Feed aggregator

Limited email privacy breach on Drupal.org on April 15th

Drupal News -

On April 15th, a change to a Drupal.org website permission inadvertently allowed a small segment of users to view a report listing the email addresses of recently logged in users. No passwords were involved. The problem was mitigated within 13 hours of being introduced and within 3 hours of being reported. The problem was completely resolved within 24 hours of introduction. The number of affected email addresses is relatively small – fewer than 500. Those users are being contacted directly if their email was affected. Users with maintainer access or the community role and above were not affected by this incident.

The users with permission to see this report were limited to community members that have shown frequent contribution to Drupal.org. The possible exposure time was also limited to between April 15, 2015 20:53 UTC to April 16, 2015 9:00 UTC. There were approximately 44 IP addresses that accessed the information during that time. These users are mostly administrators of Drupal.org and the community members who first reported the incident.

Even though the exposure of email addresses was limited as described above, we recommend all users to be cautious of any email that asks you for personal information.

We want to thank the community members who moved quickly to alert the Drupal Security and Drupal.org infrastructure teams about the problem.

Front page news: Drupal NewsDrupal version: Drupal 7.x

Accepting Session Proposals - Twin Cities Drupal Camp 2015

Wisconsin Drupal Group -

We are now (for the last week) accepting session submissions for this years Twin Cities Drupal Camp.

Also - it's time to register for camp!

Coming to the camp in June? Why not help make it awesome by submitting a session proposal? This camp wouldn't happen without your willingness to go out on a limb and share your experiences, your tools, and your crazy new ideas about Drupal. If your session is selected, we’ll pay your camp registration fee and you’ll be invited to an exclusive thank-you dinner for sponsors and speakers on Thursday, June 25.

The following factors will be considered in session selection: presenter experience, relevance of topic, uniqueness of topic, and variety of sessions for beginner, intermediate and advanced participants.

In general, we're looking for sessions that address one or more of the following:

  • Best practices and case studies for Drupal business owners, DevOps, developers, and designers
  • Beginner-level topics related to the Drupal ecosystem that includes PHP, Javascript, JQuery, MySQL, etc.
  • How-to guides for implementing various features using Drupal
  • Sessions helping prepare participants for Drupal 8
  • Growing and maintaining a healthy Drupal community

Examples of possible session titles (feel free to submit one of these!):

  • Leveling up your Drupal skills when you live on a desert island
  • Version control strategies for teams of 1 to 1000
  • Drupal theming with grid frameworks
  • SEO Tools in Drupal and beyond
  • An intro to Drupal Commerce
  • Usability wins for content creators
  • Advanced uses of the Rules Module
  • Manage your project, not your project management suite
  • Drupal 8 Plugins explained
  • Using automated testing tools with Drupal

For more inspiration, check out this list of Drupal Camp session ideas compiled by Cathy Theys.

We will be accepting submissions until Wednesday, May 20. Submit your sessions now!

Accepting Session Proposals - Twin Cities Drupal Camp 2015

Twin Cities Drupal Group -

We are now (for the last week) accepting session submissions for this years Twin Cities Drupal Camp.

Also - it's time to register for camp!

Coming to the camp in June? Why not help make it awesome by submitting a session proposal? This camp wouldn't happen without your willingness to go out on a limb and share your experiences, your tools, and your crazy new ideas about Drupal. If your session is selected, we’ll pay your camp registration fee and you’ll be invited to an exclusive thank-you dinner for sponsors and speakers on Thursday, June 25.

The following factors will be considered in session selection: presenter experience, relevance of topic, uniqueness of topic, and variety of sessions for beginner, intermediate and advanced participants.

In general, we're looking for sessions that address one or more of the following:

  • Best practices and case studies for Drupal business owners, DevOps, developers, and designers
  • Beginner-level topics related to the Drupal ecosystem that includes PHP, Javascript, JQuery, MySQL, etc.
  • How-to guides for implementing various features using Drupal
  • Sessions helping prepare participants for Drupal 8
  • Growing and maintaining a healthy Drupal community

Examples of possible session titles (feel free to submit one of these!):

  • Leveling up your Drupal skills when you live on a desert island
  • Version control strategies for teams of 1 to 1000
  • Drupal theming with grid frameworks
  • SEO Tools in Drupal and beyond
  • An intro to Drupal Commerce
  • Usability wins for content creators
  • Advanced uses of the Rules Module
  • Manage your project, not your project management suite
  • Drupal 8 Plugins explained
  • Using automated testing tools with Drupal

For more inspiration, check out this list of Drupal Camp session ideas compiled by Cathy Theys.

We will be accepting submissions until Wednesday, May 20. Submit your sessions now!

Drupal.org Featured Case Studies: National Baseball Hall of Fame and Museum

Planet Drupal -

Completed Drupal site or project URL: http://www.baseballhall.org/

The National Baseball Hall of Fame and Museum (BHoF) is an American institution. For 75 years they have housed the archive of America's favorite game, welcoming new inductees each year and connecting generations with their huge love and knowledge of the sport.

BHoF has a large and dedicated audience, but their location in Central New York limits the number of physical visits to the museum. To reach a wider audience, they needed to unlock the full potential of their online presence.

Cogapp helps organizations use digital media, specializing in large-scale, mission-critical projects for prominent institutions.

BHoF appointed Cogapp to perform a discovery phase to research user engagement, the kinds of content that are of interest to users, and key value propositions of the website to its visitors. This work then fed into developing the site, with the central objective being to showcase the vast number of artifacts in the Hall's collection, creating connections that bring these objects to life for site visitors.

Key modules/theme/distribution used: Islandora Imagecache ExternalParagraphsEntity APIMetatagFeaturesStrongarmMasterVarnish HTTP Accelerator IntegrationOrganizations involved: CogappTeam members: alxbridgechapabutassos

Drupal Watchdog: VIDEO: DrupalCon Amsterdam Interview: Angie Byron

Planet Drupal -

Angie Byron is Director of Community Development at Acquia. For this interview, during the final day of DrupalCon Amsterdam, we were able to find an empty auditorium. Alas, filming with my brand-new GoPro camera, we got off to a topsy-turvy start...

RONNIE RAY: I’ve had you upside down.

ANGIE BYRON: Oh hahaha!

I go by Angie Byron or webchick, and more people know me as webchick than Angie Byron.

Today, what I love to do at DrupalCons, on the last day of the sprint days, is just walk around all the tables and see what everyone is working on, cause there’s hundreds of people here and they’re all sort of scratching their own itches on everything from Drupal-dot-org to, like, what is the newest coolest content staging thing gonna be?, to how are we going to get Drupal 8 done?

And everybody working together and collaborating with people they don’t get to see all the time, it’s a lot of fun for me.

I feel like we made a lot of really great decisions about the Drupal 8 release management stuff here that we’ll be able to put into practice, and help try and focus efforts on getting the critical issues resolved, trying to clean up the loose ends that we still have, and getting the release out the door faster.

And the other thing I’m going to work on for the next month is something called Drupal Module Upgrader, which is the script that can help contrib modules port their modules to Drupal 8. It automates a lot of that task.

Now that Beta is here it’s a great time for people to update their modules, so I want to work on tools to help facilitate that.

RR: What are you reading, besides books on Drupal?

AB: Not much. Although I love reading kids books, because I have a daughter who’s 16 months now and she loves to be read to. So my latest books I’ve been reading are Where is the Green Sheep? and Go, Dog, Go! and a bunch of Richard Scarry stuff and things like that because she loves to know what everything’s called. She loves books.

There’s a Dr. Seuss book called Oh, The Places You’ll Go! That book is dark, man, that is like a dark book. It’s entertaining. I remember it from when I was a kid but I don’t remember it like that!

RR: Music?

AB: I listen to a lot of old music cause I’m one of those curmudgeonly people who thinks the best music was already made. So, like I’ve been having like a ‘70s rock, ‘80s pop, ‘90s punk rock, like – that’s sort of what’s in my chain all the time. Hair metal, junk like that. How to relive my kid-age stuff.

I think the community has grown to such an enormous size now that I guess one thing I wonder about, – not really worry about– but am curious about, is if can we still maintain that small-knit community feel that we had back when I started, when we were 70 people at a DrupalCon – not the 2,500 people we have now.

It’s cool to kind of walk around DrupalCon, especially on a sprint day, especially because I feel we have retained that – and people are finding people to connect with and cool things to work on and stuff like that.

I think it’s something we all need to collectively be intentional about is, you know, it’s not just enough that Drupal is just a great software project, it’s also about the people and trying to maintain that welcome feeling – that got us all in the door – for generations to come.

So that’s something I would leave as a parting note.

Tags:  DrupalCon DrupalCon Amsterdam Video Video: 

Aten Design Group: Speeding up Complex Drupal Data Loads with Custom Caches

Planet Drupal -

Recently we had the task of loading data from a content type with 350 fields. Each node is a University’s enrollment data for one year by major, gender, minority, and a number of other categories. CSV exports of this data obviously became problematic. Even before we got to 350 fields, with the overhead of the Views module we would hit PHP timeouts when exporting all the nodes. If you’re not familiar with Drupal's database structure, each field’s data is stored in a table named ‘field_data_FIELDNAME’. Loading an entire node means JOINing the node table by entity_id with each related field table. When a node only has a handful of fields, those JOINs work fine, but at 350 fields the query runs slow.

On this site we’re also plotting some of the data using highcharts.js. We really hit a wall when trying to generate aggregate data to plot alongside a single university's. This meant loading every node of this content type to calculate the averages, which turned our slow query into a very slow query. We even hit a limit on the number of database JOINs that can be done at one time.

In retrospect this is a perfect case for a custom entity, but we already had thousands of nodes in the existing content type. Migrating them and implementing a custom entity was no longer a good use of time. Instead, we added a custom table that keeps all the single value fields in a serialized string.

The table gets defined with a hook_schema in our module's .install file:

function ncwit_charts_schema() {   $schema['ncwit_charts_inst_data'] = array( 'description' => 'Table for serialized institution data.', 'fields' => array( 'nid' => array( 'type' => 'int', 'default' => 0, 'not null' => TRUE, 'description' => 'node id for this row', ), 'tid' => array( 'type' => 'int', 'default' => 0, 'not null' => TRUE, 'description' => 'intitution term id that this data belongs to', ), 'year' => array( 'type' => 'int', 'default' => 0, 'not null' => TRUE, 'description' => 'school year for this node', ), 'data' => array( 'type' => 'blob', 'not null' => FALSE, 'size' => 'big', 'serialize' => TRUE, 'description' => 'A serialized array of name value pairs that store the field data for a survey data node.', ), ), 'primary key' => array('nid'), );   return $schema; }

The most important part of the array is 'data' with type 'blob', which can be up to 65kB. Not shown is another array to create a table for our aggregate data.

When a new node is saved hook_node_insert() is invoked. hook_node_update() fires both when a new node is saved and when it's updated.

/** * Implements hook_node_insert(). * save serialized field data to inst_data table for a new node * For a new node, have to use this */ function ncwit_charts_node_insert($node) { ncwit_charts_serialize_save($node); }     /** * Implements hook_node_update(). * save serialized field data to inst_data table */ function ncwit_charts_node_update($node) { if (isset($node->nid)) { // we're also calling this function from hook_node_insert // because hook_node_update doesn't have the nid if is a new node ncwit_charts_serialize_save($node); } else { return; } }

Now we actually process the fields to be serialized and store. This section will vary greatly depending on your fields.

function ncwit_charts_serialize_save($node) { // save each value as a simple key => value item foreach ($node as $key => $value) { $data[$key] = $value[LANGUAGE_NONE][0]['value']; }   $fields = array(); $fields['nid'] = $node->nid; $fields['tid'] = $node->field_institution_term[LANGUAGE_NONE][0]['tid']; $fields['year'] = $node->field_school_year[LANGUAGE_NONE][0]['value']; $fields['data'] = serialize($data);   db_merge('ncwit_charts_inst_data') ->key(array( 'nid' => $node->nid, )) ->fields($fields) ->execute();

When a node is deleted we have some clean-up to do.

/** * Implements hook_node_delete(). * Also remove node's data from inst_data */ function ncwit_charts_node_delete($node) { if ($node->type !== 'data_survey') { //only care about data_survey nodes return; }   $query = db_select('ncwit_charts_inst_data', 'i'); $query->fields('i')->condition('i.nid', $node->nid); $result = $query->execute(); $data = $result->fetchAssoc(); if ($data > 0) { db_delete('ncwit_charts_inst_data')->condition('nid', $node->nid)->execute(); } }

When first installed or when fields get changed, we added a batch process that re-saves the serialized strings. Aggregate data is calculated during cron and saved in another table. Rather than loading every node with JOINs, the data comes from a simple query of this custom table.

Pulling the data out of the database and calling unserialize() gives us a simple associative array of the data. To pass this data to highcharts.js we have a callback defined that returns the arrays encoded as JSON. Obviously this gets more complicated when dealing with multiple languages or multi-value fields. But in our case almost everything is a simple integer.

This process of caching our nodes as serialized data changed our loading speed from painfully slow to almost instant. If you run into similar challenges, hopefully this approach will help you too.

A Modern Imagery Processing Pipeline

Drupal Fire -

Development Seed (via DrupalFire)

Satellite data is a tremendously powerful resource for governments and development organizations. We built a suite of tools to make open Landat data more accessible and useable. These allow our development partners to process imagery and perform analysis quicker, and that can make all the difference in rapidly evolving situations.

Often our partners need commercial imagery with greater resolution and refresh times than what Landsat 8 offers. We have great partnerships with commercial imagery providers, to offer all sorts of imagery. Too often receiving and processing commercial imagery is a huge pain point that slows us down and makes it harder to make use of the data. As developers we know it could be better.

Astro Digital gave us an opportunity to rebuild this workflow from the ground up. We’ve worked closely with their team to build a satellite imagery pipeline for developers and end users. We just launched a browsing and publishing platform with Astro Digital to allow anyone to discover, process, and share satellite imagery in an incredibly quick and intuitive manner. A process that previously could take days has now been cut down to minutes.

API first

We built an end-to-end data processing pipeline that feeds a powerful data API that unlocks possibilities for others. We broke down the fundamental goals of the Platform and built API calls around each. Those goals were to search, process, and publish.

We built and exposed a perfomant Elasticsearch-powered endpoint, based on our previous landsat-api work, that will allow for complex queries to find exactly the data that is needed.

{% highlight bash %}
$ curl https://api.astrodigital.com/v1/search?search=cloudCoverageFull:[0+TO+20]
{% endhighlight %}

But how to process the imagery? We extended our existing open source landsat-util tool to handle varying band combinations and the API offers several including true color, vegation health false color and urban false color.

{% highlight bash %}
$ curl https://api.astrodigital.com/v1/methods
{% endhighlight %}

And finally, there is a simple request that can be made to process the imagery and receive a tiled map URL. This URL can be used with tools like Mapbox or Leaflet to build upon the processed imagery in any way. Full documentation, including interactive samples, can be found at docs.astrodigital.com.

{% highlight bash %}
$ curl -X POST –data “sceneID=LC80430332014262LGN00&process=urbanFalse&satellite=l8” https://api.astrodigital.com/v1/publish
{% endhighlight %}

{% highlight json %}
{“status”:”Image is being processed.”}
{% endhighlight %}

Frictionless publishing

We are using this data pipeline to power an extremely easy and visual imagery browser and publishing tool. We started with Libra as a base and modified it to meet the Astro Digital specific workflow. Libra was already designed to be quick and intuitive. We added a simple publish workflow that will process and publish images and email a link to the tiled map after processing has completed. For images processed within this visual workflow, the email contains a link to an embeddable map that can be used anywhere across the web.

Working with Astro Digital, we built a modern publishing pipeline that we hope will push the entire industry to build more usable tools. This is good for the industry and good for users, particularly the small governments and development organizations that are the next wave of power satellite data users.

DrupalOnWindows: NetPhp Tutorial / User Manual

Planet Drupal -

Midwestern Mac, LLC: Thoughts on the Acquia Certified Developer - Front End Specialist Exam

Planet Drupal -

Previously, I posted my thoughts on the Acquia Certified Developer - Back End Specialist exam as well as my thoughts on the Certified Developer exam. To round out the trifecta of developer-oriented exams, I took the Front End Specialist exam this morning, and am posting some observations for those interested in taking the exam.

My Theming Background

I started my Drupal journey working on design/theme-related work, and the first few Drupal themes I built were in the Drupal 5 days (I inherited some 4.7 sites, but I only really started learning how Drupal's front end worked in Drupal 5+). Luckily for me, a lot of the basics have remained the same (or at least similar) from 5-7.

For the past couple years, though, I have shied away from front end work, only doing as much as I need to keep building out features on sites like Hosted Apache Solr and Server Check.in, and making all my older Drupal sites responsive (and sometimes, mobile-first) to avoid penalization in Google's search rankings... and to build a more usable web :)

Acquia: Sites that Cannot Fail -- Forecasting the Big Storm

Planet Drupal -

Sometimes we can’t plan for it. Sometimes we have a moment’s notice. Other times it’s our most anticipated day of the year. No matter the situation, every organization has experienced a time when their digital properties could not fail—or the business impact would be devastating.

In this blog series, we’re showcasing what it meant for three of our largest customers to have a site that could not fail. We’ll highlight both business and technical preparation, continuous improvements, platform insights, and the importance of always listening to those providing feedback on the experience.

The Story

The Weather Channel’s weather.com, one of the top 20 most trafficked sites in the US, provides millions of people every day with the world's best weather forecasts, content, and data. On average, it serves 15 million page views per day to 30 million unique visitors per month. But when major weather events loom, like a hurricane or nor’easter, the site will serve up to a billion requests a week.

These requests include delivering hundreds of dynamic maps and streaming video to users in over three million forecast locations. The site has to remain stable with instantaneous page loads and 100 percent uptime, despite these bad weather traffic bumps of up to 300 percent.

The Weather Channel’s legacy platform was groaning under this pressure. It was using approximately 144 servers across three data centers to deliver more than 17,000 articles updated on a minute-by-minute basis.
So in November 2014, weather.com moved its entire website, which serves more than 20 million pages of content, to Drupal and the Acquia Platform, facilitated by the experts at Acquia partner MediaCurrent.

Within weeks, one of the nastiest winters on record began moving into the Midwest and Northeastern part of the US. Prodigious web traffic followed.

The new site, now the busiest Drupal site in the world, never buckled. In fact, it thrived, delivering faster, more efficiently cached pages to customers.

“weather.com is thinking ahead to a future where up-to-the-minute weather information requires an open delivery platform that adapts to fast changes in technology,” Tom Erickson, CEO, Acquia, said at the time. “The Weather Channel is leading the transformation of how we interact with weather news; people expect accurate weather forecasts on-demand, and they want to be alerted to events that may impact their life, work, travel, and leisure. weather.com is gaining the agility to deliver on customers’ increasing expectations. It’s leading the charge with contextual weather insight that anticipates every user’s needs.”

A recent global survey of more than 500 businesses for the Reducing Customer Struggle report found that companies are losing nearly a quarter of their annual online revenue due to a bad website experience. That’s billions of dollars lost and customers who won’t come back because of a digital experience that left a bad impression.

Whether you’re a weather site watching traffic rise with the barometric pressure, an enterprise facing transformation in an industry where digital transformation is lacking, or a smaller brand on the cusp of breaking into a new market, your digital presence can’t fail.

Dave Terry, co-founder and partner of client services at Mediacurrent, said, “Acquia opens up all kinds of opportunities for weather.com. The site relies heavily on the ability to quickly create and distribute massive amounts of content, and with Drupal, weather.com gains editorial agility and the ability to innovate and bring the latest applications and features to the user experience.”

Behind the Scenes

When it comes to capacity planning, some organizations plan for a worst-case scenario. They purchase larger-than-necessary capacity to be permanently available. But this is wasted money. Conversely, some organizations under-plan for traffic. Without the means to increase capacity on demand, they suffer outages and, ultimately, loss of revenue.

With Acquia Cloud, the guesswork is eliminated. You only pay for what you need. Acquia Cloud scales with burstable and elastic resources, which can be added quickly and easily on demand. Our operations team can scale up resources for any period of time, and then return resources back to normal levels when traffic subsides.

We know that scaling is complex, so we do the work for you. We add resources in real time to address changing traffic conditions seamlessly when a site needs it most. Scaling on Acquia Cloud does not require risky architectural changes like migrations and resizing. But we do scale the ecosystem, not just the hardware. We scale across all layers of the environment––web servers, file systems, databases, and load balancers. The architecture scales across the MySQL database layer using data replication and the file system layer utilizing GlusterFS to ensure syncing. The web server layer is scaled up by running active web servers in multiple availability zones. We run dedicated Memcached servers for sites with high workloads and multiple load balancers to ensure traffic is distributed. This level of Drupal-aware customization doesn't exist outside of Acquia.

As part of the scaling enablement strategy, it is important for customers to have a site insulation strategy so that visitors are not aware of traffic increases. Acquia uses Varnish caching in front of all traffic to speed up sites. Additional features such as geolocation, mobile redirection, and CDN implementation can be enabled. Acquia has over 25 personnel across our Professional Services, Technical Account Management, and Support organizations who specialize in performance, focusing load testing, database query rewriting, stack tracing, and more.

At Acquia, our passion is customer success. Because of that, your site doesn’t become the next headline. Your best day doesn’t become your worst; your biggest events are uneventful behind the scenes. In essence, we don’t sleep, so you can. Our team of experts is on hand 24 hours a day, seven days a week, 365 days a year so that you don’t fail. You get a true partnership with Acquia.

No matter the time of day, or the size of the traffic spike, we have your back. So instead of downtime, your traffic spikes yield growth and success.

photo: NASA Goddard Space Flight Center

Tags:  web platform drupal acquia drupal planet

Acquia: Drupal is fun to use - meet Karen Grey

Planet Drupal -

Language Undefined Drupal is more fun - meet Karen Grey

I sat down with Karen Grey at Drupal Camp Brighton 2015 to find out more about who she is and what she does with Drupal. I apologize for taking her out of the code sprints for that time! Since we spoke, Karen has taken on a position as Senior Drupal Developer at i-KOS in their Brighton office.

Drupal Watchdog: RESTful Web Services Module Basics

Planet Drupal -

Article

Drupal 7 does not have built-in support for representational state transfer (REST) functionality. However, the RESTful Web Services module is arguably the most efficient way to provide resource representations for all the entity types, by leveraging Drupal's powerful Entity API. Unmodified, the module makes it possible to output the instances of the core entity types – node, file, and user – in JSON or XML format. Further entity type resources and formats are possible utilizing hooks in added code.

As with any REST solution, the RESTful Web Services module supports all four of the fundamental operations of data manipulation: create, read, update, and delete (CRUD). The corresponding RESTful API HTTP methods are POST, GET, PUT, and DELETE, respectively.

Anyone hoping to learn and make use of this module – especially for the first time – will likely be frustrated by the current project documentation, which is incomplete, uneven, and lacking clear examples. This article – a brief overview – is intended to introduce what is possible with this module, and help anyone getting started with it.

We begin with a clean Drupal 7 installation (using the Standard profile) running on a virtual host with the domain name "drupal_7_test". After installing and enabling the module, we find that it does not have the configuration user interface one might expect. In the demonstration code below, we focus on the node entity type.

Nabbing a Node

The simplest operation – reading an entity instance – is performed using a simple GET request containing the machine name of the entity type and the entity's ID.

Drupal Association News: We Love Our Volunteers!

Planet Drupal -

This week is National Volunteer Week, a week to recognize that volunteerism is a building block to a strong and thriving community.  The Drupal Community is no different: as an open-source project our volunteers are vital to the health and growth of our project.  There are so many roles and levels of contribution within our Drupal ecosystem that we at the Drupal Association wanted to highlight how much your contribution means to us and our work.  I took some time and asked around, here’s some of the glowing praise our staff has to say about our phenomenal volunteers. 

“I am continually impressed with the volunteers that I get to work with.  Not only do they rock at their jobs, but they are so dedicated to the work that they do for Drupal and the Cons specifically!  Anyone who has volunteered for a Con knows that it is a large undertaking, and a responsibility that isn't taken lightly. These volunteers come back each week with positive attitudes, valuable ideas and great results.  Although I have only been at the Association for a little over six months, I can truly say that these volunteers are what gives our Cons the 'special sauce' and I am lucky to get to work with volunteers from around the globe on a daily basis.” 

- Amanda Gosner, DrupalCon Coordinator

“Most of my day is spent with Drupal Association staff, who have the luxury of getting paid to think about Drupal for 8 hours a day. A good chunk of my job is working with volunteers though-- the Board of Directors, Drupal.org Working Groups, Community Organizers, DrupalCon session speakers. So many of you give so much of your time and your smarts back to the project and the community, and it's my privilege and duty to learn from you all.”

- Holly Ross, Executive Director

"I look forward to working working with community volunteers to help build and improve Drupal.org. The site would not be where it is today without everyone's work."

- Neil Drumm, Drupal.org Lead Architect

 

“I want to thank Cathy and Jared for being my sprint mentor at DrupalCon Latin America. I made my first comment on the issue queue. It felt so good to cross into that world finally, even if it is was just a baby toe crossing over.”

- Megan Sanicki, COO

 

“It feels like I’m hearing news every day about the amazing programs our community members put together all over the world — from Los Angeles to Uganda and beyond. Without help from amazing community volunteers who donate time working on social media, in the issue queues, or even volunteers who take a brief moment to drop a note in my inbox (“have you seen this?”), these stories would never be shared with our wider community.” 

- Leigh Carver, Content Writer

Today, we invite you to take a few minutes to recognize your fellow Drupal contributors by tweeting or sending a message via IRC to appreciate each other.  After all, without our volunteers, our Drupal Community would not be as lively, bright, and welcoming.  Want to lend a hand?  Our get involved page has plenty of ways to volunteer with the project.

KnackForge: Drupal 7 - Hooking Ajax events and views refresh

Planet Drupal -

Drupal has a solid Ajax interface, we can hook into the Ajax events at various places. I will explain some 5 important methods,   1) beforeSerialize - called before data is packed and runs before the beforeSend & beforeSubmit 2) beforeSubmit - called before the ajax request 3) beforeSend - called just before the ajax request 4) success - called after ajax event returns data 5) complete - called after the request ends   Lets say you want to capture some ajax event (in built or made by other module) to do some event like Views refresh. We can use a very simple logic to do that.  

Pages

Subscribe to Cruiskeen Consulting LLC aggregator