Feed aggregator

Holly Ross on the Drupal Association

Drupal Fire -

Lullabot (via DrupalFire)

In this episode of Hacking Culture, Matthew Tift talks with Holly Ross, the Executive Director of the Drupal Association, about the Drupal community, the Drupal Association, non-profits, business, tax codes, and more. They get into some controversial issues, and some of Holly's answers may surprise you!

Dcycle: Add unit testing to legacy code

Planet Drupal -

To me, modern code must be tracked by a continuous integration server, and must have automated tests. Anything else is legacy code, even if it was rolled out this morning.

In the last year, I have adopted a policy of never modifying any legacy code, because even a one-line change can have unanticipated effects on functionality, plus there is no guarantee that you won't be re-fixing the same problem in 6 months.

This article will focus on a simple technique I use to bring legacy Drupal code under a test harness (hence transforming it into modern code), which is my first step before working on it.

Unit vs. functional testing

If you have already written automated tests for Drupal, you know about Simpletest and the concept of functional web-request tests with a temporary database: the vast majority of tests written for Drupal 7 code are based on the DrupalWebTestCase, which builds a Drupal site from scratch, often installing something like a site deployment module, using a temporary database, and then allows your test to make web requests to that interface. It's all automatic and temporary environments are destroyed when tests are done.

It's great, it really simulates how your site is used, but it has some drawbacks: first, it's a bit of a pain to set up: your continuous integration server needs to have a LAMP stack or spin up Vagrant boxes or Docker containers, you need to set up virtual hosts for your code, and most importantly, it's very time-consuming, because each test case in each test class creates a brand new Drupal site, installs your modules, and destroys the environment.

(I even had to write a module, Simpletest Turbo, to perform some caching, or else my tests were taking hours to run (at which point everyone starts ignoring them) -- but that is just a stopgap measure.)

Unit tests, on the other hand, don't require a database, don't do web requests, and are lightning fast, often running in less than a second.

This article will detail how I use unit testing on legacy code.

Typical legacy code

Typically, you will be asked to make a "small change" to a function which is often 200+ lines long, and uses global variables, performs database requests, and REST calls to external services. Bit I'm not judging the authors of such code -- more often than not, git blame tells me that I wrote it myself.

For the purposes of our example, let's imagine that you are asked to make change to a function which returns a "score" for the current user.

function mymodule_user_score() { global $user; $user = user_load($user->uid); $node = node_load($user->field_score_nid['und'][0]['value']); return $node->field_score['und'][0]['value']; }

This example is not too menacing, but it's still not unit testable: the function calls the database, and uses global variables.

Now, the above function is not very elegant; our first task is to ignore our impulse to improve it. Remember: we're not going to even touch any code that's not under a test harness.

As mentioned above, we could write a subclass of DrupalWebTestCase which provisions a database, we could create a node, a user, populate it, and then run the function.

But we would rather write a unit test, which does not need externalities like the database or global variables.

But our function depends on externalities! How can we ignore them? We'll use a technique called dependency injection. There are several approaches to dependency injection; and Drupal 8 code supports it very well with PHPUnit; but we'll use a simple implementation which requires the following steps:

  • Move the code to a class method
  • Move dependencies into their own methods
  • Write a subclass replaces dependencies (not logic) with mock implementations
  • Write a test
  • Then, and only then, make the "small change" requested by the client

Let's get started!

Move the code to a class method

For dependency to work, we need to put the above code in a class, so our code will now look like this:

class MyModuleUserScore { function mymodule_user_score() { global $user; $user = user_load($user->uid); $node = node_load($user->field_score_nid['und'][0]['value']); return $node->field_score['und'][0]['value']; } } function mymodule_user_score() { $score = new MyModuleUserScore(); return $score->mymodule_user_score(); }

That wasn't that hard, right? I like to keep each of my classes in its own file, but for simplicity's sake let's assume everything is in the same file.

Move dependencies into their own methods

There are a few dependencies in this function: global $user, user_load(), and node_load(). All of these are not available to unit tests, so we need to move them out of the function, like this:

class MyModuleUserScore { function mymodule_user_score() { $user = $this->globalUser(); $user = $this->user_load($user->uid); $node = $this->node_load($user->field_score_nid['und'][0]['value']); return $node->field_score['und'][0]['value']; } function globalUser() { return global $user; } function user_load($uid) { return user_load($uid); } function node_load($nid) { return node_load($nid); } }

Your dependency methods should generally only contain one line. The above code should behave in exactly the same way as the original.

Override dependencies in a subclass

Our next step will be to provide mock versions of our dependencies. The trick here is to make our mock versions return values which are expected by the main function. For example, we can surmise that our user is expected to have a field_score_nid, which is expected to contain a valid node id. We can also make similar assumptions about how our node is structured. Let's make mock responses with these assumptions:

class MyModuleUserScoreMock extends MyModuleUserScore { function globalUser() { return (object) array( 'uid' => 123, ); } function user_load($uid) { if ($uid == 123) { return (object) array { field_score_nid => array( LANGUAGE_NONE => array( array( 'value' => 234, ), ), ), } } } function node_load($nid) { if ($nid == 234) { return (object) array { field_score => array( LANGUAGE_NONE => array( array( 'value' => 3000, ), ), ), } } } }

Notice that our return values are not meant to be complete: they only contain the minimal data expected by our function: our mock user object does not even contain a uid property! But that does not matter, because our function is not expecting it.

Write a test

It is now possible to write a unit test for our logic without requiring the database. You can copy the contents of this sample unit test to your module folder as mymodule.test, add files[] = mymodule.test to your mymodule.info, enable the simpletest modules and clear your cache.

There remains the task of actually writing the test: in your testModule() function, the following lines will do:

public function testModule() { // load the file or files where your classes are located. This can // also be done in the setUp() function. module_load_include('module', 'mymodule'); $score = new MyModuleUserScoreMock(); $this->assertTrue($score->mymodule_user_score() == 3000, 'User score function returns the expected score'); } Run your test

All that's left now is to run your test:

php ./scripts/run-tests.sh --class mymoduleTestCase

Then add above line to your continuous integration server to make sure you're notified when someone breaks it.

Your code is now ready to be fixed

Now, when your client asks for a small or big change, you can use test-driven development to implement it. For example, let's say your client wants all scores to be multiplied by 10 (30000 should be the score when 3000 is the value in the node):

  • First, modify your unit test to make sure it fails: make the test expect 30000 instead of 3000
  • Next, change your code iteratively until your test passes.
What's next

This has been a very simple introduction to dependency injection and unit testing for legacy code: if you want to do even more, you can make your Mock subclass as complex as you wish, simulating corrupt data, nodes which don't load, and so on.

I highly recommend getting familiar with PHPUnit, which is part of Drupal 8, and which takes dependency injection to a whole new level: Juan Treminio's "Unit Testing Tutorial Part I: Introduction to PHPUnit", March 1, 2013 is the best introduction I've found.

I do not recommend doing away entirely with functional, database, and web tests, but a layered approach where most of your tests are unit tests, and you limit the use of functional tests, will allow you to keep your test runs below an acceptable duration, making them all the more useful, and increasing the overall quality of new and even legacy code.

Tags: blogplanet

Acquia: Real world change with PHP and community: "The sky's the limit."

Planet Drupal -

Language Undefined

Michelle Sanver–developer at Liip–and I sat down and talked at SymfonyCon 2014 in Madrid. Michelle and I have a number of interests in common (community, FTW!) and I really enjoyed getting to know her better in a conversation in front of my microphone and camera. We covered her long history in PHP, her SymfonyCon presentation (Life After Assetic: State of Art Symfony2 Frontend Dev) the PHP Renaissance bringing communities together, Michelle's "open source addiction", building PHP applications that touch the lives of almost everyone in Switzerland, and more.

CiviCRM Blog: Load test Drupal and CiviCRM with LoadImpact

Planet Drupal -

This has been my approach (together with CiviCoop) to load test a big site with CiviCRM where most visitors where expected to login.
Let me know if you would agree with this approach or if you have a better alternative.

Every big drupal site needs load testing before going live.

These are the key questions you should have answered in the final stages before deployment:

  • How does your infrastructure handle the expected amount of visitors?
  • how does it perform with maximum amount of visitors?
  • and at what amount of visitors does it start to crumble?

For anonymous load testing there are a number of tools available.
For logged in users there are not so many available.

But what about the sites where logging in is secured using unique tokens per user visit like drupal?

How do you load test those?

The problem is that you often can record or script what you need to post during login.

But sites like drupal secure their login pages with an unique token so you do not know what you will need to post beforehand.

With LoadImpact.com that problem is solvable.

LoadImpact automates load testing and gives graphs like:

Example of LoadImpact graphs

To setup a load test on Load Impact for logged in users you can do:

Step 1: Record one or more user scenario's with the Load Impact crome plugin: https://chrome.google.com/webstore/detail/load-impact-user-scenario/comn...

Step 2: Export user scenario to Load Impact.

Step 3: Look into the generated LUA code and find the GET request to the login page.

Step 4: Change it so the form token is gathered and placed in variable:

For example:

http.page_start("Page 1") local pages = http.request_batch({ {"GET", "https://www.domain.com/user", response_body_bytes=10240} }) local body = pages[1]['body'] local token = string.match(body, 'input type="hidden" name="form_build_id" value="(.-)"')

Step 5: find the POST request to the drupal login page and change the "form_build_id" with the token value.

if token ~= nil then http.request_batch({ {"POST", "https://www.domain.com/user", headers={["Content-Type"]="application/x-www-form-urlencoded"}, data="form_build_id=" .. token .. "&form_id=user_login&name=<username>op=Log%20in&pass=<password>", auto_decompress=true} }) else log.error("failed to find token" .. body .. ""); end

And you're done. Now load tests can be performed with thousands of concurrent logged in users on your drupal site.

If your user scenario contains other form submissions you can repeat this for the other forms as well.

Using CiviCRM as an example: someting similar is needed if CiviCRM searches are performed.

CiviCRM adds a session dependent qfKey to every search. Without the right qfKey a search will not be executed properly, harming the load test.

To solve this you have to execute the following steps in the Load Impact user scenario.

Step 1: Find the GET page for the search and place the qfKey in a variable

local pages = http.request_batch({ {"GET", "https://www.domain.com/civicrm/contact/search?reset=1", response_body_bytes=102400} }) local body = pages[1]['body'] local token = string.match(body, 'input type="hidden" name="qfKey" value="(.-)"')

Step 2: find the POST request to the search page and replace the qfKey with the token

if token ~= nil then http.page_start("Page 5") http.request_batch({ {"POST", "https://www.domain.com/civicrm/contact/search", headers={["Content-Type"]="application/x-www-form-urlencoded"}, data="_qf_Basic_refresh=Search&_qf_default=Basic%3Arefresh&contact_type=&entryURL=https%3A%2F%2Fwww.domain.com%2Fcivicrm%2Fcontact%2Fsearch%3Freset%3D1&group=&qfKey=" .. token .. "&sort_name=&tag=", auto_decompress=true} }) http.page_end("Page 5") else log.error("failed to find token" .. body .. ""); end

And you can also do proper CiviCRM searches in your Load Impact user scenario and load test your Drupal+CiviCRM site before deployment.

Originally posted on http://orgis.com/en/blog/web/professional-load-testing-drupal-and-civicr...


What is an isomorphic application?

Lullabot -

Javascript was traditionally the language of the web browser, performing computations directly on a user’s machine. This is referred to as “client-side” processing. With the advent of Node.js, JavaScript has become a compelling “server-side” language as well, which was traditionally the domain of languages like Java, Python and PHP.

What is an isomorphic application?

Drupal Fire -

Lullabot (via DrupalFire)

Javascript was traditionally the language of the web browser, performing computations directly on a user’s machine. This is referred to as “client-side” processing. With the advent of Node.js, JavaScript has become a compelling “server-side” language as well, which was traditionally the domain of languages like Java, Python and PHP.

KnackForge: How to install Gitlab 7.8 on Centos 5.5 with Apache and MySQL

Planet Drupal -

GitLab is a web-based Git repository manager with wiki and issue tracking features. GitLab is written in ruby on rails.

Installing gitlab omnibus package won't be that difficult following the guide given here. But when it comes to installing gitlab on centos 5.5, it isn't that easy as there are no omnibus packages available for centos systems with version less than 6.5. So let's look at the steps that need to be followed to make gitlab installation successful on centos 5.5:

1. Install the development tools necessary to compile applications from source

Modules Unraveled: 138 Organize and Manage Your Drupal Projects Using Dropfort with Mathew Winstone - Modules Unraveled Podcast

Planet Drupal -

Published: Wed, 06/10/15Download this episodeDropfort
  • What is Dropfort?
    • Dropfort is a suite of tools to develop and manage Drupal applications.

On the development side, it integrates with GitHub and GitLab to track commits, issues and tags. It then packages releases based on those tags and lets you share those releases with your Drupal sites. Same way you tag and create releases on Drupal.org. The only difference is the released modules are private. Meaning a site that wants to use those modules needs to authenticate to download them.

For example, if you want to download a custom module to your site from dropfort, you can just do a “drush dl mymodule --source=https://app.dropfort.com/fserver/release-history”. Dropfort generates the same XML data for its modules as does Drupal.org for contrib modules. Meaning the Update module works with Dropfort, all your Drush commands work and Drush make works too. It’s all pretty seamless. The only difference with our XML is that it's not publicly available. Your site has to be allowed to see the update feed which is what you configure in the Dropfort web app itself.

The other half of Dropfort are the operations or “ops” tools. You connect your sites to Dropfort using the Dropfort Update module (which is available on Drupal.org) and it will start doing a few things. The most obvious is tracking the update status of your site. Being a Drupal shop, we monitor a few dozen Drupal sites at once and so logging into each site to see what modules need updating and the status of those sites is time consuming. What Dropfort let’s us do is see all those sites in one dashboard. I can login and see the update status and status report data from all the sites in one place. Dropfort then uses this data to generate some graphs about your sites. For example it can tell you how many dev modules you’re using across all your sites, it shows you a list of what sites have security updates and it does some fancy calculations to grade your site’s health as well. Lots of metrics to know what’s going on and how things are changing over time.

The last part, and this is what I’ve been working on mostly these last few months, is an Environment manager. It’s still pretty fresh and there are some rough edges but it does work. You can create a set of environments (dev, testing or production) to store machine configurations to both do your development and run your Drupal applications. You can say “I want a server running apache, with MySQL and the Commerce Kickstart distro” on it. Then you can either download a Vagrant file which will provision a vm, or you can download a docker container which will do the same. Or you can run a bash script on an existing machine to link the server to Dropfort and have that configuration deployed. It’s pretty neat stuff. Basically any server anywhere can be turned into a managed Drupal cloud.

Like I said, there’s a whole suite of stuff in here.

  • Why did you create Dropfort? Where did it start?

It all really came from using Feature Server for Drupal 6. When we incorporated Coldfront in 2011 and really turned it into a full time job, we wanted a way to distribute code to all our clients (even though at the time I think we only had one). We setup FServer to deploy code to your client sites. But manually creating the releases and the pages and stuff was kind of a pain. So we came up with a special commit syntax we could add to Subversion (this was before Git was a big thing). So could make our svn commit and in the message we’d write [release:full] and some post-commit scripts would run on the svn server. They’d look at the commit message and then take the code, create a tgz file, create a tag, commit the tag and then upload the tgz file to FServer using some REST web service endpoints we create on a Drupal 6 site with Services. That would create the release page, the release notes, add the files and generate the update XML. It was pretty well a mini Drupal.org but with Subversion (instead of CVS which D.O was still using at the time). It actually worked really well. So well in fact that the University of Ottawa asked if we could install a version for them to manage their Drupal stuff (which they’re actually still using today until their git migration is done). That’s when the lightbulb went off I guess. We had built this stuff for us but as it turns out, other people want to deploy custom modules too! Who’da thunk it?

And that’s when the idea for a more “web app” version of our SVN workflow came to be. At the time I thought “Yeah we can totally rewrite this in no time, I give is 6 months and we’ll have a web app ready to go”. That was 3 years ago I think? It took a bit longer than expected. Mostly I kept adding features… Yay scope creep. But now we’ve got a pretty awesome suite of tools and we’re focusing on the polish now. I’ve been told I’m not allowed to make feature requests for at least a month. We’ll see how that goes.

  • Can this be used to compete with Drupal.org? Meaning can people share public releases here instead of on d.o?

Nope. Public modules should be on Drupal.org. No module can be downloaded from Dropfort without authenticating first. We don’t want to supersede d.o in any way. We actually looked into writing a feature to automatically move a project from bieng private to a public one on d.o but since Drupal.org doesnt’ have an API we couldn’t do that. But yeah, this is for privately distributed modules only.

  • Does the monitoring dashboard check both the private projects as well as public ones on d.o?

    • Yes
  • What did you build it with? Is Dropfort open-source?

The original SVN workflow using Subversion, CLI PHP, Drupal 6, 2 custom modules and some REST / Services stuff.

Dropfort uses Git, Drupal 7, Services, about 20 or so custom modules, Puppet and our Drupal 7 port of FServer. So Dropfort is a Drupal application itself. We actually use Dropfort to manage Dropfort. Meaning we track it’s own updates and status using itself, and it packages releases for itself. A little inceptiony but it works.

Most of the parts which make up Dropfort are open. Some of the custom modules aren’t openly available. But that’s mostly because we don’t have the bandwidth to help and support the distro on D.o. Especially the stuff involving setting up a Puppet master. We’d spend more of our time debugging that than actually making things work. Doesn’t mean we won’t share everything eventually, just not right now.

  • What’s the plan for Dropfort? Is it a paid service or is it free?

Right now it’s free to use. The main reason for that is we haven’t written the commerce component yet so we can’t actually charge for it so… yeah. But we’re looking at different ways of monetizing. It’s tricky cause we want people to use it but at the same time we don’t really know what people will use. There’s such a variety of things in there it’s tough to decide what should be charged for. For example Github is pretty straightforward in their pay structure. If the code is open, your repo is free. If your code is closed, you pay for the repo. For us, I think it’s a question of usage. We’re leaning towards all tools are free to use with any account, it’s just a question of how much storage or how many sites you’re using. But regardless, anyone who uses it now is free to use it as much as they want. And we’ll have some special plans for early adopters as a thanks for their feedback. More than likely a bunch of free stuff.

  • How does this compare to other tools like Platform.sh, Pantheon, Acquia Dev Cloud?

The big difference is that they’re primarily a hosting platform. Dropfort is a management platform. You can connect a Pantheon site or an Acquia Dev cloud site to Dropfort and use most of the features no problem. You’d probably skip the release packaging stuff and environment management (for now) but the stats tracking and collaboration tools would work just fine. Dropfort doesn’t care where or how you run your Drupal site. As long as it can reach the internet, you can use Dropfort.

But you can use Dropfort with GitHub or GitLab or neither. You can use Vagrant or Docker or both. We do our best to integrate with anything which might make building Drupal application easier. It’s all about choice.

As for the hosting side of things, we give you tools to deploy your own server or cloud of servers. Meaning you can run an optimized network of Drupal web servers on whatever provider you want. It’s a philosophical difference. We let you host your code and sites wherever you want whereas with others you live on their machines. Which can have a lot of advantages and for the majority of folks out there, that’s fine with them.

But for us we’ve found it difficult for some enterprises here in Canada to get hosting on services in the US which are bound by the Patriot Act. We have FIPA, the Freedom of Information and Privacy Act which states that we can’t share information about users unless the user has explicitly allowed that agency access. The Patriot Act is pretty much the exact opposite of that. So we figured we’d bring most if not all of the advantages of a cloud solution (the optimized configuration, automated deployments / scaling, generated environments) to anyone’s infrastructure. You just supply the hardware, Dropfort does the rest.

I see it as just another option in how you can host your Drupal site. You can choose how much or how little you want to be involved in managing the hosting environment. Whichever way works best for you is the one you should go with.

  • How does this handle dev/staging/live scenarios?
  • How about local?
Use Cases
  • Let’s talk about the current use cases for Dropfort.
    • Managing several sites in one place
    • Create custom, shareable development environments
    • Create releases of projects destined for more than one application
  • Why would you use Dropfort instead of just Git to manage deployments?
    • We use Drush Make for just about everything. We control our releases using Drush make. We apply patches with Drush make. We really like Drush make. And we really don’t like merge conflicts. The number of times I’ve come into a project where the entirety of Drupal core and all the contrib modules are in a single repo with a team of people trying to all work on it at once has taught me that’s not the way to work. Treat your projects like d.o does, as self contained sets of functionality. Use make files to build your application and drush to manage updates. This is how Drupal is designed to work. Drupal is a collection of modules. When you all of a sudden lump it all together into a single repo you’re breaking how Drupal was meant to be managed.
  • Can you explain a bit more about how Drush Make works?
  • You just mentioned automated updates.
  • What’s in the near, and far future for Dropfort?
Episode Links: Mathew on drupal.orgMathew on TwitterDropfort on TwitterDropfort WebsiteColdfrontlabs.caColdfrontlabs GitHubTags: MonitoringUpdatesplanet-drupal

Drupal Association News: Bart's Bash: Breaking World Records With Drupal

Planet Drupal -

Breaking a Guinness world record is no easy feat, but in 2014, the folks behind Bart’s Bash did just that. With help from Drupal, they coordinated the world’s largest-ever sailing race — a fundraising event in memory of Andrew “Bart” Simpson.

Bart Simpson was a British sailor who won a gold medal at the 2008 Summer Olympics in Beijing, a silver medal in the 2012 Summer Olympics in London, and medaled in numerous World and European Championships. After Simpson was killed in a sailing accident in May of 2013 when training for the 2013 America’s Cup, his friends and family went on to found the Andrew Simpson Sailing Foundation in his memory.

“Andrew had passed away six months before [we began organizing Bart’s Bash],” said David Bishop, who built the website for Bart’s Bash. David is a sailor and runs NinetyOne Consulting out of Shropshire, England with his wife, who did much of the design work for the Bart’s Bash site. “When we set out initially, our goal was [to reach] only fifty sailing clubs, to raise £10,000, and see 2,000 people on the water."

The Andrew Simpson Sailing Foundation exists to inspire personal growth in young people through sailing. According to the Bart’s Bash 2014 website, "Many of our Olympic sailors have described the first time they were given charge of a boat as their moment of clarity – the first instance they felt true responsibility and in command of their destiny.  Whether or not children will take up sailing as a pastime, many studies have shown that children who are confident, have self worth and personal resilience do better in every way.  They are happier in their personal and family life, they are better able to learn, do better at school and in employment and they are more open to new experiences in life.  We aim to provide an avenue to that fulfilment and have global ambitions to spread the attitude, inspiration and personality of Andrew Simpson around the world."

“Initially, there was a Facebook page that had been set up in memory of Bart, and it only had about five thousand followers,” said David. “So we built a one page website for the event, and I put social sharing buttons on it. We were very quickly up to several thousand shares on Facebook, and hundreds on Twitter.

"Within three or four weeks, over 300 sailing clubs had come to us and said, ‘we want to be involved,’” David continued. “So we had to change what the event was going to be. Initially, we were just going to be a dinghy event in the UK, but because of the international interest from yacht clubs, kitesurfing clubs, model yacht clubs... all these people wanted to be a part of it, and we wanted to accommodate them as much as possible.”

The perfect platform for breaking records

As it turned out, Drupal was the perfect platform for this rapidly-growing event. “The whole concept of Bart’s Bash was that there’s no overriding governance. It's about engaging sailing clubs and getting someone at each venue to say, 'I’ll hold an event here, I’ll manage it,’” said David. “From that point of view it was a massively volunteer, community driven event. We’ve been as open as possible about making sure clubs can make their own pages and manage their own content, to make the event as successful as possible."

For David, that meant building a platform that sailing clubs around the world could use and make their own.

“I’ve built the system so that each club can create their own page,” David said. "They log in to a control panel, upload their own content, and manage it themselves. With the flexibility of the Simple CCK module, and blocks and views, it was possible for me to do rapid development. I built the whole thing myself. I had a little help from a local web development company — a day’s support, maybe — but other than that, one person built this whole system, and the scale it gives you is phenomenal.

“It’s interesting, because one of the areas that this has shown that the foundation can go into is providing services around the world just as a club web page. A lot of sailing clubs might not have a page that looks as nice as this, or that isn’t mobile responsive. But all of this is. So that’s actually one of the services that the foundation is looking at: we’re thinking of turning this into a ‘Learn to Sail' directory where you can find information about sailing at clubs near you.

“It’s amazing how good Drupal is as a platform — it definitely works for something like this,” David continued. “It’s just so flexible and so scalable. We put up the site for the 2014 event, and translated one of the key pages into eight or nine different languages. As you know, you turn on the international module and add the different variations, and you’re done. Drupal is the only platform out there that does this."

“A lovely festival of sailing"

Building a scalable, global website was only the beginning of holding a worldwide race, however.

“One of the biggest challenges was that it was going to be a global race — so how do you rank people racing in different time zones, in different classes?” David said. “We worked with a formula so we could calculate speed — a handicapped speed, if you will — so people in fast boats were adjusted for slow boats. Ultimately it came down to where the wind was in the world on that day. We were fortunate to receive a lot of help from the UK’s Royal Yachting Association with this challenge."

“After the race, we split the results up by age, experience, wind conditions, country, and boat class, which was key,” David continued. “We were able to produce a very nice set of statistics, and that’s something that hasn’t really been done in sailing before. In most sailing races, you just get a very straightforward set of results to see the winners. But it turns out our way was really popular— we saw a lot more traffic to the website after the events and continually for the next few weeks. As more results came in for those few weeks afterwards, seeing how the top 10 has moved up and down, it was great."

But for the sailors, it turns out it wasn’t all about winning. “We thought people were going to be obsessed about the results, and we weren’t sure how we’d validate it,” David said. “But in reality, we had massive boats in the same start lines as a 7-year-old kid in a tiny boat. It turned out people didn’t care about the race so much. Instead, it became this lovely festival of sailing."

Breaking world records

With the size of the event, the Bart’s Bash organizers were certain they’d be able to break a world record.

"For Guinness we had to get video of every start and every finish, plus steward and witness statements, and then we had to send each club bundle in. With more than 500 venues around the world participating, we wound up having nearly 10,000 boats qualified as being part of the world record,” said David. In total, the group collected and calculated results for 30,754 sailors across 52 countries around the world.

“It was another great way to get people involved in the event,” he continued. “Telling them that they're going to be a Guinness world record holder."

When it comes to the next year of races, David has high hopes. “For 2015, the Guinness restrictions have been lifted as we want to encourage small clubs who were not large enough to qualify under the rules required by Guinness last year. Also in 2015, we want more non-sailors on the water at more clubs around the world. To help make this happen we have come up with an idea called “Bart’s Buddies” aimed at taking your mates sailing. There will also be a special “Bouy Race” which will make it easier to get all of the wonderful volunteers sailing. To help showcase that, this year’s website is much more geared around showing the photos and the videos taken by each club around the world."

“Ultimately, three things brought the whole event together last year, and are pushing it forward this year, too,” David said. “First, it's a worthy fundraising reason. People want to do something in Andy’s memory. Second, it's a challenge — and sailors love challenges. Lastly, though, it brings a global community together, and Drupal as a platform enabled that to happen. We could create maps showing where people were using Open Layers modules. We could personalize the website for different people, and could drill down data and results.”

“Really, this is the first census for sailing activity done around the world in one day. It hadn’t been done before, which makes this website and event historic from that point of view,” said David. “We’ve been approached by other sailing associations and foundations, saying 'we want to do this, can we use the data you’ve collected.’"

As for what comes next, David is excited for the race coming up in September.

“A big sailing club signed up to participate in Barcelona last year,” David said. “And this year, the race is on 20 September — the day before DrupalCon Barcelona happens. Perhaps we’ll be able to get some Drupalers out there?

“The fact that Drupal exists means that Bart’s Bash happened. It has a lot of thanks to give to Drupal,” David concluded.

If you're interested in participating in Bart's Bash at DrupalCon Barcelona, let us know.

Sailing image credit to Gorazd Božič on Flickr.

Importing CSS Breakpoints Into Javascript

Lullabot -

There are a lot of challenges within responsive web design, and one that that has constantly been a pain is triggering JavaScript based on the current CSS media query breakpoint. The problem is that the breakpoints are in CSS, which JavaScript has no native way to access. Many solutions (including window.matchMedia(), and Enquire.js) involve declaring your breakpoints in both CSS and JS, or require IE10+.

Importing CSS Breakpoints Into Javascript

Drupal Fire -

Lullabot (via DrupalFire)

There are a lot of challenges within responsive web design, and one that that has constantly been a pain is triggering JavaScript based on the current CSS media query breakpoint. The problem is that the breakpoints are in CSS, which JavaScript has no native way to access. Many solutions (including window.matchMedia(), and Enquire.js) involve declaring your breakpoints in both CSS and JS, or require IE10+.

Cheeky Monkey Media: Responsive images with Foundation Interchange

Planet Drupal -

Having a mobile friendly responsive website is always a good idea. Having a responsive website that loads really fast is even better. Large images are often a bottleneck and the cause of slower page load. A great way to solve this is to serve up different images based on the screen size instead of scaling a large image to fit.

To solve this dilemma, I recently discovered the Zurb Interchange module. Since I already use Foundation as a base theme/framework, I thought I would...

Chromatic: Understanding and Using HSL in Your CSS

Planet Drupal -

Color! Without it, life can can be pretty monotone, so I’m going to introduce to you the most awesome of ways you can represent it in your CSS: hue, saturation and lightness.

"I use HEX and RGB all the time, what’s so great about HSL?"

HSL is easier to read, modify, improvise, and it’s supported back to IE9. To see why it’s awesome and how to become an HSL master, let’s take a look at HSL to understand how it works.

Here’s an example:

hsl(30, 75%, 50%);

  • Hue: The color is determined by the hue value as represented in 360 degrees of the HSL color wheel.

  • Saturation: This ranges from 0 - 100. Zero being completely desaturated with 100 representing the full saturation of your hue.

  • Lightness: This also ranges from 0 - 100 with 0 denoting black while 100 will return white.

Here’s an HSL color wheel to get an understanding of how this behaves.

Want to change your orange to yellow? Just add another 30° degrees.

Sure, you could do this with HEX and RGB, but if a request came down the line to add a little green to your color make it and 20% darker, which of the formats below would be easier to interpret and change?

HSL hsl(60, 75%, 50%); RGB rgb(223, 223, 32); HEX #dfdf20;

With its simple manipulation, HSL also let’s you create common color harmonies fast.

Want your color’s complementary color? No sweat - add 180° to the hue value. Is your hue greater than 180° already? HSL is smart enough to loop around the wheel once more.

$primary-color: hsl(30, 75%, 50%); $complementary-color: hsl(210, 75%, 50%); // 30 + 180 = 210

Here’s some additional color schemes that are common in color theory:

Analogous: $red: hsl(0, 75%, 50%); $orange: hsl(30, 75%, 50%); $yellow: hsl(60, 75%, 50%);

Triadic: $orange: hsl(30, 75%, 50%); $blue-green: hsl(150, 75%, 50%); $purple: hsl(270, 75%, 50%);

Split-complementary: $orange: hsl(30, 75%, 50%); $cyan: hsl(180, 75%, 50%); $blue: hsl(240, 75%, 50%);

If you use Sass, you may know that there are built-in functions that utilize HSL. If you’ve used adjust-hue(), saturate() or darken() for example, you’ve already employed HSL as these derive their values from HSL.

"Why should I use HSL if Sass can make these adjustments for me?"

Besides the ease of reading values for when you or another person alters the colors of a project, it also allows you to write cleaner code for getting more ambitious with your own color schemes.

As an example, let’s create our own pattern based on the analogous principle of color theory with HSL.

Protip: Color schemes tend to work best when the hue difference is wide, but saturation remains similar.

$hue: 40; $saturation: 100; $lightness: 70; $second-color: hsl($hue - 25, $saturation - 20, $lightness - 10); $third-color: hsl($hue - 15, $saturation - 10, $lightness - 10); $primary-color: hsl($hue, $saturation, $lightness); $fourth-color: hsl($hue + 15, $saturation - 15, $lightness); $fifth-color: hsl($hue + 25, $saturation - 35, $lightness - 15);

You can swap the hue to see how it looks with other colors:

$hue: 210;

Change one value, and you’ve created a different color system. Nice!

I hope you’re down with HSL and see the light of how awesome it is. Now go out, you MacGyver of color, and enjoy your new abilities with HSL!

Drupal Watchdog: JSON or XML

Planet Drupal -


Now that Drupal 8 has built-in support for Web Services, you’re likely thinking about exposing the content in your site with an API. But should you make the data available in JSON, XML, or both?

A Short History of XML and JSON

XML and JSON are the primary formats used for data exchange on the web. XML was born when some individuals involved in the Standard Generalized Markup Language (SGML) effort became early adopters of the Web. SGML is a way of defining languages for marking up documents, like HTML; XML borrowed many of the core principles and simplified the rest. The initial draft of XML was completed by a subcommittee of the W3C’s SGML Activity in 1996. Even in the early drafting process, it had support from many large technology companies.

In contrast, JSON (JavaScript Object Notation) is known for having been more discovered than invented: Douglass Crockford saw that language constructs already existing in JavaScript could be used to represent objects as strings. He coined the term JSON for this usage in 2001. It didn’t go through the standardization process, in part because it is a proper subset of the JavaScript standard. When Crockford was told by clients that they couldn’t use JSON because it wasn’t a standard, he bought json.org and put up a Web page declaring it a standard. JSON slowly gained popularity as people discovered the page. Since then, it has become an official standard, and support for encoding to and decoding from JSON has been added to many languages.

The choice between these two has been a topic of debate for nearly a decade.

Why is JSON Better?

JSON is lightweight. It often takes fewer characters to transmit the same information. For example, compare the following data in XML with the same data in JSON.

<foo>text goes here</foo>
<bar>and here</bar>


Mediacurrent: Mediacurrent Dropcast: Episode 6

Planet Drupal -

This episode we have a special guest, Mickey Williamson, who talks about the importance of Web Accessibility. We also talk about developing a restful todo application with Backbone.js and as always, Drupal 8 updates and other Drupal news. If you would like to be a guest or have any questions, email us at dropcast@mediacurrent.com.


Subscribe to Cruiskeen Consulting LLC aggregator