Let’s hear your ideas, thoughts, feedback on core
DrupalCon offers a Core Conversation track where people present on various topics ranging from what’s currently in core, how can we improve something, what should go in future releases, and sometimes what is not going so well in the core world.
Drupal Code Standards: Twig in Drupal 8
This is the seventh post in a series about coding standards. In this post we’ll talk about how to adhere to standards while implementing Twig templating in Drupal 8.
Drush SQL Sync Alternative: SQL Sync Pipe
Using Drush to sync databses (drush sql-sync) is a valuable tool, but it is not always an efficient choice when dealing with large databases (think over 1GB).
Create a Custom Views Sort Plugin with Drupal 8
Tap into the power of Views with a custom sort plugin in Drupal 8. Code samples included.
The Road to Speaking at DrupalCon
Earlier this year I was fortunate to speak at DrupalCon New Orleans. I'd been working towards speaking at DrupalCon for a few years and it wasn’t until after I spoke that I reflected on just how much effort went into it. I had underestimated the process. Because I’m likely not alone in doing so, I’d like to share what I have learned along the way.
Drupal.org's Composer endpoints are out of beta

Drupal.org's Composer endpoints have been available in beta for some time now, and in that time we've begun to see many, many people use Composer to manage Drupal modules and themes. We first launched these repositories before DrupalCon New Orleans as an alpha release, and move into beta a few months later. After receiving your feedback and bug reports we've made updates, and are ready to call this service stable.
What is Composer?Composer is a tool for dependency management in PHP. It allows you to declare the libraries your project depends on and it will manage (install/update) them for you.
… Composer is strongly inspired by node's npm and ruby's bundler." - Source
In a nutshell, Composer allows you to declare the dependencies of your project in a composer.json file in the root of your PHP project. Those dependencies, which you then install through Composer, can have their own composer.json files and their own dependencies—all of which will be automatically managed and installed by Composer. When you need specific control over the versions of dependencies, you can use a composer.lock file.
You can read more about Composer at GetComposer.org.
How do Drupal.org's composer repositories work?Drupal.org offers two Composer repositories—one for Drupal 7, and one for Drupal 8. Composer requires that packages adhere to semantic versioning, which Drupal 8 core does, but Drupal 8 contrib, and Drupal 7 core and contrib, don’t. To solve this problem, we've created a Composer façade, which takes all of the metadata about projects on Drupal.org and translates them into a format Composer can understand—including translating the Drupal-specific versioning for Drupal 7 and contrib into semantic versioning.
By creating this façade, we've made sure that Drupal.org is still the canonical source for metadata about Drupal.org projects, and that we can update this translation layer as the versioning schema changes. (Learn more about the effort to move Contrib projects to semantic versioning).
In addition to providing endpoints for building projects, Drupal's automated testing suite— DrupalCI—now uses Composer to test Drupal core and contributed projects. This allows developers to test any external dependencies.
How do I use Drupal.org's Composer repositories?To begin using Drupal.org's Composer repositories, you'll need to update your composer.json file to include the appropriate Composer repository for the version of Drupal. To use Composer with Drupal 7, use the repository url:
https://packages.drupal.org/7
. To use Composer with Drupal 8, use the repository url:
https://packages.drupal.org/8
, as in this example.
After setting up composer, simply run the command:
$ composer config repositories.drupal composer https://packages.drupal.org/8
And your project's composer.json should be updated to look like the following:
{
"repositories": {
"drupal": {
"type": "composer",
"url": "https://packages.drupal.org/8"
}
}
}
Once you've made that change, you should be able to use Composer for Drupal modules and themes as you would for any other PHP package, using the drupal/ namespace:
$ composer require drupal/<modulename>
There is one caveat about the pattern: there are some namespace collisions among modules, and so it is on our roadmap to update Drupal.org project pages to specify the exact namespace to use to require a given project.
To learn more about how to use Drupal.org's Composer repositories, and for some troubleshooting tips, read the Project Composer documentation.
What about licensing?All the projects hosted on Drupal.org are licensed GPLv2 or later or have an entry in the packaging whitelist. This means that you can rely on Drupal Core and contributed modules and themes to be licensed under the GPL or compatible. And if you need to redistribute your code created with Drupal projects, it must be redistributed as GPLv2 or later.
However, because Composer is a tool that can manage packages in the wider PHP ecosystem, you might find that you want to require a non-GPL package in your project. Using GPL-licensed Drupal projects with external packages that are GPL compatible is fine. Just be aware that if you redistribute that code, you will have to redistribute under a GPL license.
We cannot provide legal advice for your use of open source software. If you use Composer to install packages that are not compatible with the GPL alongside GPL-licensed projects like Drupal, you may use that software together, but per the terms of the GPL you may not copy, distribute, or modify that software.
"Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running the Program is not restricted…" GPL 2.0 Section 0.
For more questions about Drupal and the GPL you can read the Licensing FAQ.
What's next?At this point, the Drupal.org Composer service is stable and you can use it to manage modules and themes in your production websites. That said, we do have a roadmap of additional features that we'd like to add. And your contributions are welcome!
As development on Drupal.org's Composer service continues, we want to focus on the following features:
Supporting Composer-based workflows for distributions and install profiles
Providing sub-tree splits of Drupal Core
Updating project pages to provide information about using Composer with any given Drupal.org hosted project
Adding features to the updates service, to collect statistics about projects installed with Composer, and to explore providing update alerts about external dependencies
We also hope to work with core maintainers to add the Drupal.org Composer repositories to Drupal Core's composer.json file
If you're interested in learning more about our roadmap for Composer, or contributing to this service on Drupal.org, you can learn more in the Composer plan issue.
How you can help
If you’re interested in helping to improve Drupal.org's support for Composer workflows, please take a look at the issue above, find us on irc in #drupal-infrastructure, or send us a volunteer proposal.
Thanks to our Community Initiative contributorsWe'd like to thank the individuals who worked with us as part of this Community Initiative.
In particular, we'd like to thank:
- seldeak the creator of Composer and Packagist.org
- webflo the creator and maintainer of http://packagist.drupal-composer.org.
- timmillwood
- dixon_
- badjava
- cweagans
- tstoeckler
We'd also like to thank Appnovation, who sponsored the initial development of Drupal.org's composer endpoints.
To these volunteers and sponsoring organizations—it is your expertise, your insight, and your affirmation of our work that make these Community Initiatives successful. Thank you!
Decoupled Drupal with JSON API and Ember: Consuming Drupal with Ember Adapters and Models

Among the most crucial steps in architecting decoupled Drupal-backed applications is to bridge the gap between Drupal and the designated front end so that the latter can receive and manipulate data on the Drupal data repository via API calls. For some frameworks, this can be a rather tedious exercise in navigating the server-side APIs and crafting the correct requests on the client side. Luckily, with JSON API now proposed as a core experimental module for Drupal 8, the tightrope walk between Drupal and Ember is about to become more of a cinch.
Tags: acquia drupal planetDrupal 8 : twig_xdebug module tutorial
Upgrading Drupal VM in a BLT-powered project
Limiting the amount of surprises you get when developing a large-scale Drupal project is always a good thing. And to that end, Acquia's BLT (Build and Launch Tools) wisely chooses to leave Drupal VM alone when updating BLT itself. Updates to Drupal VM can and should be done independently of install profile and development and deployment tooling.

But this creates a conundrum: how do you upgrade Drupal VM within a project that uses BLT and has Drupal VM as one of it's composer dependencies? It's actually quite simple—and since I just did it for one of my projects, I thought I'd document the process here for future reference:
Provisionally approved coding standards proposals December 20, 2016
The TWG coding standards committee has provisionally approved 3 coding standards change proposals. These will need to be finally approved and implemented by core before being fully ratified.
The process for proposing and ratifying changes is documented on the coding standards project page. Please note, a change to this process is being proposed to streamline the interaction between the coding standards body, Drupal Core, and the Coder project, please provide any feedback on that issue.
Provisionally approved proposals awaiting core approval and implementation:
- Adopt airbnb javascript coding standards
- Set a standard for @var inline variable type declarations
- PHP 5.4 short array syntax coding standards
Interested in helping out?
You can get started quickly by helping us to update an issue summary or two or dive in and check out the full list of open proposals and see if there's anything you'd like to champion?
These proposals will be re-evaluated during the next coding standards meeting currently scheduled for December 20th. At that point the discussion may be extended, or if clear consensus has been reached one or more policies may be dismissed or ratified and moved to the next step in the process.
Provisionally approved coding standards proposals December 20, 2016
The TWG coding standards committee has provisionally approved 3 coding standards change proposals. These will need to be finally approved and implemented by core before being fully ratified.
The process for proposing and ratifying changes is documented on the coding standards project page. Please note, a change to this process is being proposed to streamline the interaction between the coding standards body, Drupal Core, and the Coder project, please provide any feedback on that issue.
Provisionally approved proposals awaiting core approval and implementation:
- Adopt airbnb javascript coding standards
- Set a standard for @var inline variable type declarations
- PHP 5.4 short array syntax coding standards
Interested in helping out?
You can get started quickly by helping us to update an issue summary or two or dive in and check out the full list of open proposals and see if there's anything you'd like to champion?
These proposals will be re-evaluated during the next coding standards meeting currently scheduled for December 20th. At that point the discussion may be extended, or if clear consensus has been reached one or more policies may be dismissed or ratified and moved to the next step in the process.
Adding a class to default images in Drupal 8
As with many sites, images are an important part of the design for my art gallery listings site, The Gallery Guide. The tricky part is that the site is full of user-generated content, so not all of the listings have images attached.
Where there isn't an image, we use a placeholder - partly for our 'tiles' design pattern, but also because the image is an integral part of the design for an individual listing on large screens. On a small screen, this placeholder image takes up a lot of space, and doesn't add much, so I wanted to hide it at a certain breakpoint.
It took me a while to get my head around how to get into the right place to be able to do this in Drupal 8, following my old standby of sprinkling my code with dpm statements (replaced in D8 with kint). I should really have cracked out XDebug, but it does slow things down quite a bit, and I was too lazy to uncomment a few lines in php.ini. In this case it would have definitely made sense, because the kint output was so large that it slowed my browser down to a crawl - probably because I hadn't previously read this DrupalEasy article.
Having looked at the variables in scope inside template_preprocess_field, I saw that the relevant object was an instance of the class FileFieldItemList, which extends EntityReferenceFieldItemList. This is a good example of where a good IDE like PHPStorm can really help - having found the class, I could quickly navigate to its parent, and see its methods - in this case the relevant one was referencedEntities():
/**
* Implements hook_preprocess_field().
*/
function mytheme_preprocess_field(&$variables, $hook) {
switch ($variables['element']['#field_name']) {
// Machine name of the field
case 'field_image':
// If this is the default image, add a class.
$images = $variables['element']['#items']->referencedEntities();
if (empty($images)) {
$variables['attributes']['class'][] = 'image__default';
}
break;
}
}
Once that class has been added, we can apply CSS to it - in my case it's in a SASS mixin that gets applied to a few different elements:
@mixin image-main {
&.image__default img {
@include breakpoint($mobile-only) {
display: none;
}
}
}
Here's an example on the live site of a listing with no image uploaded, and by comparison, a listing with an image - as you can see, the design wouldn't work on large screens if the placeholder image wasn't there, but on small screens the placeholder just takes up space without giving the user anything worth looking at.
The solution isn't perfect, because the browser will still download the image, even if it's set to display: none, as Tim Kadlec wrote about a while ago. But it'll do for a side project...
Multiple MailChimp Accounts with Drupal
A couple of months ago, team ThinkShout quietly introduced a feature to the MailChimp module that some of us have really wanted for a long time—the ability to support multiple MailChimp accounts from a single Drupal installation. This happened, in part, after I reached out to them on behalf of the stakeholders at Cornell University's ILR School, where I work.
aaron Tue, 12/20/2016 - 11:33Migrating Content References in Drupal 8
An Introduction to Stubs
A common need in our projects is the ability to migrate data that references other content. This can be in the form of taxonomy hierarchy (i.e. parent > child relationship) or content that is attached such as images, videos, or other nodes that exist as standalone entities in the system.
Adding Performance Metrics to Your Behat Test Runs
Adding behavioural testing to your project can do wonders for catching regressions, and even if you don't start out your development writing tests in true Behavioral Driven Development style, you'll greatly benefit from having this type of test coverage.
Within my team, we use an external service to run the tests automatically when pull requests are made to our development branch. The results that come back are immensely helpful when looking over a pull request and no code is merged into the codebase until the test run is green. We have caught several bugs and regressions using this method, but we'd like to add more test coverage to better catch future regressions.
Additional FeedbackWhile developing and finessing the testing stack, we started thinking about what other feedback we could gain from our tests runs. After all, we build a whole site to test out site creation from our profile and then make requests to a bunch of pages on the built test site.
Why not try and get some performance metrics from the test run? You might immediately be concerned that adding code and database queries to a test run might negatively impact the test run, but we haven't seen an increase in the time of the test runs or any failing tests related to added performance-related code. You might also question turning on modules that won't be on in production while you're trying to test sites as if they are in production. That concern is a very legitimate one, but in my experience, you'll run into something you have to change for setup on a hosted test runner service. We weren't concerned with either potential issue.
Drupal ModulesAfter we had come up with the idea of adding performance metrics to our test runs and weighed the potential drawbacks, we needed to write some code to complete our fantastic and awesome idea. Luckily in the Drupal world, there's usually a module for that.
We decided to use the Performance Logging and Monitoring module for our use case. The module "provides performance statistics logging for a site, such as page generation times, and memory usage, for each page load." That description is exactly what we were looking for. Contrib for the win!
Module SetupOnce you enable the module, you'll have to first grant yourself the role to administer the performance logging before you can do anything. Not having this permission on my developer role threw me for a loop for a couple of minutes, so don't let it get you too!
The module's configuration page lies under the Development subsection of the Admin Configuration menu. You can choose from summary, detailed, and/or query logging as well as exclude certain paths from logging and set an access threshold for displaying only the most accessed pages in your summary report.
We have a test setup where we change a few settings from how our production sites are configured. In a testing core module, we list the performance module as a dependency and setup some initial performance logging variables for the test run.
// Setting performance logging variables.
variable_set('performance_detail', 1);
variable_set('performance_nodrush', 1);
variable_set('performance_threshold_accesses', '2');
variable_set('performance_query', 1);
variable_set('performance_summary', 1);
Performance Logging...Caused Performance Problems
Initially, I thought I had picked a good configuration for what we wanted to get out of the report. The problem was that our test runs were no longer passing on the feature branch I had put the code in. The tests were erroring on memory exhaustion when trying to save nodes or beans.
We started to think something we added to the codebase had caused a regression and that this coding error might eat up all the memory on our production servers. Needless to say, we focused a lot on our memory exhaustion error.
I had a face palm moment when I realised that the database query logging was causing the issue. It was even written on the configuration page, "Enabling this will incurr some memory overhead as query times and the actual query strings are cached in memory as arrays for each page, hence skewing the overall page memory reported." But I didn't notice the memory warning while initially setting up the module.
// Setting performance logging variables.
variable_set('performance_detail', 1);
variable_set('performance_nodrush', 1);
variable_set('performance_threshold_accesses', '2');
// Don't check for summary detail since not using in report.
// variable_set('performance_summary', 1);
// Don't add query logging since it adds memory overhead.
// variable_set('performance_query', 1);
Our variable sets on module install were modified to reflect the code above. Unfortunately, we had to axe the query performance due to our memory issue, and we also disabled the performance logging summary since that table is used to create a display for the module UI and we were creating our own report and display.
Adding the Performance Logging to Our Travis CI Test RunsNow that we could see the module logged the stats we wanted and what configuration we could use log performance and still allow the test runs to pass. Adding the logging setup to our Travis CI script was fairly straightforward.
- drush si express express_profile_configure_form.express_core_version=cu_testing_core
We build our sites using a custom distribution profile, and to modify how our Drupal site is installed for the test run, we added a step in the Drupal installation phase for choosing which core module you want to use for site creation. We enable the testing core module to setup some variables just for the Travis test runs.
// Turn on error reporting only for serious errors.
// Warnings were causing dumb exceptions in Behat and the messages don't
// interfere with the tests.
error_reporting(E_ERROR | E_PARSE);
One important thing to note is that PHP warnings and notices end up causing Behat test runs to fail on Travis CI exiting with an error, non-zero, code. Because we know we have a lot of warnings and notices, hey nobody's perfect, we decided to turn off reporting and only enable for the most serious of PHP errors. Other than that, we mainly turned off things like Varnish and Memcache which are hard and resource intensive to test out on Travis.
Displaying the Results at the End of the Test RunInstead of doing some fancy posting of our test results to an external URL, we decided to just print the logging after our verbose Behat test run output.
# Run Behat tests.
- ./bin/behat --config behat.yml --verbose
# Output performance logging data.
- drush scr profiles/express/tests/travis-ci/log-express-performance.php
Behat simply installs a site, runs the test suite, and then we print our logging results out at the end of the Travis test run log. We decided to print out four data points for our display output.
print_r('Path: ' . $path['path'] . "\n");
print_r('Accessed: ' . $count . "\n");
print_r('Memory Consumption: ' . $memory_average . "MB\n");
print_r('Load Time: ' . $load_average . " Milliseconds\n");
We limited our report to the top 15 pages by the number of requests. Our top three pages were the front page, the user dashboard, and the "node/add" form.
That about wraps it up. We do intend to add more data to our report and gain more insight into code performance by doing memory intensive things and adding Behat tests solely for performance logging issues. Visiting the Features page and reverting features are a good example of performance tests we could add.
Also, while having the performance metrics displayed at the end of the test run report is nice, you can't really get a quick sense of trends in the data unless you would manually add up all the reports. We use the ELK stack for displaying other logs, and we plan to POST data back to that instance for better trend monitoring in the future.
I highly encourage you to try getting more feedback data from your automated test runs. While they made impact your test runs and add memory overhead, you can always use the reports for marking general trends over time even if the results aren't perfectly accurate.
Developer BlogFixing Drupal 8 Xampp cURL error 60: SSL certificate
Recently, we wrote a guide on using Xampp with Drupal 8 for local development.
After reading the guide, one of our users asked,
How do you fix the cURL 60 SSL error that I keep getting?
The cURL SSL 60 error occurs when you're trying to install Drupal module by copying the FTP link from drupal.org into the "Add modules" screen.

This error is caused by the default SSL certificate provided by Xampp.
Commands I use when creating a patch for drupal.org
I have honed a selection of commands that I regularly use in the creation and application of patches. Here is a run-down of my most useful commands.
WarChild new MMP site in Drupal launched
War Child UK describes itself as “a small charity that protects children living in some of the world's most dangerous war zones”. Founded in 1993, the charity works with vulnerable children and communities in war torn areas; providing safe spaces, access to education, skills training and much more to ensure that children’s lives are not torn apart by war.
War Child International has multiple offices all over the world, protecting, educating, and empowering marginalised children.
This week in new features - Build-time variables
As we wrote about recently, all of the inputs to an application running on Platform.sh are clearly defined, predictable, and with one exception entirely under your control as a user. We’re happy to announce that we’ve added one more input, and it’s also under user control: Project-level variables available at build time.
Building Views Query Plugins for Drupal 8, Part 1
Three years ago Greg Dunlap wrote a series of articles about building Views query plugins in Drupal 7. A lot has changed since then. Drupal 8 has been out for a year now and Views is in core. I recently had an opportunity to write a Views query plugin for a Drupal 8 project, and it worked out surprisingly well. So, how has the process of implementing a Views query plugin changed? As in the original series, we’ll take a look at how you build one from scratch.
The original article series used Flickr as an example of a remote service you could expose to Views via a query plugin. The Flickr API module that was used does not have a Drupal 8 port, so I needed to port something for the series. We’re big Fitbit users at Lullabot, myself included, so I thought it might be fun to use the Fitbit API as an example of what can be done with Views query plugins. We’ll use the Fitbit API, exposed via a Views query plugin, to build our own custom leader board for our Drupal site.
What’s a Views query plugin?Since Views 3 in Drupal 7, you could write your own plugin to replace Views’ built-in SQL-query engine. This means that you can make Views query against any kind of data source, using the same Views UI your site administrators are accustomed to. The most common use case is to create views that query a remote web service.
The big picture of how you write a Views query plugin hasn’t changed much. You’ll still go through the same steps as in Drupal 7, so we’ll follow the original article series and divide it into three parts:
- Planning and modeling your data
- Creating a basic query plugin
- Exposing configuration options and handling arguments and filters
Probably the biggest change in writing Views query plugins in Drupal 8 is the use of Drupal 8’s plugin system. It’s helpful to have a general understanding of Drupal 8 plugins as we’ll skip over some of those details. The Drupalize.me Drupal 8 Module Development Guide is an excellent source for general information about Drupal 8 plugins.
Let’s do it!
Modeling your plugin dataOne of the first things you need to do before coding your plugin is sit down and think about how the data being returned from your API maps to the data that Views expects. There are a lot of moving parts when writing a Views query plugin, so it’s good to resist the temptation to dive right into code. Let's first figure out the nature of the remote API and what we need to do to transform the data into a Views-friendly format.
Views is designed to represent tabular data, the basis of which is a row of fields. Many API endpoints do not follow this model. For example, a single request may contain a lot of nested data. The query plugin I wrote on a recent project wrapped a Search endpoint that returned faculty member search results. The results contained arrays of research interests and degrees. It was easy enough to flatten these arrays into comma separated lists and present to Views a single field, but it’s not difficult to imagine more complex data. Perhaps the results should include details about the Department the faculty member belongs to. Complex nested data may necessitate a Views relationship plugin.
The Fitbit API exposes a number of endpoints. We’ll first hone in on the User endpoint. It’s pretty straightforward in that it returns nearly tabular data. Issue a request and you get back a list of key-value pairs. It’s also got just enough data in the response to put together a simple leader board based on users’ average daily steps. Here is an example of what it returns (clipped for brevity):
{
"user": {
"age": 32,
"avatar": "https://d6y8zfzc2qfsl.cloudfront.net/B8395E1F-346C-1E31-E74C-0AE2512A38BD_profile_100_square.jpg",
"avatar150": "https://d6y8zfzc2qfsl.cloudfront.net/B8395E1F-346C-1E31-E74C-0AE2512A38BD_profile_150_square.jpg",
"averageDailySteps": 7334,
"displayName": "Matthew O.",
"topBadges": [
{
"image100px": "https://static0.fitbit.com/images/badges_new/100px/badge_daily_steps30k.png",
"name": "Trail Shoe (30,000 steps in a day)",
},
{
"image100px": "https://static0.fitbit.com/images/badges_new/100px/badge_lifetime_miles1997.png",
"name": "India (3,213 lifetime kilometers)",
}
],
}
}
There are a few values with nested data, in particular topBadges, but we’ll see that it’s not that difficult to mange these. From a Views standpoint, we want to expose each of these pieces of data as a field.
There is a wrinkle when it comes to the Fitibit API and our ability to translate it for Views, perhaps you’ve already noticed it. Each response contains only a single user. The API does not allow for multiple users’ data to be returned by a single request. This is because the data can be very personal; values like gender, height, and weight are included in some responses, but only with the individual user's explicit permission. Fitbit’s API follows the OAuth 2.0 Authorization Code Grant. Under the OAuth 2.0 Authorization Code Grant scheme, your application, in this case our Drupal module, has to be registered with Fitbit. Then, in order to query the API on behalf of users, each user must grant access to the application. When a user grants access, they have the option of selectively granting scope, for example they can choose to share activity data, such as steps, distance, calories burned, and active minutes but choose to omit profile data, which includes values like height and weight. Authentication is outside the scope of this article but take a look at the Fitbit base module if you’re interested.
What does this mean for our Views implementation? Well unlike a SQL-query backend which can receive multiple rows in a single request, we’ll need to write our Views query plugin such that we loop over all of our authenticated users and make a single API request per user to aggregate the “table” we give back to Views. This kind of complexity is the trickiest part of writing a Views query plugin. Remember, Views deals in tabular data, so you have to analyze your remote data source and, anywhere it doesn’t fit the bill, you need to translate it for Views.
Data spread across more than one API request represents another common hang up. Here too you have to aggregate the data and then translate into a tabular format. There are a couple ways you can do this, you can either hit each endpoint in turn and present the aggregate fields to Views as a single “table,” or you can implement a Views relationship plugin. By implementing a Views relationship plugin you can defer the decision of which API endpoints are hit to the site administrator.
Take our Fitbit module for example. The Fitbit API has an endpoint to retrieve user profile data, and a separate endpoint to retrieve daily activity summary data. To surface data from both endpoints to Views, we could just hit both endpoints all the time and give the aggregate result back to Views, however, that could be wasteful. The site administrator may have no interest in daily activity summary data. Instead, we can use Views relationships to give the site administrator the option to opt in to daily activity summary data. When the relationship is present, our Views query plugin will know to hit both endpoints.
Other considerationsIf the number of authenticated Fitbit users on our site grew, we’d reach a point where performance and rate limiting would become a concern. We will probably want to investigate a caching strategy to reduce the number of round trips. Another concern I had at the beginning of the project was that I preferred not to interact directly with the API, but to instead use a module or library that abstracted a lot of that work away for me. In particular, I thought it would be nice to not have to write the Fitbit authentication code myself.
There wasn’t any module available for Drupal 8, but I did find a Composer library that I could use to do a lot of that work for me. The OAuth 2.0 Client Library together with the Fitbit provider provided a great stepping stone. So in true Drupal 8 fashion, I stepped off the island and harnessed the power of these external dependencies to take care of the heavy lifting around authentication. The author of the Fitbit provider was even open to pull requests, making the experience much more fun.
ConclusionWhen writing any piece of sufficiently complex code, taking the time to think about your problem and the best way to solve it will pay dividends down the road. Writing a set of Views plugins is no exception. You need to think about the data you have, the data Views expects, and how to deal with the complications that arise when the two don’t fit together perfectly. If you’re designing your own system from scratch, you have the luxury of building the APIs 'just so' to fit your desired use case. Sadly, life is rarely so neat. Resist the temptation to dive straight into code, and first figure out what it is you need to build.
In part 2 of this series, we’ll go through the steps of building our plugin, ending up with the simple use case of having a Fitbit leaderboard. Until next time!