THE BRIEF
Never Stopping, Always Evolving
Leica Geosystems was founded on cutting-edge technology and continues to push the envelope with their revolutionary products. Leica Geosystems was founded by Heinrich Wild and made its first rangefinder in 1921. Fast forward to the 21st century, and Leica Geosystems is the leading manufacturer of precision laser technology used for measurements in architecture, construction, historic preservation, and DIY home remodeling projects.
Oomph and Leica collaborated on an initial project in 2014 and have completed multiple projects since. We transitioned the site into a brand new codebase with Drupal 8. With this conversion, Oomph smoothed out the Leica team’s pain points related to a multisite architecture. We created a tightly integrated single site that can still serve multiple countries, languages, and currencies.
THE CHALLENGE
Feeling the Pain-points with Multisite
Leica’s e-commerce store is active in multiple countries and languages. Managing content in a Drupal multisite environment meant managing multiple sites. Product, content, and price changes were difficult. It was Oomph’s challenge to make content and product management easier for the Leica team as well as support the ability to create new country sites on demand. Leica’s new e-commerce site needed to support:
MULTIPLE COUNTRIES AND A GLOBAL OPTION
SIX LANGUAGES
MANY 3RD-PARTY INTEGRATIONS
The pain points of the previous Multisite architecture were that each country was a silo:
- No Single Sign On (SSO): Multiple admin log-ins to remember
- Repetitive updates: Running Drupal’s update script on every site and testing was a lengthy process
- Multiple stores: Multiple product lists, product features, and prices
- Multiple sites to translate: each site was sent individually to be translated into one language
THE APPROACH
Creating a Singularity with Drupal 8, Domain Access, & Drupal Commerce
A move to Drupal 8 in combination with some smart choices in module support and customization simplified many aspects of the Leica team’s workflow, including:
- Configuration management: Drupal 8’s introduction of configuration management in core means that point-and-click admin configuration can get exported from one environment and imported into another, syncing multiple environments and saving configuration in our code repository
- One Database to Rule Them All: Admins have a single site to log into and do their work, and developers have one site to update, patch, and configure
- One Commerce Install, Multiple stores: There is one Drupal Commerce 2.x install with multiple stores with one set of products. Each product has the ability to be assigned to multiple stores, and price lists per country control product pricing
- One Page in Multiple Countries and Multiple Languages: The new single site model gives a piece of content one place to live, while authors can control which countries the content is available and the same content is translated into all the languages available once.
- Future proof: With a smooth upgrade path into Drupal 9 in 2020, the Drupal 8 site gives Leica more longevity in the Drupal ecosystem
LEARN VS. SHOP
Supporting Visitor Intention with Two Different Modes
While the technical challenges were being worked out, the user experience and design had to reflect a cutting-edge company. With the launch of their revolutionary product, the BLK 360, in 2018, Leica positioned itself as the Apple of the geospatial measurement community — sleek, cool, cutting-edge and easy to use. While many companies want to look as good as Apple, few of them actually have the content and product to back it up.
The navigation for the site went through many rounds of feedback and testing before deciding on something radically simple — Learn or Shop. A customer on the website is either in an exploratory state of mind — browsing, comparing, reviewing pricing and specifications — or they are ready to buy. We made it very clear which part of the website was for which.
This allowed us to talk directly to the customer in two very different ways. On the Learn side, the pages educate and convince. They give the customer information about the product, reviews, articles, sample data files, and the like. The content is big, sleek, and leverages video and other embedded content, like VR, to educate.
On the Shop side the pages are unapologetically transactional. Give the visitor the right information to support a purchase, clearly deliver specs and options like software and warranties, without any marketing. We could assume the customer was here to purchase, not to be convinced, so the page content could concentrate on order completion. The entire checkout process was simplified as much as possible to reduce friction. Buying habits and patterns of their user base over the past few site iterations were studied to inform our choices about where to simplify and where to offer options.
THE RESULTS
More Nimble Together
The willingness of the Drupal community to support the needs of this project cannot be overlooked, either. Oomph has been able to leverage our team’s commitment to open source contributions to get other developers to add features to the modules they support. Without the give and take of the community and our commitment to give back, many modifications and customizations for this project would have been much more difficult. The team at Centarro, maintainers of the Commerce module, were fantastic to work with and we thank them.
We look forward to continuing to support Leica Geosystems and their product line worldwide. With a smooth upgrade path to Drupal 9 in 2020, the site is ready for the next big upgrade.
THE BRIEF
The Virtual Lab School (VLS) supports military educators with training and enrichment around educational practices from birth through age 12. Their curriculum was developed by a partnership between Ohio State University and the U.S. Department of Defense to assist direct-care providers, curriculum specialists, management personnel, and home-based care providers. Because of the distributed nature of educators around the world, courses and certifications are offered virtually through the VLS website.
Comprehensive Platform Assessment
The existing online learning platform had a deep level of complexity under the surface. For a student educator taking a certification course, the site tracks progress through the curriculum. For training leaders, they need to see how their students are progressing, assign additional coursework, or assist a student educator through a particular certification.
Learning platforms in general are complex, and this one is no different. Add to this an intertwined set of military-style administration privileges and it produces a complex tree of layers and permutations.
The focus of the platform assessment phase was to catalog features of the largely undocumented legacy system, uncover complexity that could be simplified, and most importantly identify opportunities for efficiencies.
THE RESULTS
Personalized Online Learning Experience
Enrollment and Administration Portal
Administrators and instructors leverage an enrollment portal to manage the onboarding of new students and view progress on coursework and certifications.
Course Material Delivery
Students experience the course material through a combination of reading, video, and offline coursework downloads for completion and submission.
Learning Assessments & Grading
Students are tested with online assessments, where grading and suggestions are delivered in real time, and submission of offline assignments for review by instructors.
Progress Pathways
A personalized student dashboard is the window into progress, allowing students to see which courses have been started, how much is left to complete, and the status of their certifications.
Certification
Completed coursework and assessments lead students to a point of certification resulting in a printable Certificate of Completion.
FINAL THOUGHTS
Faster and More Secure than Ever Before
When building for speed and scalability, fully leveraging Drupal’s advanced caching system is a major way to support those goals. The system design leverages query- and render-caching to support a high level of performance while also supporting personalization to an individual level. This is accomplished with computed fields and auto-placeholdering utilizing lazy builder.
The result is an application that is quicker to load, more secure, and able to support hundreds more concurrent users.
Why Drupal?
When building for speed and scalability, fully leveraging Drupal’s advanced caching system is a major way to support those goals. The system design leverages query- and render-caching to support a high level of performance while also supporting personalization to an individual level. This is accomplished with computed fields and auto-placeholdering utilizing lazy builder.
The result is an application that is quicker to load, more secure, and able to support hundreds more concurrent users.
Is your digital platform still on Drupal 7? By now, hopefully, you’ve heard that this revered content management system is approaching its end of life. In November 2023, official Drupal 7 support from the Drupal community will end, including bug fixes, critical security updates, and other enhancements.
If you’re not already planning for a transition from Drupal 7, it’s time to start.
With nearly 12 million websites currently hacked or infected, Drupal 7’s end of life carries significant security implications for your platform. In addition, any plug-ins or modules that power your site won’t be supported, and site maintenance will depend entirely on your internal resources.
Let’s take a brief look at your options for transitioning off Drupal 7, along with five crucial planning considerations.
What Are Your Options?
In a nutshell, you have two choices: upgrade to Drupal 9, or migrate to a completely new CMS.
With Drupal 9’s advanced features and functionalities, migrating from Drupal 7 to 9 involves much more than applying an update to your existing platform. You’ll need to migrate all of your data to a brand-new Drupal 9 site, with a whole new theme system and platform requirements.
Drupal 9 also requires a different developer skill set. If your developers have been on Drupal 7 for a long time, you’ll need to factor a learning curve into your schedule and budget, not only for the migration but also for ongoing development work and maintenance after the upgrade.
As an alternative, you could take the opportunity to build a new platform with a completely different CMS. This might make sense if you’ve experienced a lot of pain points with Drupal 7, although it’s worth investigating whether those problems have been addressed in Drupal 9.
Drupal 10
What of Drupal 10, you ask? Drupal 10 is slated to be released in December 2022, but it will take some time for community contributed modules to be updated to support the new version. Once ready, updating from Drupal 9 to Drupal 10 should be a breeze.
Preparing for the Transition
Whether you decide to migrate to Drupal 9 or a new CMS, your planning should include the following five elements:
Content Audit
Do a thorough content inventory and audit, examine user analytics, and revisit your content strategy, so you can identify which content is adding real value to your business (and which isn’t).
Some questions to ask:
- Does your current content support your strategy and business goals?
- How much of the content is still relevant for your platform?
- Does the content effectively engage your target audience?
Another thing to consider is your overall content architecture: Is it a Frankenstein that needs refinement or revision? Hold that thought; in just a bit, we’ll cover some factors in choosing a CMS.
Design Evaluation
As digital experiences have evolved, so have user expectations. Chances are, your Drupal 7 site is starting to show its age. Be sure to budget time for design effort, even if you would prefer to keep your current design.
Drupal 7 site themes can’t be moved to a new CMS, or a different version of Drupal, without a lot of development work. Even if you want to keep the existing design, you’ll have to retheme the entire site, because you can’t apply your old theme to a new backend.
You’ll also want to consider the impact of a new design and architecture on your existing content, as there’s a good chance that even using an existing theme will require some content development.
Integrations
What integrations does your current site have, or need? How relevant and secure are your existing features and modules? Be sure to identify any modules that have been customized or are not yet compatible with Drupal 9, as they’ll likely require development work if you want to keep them.
Are your current vendors working well for you, or is it time to try something new? There are more microservices available than ever, offering specialized services that provide immense value with low development costs. Plus, a lot of functionalities from contributed modules in D7 are now a part of the D9 core.
CMS Selection & Architecture
Building the next iteration of your platform requires more than just CMS selection. It’s about defining an architecture that balances cost with flexibility. Did your Drupal 7 site feel rigid and inflexible? What features do you wish you had – more layout flexibility, different integrations, more workflow tools, better reporting, a faster user interface?
For content being brought forward, will migrating be difficult? Can it be automated? Should it be? If you’re not upgrading to Drupal 9, you could take the opportunity to switch to a headless CMS product, where the content repository is independent of the platform’s front end. In that case, you’d likely be paying for a product with monthly fees but no need for maintenance. On the flipside, many headless CMS platforms (like Contentful, for example) don’t allow for customization.
Budget Considerations
There are three major areas to consider in planning your budget: hosting costs, feature enhancements, and ongoing maintenance and support. All of these are affected by the size and scope of your migration and the nature of your internal development team.
Key things to consider:
- Will the newly proposed architecture be supported by existing hosting? Will you need one vendor or several (if you’re using microservices)?
- If you take advantage of low or no-cost options for hosting, you could put the cost savings toward microservices for feature enhancements.
- Given Drupal 9’s platform requirements, you’ll need to factor in the development costs of upgrading your platform infrastructure.
- Will your new platform need regular updates? How often might you add feature enhancements and performance optimizations? And do you have an internal dev team to handle the work?
Look on the Bright Side
Whatever path you choose, transitioning from Drupal 7 to a new CMS is no small task. But there’s a bright side, if you view it as an opportunity. A lot has changed on the web since Drupal 7 was launched! This is a great time to explore how you can update and enhance your platform user experience.
Re-platforming is a complex process, and it’s one that we love guiding clients through. By partnering with an experienced Drupal development firm, most migrations can be planned and implemented more quickly, with better outcomes for your brand and your audience.
From organizing the project, to vendor and service selection, to budget management, we’ve helped a number of organizations bring their platforms into the modern web. To see an example, check out this design update for the American Veterinary Medical Association.
Need help figuring out the best next step for your Drupal 7 platform? Contact us today for a free consultation.
As a back-end developer, I’m used to building stuff that people interact with every day but never actually see. I create that layer beneath the surface that makes things work the way people expect them to — and I’m the one who gets called in when something goes wrong. Either way, I spend a lot of time unraveling puzzles and reimagining solutions, forever pushing the limits of what software can do.
I make things work; that’s what I love about my job. It’s also the reason why I like being part of the Open Source Software (OSS) community. OSS offers nearly infinite opportunities to solve problems or build things that didn’t exist before.
Plus, as an open source contributor, by writing one little fix you could be helping hundreds or thousands of digital platforms.
Why Open Source Matters to Me
I first got involved with OSS back in 2007, when I used Drupal CMS for a client project. I spent a few years getting comfortable with the software and kind of dipping a toe in the community aspect. After a while, I’d been consuming Drupal so much that I started to feel bad about using all that free software and not giving anything back.
Then I had a project come along that needed a custom feature for Drupal that didn’t exist yet in the open source space. I wrote the module and gave it back to the community, working through other open source contributors who needed it too. That’s when I discovered how rewarding it is when you build something and people actually use it.
When you’re working on a project and you find a problem, you could just fix it and move on. But when you say, “Hey, somebody else might find this useful,” and you take that extra 30 minutes to give the code back to the community, you make everybody better.
I love the feeling that giving back gives you, especially when you fix something that thousands of other people use.
From Dipping a Toe to the Deep End
For me, being an OSS contributor comes with a sense of responsibility. It’s rewarding when you fix an issue for other developers, but it also makes you not want to screw up something that thousands of other sites report using. So I’m always mindful of releasing quality code. Maybe that’s also why I became a maintainer.
Years ago, I was using a contributed theme that someone else had written as a starting point for a lot of the projects I worked on. Then the sole maintainer passed away, and a lot of us still wanted to use the theme. So a coworker and I offered to maintain it — I couldn’t just walk away from something I’d been benefiting from for so long.
Today, I regularly submit code to open source communities, and I’m a maintainer of nine different open source modules. How did that happen, you ask? Well… sometimes I’ll recommend that a client uses an unmaintained module in their application, and then, you know, somebody’s got to take care of it.
What can I say? I also feed the stray cats in my neighborhood.
I Get to Do This for Work?!
Problem-solving is the best part of my job. And with OSS, you’re always going to have a problem to solve, or a bug to fix, or a new feature to build every time someone says, “You know, it would be great if this module did this.” That’s when my teammates and I roll up our sleeves and say, “Okay, how are we going to make this work?” When we find a solution, we give it back.
Fortunately, I work at an agency that actively encourages us to make open source contributions. We recognize the value we get from OSS, and that’s why we go the extra mile to support the community. I can build OSS fixes while I’m working on client projects, or take professional growth days to focus on open source work.
I don’t know if that’s true of all agencies, but I’m lucky enough to work somewhere that believes in making an impact.
Want to come build things with us? Check out our open positions.
How we leveraged Drupal’s native API’s to push notifications to the many department websites for the State.RI.gov is a custom Drupal distribution that was built with the sole purpose of running hundreds of department websites for the state of Rhode Island. The platform leverages a design system for flexible page building, custom authoring permissions, and a series of custom tools to make authoring and distributing content across multiple sites more efficient.
Come work with us at Oomph!
VIEW OPEN POSITIONS
The Challenge
The platform had many business requirements, and one stated that a global notification needed to be published to all department sites in near real-time. These notifications would communicate important department information on all related sites. Further, these notifications needed to be ingested by the individual websites as local content to enable indexing them for search.
The hierarchy of the departments and their sites added a layer of complexity to this requirement. A department needs to create notifications that broadcast only to subsidiary sites, not the entire network. For example, the Department of Health might need to create a health department specific notification that would get pushed to the Covid site, the RIHavens site, and the RIDelivers sites — but not to an unrelated department, like DEM.
Exploration
Aggregator:
Our first idea was to utilize the built in Drupal aggregator module and pull notifications from the hub. A proof of concept proved that while it worked well for pulling content from the hub site, it had a few problems:
- It relied heavily on the local site’s cron job to pull updates, which led to timing issues in getting the content — it was not in near real-time. Due to server limitations, we could not run cron as often as would be necessary
- Another issue with this approach was that we would need to maintain two entity types, one for global notifications and a second for local site notifications. Keeping local and global notifications as the same entity allowed for easier maintenance for this subsystem.
Feeds:
Another thought was to utilize the Feeds module to pull content from the hub into the local sites. This was a better solution than the aggregator because the nodes would be created locally and could be indexed for local searching. Unfortunately, feeds relied on cron as well.
Our Solution
JSON API
We created a suite of custom modules that centered around moving data between the network sites using Drupal’s JSON API. The API was used to register new sites to the main hub when they came online. It was also used to pass content entities from the main hub down to all sites within the network and from the network sites back to the hub.
Notifications
In order to share content between all of the sites, we needed to ensure that the data structure was identical on all sites in the network. We started by creating a new notification content type that had a title field, a body field, and a boolean checkbox indicating whether the notification should be considered global. Then, we packaged the configuration for this content type using the Features module.
By requiring our new notification feature module in the installation profile, we ensured that all sites would have the required data structure whenever a new site was created. Features also allowed us to ensure that any changes to the notification data model could be applied to all sites in the future, maintaining the consistency we needed.
Network Domain Entity
In order for the main hub, ri.gov, to communicate with all sites in the network, we needed a way to know what Drupal sites existed. To do this, we created a custom configuration entity that stored the URL of sites within the network. Using this domain entity, we were able to query all known sites and passed the global notification nodes created on ri.gov to each known site using the JSON API.
Queue API:
To ensure that the notification nodes were posted to all the sites without timeouts, we decided to utilize Drupal’s Queue API. Once the notification content was created on the ri.gov hub, we queried the known domain entities and created a queue item that would use cron to actually post the notification node to each site’s JSON API endpoint. We decided to use cron in this instance to give us some assurance that a post to many websites wouldn’t timeout and fail.
Batch API
To allow for time sensitive notifications to be pushed immediately, we created a custom batch operation that reads all of the queued notifications and pushes them out one at a time. If any errors are encountered, the notification is re-queued at the end of the stack and the process continues until all notifications have been posted to the network sites.
New site registrations
In order to ensure that new sites receive notifications from the hub, we needed a site registration process. Whenever a new site is spun up, a custom module is installed that calls out to the hub using JSON API and registers itself by creating a new network domain entity with it’s endpoint URL. This allows the hub to know of the new site and can push any new notifications to this site in the future.
The installation process will also query the hub for any existing notifications and, using the JSON API, get a list of all notification nodes from the hub to add them to it’s local queue for creation. Then, the local site uses cron to query the hub and get the details of each notification node to create it locally. This ensured that when a new site comes online, it will have an up to date list of all the important notifications from the hub.
Authentication
Passing this data between sites is one challenge, but doing it securely adds another layer of complexity. All of the requests going between the sites are authenticating with each other using the Simple Oauth module. When a new site is created, an installation process creates a dedicated user in the local database that will own all notification nodes created with the syndication process. The installation process also creates the appropriate Simple OAuth consumers which allows the authenticated connections to be made between the sites.
Department sites
Once all of the groundwork was in place, with minimal effort, we were able to allow for department sites to act as hubs for their own department sites. Thus, the Department of Health can create notifications that only go to subsidiary sites, keeping them separate from adjacent departments.
Translations
The entire process also works with translations. After a notification is created in the default language, it gets queued and sent to the subsidiary sites. Then, a content author can create a translation of that same node and the translation will get queued and posted to the network of sites in the same manner as the original. All content and translations can be managed at the hub site, which will trickle down to the subsidiary sites.
Moving in the opposite direction
With all of the authorization, queues, batches, and the API’s in place, the next challenge was making this entire system work with a Press Release content type. This provided two new challenges that we needed to overcome:
- Instead of moving content from the top down, we needed to move from the bottom up. Press release nodes get created on the affiliate sites and would need to be replicated on the hub site.
- Press release nodes were more complex than the notification nodes. These content types included media references, taxonomy term references and toughest of all, paragraph references.
Solving the first challenge was pretty simple – we were able to reuse the custom publishing module and instructed the queue API to send the press release nodes to the hub sites.
Getting this working with a complex entity like the press release node meant that we needed to not only push the press release node, but we also needed to push all entities that the initial node referenced. In order for it all to work, the entities needed to be created in reverse order.
Once a press release node was created or updated, we used the EntityInterface referencedEntities() method to recursively drill into all of the entities that were referenced by the press release node. In some cases, this meant getting paragraph entities that were nested two, three, even four levels deep inside of other paragraphs. Once we reached the bottom of the referenced entity pile, we began queuing those entities from the bottom up. So, the paragraph that was nested four levels deep was the first to get sent and the actual node was the last to get sent
Are you a developer looking to grow your skills? Join our team.
Conclusion
Drupal’s powerful suite of API’s gave us all the tools necessary to come up with a platform that will allow the State of Rhode Island to easily keep their citizens informed of important information, while allowing their editing team the ease of a create once and publish everywhere workflow.
Each spring, students at the Rhode Island School of Design (RISD) exhibit their undergraduate and master’s thesis projects at the RISD Museum. Due to Covid-19, they were unable to prepare and stage physical exhibits in the spring of 2020.
Not to be deterred, the school and museum agreed to host the student work online as fully digital exhibits. The Museum previously partnered with Oomph to build out the award-winning “Raid the Icebox” online publication using Drupal and Layout Builder, so it felt familiar and natural to build out similar features for university student projects.
The necessary work involved extending the existing online gallery features to hundreds of additional artists, so we needed to build a system that could scale. Along the way, while we were at it, we were tasked with adding additional features to the platform. Why not have lofty goals?
The Timeline
We kicked off the first stage of the project on April 20, 2020, aiming for a two-staged release. Most of the new code would need to be deployed by the last week of May, with the additional features released two weeks later. The basic infrastructure would have to be established along with the custom permissions for artists, editors, and museum administrators. A second stage would add refinement to font selection and color palette.
What the Artists Needed
- A platform for routine editor tasks such as editing content, uploading media, altering the layout, and resources to perform many tasks outside the usual scope of content editors.
- The ability to add primary, secondary, and system webfonts to custom content types as well as their associated layout builder templates.
- A custom color palette whose colors were chosen by an admin user. This kind of style addition had to be available through Layout Builder for new publication nodes.
- A few trusted student authors also needed the ability to add JavaScript directly1 into page content. This was an intimidating requirement from a security standpoint, but the end results were highly engaging.
What the Staff Needed
- A robust set of permissions to enable multiple departments to have ownership and administrative controls over their particular domains, including:
- Bulk upload of new users in anticipation of needing to add hundreds of students.
- Node clone functionality (the ability to duplicate pages) to speed up time creating new pieces of content.
- Custom permissions for trusted editors for all content in a particular section.
- Enabling those editors to grant artists permission to view, edit, and access Layout Builder for a particular node.
A Deeper Dive
Overall Approach
We leveraged Drupal to build out our new features “the Drupal way.” The Node Clone and Bulk User Import modules could be installed and enabled with our Composer workflow and used right out of the box to offer additional powerful functionality. Now a user with the Editor role could craft a meticulously designed template and then clone it for other school departments. A user with the Web Administrator role would not have to add users one-by-one through the user interface but could import large numbers of new users — while specifying the user role — with CSV files.
We added the new custom fields, content types, user roles, and text formats manually through Drupal’s UI. We could later use preprocess functions in the theme and Twig templates to render content as needed.
There were a lot of fields needed, covering different aspects of the typography. Here are a few:RISD
Since it was a Drupal 8 project, we made extensive use of config sync to export and import config files. The front-end and back-end developers could work independently until it was time to merge branches for testing. Then we were able to seamlessly push changes to higher environments as part of our deploy process.
Note: As a rule, we recommend setting config to read-only, especially on projects that have many web admin users.
Custom Webfont Example
With those new fields in place, a user sees text input fields on the node edit view of each publication to enter in custom font URLs or names.
In terms of rendering to the page when someone is viewing the node, this requires both a preprocess hook in the [custom_theme].theme
file and changes to the Twig template.
Note: Please be aware that allowing hundreds of users to input free text is not an ideal situation, and that security measures should be taken when processing free text.
Here is what the preprocess hook looks like for the mytheme.theme
file:
use Drupal\node\Entity\Node;
use Drupal\taxonomy\TermStorage;
/**
* Implements hook_preprocess_HOOK().
*/
function mytheme_preprocess_html(array &$variables) {
$routeMatch = Drupal::routeMatch();
$node = $routeMatch->getParameter('node');
if ($node instanceof Node && $node->getType() === 'publication’) {
if (isset($node->field_primary_webfont_url) && !$node->field_primary_webfont_url->isEmpty()) {
$variables['primary_webfont_url'] = $node->field_primary_webfont_url->value;
$variables['primary_webfont_family'] = $node->field_primary_webfont_family->value;
$variables['primary_webfont_type'] = $node->field_primary_webfont_type->value;
}
PHP
Then in the Twig template, which is at this path: myproject/docroot/themes/custom/mytheme/templates/layout/html.html.twig
<!DOCTYPE html>
<html{{ html_attributes }}>
<head>
<title>{{ head_title }}</title>
{% if primary_webfont_url|length %}
<link rel="stylesheet prefetch" media="screen" href="{{ primary_webfont_url }}">
<style type="text/css">
:root {
--ff__serif: '{{ primary_webfont_family }}', {{ primary_webfont_type }};
}
</style>
{% endif %}
{% if secondary_webfont_url|length %}
<link rel="stylesheet prefetch" media="screen" href="{{ secondary_webfont_url }}">
<style type="text/css">
:root {
--ff__sans: '{{ secondary_webfont_family }}', {{ secondary_webfont_type }};
}
</style>
{% endif %}
{% if background_color_override|length and foreground_color_override|length %}
<style type="text/css">
:root {
--c__primary--bg: {{ background_color_override }};
--c__primary--fg: {{ foreground_color_override }};
}
</style>
{% endif %}
</head>
<body{{ attributes }}>
{{ page_top }}
{{ page }}
{{ page_bottom }}
</body>
</html>
HTML
Finally, here is what someone viewing a page would see:
Most of the creative work for each piece of content happened behind the scenes in Layout Builder. Each block or section could be configured individually, which gave the artists a lot of ability to customize their online territories to the fullest extent possible.
In addition to being able to choose a foreground or background color on the node level, an artist or editor can choose to change the color of just one block in Layout Builder simply by clicking on the “Style Settings” link.
Another inline-editing window will pop up with additional options. In the “Add a style” dropdown menu, the artist or editor can select “Component Background Color,” click “Add Styles,” and choose from one of the colors in the palette to be applied to the block.
Along with the preprocessing described in the previous section, we extended Layout Builder’s features with a custom module to alter layouts. The plugin class lives at: docroot/modules/custom/my_module/Plugin/Layout/LayoutBase.php
<?php
namespace Drupal\my_module\Plugin\Layout;
use Drupal\Core\Form\FormStateInterface;
use Drupal\Core\Layout\LayoutDefault;
use Drupal\Core\Plugin\PluginFormInterface;
/**
* Provides a layout base for custom layouts.
*/
abstract class LayoutBase extends LayoutDefault implements PluginFormInterface {
public const NO_BACKGROUND_COLOR = 0;
public function build(array $regions): array {
$build = parent::build($regions);
$backgroundColor = $this->configuration['background_color'];
if ($backgroundColor) {
$build['#attributes']['class'][] = 'rpp__bg-color--' . $backgroundColor;
}
return $build;
}
public function defaultConfiguration(): array {
return [
'background_color' => NO_BACKGROUND_COLOR,
'id' => NULL,
'background_color_override' => NULL,
];
}
public function buildConfigurationForm(array $form, FormStateInterface $form_state): array {
$form['background'] = [
'#type' => 'details',
'#title' => $this->t('Background'),
'#open' => TRUE,
'#weight' => 20,
];
$form['background']['background_color'] = [
'#type' => 'radios',
'#default_value' => $this->configuration['background_color'],
];
$form['background']['overrides'] = [
'#type' => 'fieldset',
'#title' => $this->t('Overrides'),
];
$form['background']['overrides']['background_color_override'] = [
'#type' => 'textfield',
'#title' => $this->t('Background Color'),
'#default_value' => $this->configuration['background_color_override'],
'#attributes' => [
'placeholder' => '#000000',
],
];
return $form;
}
public function submitConfigurationForm(array &$form, FormStateInterface $form_state) {
$values = $form_state->getValues();
$this->configuration['background_color'] = $values['background']['background_color'];
$this->configuration['id'] = $values['extra']['attributes']['id'];
$this->configuration['background_color_override'] = $values['background']['overrides']['background_color_override'];
}
}
PHP
The Background Color form gets inserted into the Layout Builder form, and the user’s color selections get submitted and saved to configuration in the submitConfigurationForm()
method.
The custom layout needs to be registered, so it should be added in a file called: my_module.layouts.yml and looks like:
layout_base:
label: 'New Layout'
category: 'Custom Layouts'
template: templates/layout-base
default_region: main
regions:
main:
label: Main content
PHP
Now this custom layout with color overrides and whatever else you want to add will be available for users with the appropriate permissions to edit content in Layout Builder.
Conclusion
Jeremy Radtke, Assistant Director of Digital Initiatives at the RISD Museum, said in a recent presentation to the Museum Publishing Digital Interest Group that RISD sees the museum as a site of creative collaboration. In terms of the end-of-year digital showcase, this is demonstrated in the emphasis on student artists having a high degree of creative control over their projects. They were able to radically alter the existing layout templates offered to them, changing fonts, colors, and other elements of the theme. They were able to configure blocks to add static images, animated gifs, and other media files such as short films to stretch the limits of the digital space.
There were a total of 700 undergraduates and grads featured in the online exhibit, which featured 16 departments. The art school is attached to the RISD Museum, and Radtke said the museum’s style is very much along the lines of an art school, in that it employs critique — asking questions and solving problems is strongly emphasized. This project was about content delivery, but also how to generate content. Oomph was proud to be part of that collective journey of exploration and experimentation.
Related Resources
- Oomph case study about Ziggurat and “RAID the Icebox”
- Museum Publishing Digital Interest Group presentation on RISD Museum online exhibits
The Challenge
Execute on a digital platform strategy for a global private equity firm to create a centralized employee destination to support onboarding, create interpersonal connections between offices, and drive employee satisfaction.
The key components would be an employee directory complete with photos, bios, roles and organizational structure; News, events, and other communications made easily available and organized per location as well as across all locations; The firm’s investment portfolio shared through a dashboard view with all pertinent information including the team involved.
These components, and the expected tactical assets that an intranet provides, would help the firm deepen connections with and among employees at the firm, accelerate onboarding, and increase knowledge sharing.
The Approach
Supporting Multiple Intentions: Browsing vs. Working
An effective employee engagement platform, or intranet, needs to support two distinct modes — task mode and explore mode. In task mode, employees have access to intuitive navigation, quick page loading, and dynamic search or filtering while performing daily tasks. They get what they need fast and proceed with their day.
At the same time, a platform must also encourage and enable employees to explore company knowledge, receive company-wide communications, and connect with others. For this firm, the bulk of content available in explore mode revolves around the firm’s culture, with a special focus on philanthropic initiatives and recognition of key successes.
Both modes benefit from intuitive searching and filtering capabilities for team members, news, events, FAQs, and portfolio content. News and events can be browsed in a personalized way — what is happening at my location — or a global way — what is happening across the company. For every interaction within the platform, the mode was considered and influential of nearly all design decisions.
From a technical standpoint, the private equity firm needed to support security by hosting the intranet on their own network. This and the need to completely customize the experience for close alignment with their brand meant that no off-the-shelf pre-built intranet solution would work. We went with Drupal 8 to make this intranet scalable, secure, and tailor-made to an optimal employee experience.
The Results
The platform deployment came at a time when it was most needed, playing a crucial role for the firm during a global pandemic that kept employees at home. What was originally designed as a platform to deepen employee connections between offices quickly became the firm’s hub for connecting employees within an office. As many businesses are, the firm is actively re-evaluating its approach to the traditional office model, and the early success of the new platform indicates that it is likely to play an even larger role in the future.
Test Driven Development (TDD) facilitates clean and stable code. Drupal 8 has embraced this paradigm with a suite of testing tools that allow a developer to write unit tests, functional tests, and functional JavaScript tests for their custom code. Unfortunately, there is no JavaScript unit testing framework readily available in Drupal core, but don’t fret. This article will show you how to implement JavaScript unit testing.
Why unit test your JavaScript code?
Testing units of code is a great practice, and also guarantees that any future developer doesn’t commit a regression to your logic. Adding unit coverage for JavaScript code is helpful for testing specific logical blocks of code quickly and efficiently without the overhead both in development time and testing time of functional tests.
An example of JavaScript code that would benefit from unit testing would be an input field validator. For demonstration purposes, let’s say you have a field label that permits certain characters, but you want to let the user know immediately if they entered something incorrectly, maybe with a warning message.
Here’s a crude example of a validator that checks an input field for changes. If the user enters a value that is not permitted, they are met with an error alert.
(($, Drupal) => {
Drupal.behaviors.labelValidator = {
attach(context) {
const fieldName = "form.form-class input[name=label]";
const $field = $(fieldName);
$field.on("change", () => {
const currentValue = $field.val();
if (currentValue.length > 0 && !/^[a-zA-Z0-9-]+$/.test(currentValue)) {
alert("The value you entered is incorrect!");
}
});
}
};
})(jQuery, Drupal);
JavaScript
We only allow letters, numbers, and hyphens in this sample validator. We now have a good idea of test data we can create for our test.
Setting up JS Unit Testing
In the world of JavaScript unit testing, Jest has a well-defined feature set, a large community, and is the most popular choice among developers. To begin using Jest, add jest
as a development dependency in your favorite manager. Then create a Jest config file, and add your directories for testing. I recommend enabling lcov ; a test coverage reporter that converts test results into local HTML pages.
Writing a Test
We want to test our Drupal behavior, but we need jQuery and the global Drupal object. Have no fear! We can mock all of this. For simplicity’s sake, we can mock both jQuery and Drupal to test the code we want. The point here is to collect the validation logic and run it on our test cases.
There are a couple of different techniques we can use to meet our requirements. You can create a test DOM using a library like JSDOM and require the jQuery library. This gives you the ability to simulate HTML and DOM events. This approach is fine, but our goal is to test our custom validation logic, not to test third-party libraries, or simulate the DOM. Similar to mocking classes and methods in PHPUnit, we can do the same with jest.
Our testing environment is Node, so we can leverage the global object to mock Drupal, jQuery, and even the alert function. Please see Node’s global variable documentation for more information on this object. We can do this in the setup logic of jest with beforeAll
:
beforeAll(() => {
global.alert = jest.fn();
global.Drupal = {
behaviors: {}
};
global.jQuery = jest.fn(selector => ({
on(event, callback) {
validator = callback;
},
val() {
return fieldValue;
}
}));
const behavior = require("label-validator.es6.js");
Drupal.behaviors.labelValidator.attach();
});
JavaScript
This makes our behavior available to the global Drupal object. We also have mocked jQuery, so we can collect the callback on which we want to run the tests. We run the attach method on the behavior to collect the callback. You may have noticed that we never declared the validator
or fieldValue
variables; we do this at the top of our test so we have them available in our tests.
// The validation logic we collect from the `change` event.
let validator = () => "";
// The value of the input we set in our tests.
let fieldValue = "";
JavaScript
With the intention of cleanup, we want to unset all the global objects after we have run our tests. In our case, the globals we are mocking do not exist in Node, so it is safe to set them to null. In cases in which we are mocking defined values, we would want to save a backup of that global and then mock it. After we are done testing, we would set the backup back to its corresponding global. There are also many techniques related to mocking globals and even core Node libraries. For an example, check out the documentation on the jest website.
Here is our tear-down logic. We use the jest function afterAll
to achieve this:
afterAll(() => {
global.Drupal = null;
global.jQuery = null;
global.alert = null;
});
JavaScript
We need to create an array of values that we know should pass validation and fail validation. We will call them validLabels
and invalidLabels
, respectively:
/**
* List of valid labels for the input.
*
* @type {string[]}
*/
const validLabels = [
"123ABVf123",
"123",
"AB",
"1",
"",
"abcdefghijklmnop12345678910",
"ab-3-cd"
];
/**
* List of invalid labels for the input.
*
* @type {string[]}
*/
const invalidLabels = [
"!@#fff",
"test test",
"(123)",
"ABCDEF123!",
"^Y1",
" ",
"'12346'",
];
JavaScript
Finally, we are ready to start writing our tests. We can use jest’s provided test function, or we can use the “describe it” pattern. I prefer the “describe it” pattern because you can provide detailed information on what you are testing and keep it in the same test scope.
Firstly, we want to test our valid data, and we know that these values should never trigger an alert. We will call the validator on each test value and set the expectation that the alert function is never called. But before we write the test, we want to make sure to clear all our mocks between tests to prevent mock pollution. We can achieve this with beforeEach
:
beforeEach(() => {
jest.clearAllMocks();
});
JavaScript
After writing our valid data test, we will write our invalid data test. This test should expect an alert for each invalid value sent. Putting it all together we have:
describe("Tests label validation logic", () => {
beforeEach(() => {
jest.clearAllMocks();
});
it("valid label test", () => {
validLabels.forEach(value => {
fieldValue = value;
validator();
});
expect(global.alert.mock.calls.length).toBe(0);
});
it("invalid label test", () => {
invalidLabels.forEach(value => {
fieldValue = value;
validator();
});
expect(global.alert.mock.calls.length).toBe(invalidLabels.length);
});
});
JavaScript
After writing our tests, we can check our coverage and see we have hit 100%!
Jest is extremely flexible and has a large ecosystem. There are many different ways we could have achieved the above results; hopefully this gives you some useful ideas on how to unit test your javascript code.
The entire sample Jest test:
/* global test expect beforeEach afterAll beforeAll describe jest it */
// The validation logic we collect from the `change` event.
let validator = () => "";
// The value of the input we set in our tests.
let fieldValue = "";
// the setup function where we set our globals.
beforeAll(() => {
global.alert = jest.fn();
global.Drupal = {
behaviors: {}
};
global.jQuery = jest.fn(selector => ({
on(event, callback) {
validator = callback;
},
val() {
return fieldValue;
}
}));
const behavior = require("label-validator.es6.js");
Drupal.behaviors.labelValidator.attach();
});
// Global tear down function we use to remove our mocks.
afterAll(() => {
global.Drupal = null;
global.jQuery = null;
global.alert = null;
});
/**
* List of valid labels for the input.
*
* @type {string[]}
*/
const validLabels = [
"123ABVf123",
"123",
"AB",
"1",
"",
"abcdefghijklmnop12345678910",
"ab-3-cd"
];
/**
* List of invalid labels for the input.
*
* @type {string[]}
*/
const invalidLabels = [
"!@#fff",
"test test",
"(123)",
"ABCDEF123!",
"^Y1",
" ",
"'12346'",
];
// The tests.
describe("Tests label validation logic", () => {
beforeEach(() => {
jest.clearAllMocks();
});
it("valid label test", () => {
validLabels.forEach(value => {
fieldValue = value;
validator();
});
expect(global.alert.mock.calls.length).toBe(0);
});
it("invalid label test", () => {
invalidLabels.forEach(value => {
fieldValue = value;
validator();
});
expect(global.alert.mock.calls.length).toBe(invalidLabels.length);
});
});
JavaScript
Resources
- Jest javascript testing framework – jest
- NPM trends JS testing framework popularity – npm trends js testing
- Jest configuration file documentation – https://jestjs.io/docs/en/configuration.html
- A JS DOM testing library – JSDOM
- Jest mocking documentation – jestjs.io/docs/en/manual-mocks
Continuous integration and delivery (CI/CD) is an important part of any modern software development cycle. It ensures code quality remains high, helps keep applications secure, and bridges the gap between everyday work and your visitors’ experience.
Nowadays it’s a given that a CI/CD pipeline will be part of a workflow, but choosing a provider and/or platform can be difficult. Oomph has made use of a number of CI/CD tools over the years: DeployBot, Jenkins, and Travis CI have all made appearances. Most of our projects in the last few years have used Travis, but more recently we’ve found it to be unreliable. Just as we began searching for a new provider, full CI/CD support was announced for GitHub Actions.
We immediately added Actions to the list of providers we were interested in, and after some comparison, we began migrating projects to it. Overall we’ve found it to be beneficial — the syntax is well-designed, workflows are extensible and modular, the platform is reliable and performant, and we’ve experienced no major trouble.
There are already plenty of good guides and articles on how to use GitHub Actions; we won’t repeat that here. Instead, we’ll look at a few gotchas and issues that we’ve encountered while using the platform, to give an accurate picture of things you may come across while implementing GitHub Actions.
Considerations
The team behind GitHub Actions knew what they were doing, and it’s clear they learned from and improved on previous CI/CD implementations. This is most obvious in the clear structure of the syntax, the straightforward pricing model, and the useful feature set. However, Actions’ in-progress state is apparent in some areas.
Artifact Storage and Billing
GitHub provides a generous amount of free build time for all repositories and organizations. Storage, though, is much more limited — only 2GB is included for GitHub Teams organizations. If you want to store build artifacts for all of your CI/CD jobs (a good idea for testing and repeatability) you may need to configure a “spending limit” — i.e. a maximum amount you’re willing to spend each month on storage. GitHub charges $0.25/GB for storage beyond the included 2GB.
Artifact storage is still rudimentary. Jobs can upload artifacts for download by other jobs later in the workflow, but the lifetime of those artifacts cannot be configured; they will expire after 90 days and the only way to delete them beforehand is manual. Manual deletions will also take some time to free up storage space.
We also experienced an issue where our reported usage for Actions storage was greatly (~500%) exaggerated, putting us far past our spending limit and breaking builds. When we reached out to GitHub’s support, though, they responded quickly to let us know this was a system-wide issue and they were working on it; the issue was resolved some days later and we were not charged for the extra storage. We were able to work around it in the meantime by extending our spending limit.
Restarting and Debugging Jobs
If a workflow fails or is canceled, it can be restarted from the workflow page. However, it’s not yet possible to restart certain jobs; the entire workflow has to be run again. GitHub is working on support for job-specific restarts.
Debugging job failures also is not yet officially supported, but various community projects make this possible. We’ve used Max Schmitt’s action-tmate to debug our builds, and that does the job. In fact, I prefer this approach to the Travis method; with this we can specify the point of the workflow where we want to start debugging, whereas Travis always starts debugging at the beginning of the build.
Log Output
GitHub Actions has an excellent layout for viewing the output of jobs. Each job in a workflow can be viewed and within that each step can be expanded on its own. The output from the current step can also be seen in near-real-time. Unfortunately, this last bit has been somewhat unreliable for us, lagging behind by a bit or failing to show the output for short steps. (To be fair to GitHub, I have never used a CI/CD platform where the live output worked flawlessly.) Viewing the logs after completion has never been a problem.
Configuring Variables/Outputs
GitHub Actions allows you to configure outputs for an action, so a later step can use some value or outcome from an earlier step. However, this only applies to packaged actions that are included with the uses
method.
To do something similar with a free-form step is more convoluted. First, the step must use some odd syntax to set an output parameter, e.g.:
- name: Build
id: build
run: |
./scripts/build.sh
echo "::set-output name=appsize::$(du -csh --block-size=1G build/ | tail -n1 | cut -d$'\t' -f1)"
YAML
Then a later step can reference this parameter with the steps
context:
- name: Provision server
run: terraform apply -var “app_ebs_volume_size=${{ steps.build.outputs.appsize }}”
YAML
However, the scope of the above is limited to the job it takes place inside of. To reference values across jobs you must also set the values within the outputs
map in the jobs
context, e.g.:
jobs:
build:
runs-on: ubuntu-latest
outputs:
appsize: ${{ steps.step1.outputs.appsize }}
steps:
- name: Build
id: build
run: |
./scripts/build.sh
echo "::set-output name=appsize::$(du -csh --block-size=1G build/ | tail -n1 | cut -d$'\t' -f1)"
infra:
runs-on: ubuntu-latest
needs: build
steps:
- run: terraform apply -var “app_ebs_volume_size=${{ needs.build.outputs.appsize }}”
YAML
Importantly, the outputs map from a previous job is only made available to jobs that require it with the needs
directive.
While this setup is workable, the syntax feels a little weird, and the lack of documentation on it makes it difficult to be certain of what you’re doing. This is evolving, as well; the jobs.<jobs_id>.outputs
context was only released in early April. Before that was added, persisting data across jobs required the use of build artifacts, which was clunky and precluded its use for sensitive values.
Self-hosted Runners
Sometimes security or access requirements prohibit a cloud-hosted CI/CD runner from reaching into an environment to deploy code or provision resources, or some sensitive data needs to be secured. For these scenarios, GitHub provides the ability to self-host Actions runners. Self-hosted runners can instead run the CI/CD process from an arbitrary VM or container within the secured network or environment. You can use them alongside cloud-hosted runners; as an example, in some situations we use cloud-hosted runners to test and validate builds before having the self-hosted runners deploy those builds to an environment.
This feature is currently in beta, but it has proven reliable and extremely useful in the places we’ve needed them.
Reliability and Performance
Overall GitHub Actions has been very reliable for us. There have been periods of trouble here and there but GitHub is open about the issues and generally addresses them in short order. We have not (yet) been seriously impeded by any outages or degradation, which is a significant improvement over our previous CI/CD situation.
Overall Experience
In general, the switch to GitHub Actions has been a positive experience. We have made significant improvements to our CI/CD workflows by switching to Actions; the platform has some great features and it has certainly been beneficial for our development lifecycle. While Actions may have a few quirks or small issues here and there we wouldn’t hesitate to recommend it as a CI/CD platform.
The first stable release for Drupal 9 shipped right on schedule — June 3, 2020. The Drupal 8.9.0 release was available the same day, and that means end-of-life for 8.7.x.
Since we all have migrated our sites from Drupal 7 to 8.9.x already (right??), it should be a fairly straightforward process to port everything from 8 to 9 when the time comes. This article covers what is involved with the 8 to 9 migration, sharing some of the gotchas we encountered in the hopes that you can have a smooth transition.
Are you familiar with what is coming in Drupal 9? How can you assess what is needed? How do you know what code needs to be updated? What other steps are involved?
This will help prepare you when it comes time to make the leap and to reassure you that this should be a straightforward and painless process.
Drupal 9
Drupal 9 is not being built in a different codebase than Drupal 8, so all new features will be backward-compatible. That is a significant departure if you recently went through a Drupal 6 to 7, or Drupal 7 to 8 migration. You won’t have to map content types and fields using migration modules or custom migration plugins and you won’t have to restructure your custom modules from scratch. This is really good news for companies and organizations who want to port sites before Drupal 8 end of life in November 2021 and who want to avoid or minimize the disruption that can come with a complicated migration.
In terms of what the code looks like, Drupal 9 will be the same as the last Drupal 8 minor release (which is set to be 8.9), with deprecated code removed and third-party dependencies updated. Upgrading to Drupal 9 should be like any other minor upgrade, so long as you have removed or replaced all deprecated code.
The Drupal.org documentation visualizes the differences between Drupal 8.9 and 9 with this image:
Upgrades
Symfony 3 -> 4.4
The biggest change for third party dependencies is the use of Symfony 4.4 for Drupal 9. Drupal 8 relies on Symfony 3, and to ensure security support, Symfony will have to be updated for Drupal 9.
Twig 1 -> 2
Drupal 9 will use Twig 2 instead of Twig 1 (Drupal 8). CKEditor 5 is planned to be used for a future version of Drupal 9; this issue references 9.1.x for the transition. Drupal 9 will still depend on jQuery, but most components of jQuery UI will be removed from core.
PHPUnit 6 -> 7
For testing, PHPUnit 7 will be used instead of version 6. The Simpletest API will be deprecated in Drupal 9 and PHPUnit is recommended in its place. If you have an existing test suite using PHPUnit, you might have to replace a lot of deprecated code, just as you will do for custom modules.
6 Month release schedule
Along the lines of how Drupal 8 releases worked, Drupal 9.1.0, 9.2.0, and so on, will each contain new backwards-compatible features for Drupal 9 every six months after the initial Drupal 9.0 release. The list of Strategic Initiatives gives a detailed overview of major undertakings that have been completed for Drupal 8 or are proposed and underway for Drupal 9. We might see automatic updates for 9.1, or drush included in core.
How can you assess what is needed to upgrade?
There are some comprehensive guides available on Drupal.org that highlight the steps needed for Drupal 9 readiness. A lot of functions, constants, and classes in Drupal core have been deprecated in Drupal 9.
Some deprecations call for easy swap-outs, like the example below:
Call to deprecated method url() of class Drupal\file\Entity\File. Deprecated in drupal:8.0.0 and is removed from drupal:9.0.0. Please use toUrl() instead.
You can see a patch that has been created that swaps out url()
with toUrl()
straightforwardly:
- $menuItem['thumbnail_url'] = file_url_transform_relative($imageFile->Url());
+ $menuItem['thumbnail_url'] = file_url_transform_relative($imageFile->toUrl()->toString());
Some deprecations are more involved and do require some code rewrites if your custom modules are relying on the outdated code.
Example:
Call to deprecated function pagerdefaultinitialize() in drupal:8.8.0 and is removed from drupal:9.0.0. Use \Drupal\Core\Pager\PagerManagerInterface->defaultInitialize() instead.
There is an active issue in the Drupal core issue queue for this deprecation. Rewriting outdated code sometimes requires going through issue queue comments and doing some research to figure out how the core module has been reconfigured. Often it is easiest to look at the core code itself, or to grep for that function in other core modules to see how they have handled the deprecation.
This is how I ended up replacing the pagerdefaultinitialize() deprecated function for the limit() method in our custom module:
use Drupal\Core\Database\Query\PagerSelectExtender;
+ use Drupal\Core\Pager\PagerManagerInterface;
+ use Drupal\Core\Pager;
class CountingPagerSelectExtender extends PagerSelectExtender {
/**
* {@inheritdoc}
*/
public function limit($limit = 10) {
parent::limit($limit);
+ /** @var \Drupal\Core\Pager\PagerManage $pagerManager */+ $pager_manager = \Drupal::service('pager.manager');
if (empty($this->limit)) {
return $this;
}
$this
->ensureElement();
$total_items = $this
->getCountQuery()
->execute()
->fetchField();
- $current_field = pager_default_initialize($total_items, $this->limit, $this->element);
+ $pager = $pager_manager->createPager($total_items, $this->limit, $this->element);
+ $current_page = $pager->getCurrentPage();
$this
->range($current_page * $this->limit, $this->limit);
return $this;
}
How do you know what code needs to be updated?
Fortunately, as is usually the case with Drupal, there is a module for that! Upgrade Status
This contributed module allows you to scan all the code of installed modules. Sometimes a scan can take a while, so it might make sense to scan custom modules one by one if you want to step through your project. Upgrade Status generates reports on the deprecated code that must be replaced and can be exported in HTML format to share with others on your team.
If you are using a composer-based workflow, install Upgrade Status using the following command:
composer require 'drupal/upgrade_status:^2.0'
YAML
You might also need the Git Deploy contributed module as a dependency. Our projects did.
The Upgrade Status module relies on a lot of internals from the Drupal Check package. You can install Drupal Check with composer and run it if you want a quicker tool in the terminal to go through the codebase to identify code deprecations, and you don’t care about visual reporting or the additional checks offered by Upgrade Status.
Tools such as Upgrade Status and Drupal Check are extremely useful in helping to pinpoint which code will no longer be supported once you upgrade your project to Drupal 9. The full list of deprecated code was finalized with the Drupal 8.8.0 release in December 2019. There could be some future additions but only if absolutely necessary. The Drupal Core Deprecation Policy page goes into a lot more detail behind the justification for and mechanics of phasing out methods, services, hooks, and more.
@deprecated in drupal:8.3.0 and is removed from drupal:9.0.0.
Use \Drupal\Foo\Bar::baz() instead.
@see http://drupal.org/mode/the-change-notice-nid
YAMLThe deprecation policy page explains how the PHPdoc tags indicate deprecated code
For the most part, all deprecated APIs are documented at: api.drupal.org/api/drupal/deprecated
Since so many maintainers are currently in the process of preparing their projects for Drupal 9, there is a lot of good example code out there for the kinds of errors that you will most likely see in your reports.
Check out the issues on Drupal.org with Issue Tag “Drupal 9 compatibility”, and if you have a few thousand spare hours to wade through the queues, feel free to help contributed module maintainers work towards Drupal 9 readiness!
Upgrade Status note
My experience was that I went through several rounds of addressing the errors in the Upgrade Status report. For several custom modules, after I cleared out one error, re-scanning surfaced a bunch more. My first pass was like painting a wall with a roller. The second and third passes entailed further requirements and touch-ups to achieve a polished result.
What about previous Drupal releases?
Drupal 8 will continue to be supported until November 2021, since it is dependent on Symfony 3, which has an end-of-life at the same time.
Drupal 7 will also continue to be supported by the community until November 2021, with vendor extended support offered at least until 2024.
Now is a good time to get started on preparing for Drupal 9!
This post will assume you have already completed the base setup of enabling Layout Builder and added the ability to manage layouts to one of your content types. If you are not to this point check out Drupal.orgs documentation on layout builder or this article by Tyler Fahey which goes over setup and some popular contrib module enhancements.
As we mentioned in part 1 of this series, you should expect a little DIY with Layout Builder. So far the best way we have found to theme Layout Builder is by creating a custom module to provide our own custom layouts and settings. By defining custom layouts in a custom module we get the ability to control each layout’s markup as well as the ability to add/remove classes based on the settings we define.
Writing the custom layout module
Setup the module
Start by creating your custom module and providing the required .info.yml file.
demo_layout.info.yml:
name: Demo Layout
description: Custom layout builder functionality for our theme.
type: module
core: 8.x
package: Demo
dependencies:
- drupal:layout_builder
YAML
Remove default core layouts
Layout Builder comes with some standard layouts by default. There’s nothing wrong with these, but generally for our clients, we want them only using our layouts. This hook removes those core layouts, leaving only the layouts that we will later define:
demo_layout.module
/**
* Implements hook_plugin_filter_TYPE__CONSUMER_alter().
*/
function demo_layout_plugin_filter_layout__layout_builder_alter(array &$definitions): void {
// Remove all non-demo layouts from Layout Builder.
foreach ($definitions as $id => $definition) {
if (!preg_match('/^demo_layout__/', $id)) {
unset($definitions[$id]);
}
}
}
PHP
Register custom layouts and their regions
The next step is to register the custom layouts and their respective regions. This process is well documented in the following drupal.org documentation: https://www.drupal.org/docs/8/api/layout-api/how-to-register-layouts
For this particular demo module we are going to define a one column and a two column layout. These columns will be able to be sized later with the settings we provide.
demo_layout.layouts.yml
demo_layout__one_column:
label: 'One Column'
path: layouts/one-column
template: layout--one-column
class: Drupal\demo_layout\Plugin\Layout\OneColumnLayout
category: 'Columns: 1'
default_region: first
icon_map:
- [first]
regions:
first:
label: First
demo_layout__two_column:
label: 'Two Column'
path: layouts/two-column
template: layout--two-column
class: Drupal\demo_layout\Plugin\Layout\TwoColumnLayout
category: 'Columns: 2'
default_region: first
icon_map:
- [first, second]
regions:
first:
label: First
second:
label: Second
YAML
Pay close attention to the path, template, and class declarations. This determines where the twig templates and their respective layout class get placed.
Creating the base layout class
Now that we have registered our layouts, it’s time to write a base class that all of the custom layouts will inherit from. For this demo we will be providing the following settings:
- Column width
- Column padding (top and bottom)
- Background color
- Custom classes
However, there is a lot of PHP to make this happen. Thankfully for the most part it follows a general pattern. To make it easier to digest, we will break down each section for the Column Width setting only and then provide the entire module at the end which has all of the settings.
src/Plugin/Layout/LayoutBase.php
<?php
declare(strict_types = 1);
namespace Drupal\demo_layout\Plugin\Layout;
use Drupal\demo_layout\DemoLayout;
use Drupal\Core\Form\FormStateInterface;
use Drupal\Core\Layout\LayoutDefault;
/**
* Provides a layout base for custom layouts.
*/
abstract class LayoutBase extends LayoutDefault {
}
PHP
Above is the layout class declaration. There isn’t a whole lot to cover here other than to mention use Drupal\demo_layout\DemoLayout;
. This class isn’t necessary but it does provide a nice one-stop place to set all of your constant values. An example is shown below:
src/DemoLayout.php
<?php
declare(strict_types = 1);
namespace Drupal\demo_layout;
/**
* Provides constants for the Demo Layout module.
*/
final class DemoLayout {
public const ROW_WIDTH_100 = '100';
public const ROW_WIDTH_75 = '75';
public const ROW_WIDTH_50 = '50';
public const ROW_WIDTH_25 = '25';
public const ROW_WIDTH_25_75 = '25-75';
public const ROW_WIDTH_50_50 = '50-50';
public const ROW_WIDTH_75_25 = '75-25';
}
PHP
The bulk of the base class logic is setting up a custom settings form using the Form API. This form will allow us to formulate a string of classes that get placed on the section or to modify the markup depending on the form values. We are not going to dive into a whole lot of detail as all of this is general Form API work that is well documented in other resources.
Setup the form:
/**
* {@inheritdoc}
*/
public function buildConfigurationForm(array $form, FormStateInterface $form_state): array {
$columnWidths = $this->getColumnWidths();
if (!empty($columnWidths)) {
$form['layout'] = [
'#type' => 'details',
'#title' => $this->t('Layout'),
'#open' => TRUE,
'#weight' => 30,
];
$form['layout']['column_width'] = [
'#type' => 'radios',
'#title' => $this->t('Column Width'),
'#options' => $columnWidths,
'#default_value' => $this->configuration['column_width'],
'#required' => TRUE,
];
}
$form['#attached']['library'][] = 'demo_layout/layout_builder';
return $form;
}
/**
* {@inheritdoc}
*/
public function validateConfigurationForm(array &$form, FormStateInterface $form_state) {
}
/**
* {@inheritdoc}
*/
public function submitConfigurationForm(array &$form, FormStateInterface $form_state) {
$this->configuration['column_width'] = $values['layout']['column_width'];
}
/**
* Get the column widths.
*
* @return array
* The column widths.
*/
abstract protected function getColumnWidths(): array;
PHP
Finally, we add the build function and pass the column width class:
/**
* {@inheritdoc}
*/
public function build(array $regions): array {
$build = parent::build($regions);
$columnWidth = $this->configuration['column_width'];
if ($columnWidth) {
$build['#attributes']['class'][] = 'demo-layout__row-width--' . $columnWidth;
}
return $build;
}
PHP
Write the column classes
Now that the base class is written, we can write column-specific classes that extend it. These classes are very minimal since most of the logic is contained in the base class. All that is necessary is to provide the width options for each individual class.
src/Plugin/Layout/OneColumnLayout.php
<?php
declare(strict_types = 1);
namespace Drupal\demo_layout\Plugin\Layout;
use Drupal\demo_layout\DemoLayout;
/**
* Provides a plugin class for one column layouts.
*/
final class OneColumnLayout extends LayoutBase {
/**
* {@inheritdoc}
*/
protected function getColumnWidths(): array {
return [
DemoLayout::ROW_WIDTH_25 => $this->t('25%'),
DemoLayout::ROW_WIDTH_50 => $this->t('50%'),
DemoLayout::ROW_WIDTH_75 => $this->t('75%'),
DemoLayout::ROW_WIDTH_100 => $this->t('100%'),
];
}
/**
* {@inheritdoc}
*/
protected function getDefaultColumnWidth(): string {
return DemoLayout::ROW_WIDTH_100;
}
}
PHP
src/Plugin/Layout/TwoColumnLayout.php
<?php
declare(strict_types = 1);
namespace Drupal\demo_layout\Plugin\Layout;
use Drupal\demo_layout\DemoLayout;
/**
* Provides a plugin class for two column layouts.
*/
final class TwoColumnLayout extends LayoutBase {
/**
* {@inheritdoc}
*/
protected function getColumnWidths(): array {
return [
DemoLayout::ROW_WIDTH_25_75 => $this->t('25% / 75%'),
DemoLayout::ROW_WIDTH_50_50 => $this->t('50% / 50%'),
DemoLayout::ROW_WIDTH_75_25 => $this->t('75% / 25%'),
];
}
/**
* {@inheritdoc}
*/
protected function getDefaultColumnWidth(): string {
return DemoLayout::ROW_WIDTH_50_50;
}
}
PHP
We can now check out the admin interface and see our custom form in action.
One column options:
Two column options:
Add twig templates
The last step is to provide the twig templates that were declared earlier in the demo_layout.layouts.yml file. The variables to be aware of are:
- Content: contains the block content for this layout separated by region
- Attributes: contains the custom classes that were passed in the base class build function.
- Settings:contains the submitted form values from the settings form.
src/layouts/one-column/layout–one-column.html.twig
{#
/**
* @file
* Default theme implementation to display a one-column layout.
*
* Available variables:
* - content: The content for this layout.
* - attributes: HTML attributes for the layout <div>.
* - settings: The custom form settings for the layout.
*
* @ingroup themeable
*/
#}
{%
set row_classes = [
'row',
'demo-layout__row',
'demo-layout__row--one-column'
]
%}
{% if content %}
<div{{ attributes.addClass( row_classes|join(' ') ) }}>
<div {{ region_attributes.first.addClass('column', 'column--first') }}>
{{ content.first }}
</div>
</div>
{% endif %}
Twig
src/layouts/two-column/layout–two-column.html.twig
{#
/**
* @file
* Default theme implementation to display a two-column layout.
*
* Available variables:
* - content: The content for this layout.
* - attributes: HTML attributes for the layout <div>.
* - settings: The custom form settings for the layout.
*
* @ingroup themeable
*/
#}
{# Get the column widths #}
{% set column_widths = settings.column_width|split('-') %}
{%
set row_classes = [
'row',
'demo-layout__row',
'demo-layout__row--two-column'
]
%}
{% if content %}
<div{{ attributes.addClass( row_classes|join(' ') ) }}>
{% if content.first %}
<div {{ region_attributes.first.addClass('column', 'column--' ~ column_widths.0, 'column--first') }}>
{{ content.first }}
</div>
{% endif %}
{% if content.second %}
<div {{ region_attributes.second.addClass('column', 'column--' ~ column_widths.1, 'column--second') }}>
{{ content.second }}
</div>
{% endif %}
</div>
</div>
{% endif %}
Twig
Notice settings.column_width
was passed with a string: 75-25
. We need to split it and place each value on our column which results in the following output.
<div class="demo-layout__row-width--75-25 row demo-layout__row demo-layout__row--two-column ">
<div class="column column--75 column--first"></div>
<div class="column column--25 column--second"></div>
</div>
HTML
Since these are custom classes, and we haven’t written any CSS, these columns do not have any styling. Depending on your preference, you can implement your own custom column styles or wire up a grid framework such as Bootstrap in order to get the columns to properly size themselves.
Wrapping it up
You should be at a point where you have an idea of how to create custom settings in order to theme layout builder sections. You can take this method and extend it however you need to for your particular project. There’s no definitive best way to do anything in the world of web development, and Layout Builder is no exception to that rule. It’s a great addition to Drupal’s core functionality, but for larger sites, it likely won’t be and shouldn’t be the only way you handle layout. Much like Drupal itself though, as more and more people use it, Layout Builder will only become stronger, more robust, more fully-featured, and better documented. If it doesn’t seem like a good fit for you right now, it may become a better fit as it grows. If it does seem like a good fit, be ready to get your hands dirty!
The full demo layouts module with all of the custom settings is available here: https://github.com/oomphinc/layout-builder-demo/tree/master/moduleexamples/demolayout
THE BRIEF
The RISD Museum publishes a document for every exhibition in the museum. Most of them are scholarly essays about the historical context around a body of work. Some of them are interviews with the artist or a peek into the process behind the art. Until very recently, they have not had a web component.
The time, energy, and investment in creating a print publication was becoming unsustainable. The limitations of the printed page in a media-driven culture are a large drawback as well. For the last printed exhibition publication, the Museum created a one-off web experience — but that was not scalable.
The Museum was ready for a modern publishing platform that could be a visually-driven experience, not one that would require coding knowledge. They needed an authoring tool that emphasized time-based media — audio and video — to immediately set it apart from printed publications of their past. They needed a visual framework that could scale and produce a publication with 4 objects or one with 400.
THE APPROACH
A Flexible Design System
Ziggurat was born of two parents — Oomph provided the design system architecture and the programmatic visual options while RISD provided creative inspiration. Each team influenced the other to make a very flexible system that would allow any story to work within its boundaries. Multimedia was part of the core experience — sound and video are integral to expressing some of these stories.
The process of talking, architecting, designing, then building, then using the tool, then tweaking the tool pushed and pulled both teams into interesting places. As architects, we started to get very excited by what we saw their team doing with the tool. The original design ideas that provided the inspiration got so much better once they became animated and interactive.
Design/content options include:
- Multiple responsive column patterns inside row containers
- Additionally, text fields have the ability to display as multiple columns
- “Hero” rows where an image is the primary design driver, and text/headline is secondary. Video heroes are possible
- Up to 10-colors to be used as row backgrounds or text colors
- Choose typefaces from Google Fonts for injection publication-wide or override on a page-by-page basis
- Rich text options for heading, pull-quotes, and text colors
- Video, audio, image, and gallery support inside any size container
- Video and audio player controls in a light or dark theme
- Autoplaying videos (where browsers allow) while muted
- Images optionally have the ability to Zoom in place (hover or touch the image to see the image scale by 200%) or open more
There are 8 chapters total in RAID the Icebox Now and four supporting pages. For those that know library systems and scholarly publications, notice the Citations and credits for each chapter. A few liberally use the footnote system. Each page in this publication is rich with content, both written and visual.
RAPID RESPONSE
An Unexpected Solution to a New Problem
The story does not end with the first successful online museum publication. In March of 2020, COVID-19 gripped the nation and colleges cut their semesters short or moved classes online. Students who would normally have an in-person end-of-year exhibition in the museum no longer had the opportunity.
Spurred on by the Museum, the university invested in upgrades to the Publication platform that could support 300+ new authors in the system (students) and specialized permissions to limit access only to their own content. A few new features were fast-tracked and an innovative ability for some authors to add custom javascript to Department landing pages opened the platform up for experimentation. The result was two online exhibitions that went into effect 6 weeks after the concepts were approved — one for 270+ graduate students and one for 450+ undergraduates.