The Drupal Association brought a new challenge to the Drupal community this past summer. At the beginning of May 2024, Dries Buytaert, the founder and leading visionary for the Drupal platform, announced an ambitious plan codenamed Starshot. The community rapidly came together around the concept and started planning how to make this vision of the future a reality, including Oomph.
What is Starshot/Drupal CMS?
Codename Starshot is now known as Drupal CMS. Drupal is a free, open-source content management system (CMS) where authors and developers build and maintain websites. Drupal has been around since 2001, and in the past, it was focused on being a developer-friendly platform that supports complex integrations and custom features.
Drupal CMS is a reimagining of Drupal for a wider market. Currently, Drupal successfully supports the complexities that governments, high-volume editorial sites, and membership organizations require. But, the barrier to entry for those that wanted to start with a small, simple site was too high.
Drupal CMS is the community’s solution to drastically lower the barrier to entry by providing a new onboarding and page-building experience, recipes for common features, advanced SEO features, and “AI Agents” that assist authors with content migration and site-building acceleration. Dries challenged the community to start building towards a working prototype in less than 4 months, in time to demonstrate significant progress for the audience at DrupalCon Barcelona in mid-September.
The Contact Form Track
The Contact Form is an official recommended recipe. As the name suggests, its purpose is to provide a Recipe that installs the necessary modules and default content to support a useful, but simple, Contact Form.
The primary user persona for Drupal CMS is a non-technical Marketer or Digital Strategist. Someone who wants to set up a simple website to promote themselves, a product, and/or a service. A Contact Form should start simple, but be ready for customization such as integrations with popular email newsletter services for exporting contacts and opting into receiving email.
Research and Competitive Analysis
Drupal CMS aims to compete with juggernauts like WordPress and relative newcomers like SquareSpace, Wix, and Webflow. To create a Contact Form that could compete with these well-known CMSs, our first step was to do some competitive research.
We went in two directions for the competitive analysis (Figma whiteboard). First, we researched what kinds of experiences and default contact forms competitor CMSs provided. Second, we took stock of common Contact Form patterns, including those from well-known SAAS products. We wanted to see the kinds of fields that sales lead generation forms typically leveraged. With both of these initiatives, we learned a few things quickly:
- The common fields for a simple Contact Form are generally consistent from platform to platform
- More complex sales lead forms also had much in common, though every form had something custom that directly related to the product offered
- WordPress does not have a Contact Form solution out of the box! Site owners need to research commonly used plugins to achieve this
Our approach was starting to take shape. We internally documented our decisions and high-value MVP requirements and presented them to the advisory board for feedback. With that, we were off to create the start of our Contact Form recipe.
Recipe and Future Phases
Phil Frilling started the Contact Form recipe, which is currently under peer review. The recipe is barebones for Phase 1 and will install the required modules to support a default Contact Form and email the site owner when messages are received. Once the initial recipe is accepted, a round of testing, documentation, and additional UI in a custom module may be required.
Our plans include additional fields set as optional for the site owner to turn on or off as they need. Some customization will be supported in a non-technical user-friendly way, but all the power of Drupal WebForms will be available to those that want to dig deeper into customizing their lead forms.
In the short term, we are proposing:
- Database storage of contacts that safeguards valuable leads that come in through forms
- Quick integrations with common CRMs and Newsletter providers
- Enhanced point-and-click admin UI through the in-progress Experience Builder
- Advanced fields to handle specialty data, like price ranges, date ranges, and similar
- Conditional defaults: Through the initial set up, when a site owner specifies an Editorial site they get one default Contact Form, while someone who specifies E-commerce gets another default Contact Form
- Feedback mechanism to request new fields
Next stop, the Moon
DrupalCon Barcelona took place last week, September 24 through 27, 2024, and the Drupal CMS prototype was displayed for all to see. Early 2025 is the next target date for a market-ready version of Drupal CMS. The community is continuing to push hard to create a fantastic future for the platform and for authors who are dissatisfied with the current CMS marketplace.
Oomph’s team will continue to work on the Contact Form Track while contributing in other ways with the full range of skills we have. The great part about such a large and momentous initiative as Drupal CMS is that the whole company can be involved, and each can contribute from their experience and expertise.
We’ll continue to share our progress in the weeks to come!
Thanks!
Track Lead J. Hogue with Philip Frilling contributing engineer, Akili Greer and Rachel Hart researchers, and thanks to Rachel Hart again for bringing the Contact Form Track Lead to Oomph for consideration.
Oomph has been quiet about our excitement for artificial intelligence (A.I.). While the tech world has exploded with new A.I. products, offerings, and add-ons to existing product suites, we have been formulating an approach to recommend A.I.-related services to our clients.
One of the biggest reasons why we have been quiet is the complexity and the fast-pace of change in the landscape. Giant companies have been trying A.I. with some loud public failures. The investment and venture capitalist community is hyped on A.I. but has recently become cautious as productivity and profit have not been boosted. It is a familiar boom-then-bust of attention that we have seen before — most recently with AR/VR after the Apple Vision Pro five months ago and previously with the Metaverse, Blockchain/NFTs, and Bitcoin.
There are many reasons to be optimistic about applications for A.I. in business. And there continue to be many reasons to be cautious as well. Just like any digital tool, A.I. has pros and cons and Oomph has carefully evaluated each. We are sharing our internal thoughts in the hopes that your business can use the same criteria when considering a potential investment in A.I.
Using A.I.: Not If, but How
Most digital tools now have some kind of A.I. or machine-learning built into them. A.I. has become ubiquitous and embedded in many systems we use every day. Given investor hype for companies that are leveraging A.I., more and more tools are likely to incorporate A.I.
This is not a new phenomenon. Grammarly has been around since 2015 and by many measures, it is an A.I. tool — it is trained on human written language to provide contextual corrections and suggestions for improvements.
Recently, though, embedded A.I. has exploded across markets. Many of the tools Oomph team members use every day have A.I. embedded in them, across sales, design, engineering, and project management — from Google Suite and Zoom to Github and Figma.
The market has already decided that business customers want access to time-saving A.I. tools. Some welcome these options, and others will use them reluctantly.
The Risks that A.I. Pose
Every technological breakthrough comes with risks. Some pundits (both for and against A.I. advancements) have likened its emergence to the Industrial Revolution of the early 20th century. And a high-level of positive significance is possible, while the cultural, societal, and environmental repercussions could also follow a similar trajectory.
A.I. has its downsides. When evaluating A.I. tools as a solution to our client’s problems, we keep this list of drawbacks and negative effects handy, so that we may review it and think about how to mitigate their negative effects:
- A.I. is built upon biased and flawed data
- Bias & flawed data leads to the perpetuation of stereotypes
- Flawed data leads to Hallucinations & harms Brands
- Poor A.I. answers erode Consumer Trust
- A.I.’s appetite for electricity is unsustainable
We have also found that our company values are a lens through which we can evaluate new technology and any proposed solutions. Oomph has three cultural values that form the center of our approach and our mission, and we add our stated 1% For the Planet commitment to that list as well:
- Smart
- Driven
- Personal
- Environmentally Committed
For each of A.I.’s drawbacks, we use the lens of our cultural values to guide our approach to evaluating and mitigating those potential ill effects.
A.I. is built upon biased and flawed data
At its core, A.I. is built upon terabytes of data and billions, if not trillions, of individual pieces of content. Training data for Large Language Models (LLMs) like Chat GPT, Llama, and Claude encompass mostly public content as well as special subscriptions through relationships with data providers like the New York Times and Reddit. Image generation tools like Midjourney and Adobe Firefly require billions of images to train them and have skirted similar copyright issues while gobbling up as much free public data as they can find.
Because LLMs require such a massive amount of data, it is impossible to curate those data sets to only what we may deem as “true” facts or the “perfect” images. Even if we were able to curate these training sets, who makes the determination of what to include or exclude?
The training data would need to be free of bias and free of sarcasm (a very human trait) for it to be reliable and useful. We’ve seen this play out with sometimes hilarious results. Google “A.I. Overviews” have told people to put glue on pizza to prevent the cheese from sliding off or to eat one rock a day for vitamins & minerals. Researchers and journalists traced these suggestions back to the training data from Reddit and The Onion.
Information architects have a saying: “All Data is Dirty.” It means no one creates “perfect” data, where every entry is reviewed, cross-checked for accuracy, and evaluated by a shared set of objective standards. Human bias and accidents always enter the data. Even the simple act of deciding what data to include (and therefore, which data is excluded) is bias. All data is dirty.
Bias & flawed data leads to the perpetuation of stereotypes
Many of the drawbacks of A.I. are interrelated — All data is dirty is related to D.E.I. Gender and racial biases surface in the answers A.I. provides. A.I. will perpetuate the harms that these biases produce as they become easier and easier to use and more and more prevalent. These harms are ones which society is only recently grappling with in a deep and meaningful way, and A.I. could roll back much of our progress.
We’ve seen this start to happen. Early reports from image creation tools discuss a European white male bias inherent in these tools — ask it to generate an image of someone in a specific occupation, and receive many white males in the results, unless that occupation is stereotypically “women’s work.” When AI is used to perform HR tasks, the software often advances those it perceives as males more quickly, and penalizes applications that contain female names and pronouns.
The bias is in the data and very, very difficult to remove. The entirety of digital written language over-indexes privileged white Europeans who can afford the tools to become authors. This comparably small pool of participants is also dominantly male, and the content they have created emphasizes white male perspectives. To curate bias out of the training data and create an equally representative pool is nearly impossible, especially when you consider the exponentially larger and larger sets of data new LLM models require for training.
Further, D.E.I. overflows into environmental impact. Last fall, the Fifth National Climate Assessment outlined the country’s climate status. Not only is the U.S. warming faster than the rest of the world, but they directly linked reductions in greenhouse gas emissions with reducing racial disparities. Climate impacts are felt most heavily in communities of color and low incomes, therefore, climate justice and racial justice are directly related.
Flawed data leads to “Hallucinations” & harms Brands
“Brand Safety” and How A.I. can harm Brands
Brand safety is the practice of protecting a company’s brand and reputation by monitoring online content related to the brand. This includes content the brand is directly responsible for creating about itself as well as the content created by authorized agents (most typically customer service reps, but now AI systems as well).
The data that comes out of A.I. agents will reflect on the brand employing the agent. A real life example is Air Canada. The A.I. chatbot gave a customer an answer that contradicted the information in the URL it provided. The customer chose to believe the A.I. answer, while the company tried to say that it could not be responsible if the customer didn’t follow the URL to the more authoritative information. In court, the customer won and Air Canada lost, resulting in bad publicity for the company.
Brand safety can also be compromised when a 3rd party feeds A.I. tools proprietary client data. Some terms and condition statements for A.I. tools are murky while others are direct. Midjourney’s terms state,
“By using the Services, You grant to Midjourney […] a perpetual, worldwide, non-exclusive, sublicensable no-charge, royalty-free, irrevocable copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, sublicense, and distribute text and image prompts You input into the Services”
Midjourney’s Terms of Service Statement
That makes it pretty clear that by using Midjourney, you implicitly agree that your data will become part of their system.
The implication that our client’s data might become available to everyone is a huge professional risk that Oomph avoids. Even using ChatGPT to provide content summaries on NDA data can open hidden risks.
What are “Hallucinations” and why do they happen?
It’s important to remember how current A.I. chatbots work. Like a smartphone’s predictive text tool, LLMs form statements by stitching together words, characters, and numbers based on the probability of each unit succeeding the previously generated units. The predictions can be very complex, adhering to grammatical structure and situational context as well as the initial prompt. Given this, they do not truly understand language or context.
At best, A.I. chatbots are a mirror that reflects how humans sound without a deep understanding of what any of the words mean.
A.I. systems are trying its best to provide an accurate and truthful answer without a complete understanding of the words it is using. A “hallucination” can occur for a variety of reasons and it is not always possible to trace their origins or reverse-engineer them out of a system.
As many recent news stories state, hallucinations are a huge problem with A.I. Companies like IBM and McDonald’s can’t get hallucinations under control and have pulled A.I. from their stores because of the headaches they cause. If they can’t make their investments in A.I. pay off, it makes us wonder about the usefulness of A.I. for consumer applications in general. And all of these gaffes hurt consumer’s perception of the brands and the services they provide.
Poor A.I. answers erode Consumer Trust
The aforementioned problems with A.I. are well-known in the tech industry. In the consumer sphere, A.I. has only just started to break into the public consciousness. Consumers are outcome-driven. If A.I. is a tool that can reliably save them time and reduce work, they don’t care how it works, but they do care about its accuracy.
Consumers are also misinformed or have a very surface level understanding of how A.I. works. In one study, only 30% of people correctly identified six different applications of A.I. People don’t have a complete picture of how pervasive A.I.-powered services already are.
The news media loves a good fail story, and A.I. has been providing plenty of those. With most of the media coverage of A.I. being either fear-mongering (“A.I. will take your job!”) or about hilarious hallucinations (“A.I. suggests you eat rocks!”), consumers will be conditioned to mistrust products and tools labeled “A.I.”
And for those who have had a first-hand experience with an A.I. tool, a poor A.I. experience makes all A.I. seem poor.
A.I.’s appetite for electricity is unsustainable
The environmental impact of our digital lives is invisible. Cloud services that store our lifetime of photographs sound like featherly, lightweight repositories that are actually giant, electricity-guzzling warehouses full of heat-producing servers. Cooling these data factories and providing the electricity to run them are a major infrastructure issue cities around the country face. And then A.I. came along.
While difficult to quantify, there are some scientists and journalists studying this issue, and they have found some alarming statistics:
- Training GPT-3 required more than 1,200 MWh which led to 500 metric tons of greenhouse gas emissions — equivalent to the amount of energy used for 1 million homes in one hour and the emissions of driving 1 million miles. GPT-4 has even greater needs.
- Research suggests a single generative A.I. query consumes energy at four or five times the magnitude of a typical search engine request.
- Northern Virginia needs the equivalent of several large nuclear power plants to serve all the new data centers planned and under construction.
- In order to support less consumer demand on fossil fuels (think electric cars, more electric heat and cooking), power plant executives are lobbying to keep coal-powered plants around for longer to meet increased demands. Already, soaring power consumption is delaying coal plant closures in Kansas, Nebraska, Wisconsin, and South Carolina.
- Google emissions grew 48% in the past five years in large part because of its wide deployment of A.I.
While the consumption needs are troubling, quickly creating more infrastructure to support these needs is not possible. New energy grids take multiple years and millions if not billions of dollars of investment. Parts of the country are already straining under the weight of our current energy needs and will continue to do so — peak summer demand is projected to grow by 38,000 megawatts nationwide in the next five years.
While a data center can be built in about a year, it can take five years or longer to connect renewable energy projects to the grid. While most new power projects built in 2024 are clean energy (solar, wind, hydro), they are not being built fast enough. And utilities note that data centers need power 24 hours a day, something most clean sources can’t provide. It should be heartbreaking that carbon-producing fuels like coal and gas are being kept online to support our data needs.
Oomph’s commitment to 1% for the Planet means that we want to design specific uses for A.I. instead of very broad ones. The environmental impact of A.I.’s energy demands is a major factor we consider when deciding how and when to use A.I.
Using our Values to Guide the Evaluation of A.I.
As we previously stated, our company values provide a lens through which we can evaluate A.I. and look to mitigate its negative effects. Many of the solutions cross over and mitigate more than one effect and represent a shared commitment to extracting the best results from any tool in our set
Smart
- Limit direct consumer access to the outputs of any A.I. tools, and put a well-trained human in the middle as curator. Despite the pitfalls of human bias, it’s better to be aware of them rather than allow A.I. to run unchecked
- Employ 3rd-party solutions with a proven track-record of hallucination reduction
Driven
- When possible, introduce a second proprietary dataset that can counterbalance training data or provide additional context for generated answers that are specific to the client’s use case and audience
- Restrict A.I. answers when qualifying, quantifying, or categorizing other humans, directly or indirectly
Personal
- Always provide training to authors using A.I. tools and be clear with help text and microcopy instructions about the limitations and biases of such datasets
1% for the Planet
- Limit the amount of A.I. an interface pushes at people without first allowing them to opt in — A.I. should not be the default
- Leverage “green” data centers if possible, or encourage the client using A.I. to purchase carbon offset credits
In Summary
While this article feels like we are strongly anti-A.I., we still have optimism and excitement about how A.I. systems can be used to augment and support human effort. Tools created with A.I. can make tasks and interactions more efficient, can help non-creatives jumpstart their creativity, and can eventually become agents that assist with complex tasks that are draining and unfulfilling for humans to perform.
For consumers or our clients to trust A.I., however, we need to provide ethical evaluation criteria. We can not use A.I. as a solve-all tool when it has clearly displayed limitations. We aim to continue to learn from others, experiment ourselves, and evaluate appropriate uses for A.I. with a clear set of criteria that align with our company culture.
To have a conversation about how your company might want to leverage A.I. responsibly, please contact us anytime.
Additional Reading List
- “The Politics of Classification” (YouTube). Dan Klyn, guest lecture at UM School of Information Architecture. 09 April 2024. A review of IA problems vs. AI problems, how classification is problematic, and how mathematical smoothness is unattainable.
- “Models All the Way Down.” Christo Buschek and Jer Thorp, Knowing Machines. A fascinating visual deep dive into training sets and the problematic ways in which these sets were curated by AI or humans, both with their own pitfalls.
- “AI spam is already starting to ruin the internet.” Katie Notopoulos, Business Insider, 29 January 2024. When garbage results flood Google, it’s bad for users — and Google.
- Racial Discrimination in Face Recognition Technology, Harvard, 24 October 2020. The title of this article explains itself well.
- Women are more likely to be replaced by AI, according to LinkedIn, Fast Company, 04 April 2024. Many workers are worried that their jobs will be replaced by artificial intelligence, and a growing body of research suggests that women have the most cause for concern.
- Brand Safety and AI, Writer.com. An overview of what brand safety means and how it is usually governed.
- AI and designers: the ethical and legal implications, UX Design, 25 February 2024. Not only can using training data potentially introduce legal troubles, but submitting your data to be processed by A.I. does as well.
- Can Generative AI’s Hallucination Problem be Overcome? Louis Poirier, C3.ai. 31 August 2023. A company claims to have a solution for A.I. hallucinations but doesn’t completely describe how in their marketing.
- Why AI-generated hands are the stuff of nightmares, explained by a scientist, Science Focus, 04 February 2023. Whether it’s hands with seven fingers or extra long palms, AI just can’t seem to get it right.
- Sycophancy in Generative-AI Chatbots, NNg. 12 January 2024. Human summary: Beyond hallucinations, LLMs have other problems that can erode trust: “Large language models like ChatGPT can lie to elicit approval from users. This phenomenon, called sycophancy, can be detected in state-of-the-art models.”
- Consumer attitudes towards AI and ML’s brand usage U.S. 2023. Valentina Dencheva, Statistica. 09 February 2023
- What the data says about Americans’ views of artificial intelligence. Pew Research Center. 21 November 2023
- Exploring the Spectrum of “Needfulness” in AI Products. Emily Campbull, The Shape of AI. 28 March 2024
- AI’s Impact On The Future Of Consumer Behavior And Expectations. Jean-Baptiste Hironde, Forbes. 31 August 2023.
- Is generative AI bad for the environment? A computer scientist explains the carbon footprint of ChatGPT and its cousins. The Conversation. 23 May 2023
Everyone’s been saying it (and, frankly, we tend to agree): We are currently in unprecedented times. It may feel like a cliche. But truly, when you stop and look around right now, not since the advent of the first consumer-friendly smartphone in 2008 has the digital web design and development industry seen such vast technological advances.
A few of these innovations have been kicking around for decades, but they’ve only moved into the greater public consciousness in the past year. Versions of artificial intelligence (AI) and chatbots have been around since the 1960s and even virtual reality (VR)/augmented reality (AR) has been attempted with some success since the 1990s (That Starner). But now, these technologies have reached a tipping point as companies join the rush to create new products that leverage AI and VR/AR.
What should we do with all this change? Let’s think about the immediate future for a moment (not the long-range future, because who knows what that holds). We at Oomph have been thinking about how we can start to use this new technology now — for ourselves and for our clients. Which ideas that seemed far-fetched only a year ago are now possible?
For this article, we’ll take a closer look at VR/AR, two digital technologies that either layer on top of or fully replace our real world.
VR/AR and the Vision Pro
Apple’s much-anticipated launch into the headset game shipped in early February 2024. With it came much hype, most centered around the price tag and limited ecosystem (for now). But after all the dust has settled, what has this flagship device told us about the future?
Meta, Oculus, Sony, and others have been in this space since 2017, but the Apple device has debuted a better experience in many respects. For one, Apple nailed the 3D visuals, using many cameras and low latency to reproduce a digital version of the real world around the wearer— in real time. All of this tells us that VR headsets are moving beyond gaming applications and becoming more mainstream for specific types of interactions and experiences, like virtually visiting the Eiffel Tower or watching the upcoming Summer Olympics.
What Is VR/AR Not Good At?
Comfort
Apple’s version of the device is large, uncomfortable, and too heavy to wear for long. And its competitors are not much better. The device will increasingly become smaller and more powerful, but for now, wearing one as an infinite virtual monitor for the entire workday is impossible.
Space
VR generally needs space for the wearer to move around. The Vision Pro is very good at overlaying virtual items into the physical world around the wearer, but for an application that requires the wearer to be fully immersed in a virtual world, it is a poor experience to pantomime moving through a confined space. Immersion is best when the movements required to interact are small or when the wearer has adequate space to participate.
Haptics
“Haptic” feedback is the sense that physical objects provide. Think about turning a doorknob: You feel the surface, the warmth or coolness of the material, how the object can be rotated (as opposed to pulled like a lever), and the resistance from the springs.
Phones provide small amounts of haptic feedback in the form of vibrations and sounds. Haptics are on the horizon for many VR platforms but have yet to be built into headset systems. For now, haptics are provided by add-on products like this haptic gaming chair.
What Is VR/AR Good For?
Even without haptics and free spatial range, immersion and presence in VR is very effective. It turns out that the brain only requires sight and sound to create a believable sense of immersion. Have you tried a virtual roller coaster? If so, you know it doesn’t take much to feel a sense of presence in a virtual environment.
Live Events
VR and AR’s most promising applications are with live in-person and televised events. In addition to a flat “screen” of the event, AR-generated spatial representations of the event and ways to interact with the event are expanding. A prototype video with Formula 1 racing is a great example of how this application can increase engagement with these events.
Imagine if your next virtual conference were available in VR and AR. How much more immersed would you feel?
Museum and Cultural Institution Experiences
Similar to live events, AR can enhance museum experiences greatly. With AR, viewers can look at an object in its real space — for example, a sarcophagus would actually appear in a tomb — and access additional information about that object, like the time and place it was created and the artist.
Museums are already experimenting with experiences that leverage your phone’s camera or VR headsets. Some have experimented with virtually showing artwork by the same artist that other museums own to display a wider range of work within an exhibition.
With the expansion of personal VR equipment like the Vision Pro, the next obvious step is to bring the museum to your living room, much like the National Gallery in London bringing its collection into public spaces (see bullet point #5).
Try Before You Buy (TBYB)
Using a version of AR with your phone to preview furniture in your home is not new. But what other experiences can benefit from an immersive “try before you buy” experience?
- Test-drive a new car with VR, or experience driving a real car on a real track in a mixed-reality game. As haptic feedback becomes more prevalent, the experience of test-driving will become even closer to the real thing.
- Even small purchases have been using VR and AR successfully to trial their products, including AR for fashion retail, eyeglass virtual try-ons, and preview apps for cosmetics. Even do-it-yourself retailer Lowe’s experimented with fully haptic VR in 2018. But those are all big-name retailers. The real future for VR/AR-powered TBYB experiences will allow smaller companies to jump into the space, like Shopify enabled for its merchants.
- Visit destinations before traveling. With VR, you could visit fragile ecosystems without affecting the physical environment or get a sense of the physical space before traveling to a new spot. Visitors who require special assistance could preview the amenities beforehand. Games have been developed for generic experiences like deep sea diving, but we expect more specific travel destinations to provide VR experiences of their own, like California’s Redwood Forest.
What’s Possible With VR/AR?
The above examples of what VR/AR is good at are just a few ways the technology is already in use — each of which can be a jumping-off point for leveraging VR/AR for your own business.
But what are some new frontiers that have yet to be fully explored? What else is possible?
- What if a digital sculptor or 3D model maker could create new three-dimensional models in a three-dimensional virtual space? The application for architects and urban planners is just as impactful.
- What if medical training could be immersive, anatomically accurate, and reduce the need for cadavers? What if rare conditions could be simulated to increase exposure and aid in accurate diagnoses?
- What if mental health disorders could be treated with the aid of immersive virtual environments? Exposure therapy can aid in treating and dealing with anxiety, depression, and PTSD.
- What if highly skilled workers could have technical mentors virtually assist and verify the quality of a build? Aerospace, automotive, and other manufacturing industry experts could visit multiple locations virtually and go where they’re needed most.
- What if complex mathematic-based sciences could provide immersive, data-manipulative environments for exploration? Think of the possibilities for fields like geology, astronomy, and climate change.
- What if movies were told from a more personal point of view? What if the movie viewer felt more like a participant? How could someone’s range of experiences expand with such immersive storytelling?
Continue the AR/VR Conversation
The Vision Pro hasn’t taken the world by storm, as Apple likely hoped. It may still be too early for the market to figure out what AR/VR is good for. But we think it won’t go away completely, either. With big investments like Apple’s, it is reasonable to assume the next version will find a stronger foothold in the market.
Here at Oomph, we’ll keep pondering and researching impactful ways that tomorrow’s technology can help solve today’s problems. We hope these ideas have inspired some of your own explorations, and if so, we’d love to hear more about them.
Drop us a line and let’s chat about how VR/AR could engage your audience.
More than two years after Google announced the launch of its powerful new website analytics platform, Google Analytics 4 (GA4), the final countdown to make the switch is on.
GA4 will officially replace Google’s previous analytics platform, Universal Analytics (UA), on July 1, 2023. It’s the first major analytics update from Google since 2012 — and it’s a big deal. As we discussed in a blog post last year, GA4 uses big data and machine learning to provide a next-generation approach to measurement, including:
- Unifying data across multiple websites and apps
- A new focus on events vs. sessions
- Cookieless user tracking
- More personalized and predictive analytics
At Oomph, we’ve learned a thing or two about making the transition seamless while handling GA4 migrations for our clients – including a few platform “gotchas” that are definitely better to know in advance. Before you start your migration, do yourself a favor and explore our GA4 setup guide.
Your 12-Step GA4 Migration Checklist
Step 1: Create a GA4 Analytics Property and Implement Tagging
The Gist: Launch the GA4 setup assistant to create a new GA4 property for your site or app. For sites that already have UA installed, Google is beginning to create GA4 properties automatically for them beginning in March 2023 (unless you opt out). If you’re migrating from UA, you can connect your UA property to your GA4 property to use the existing Google tracking tag on your site. For new sites, you’ll need to add the tag directly to your site or via Google Tag Manager.
The Gotcha: During property setup, Google will ask you which data streams you’d like to add (websites, apps, etc…). This is simple if you’re just tracking one site, but gets more complex for organizations with multiple properties, like educational institutions or retailers with individual locations. While UA allowed you to separate data streams by geography or line of business, GA4 handles this differently. This Google guide can help you choose the ideal configuration for your business model.
Step 2: Update Your Data Retention Settings
The Gist: GA4 lets you control how long you retain data on users and events before it’s automatically deleted from Google’s servers. For user-level data, including conversions, you can hang on to data for up to 14 months. For other event data, you have the option to retain the information for 2 months or 14 months.
The Gotcha: The data retention limits are much shorter than UA, which allowed you to keep Google-signals data for up to 26 months in some cases. The default retention setting in GA4 is 2 months for some types of data – a surprisingly short window, in our opinion – so be sure to extend it to avoid data loss.
Step 3: Initialize BigQuery
The Gist: Have a lot of data to analyze? GA4 integrates with BigQuery, Google’s cloud-based data warehouse, so you can store historical data and run analyses on massive datasets. Google walks you through the steps here.
The Gotcha: Since GA4 has tight time limits on data retention as well as data limits on reporting , skipping this step could compromise your reporting. BigQuery is a helpful workaround for storing, analyzing and visualizing large amounts of complex data.
Step 4: Configure Enhanced Measurements
The Gist: GA4 measures much more than pageviews – you can now track actions like outbound link clicks, scrolls, and engagements with YouTube videos automatically through the platform. When you set up GA4, simply check the box for any metrics you want GA4 to monitor. You can still use Google tags to customize tracking for other types of events or use Google’s Measurement Protocol for advanced tracking.
The Gotcha: If you were previously measuring events through Google tags that GA4 will now measure automatically, take the time to review which ones to keep to avoid duplicating efforts. It may be simpler to use GA4 tracking – giving you a good reason to do that Google Tag Manager cleanup you’ve been meaning to get to.
Step 5: Configure Internal and Developer Traffic Settings
The Gist: To avoid having employees or IT teams cloud your insights, set up filters for internal and developer traffic. You can create up to 10 filters per property.
The Gotcha: Setting up filters for these users is only the first step – you’ll also need to toggle the filter to “Active” for it to take effect (a step that didn’t exist in UA). Make sure to turn yours on for accurate reporting.
Step 6: Migrate Users
The Gist: If you were previously using UA, you’ll need to migrate your users and their permission settings to GA4. Google has a step-by-step guide for migrating users.
The Gotcha: Migrating users is a little more complex than just clicking a button. You’ll need to install the GA4 Migrator from Google Analytics add-on, then decide how to migrate each user from UA. You also have the option to add users manually.
Step 7: Migrate Custom Events
The Gist: Event tracking has fundamentally changed in GA4. While UA offered three default parameters for events (eventcategory, action, and eventlabel), GA4 lets you create any custom conventions you’d like. With more options at your fingertips, it’s a great opportunity to think through your overall measurement approach and which data is truly useful for your business intelligence.
When mapping UA events to GA4, look first to see if GA4 is collecting the data as an enhanced measurement, automatically collected, or recommended event. If not, you can create your own custom event using custom definitions. Google has the details for mapping events.
The Gotcha: Don’t go overboard creating custom definitions – GA4 limits you to 50 per property.
Step 8: Migrate Custom Filters to Insights
The Gist: Custom filters in UA have become Insights in GA4. The platform offers two types of insights: automated insights based on unusual changes or emerging trends, and custom insights based on conditions that matter to you. As you implement GA4, you can set up custom insights for Google to display on your Insights dashboard. Google will also email alerts upon request.
The Gotcha: Similar to custom events, GA4 limits you to 50 custom insights per property.
Step 9: Migrate Your Segments
The Gist: Segments work differently in GA4 than they do in UA. In GA4, you’ll only find segments in Explorations. The good news is you can now set up segments for events, allowing you to segment data based on user behavior as well as more traditional segments like user geography or demographics.
The Gotcha: Each Exploration has a limit of 10 segments. If you’re using a lot of segments currently in UA, you’ll likely need to create individual reports to see data for each segment. While you can also create comparisons in reports for data subsets, those are even more limited at just four comparisons per report.
Step 10: Migrate Your Audiences
The Gist: Just like UA, GA4 allows you to set up audiences to explore trends among specific user groups. To migrate your audiences from one platform to another, you’ll need to manually create each audience in GA4.
The Gotcha: You can create a maximum of 100 audiences for each GA4 property (starting to sense a theme here?). Also, keep in mind that GA4 audiences don’t apply retroactively. While Google will provide information on users in the last 30 days who meet your audience criteria — for example, visitors from California who donated more than $100 — it won’t apply the audience filter to users earlier than that.
Step 11: Migrate Goals to Conversion Events
The Gist: If you were previously tracking goals in UA, you’ll need to migrate them over to GA4, where they’re now called conversion events. GA4 has a goals migration tool that makes this process pretty simple.
The Gotcha: GA4 limits you to 30 custom conversion events per property. If you’re in e-commerce or another industry with complex marketing needs, those 30 conversion events will add up very quickly. With GA4, it will be important to review conversion events regularly and retire ones that aren’t relevant anymore, like conversions for previous campaigns.
Step 12: Migrate Alerts
The Gist: Using custom alerts in UA? As we covered in Step 8, you can now set up custom insights to keep tabs on key changes in user activity. GA4 will deliver alerts through your Insights dashboard or email, based on your preferences.
The Gotcha: This one is actually more of a bonus – GA4 will now evaluate your data hourly, so you can learn about and respond to changes more quickly.
The Future of Measurement Is Here
GA4 is already transforming how brands think about measurement and user insights – and it’s only the beginning. While Google has been tight-lipped about the GA4 roadmap, we can likely expect even more enhancements and capabilities in the not-too-distant future. The sooner you make the transition to GA4, the sooner you’ll have access to a new level of intelligence to shape your digital roadmap and business decisions.
Need a hand getting started? We’re here to help – reach out to book a chat with us.
Was this blog written by ChatGPT? How would you really know? And what impact would it have on Oomph’s site if it were?
Yes, we know there are some great AI-detecting tools out there. But for the typical reader, picking an AI article out of a crowd can be challenging. And with AI tools like ChatGPT delivering better-quality results than ever, many companies are struggling to decide whether to hand their content and SEO reins over to the machines.
While AI can add value to your content, companies should proceed with caution to avoid some potentially big pitfalls. Here’s why.
Quality Content Is Critical to SEO
All the way back in 1996, Bill Gates said “Content is King.” This phrase became ubiquitous in the early years of SEO. At that time, you could rank well simply by writing about a search topic, then optimizing your writing with the right keywords.
Since then, search algorithms have evolved, and the Google search engine results page (SERP) is more crowded than ever (not to mention the new continuous scroll). While ranking isn’t as easy as it used to be, content — whether it’s a video, an image, a product, a blog, or a news story — still matters. When content ranks well, it’s an ad-spend-free magnet for readers that eventually become customers and subscribers. What else on your website can do that?
That makes your content special. It also puts a premium on producing a high volume of relevant content quickly. For years, brands have done this the old-fashioned way: with copywriters and designers researching, writing, revising, creating images, and publishing ad infinitum.
Until AI.
AI-Powered Content Generation Changes How We Make Content
There’s no point in denying it: AI will impact SEO. But it’s still up for debate just how deep that impact will be.
The rise of AI-powered language processing tools like ChatGPT and Meta’s Llama makes quick content generation a reality. They can easily produce high-quality content that will likely only get better with time. ChatGPT can produce an article in minutes, not hours, and even suggest keywords for you.
For all those reasons, marketers have embraced these new tools — ChatGPT shattered records when it reached 100 million daily active users in a mere two months. As the saying goes, though, just because we can, doesn’t mean we should — especially if using it means compromising on quality or losing what makes us, well, us.
After all, AI is a tool, not a human replacement. If you’re considering integrating AI into your martech stack, here’s what to keep in mind.
4 Things to Know Before Using AI for Content
1. Google may ding you.
In August 2022, Google released an algorithm update called the “helpful content update.” You can read more on Google’s blog, but what’s important is this: Google prioritizes content by people, for people.
In other words, using AI solely to churn out articles that feed the search engine beast won’t work. You should continue to create content that puts your audience first, is genuinely helpful, and is 100% unique to your brand.
But Google’s stance does beg the question: Will they really know if you used AI to generate your content? Well, they do have a very special set of skills…
While AI content can be good quality and easy to read, tools like Crossplag’s AI Content Detector and OpenAI’s GPT-2 Output Detector can detect whether something is written by a human or by an AI bot. That comes down, in part, to the fact that AI lacks EEAT – experience, expertise, authority, and trust – which is not just a major factor in SEO, but a major reason why people consume content in the first place.
2. AI isn’t always accurate.
AI culls information from across the web, and it’s remarkably good at it. But aggregating massive amounts of information isn’t the same as fact-checking. AI tools often get it wrong – even with easily verifiable, fact-based questions.
Take the question: What is the largest country in Central America that isn’t Mexico?
AI’s answer:
The largest country in Central America that isn’t Mexico is Guatemala. Guatemala has an area of about 108,890 square kilometers, making it the largest country in Central America by land area. It is bordered by Mexico to the north, Belize to the northeast, Honduras to the east, El Salvador to the southeast, and the Pacific Ocean to the south.
Google’s answer:
Nicaragua is the largest country in the subregion, with a total area of over 130 thousand square kilometers, followed by Honduras, with more than 112 thousand square kilometers.
(Hint: Google is right)
This is a problem for your business and for your SEO. Accuracy is an important part of EEAT. It’s especially critical for “your money or your life” (YMYL) content, like financial or medical advice. In these cases, the content you publish can and does impact real people’s lives and livelihoods.
Spotty accuracy has even prompted some sites, like StackOverflow, to ban AI-generated content.
3. You don’t have the rights to your AI-generated content.
AI-generated content isn’t actually copyrightable. Yes, you read that right.
As it stands, the courts have interpreted the Copyright Act to mean that only human-authored works can be copyrighted. Something is only legally defensible when it involves at least a minimal amount of creativity.
We’re all familiar with this concept when it comes to books, TV shows, movies, and paintings, but it matters for your website, too. You want your content and your ideas to be yours. If you use AI-generated content, be aware that it isn’t subject to standard intellectual property rules and may not be protected.
4. AI-generated content can’t capture your voice.
Even if you fly under Google’s radar with your AI content, it still won’t really feel like you. You are the only you. We know that sounds like it belongs on an inspirational poster, but it’s true. Your voice is what readers will connect with, believe in, and ultimately trust.
Sure, AI may succeed at stringing together facts and keywords to create content that ranks. And that content may even drive people to your site. But it lacks the emotional intelligence to infuse your content with real-life examples and anecdotes that make readers more likely to read, share, and engage with your content and your brand.
Your voice is also what sets you apart from other brands in your industry. Without that, why would a customer choose you?
AI and SEO Is a Journey, Not a Destination
AI is not the end of human-driven SEO. In reality, AI has only just arrived. But the real opportunity lies in finding out how AI can enhance, not replace, our work to create winning SEO content.
Think about content translation. Hand translation is the most premium translation option out there. It’s also costly. While machine translation on its own can be a bit of a mess, many translation companies actually start with an automated solution, then bring in the humans to polish that first translation into a final product. If you ask us, AI and SEO will work in much the same way.
Even in a post-AI world, SEO all comes down to this guidance from Google:
“If it is useful, helpful, original, and satisfies aspects of E-E-A-T, it might do well in Search. If it doesn’t, it might not.”
If and when you do decide to leverage AI, keep these tips in mind:
- Use AI to generate ideas, not create finished products. Asking ChatGPT to provide five industry trends that you turn into a series of articles is one thing; cutting and pasting a full AI-generated article onto your website is another.
- Fact-check anything and everything AI tells you. + Infuse your brand into every piece of AI-generated copy. Personal stories, insights, and anecdotes are what makes great content great.
At Oomph, we believe quality branded content is just one component of a digital experience that engages and inspires your audience.
Need help integrating SEO content into your company’s website? Let’s talk.
There’s a phrase often used to gauge healthcare quality: the right care, at the right time, in the right place. When those elements are out of sync, the patient experience can take a turn for the worse. Think about missed appointments, misunderstood pre-op instructions, mismanagement of medication… all issues that require clear and timely communication to ensure positive outcomes.
Many healthcare organizations are tapping into patient engagement tools that use artificial intelligence (AI) to drive better healthcare experiences. In this article, we’ll cover a number of use cases for AI within healthcare, showing how it can benefit providers, their patients, and their staff in an increasingly digital world.
Healthcare Consumers are Going Digital
Use of AI in the clinical space has been growing for years, from Google’s AI aiding diagnostic screenings to IBM’s Watson AI informing clinical decision making. But there are many other touchpoints along a patient’s continuum of care that can impact patient outcomes.
The industry is seeing a shift towards more personalized and data-driven patient engagement, with recent studies showing that patients are ready to integrate AI and other digital tools into their healthcare experiences.
For instance, healthcare consumers are increasingly comfortable with doctors using AI to make better decisions about their care. They also want personalized engagement to motivate them on their health journey, with 65% of patients agreeing that communication from providers makes them want to do more to improve their health.
At the same time, 80% of consumers prefer to use digital channels (online messaging, virtual appointments, text, etc…) to communicate with healthcare providers at least some of the time. This points to significant opportunities for digital tools to help providers and patients manage the healthcare experience.
Filling in Gaps: AI Use Cases for Healthcare
Healthcare will always need skilled, highly trained experts to deliver high quality care. But, AI can fill in some gaps by addressing staffing shortages, easing workflows, and improving communication. Many healthcare executives also believe AI can provide a full return on investment in less than three years.
Here are some ways AI can support healthcare consumers and providers to improve patients’ outcomes and experiences.
Streamline basic communications
Using AI as the first line to a patient for basic information enables convenient, personalized service without tying up staff resources. With tools like text-based messaging, chatbots, and automated tasks, providers can communicate with people on the devices, and at the times, that they prefer.
Examples include:
- Scheduling appointments
- Sending appointment reminders
- Answering insurance questions
- Following up on a speciality referral
Remove barriers to access
AI algorithms are being used in some settings to conduct initial interviews that help patients determine whether they need to see a live, medical professional — and then send them to the right provider.
AI can offer a bridge for patients who, for a host of reasons, are stuck in taking the first step. For instance, having the first touchpoint as a chatbot helps overcome a barrier for patients seeking care within often-stigmatized specialities, such as behavioral health. It can also minimize time wasted at the point of care communicating things like address changes and insurance providers.
Reduce no-show rates
In the U.S., patient no-show rates range from 5.5 to 50%, depending on the location and type of practice. Missed appointments not only result in lost revenue and operational inefficiencies for health systems, they can also delay preventive care, increase readmissions, and harm long-term outcomes for patients.
AI-driven communications help ensure that patients receive critical reminders at optimal times, mitigating these risks. For instance:
- Text-based procedure prep. Automated, time-specific reminders sent to patients prior to a procedure can ensure they correctly follow instructions — and remember to show up
- Post-procedure support. Chatbots can deliver post-op care instructions or follow-up visit reminders, with a phone number to call if things don’t improve
Close information gaps
Imagine a patient at home, alone, not feeling well, and confused about how to take their medication or how to handle post-operative care. Not having that critical information can lead to poor outcomes, including readmission.
Delivering information at the right time, in the right place, is key. But multiple issues can arise, such as:
- A patient needs help outside normal business hours
- Different care team members provide conflicting instructions
- An understaffed care team is unable to return a patient’s call in a timely manner
By providing consistent, accurate, and timely information, AI-enabled tools can provide critical support for patients and care teams.
Minimize staff burnout
Burnout and low morale have contributed to severe staffing shortages in the US healthcare system. The result is an increase in negative patient outcomes, in addition to massive hikes in labor costs for hospitals and health systems.
AI can help lighten the burden on healthcare employees through automated touchpoints in the patient journey, such as self-scheduling platforms or FAQ-answering chatbots. AI can even perform triage informed by machine learning, helping streamline the intake process and getting patients the right care as quickly as possible.
This frees up staff to focus on more meaningful downstream conversations between patients and care teams. It can also reduce phone center wait times for those patients (often seniors) who still rely on phone calls with live staff members.
Maximize staff resources
When 80% of healthcare consumers are willing to switch providers for convenience factors alone, it’s crucial to communicate with patients through their preferred channels. Some people respond to asynchronous requests (such as scheduling confirmations) late at night, while others must speak to a live staff member during the day.
Using multimodal communication channels (phone, text, email, web) offers two major benefits for healthcare providers. For one, you can better engage patients who prefer asynchronous communication. You can also identify the ratio of patients who prefer live calls and staff accordingly when it’s needed most.
Leverage customer feedback
AI provides fast, seamless avenues to gather and track patient satisfaction data and create a reliable, continual customer feedback loop. Tools like chatbots and text messaging expand the number of ways patients can communicate with healthcare providers, making it easier to leave feedback and driving not only a better digital customer experience but potentially leading to better satisfaction scores that may impact payment or quality scores.
AI offers another benefit, too: the ability to identify and respond more quickly to negative feedback. The more swiftly a problem is resolved, the better the consumer experience.
A Few Tips for Getting Started
First, find a trusted technology partner who has experience with healthcare IT stacks and understands how AI fits into the landscape. The healthcare industry is distinctly different from other verticals that might use tools like chatbots and automated tasks. You need a partner who’s familiar with the nuances of the healthcare consumer experience and regulatory compliance requirements.
Next, start small. It’s best to choose your first AI applications in a strategic, coordinated manner. One approach is to identify the biggest bottlenecks for care teams and/or patients, then assess which areas present the lowest risk to the customer experience and the greatest chance of operational success.
Finally, track the progress of your first implementation. Evaluate, iterate, evaluate again, and then expand into other areas when you’re comfortable with the results.
Focal points for iteration:
- Standardize and optimize scripts
- Simplify processes for both patients and staff
- Learn the pathways that users take to accomplish tasks
- Monitor feedback and make improvements as needed
Above all, remember that successful use of AI isn’t just about how well you implement the technology. It’s about the impact those digital tools have on improving patient outcomes and increasing patient satisfaction with their healthcare experience.
Interested in exploring the specific ways AI can benefit your care team and patients? We’re here to help! Contact us today.
The circular economy aims to help the environment by reducing waste, mainly by keeping goods and services in circulation for as long as possible. Unlike the traditional linear economy, in which things are produced, consumed, and then discarded, a circular economy ensures that resources are shared, repaired, reused, and recycled, over and over.
What does this have to do with your digital platform? In a nutshell: everything.
From tackling climate change to creating more resilient markets, the circular economy is a systems-level solution for global environmental and economic issues. By building digital platforms for the circular economy, your business will be better prepared for whatever the future brings.
The Circular Economy isn’t Coming. It’s Here.
With environmental challenges growing day by day, businesses all over the world are going circular. Here are a few examples:
- Target plans for 100% of its branded products to last longer, be easier to repair or recycle, and be made from materials that are regenerative, recyclable, or sustainably sourced.
- Trove’s ecommerce platform lets companies buy back and resell their own products. This extends each products’ use cycle, lowering the environmental and social cost per item.
- Renault is increasing the life of its vehicle parts by restoring old engine parts. This limits waste, prolongs the life of older cars, and reduces emissions from manufacturing.
One area where nearly every business could adopt a circular model is the creation and use of digital platforms. The process of building websites and apps, along with their use over time, consumes precious resources (both people and energy). That’s why Oomph joined 1% For the Planet earlier this year. Our membership reflects our commitment to do more collective good — and to hold ourselves accountable for our collective impact on the environment.
But, we’re not just donating profits to environmental causes. We’re helping companies build sustainable digital platforms for the circular economy.
Curious about your platform’s environmental impact? Enter your URL into this tool to get an estimate of your digital platform’s carbon footprint.
Changing Your Platform From Linear to Circular
If protecting the environment and promoting sustainability is a priority for your business, it’s time to change the way you build and operate your websites and apps. Here’s what switching to a platform for the circular economy could look like.
From a linear mindset…
When building new sites or apps, many companies fail to focus on longevity or performance. Within just a few years, their platforms become obsolete, either as a result of business changes or a desire to keep up with rapidly evolving technologies.
So, every few years, they have to start all over again — with all the associated resource costs of building a new platform and migrating content from the old one.
Platforms that aren’t built with performance in mind tend to waste a ton of energy (and money) in their daily operation. As these platforms grow in complexity and slow down in performance, one unfortunate solution is to just increase computing power. That means you need new hardware to power the computing cycles, which leads to more e-waste, more mining for metals and more pollution from manufacturing, and more electricity to power the entire supply chain.
Enter the circular economy.
…to a circular approach.
Building a platform for the circular economy is about reducing harmful impacts and wasteful resource use, and increasing the longevity of systems and components. There are three main areas you can address:
1. Design out waste and pollution from the start.
At Oomph, we begin every project with a thorough and thoughtful discovery process that gets to the heart of what we’re building, and why. By identifying what your business truly needs in a platform — today and potentially tomorrow — you’ll minimize the need to rebuild again later.
It’s also crucial to build efficiencies into your backend code. Clean, efficient code makes things load faster and run more quickly, with fewer energy cycles required per output.
Look for existing frameworks, tools, and third-party services that provide the functions you need and will continue to stay in service for years or decades to come. And, instead of building a monolith platform that has to be upgraded every few years or requires massive computing power, consider switching to a more nimble and efficient microservices architecture.
2. Keep products and services in use.
Regular maintenance and timely patching is key to prolonging the life of your platform. So is proactively looking for performance issues. Be sure to regularly test and assess your platform’s speed and efficiency, so you can address problems early on.
While we’re advocating for using products and services for as long as possible, if your platform is built on microservices, don’t be afraid to replace an existing service with a new one. Just make sure the new service provides a benefit that outweighs the resource costs of implementing it.
3. Aim to regenerate natural systems.
The term “regenerate” describes a process that mimics the cycles of nature by restoring or renewing sources of energy and materials. It might seem like the natural world is far removed from your in-house tech, but there are a number of ways that your IT choices impact the environment.
For starters, you can factor sustainability into your decisions around vendors and equipment. Look for digital hosting companies and data centers that are green or LEED-certified. Power your hardware with renewable energy sources. Ultimately, the goal is to consider not just how to reduce your platform’s impact on the environment, but how you can create a net-positive effect by doing better with less.
Get Ready for the Future
We’ve long seen that the ways in which businesses and societies use resources can transform local and global communities. And we know that environmental quality is inextricably linked to human wellbeing and prosperity. The circular economy, then, provides a way to improve our future readiness.
Companies that invest in sustainability generally experience better resilience, improved operational performance, and longer-lasting growth. They’re also better suited to meet the new business landscape, as governments incentivize sustainable activities, customers prefer sustainable products, and employees demand sustainable leadership.
Interested in exploring how you can join the new circular economy with your digital platforms? We’d love to help you explore your options, just contact us.
Google Analytics 4, or GA4, is Google’s fourth iteration of its website analytics platform. This is no ordinary upgrade! Leveraging the power of big data and machine learning, GA4 offers entirely new ways to collect and analyze user activity data across websites and apps.
While GA4 provides access to robust new tools and features for data-driven decision making, it also sheds many of the metrics and reports we’re used to in Google Analytics 3 (a.k.a. Universal Analytics, or UA).
Google will be sunsetting UA properties in July 2023. Here’s what you need to know about GA4’s capabilities — and why you should start the transition sooner rather than later.
Not sure which platform you currently have (UA vs. GA4)?
Take a look at this cheat sheet.
Key Benefits of Google Analytics 4
We’re living in a more privacy-centric world, and GA4 is Google’s answer to stricter data laws and browser regulations. GA4 is designed to function without third-party cookies, using machine learning and statistical modeling instead to collect data.
This change comes with a range of benefits, from more actionable user insights to enhanced reporting capabilities.
Broader Insights
Unlike UA, GA4 has the ability to track users across devices and platforms, combining all the data into a single property with a unified set of metrics and dimensions. This gives you a more complete picture of how users interact with your brand, whether they’re on your website, your mobile app, or both.
Another major advantage is that you can more effectively track conversions — particularly for users that might visit on their mobile, come back on desktop, and then download/purchase/register through your app. Because GA4 attributes actions to users across devices and platforms, you can see the entire journey a user takes from start to finish.
Predictive Metrics
Using machine learning, GA4 offers powerful new metrics to predict user actions and includes new data buckets like Acquisition, Engagement, Monetization, and Retention. These predictive metrics can help you better understand your audience and make more informed decisions, so you can do things like tailoring your website experience for different users or creating targeted marketing campaigns.
Customized Reporting
UA offers a set of standard reports with some customization options. By contrast, GA4 enables and encourages users to create custom reports with only the data they need.
With greater freedom to create reports, you can declutter your dashboard and make decisions more quickly by drilling down to the data that’s most important to you. You can even create a separate “Audiences” report with custom user definitions, further tailoring the data to support your business needs.
Key Features of Google Analytics 4
With comprehensive user tracking, predictive metrics, customizable reports, and more, GA4 promises to be much more powerful than any previous version of Google Analytics. Here are the core capabilities driving all of those benefits.
Event-Based Tracking
One of the biggest changes in GA4 is how user data is collected. In UA, data is collected via tags placed on each page of a website. Users are tracked via sessions, or set periods that begin and end when a user enters and exits a site.
Instead of relying on pageviews and sessions, GA4 tracks user interactions, known as “events,” as users complete them. This focus on individual user interactions provides a more complete picture of each user’s journey across your website or app.
This event-based model also makes it possible to track interactions that don’t happen on web pages but can be influenced by digital marketing, such as in-store visits or in-app purchases. And, it allows Google to more accurately deduplicate users.
Cross-Platform Data Consolidation
In UA, “properties” are where Analytics data is collected for individual websites and apps. You can then use views to see and report on the data in various ways.
GA4 uses individual data streams to combine data from different platforms into a single property. You can add multiple data streams into a property and create different views based on certain criteria.
For example, you could create a stream for all web traffic, a stream for all app traffic, or a stream for traffic from both that covers a given geographic area. By placing the same tracking code across different digital platforms, you can consolidate data to track users who move between the streams.
Advanced Analytics
Maybe the most exciting feature for data geeks like us, GA4’s Explorations Hub offers a suite of advanced data and analytical techniques that go well beyond standard reports. The Explore section lets you create custom analyses to uncover deeper insights about your website and app performance, with filters and segments so you can drill down even further.
GA4 also integrates with BigQuery, Google’s cloud-based data warehouse, where you can run complex analyses of very large datasets. Bonus: BigQuery offers near-unlimited data storage.
Machine Learning
In an increasingly cookie-less world, Google is attempting to balance privacy limitations with usable insights. Using machine learning, GA4 fills in data gaps and provides predictive insights about user behavior and trends.
Machine learning combines artificial intelligence (AI) and computer science to fill in gaps and make predictions. It essentially looks for patterns of activity that can be fed into an algorithm to understand and predict how users behave online.
As an example, GA4’s AI-powered insights can help identify user actions that are most likely to lead to conversions. Using metrics like purchase probability, churn probability, and revenue prediction, you can customize marketing campaigns or target specific audiences to achieve your conversion goals.
Why You Should Switch to GA4 ASAP
You’ll be able to collect and use platform data in your existing UA property until July 1, 2023. After that, you’ll be able to access historical data for only six months. That’s why we strongly recommend you implement GA4 as soon as possible.
Transitioning now will allow you to:
Feed The Machine
Many of GA4’s core features rely on machine learning, and in order for machine learning to be effective, the algorithm needs time to learn. The sooner you set up and start collecting data in GA4, the more time your models will have to analyze and learn, shaping the insights you’ll need down the road.
Train Your People
Those using GA4 will need time to learn the new terminology, user interface, and capabilities. Switching early gives your team time to get used to the new platform and work out new processes and reporting while you still have UA to fall back on.
Get Year-Over-Year Data
GA4 is forward-facing only, which means your new GA4 property will only collect data from the time of creation; it won’t import past data from UA. Once UA sunsets next year, you’ll be relying solely on GA4 for year-over-year data.
Why does that matter? Here at Oomph, when we launch client projects, we use Google Analytics data to analyze digital platform performance so we can develop the best possible user experience. By examining user flows, page visits, common search terms, engagement metrics, and more, we can very quickly get a picture of where a platform has strengths and weak points. And we need your historical data to do it.
Ready to switch to Google Analytics 4? It’s a relatively simple process. Just follow the steps Google provides, whether you want to switch from UA to GA4 or set up a GA4 property alongside an existing UA property.
If you’re not feeling confident about handling the transition alone, we’d love to help. Get in touch with us today.
In the age of hyper-personalization by the likes of Amazon and Netflix, customized user experiences are now table stakes for digital platforms. Businesses that invest in personalization are rewarded with loyalty and revenue. Those that don’t, get left behind.
But making that investment isn’t a straightforward affair. Many services that pitch themselves as personalization tools don’t even come close to creating a truly customized experience. And today’s savvy web users aren’t fooled:
- 74% of customers feel frustrated when website content isn’t personalized.
- 84% of consumers say being treated like a person, not a number, is important to winning their business.
Where we’ve seen businesses stumble is in substituting personification for true personalization. While personalization involves tailoring content based on direct personal information, personification is based on categories of consumers, not individual people.
Here’s what you need to know about the difference.
Perils of Personification
Gartner defines personification as “the delivery of relevant digital experiences to individuals based on their inferred membership in a defined customer segment, rather than their personal identity.” It’s the digital equivalent of calling someone “buddy” or “champ” because you can’t remember their name. I know that I know you, but I don’t know who you are.
Personification tools can track user behavior and use AI to place users into, say, one of several marketing personas you’ve developed. But in terms of driving meaningful, personalized interactions with users, personification falls down.
Here are a few critical issues with commonly used personification tools:
User Session Data
Information about a user’s interactions with an application is stored temporarily on the application’s server, not the browser.
EXAMPLE: During this session, I see that you’ve visited a piece of content that falls in a specific category. For the rest of your session, I can serve up other content tagged with the same category (often in Featured, Related, or You May Also Like sections).
PROBLEM 1: As soon as the browser session is closed, the user data is lost.
PROBLEM 2: The moment you switch from one device (e.g. mobile) to another (e.g. tablet) you lose all session data.
Contextual Data
Marketing automation or location intelligence software can use AI to gather environmental data about a user to deliver customized content or services.
EXAMPLE: I see that you’re in Los Angeles, California. Knowing your local weather, time zone, and other regional attributes, I can tailor the content you see to be more specific to your area.
PROBLEM: I have to ask you first if I can track your location, and you might say no.
First Party Cookie Data
By storing information about a user’s behavior directly on a domain, site owners can collect analytics data and remember language settings, among other functions.
EXAMPLE: Last time you visited my website, you commented on a certain piece of content. I may even have asked, “Do you want to see more of this type of content?” Now that you’re back, I can serve up newly published content of the same type. I can even feature it right on the homepage.
PROBLEM 1: I need to ask you if I can use cookies with you, and you can say no.
PROBLEM 2: If you clear the cookies in your browser, I’ll lose that valuable data.
PROBLEM 3: Another family member is using the same application on the same device, and now I’m getting mixed signals. This is completely messing with my AI.
Bottom line: personification is not really personalization. Even worse, you may lose your data and have to start from square one. To deliver true personalization, you need first-party data from authenticated users. Instead of guessing who your customer is, get to know who they really are.
Next-Level Personalization
True personalization is difficult to achieve outside of a digital platform, where people register as users (versus just casually visiting a website). Once someone becomes an authenticated user, it’s easier to learn a number of things about them.
83% of consumers are willing to share their data to enable personalized experiences. Platform users in particular are more open to providing personal information, because they’re specifically looking for a customized experience. With that first-party data, you can track preferences and interactions to improve the user experience. And you’re not going to lose the historical data when a user closes a session or clears their cookies.
Here are some key benefits:
- I actually know who you are, and over time I can continue to learn more about you and your interests. Plus, I’ll only lose that data if you quit my tool, service, or platform.
- Your data follows you across any devices where you use my application (mobile, tablet, desktop, etc.).
- I can start a two-way conversation with you, so you can tell me how you want to personalize your experience and what kind of content you want to see.
- I can reach out to you with personalized suggestions, driving more engagement and giving you a reason to return more often.
- While you can always say no to first-party cookies if you have privacy concerns, by signing up for my platform, you’re indicating a level of trust and consent.
Looking for Middle Ground?
In the end, you’ll deliver the best personalization (and earn the most engagement) by building an interactive platform and leveraging first-party data. But what if you have a decent website, and you’re not ready to shift to a platform?
You could approach it as a testing ground for personalization instead. By creating a series of micro-interactions using personification tools, you can test whether your users actually want a personalized experience, and if so, what they want to personalize.
Let’s say you’re a news outlet. You could just let people come and read your content online. At the next level, you can try to guess who they are through personification (via cookie requests, location prompts, etc.). If users are interacting with your prompts, it’s likely they’re interested in having a personalized experience.
Finally, you could build a platform for registered users and offer true personalization. You’ll not only deliver a better user experience, you’ll increase engagement and return visits — not to mention sales and other revenue.
At whatever level you can, go the extra mile and give your users what they want. We’re happy to help! Contact us today to learn more.
You’ve decided to decouple, you’re building your stack, and the options are limitless – oh, the freedom of escaping the LAMP square and the boundaries of the conventional CMS! Utilities that were once lumped together into one unmoveable bundle can now be separately selected, or not selected at all. It is indeed refreshing to pick and choose the individual services best fitted to your project. But now you have to choose.
One of those choices is your backend content storage. Even though decoupling means breaking free from monolithic architecture, certain concepts persist: content modeling, field types, and the content editor experience.
I recently evaluated four headless CMS options: Contentful, Cosmic, Dato, and Prismic. Prior to that, I had no experience with any of them. Fortunately they all offer a free plan to test and trial their software. For simpler projects, that may be all you need. But if not, each CMS offers multiple tiers of features and support, and costs vary widely depending on your requirements.
I was tasked with finding a CMS and plan that met the following specs:
- 10 separate content editors
- Multiple editor roles (admin, contributor, etc…)
- Draft/scheduled publishing, with some sort of editorial workflow, even as simple as limiting a role to draft-only status
Although this doesn’t seem like a big ask for any CMS, these requirements eliminated the free plans for all four services, so cost became a factor.
Along with cost, I focused my evaluation on the editor experience, modeling options, integration potential, and other features. While I found lots of similarities between the four, each had something a little different to offer.
It’s worth mentioning that development is active on all four CMSs. New features and improvements were added just within the span of time it took to write this article. So keep in mind that current limitations could be resolved in a future update.
Contentful
Contentful’s Team package is currently priced at $489 per month, making it the most expensive of the four. This package includes 10 content editors and 2 separate roles. There is no editorial workflow without paying extra, but scheduled publishing is included.
Terminology
A site is a “space” and content types are “content types.”
What I love
The media library. Media of many different types and sources – from images to videos to documents and more – can be easily organized and filtered. Each asset has a freeform name and description field for searching and filtering. And since you can provide your own asset name, you’re not stuck with image_8456_blah.jpeg
or whatever nonsense title your asset had when you uploaded it. Additionally, image dimensions are shown on the list view, which is a quick, helpful reference.
Video description
- Upload images from many sources: Contentful supports the addition of media from sources such as Google Photos and Facebook, not just the files on your computer.
- Give your files meaningful names and add searchable descriptions.
RUNNER UP
Dato’s Media Area offers similar filtering and a searchable notes field.
What I like
Commenting. Every piece of content has an admin comments area for notes or questions, with a threaded Reply feature.
My Views. My Views is an option in the content navigation panel. With a single click, you can display only content that you created or edited – very convenient when working with multiple editors and a large volume of content.
What could be better
Price. Contentful is expensive if your project needs don’t allow you to use the free/community plan. You do get a lot of features for the paid plans, but there’s a big jump between the free plan and the first tier paid plan.
Cosmic
Cosmic ranks second most pricey for our requirements at $299 per month for the Pro Package. This package includes 10 editors and 4 predefined roles. It has draft/scheduled publishing, and individual editor accounts can be limited to draft status only.
Terminology
A site is a “bucket” and content types are “object types.”
What I love
Developer Tools. Developer Tools is a handy button you can click at the object or object type level to view your REST endpoint and response. It also shows other ways (GraphQL, CLI, etc.) to connect to a resource, using real code that is specific to your bucket and objects.
Video description
- Find the Developer Tools button on all objects or a single object.
- Browse the different methods to connect to your resource and return data.
RUNNER UP
Dato’s has an API Explorer for writing and running GraphQL queries.
The Slack Community. The Cosmic Slack community offers a convenient way to get technical support – in some cases, even down to lines-of-code level support – with quick response times.
What I like
View as editor. This is a toggle button in the navigation panel to hide developer features – even if your account is assigned the developer or admin role – allowing you to view the CMS as the editor role sees it. This is useful for documenting an editor’s process or troubleshooting their workflow.
Extensions. Cosmic provides several plug-and-play extensions, including importers for Contentful and WordPress content, as well as Algolia Search, Stripe, and more. I tested the Algolia extension, and it only took minutes to set up and immediately began syncing content to Algolia indexes1. You can also write your extensions and upload them to your account.
What could be better
Price/price structure. I found Cosmic’s pricing structure to be the most confusing, with extra monthly charges for common features like localization, backups, versioning, and webhooks. It’s hard to know what you’ll actually pay per month until you add up all the extras. And once you do, you may be close to the cost of Contentful’s lower tier.
Content model changes. Changing the content model after you’ve created or imported a lot of content is tricky. Content model changes don’t flow down to existing content without a manual process of unlocking, editing and re-publishing each piece of content, which can be very inefficient and confusing.
Dato
Dato’s Professional package is priced at €99 (about $120) per month, making it the second least pricey for our requirements. It includes 10 content editors and 15 roles, with configurable options to limit publishing rights.
Terminology
A site is a “project” and content types are “models.”
What I love
Tree-like collections. Dato lets you organize and display records in a hierarchical structure with visual nesting. The other CMSs give you roundabout ways to accomplish this, usually requiring extra fields. But Dato lets you do it without altering the content model. And creating hierarchy is as simple as dragging and dropping one record under another, making things like taxonomy a breeze to build.
Video description
- Organize the records in your model in a hierarchical structure by dragging and dropping.
- Dato lets you easily visualize the nested relationships.
RUNNER UP
No other CMS in this comparison offers hierarchical organizing quite like Dato, but Cosmic provides a parent field type, and Prismic has a documented strategy for creating hierarchical relationships.
What I like
Maintenance Mode. You can temporarily disable writes on your project and display a warning message to logged in editors. If you need to prevent editors from adding/editing content — for instance, during content model changes — this is a useful feature.
What could be better
Field types. Out-of-the-box Dato doesn’t provide field types for dropdowns or checkboxes. There’s a plugin available that transforms a JSON field into a multiselect, but it’s presented as a list of toggles/booleans rather than a true multiselect. And managing that field means working with JSON, which isn’t a great experience for content editors.
Dato is also missing a simple repeater field for adding one or more of something. I created repeater-like functionality using the Modular Content field type, but this feels overly complicated, especially when every other CMS in my comparison implements either a Repeater field type (Cosmic, Prismic) or a multi-value field setting (Contentful).
Prismic
Prismic ranks least pricey, at $100/mo for the Medium Package. This package includes 25 content editors, 3 predefined roles, draft/scheduled publishing and an editorial workflow.
Terminology
A site is a “repository”, and content types are “custom types.”
What I love
Field types. Prismic gives you 16 unique field types for modeling your content, but it’s not the number of types that I love; it’s the particular combination of options: the dedicated Title type for headings, the Media link type, the Embed type, the Color Picker. Plus, the UI is so intuitive, content editors know exactly what they’re expected to do when populating a field.
Take the Title type for example. If you need a heading field in the other CMSs, you’d probably use a plain text or rich text field type. Using rich text almost guarantees you’ll get unwanted stuff (paragraph tags, in particular) wrapped around whatever heading the editor enters. Using plain text doesn’t let the editor pick which heading they want. Prismic’s Title type field solves both of these problems.
Video description
- Prismic gives you a unique combination of field type options, some that you won’t find elsewhere.
- Use the Title type to create designated heading fields, and select or deselect the heading tags you want to include.
RUNNER UP
This is a tough one, but I’m leaning toward Contentful. What they lack in the number of available field types, they make up for in Appearance settings that allow you to render a field type to the editor in different formats.
Price. Unlimited documents, custom types, API calls and locales are included in the Medium package for a reasonable price. Additionally, Prismic has more packages and support tiers than any of the others, with one paid plan as low as $7/mo.
What I like
Slices. Slices are an interesting addition to back-end content modeling, because they’re essentially components: things you build on the front. Prismic lets you create custom components, or use their predefined ones — blockquotes, a list of articles, an image gallery, etc… I admit I didn’t test how these components render on the front-end, but Slices deserve further exploration.
What could be better
Integration options/plugins. Although Webhooks are included in all of Prismic’s plans, there doesn’t seem to be any development of plugins or ways to quickly extend functionality. Every other CMS in this comparison offers simple, click-to-install extensions and integrations to common services.
A note on Front-end Frameworks
A headless CMS, by simple definition, is a content storage container. It does not provide the markup that your website visitors will see and use. Therefore, your project planning will include choosing and testing a front-end system or framework, such as Gatsby JS. It’s important to find out early in the process what, if any, obstacles exist with connecting your choice CMS to your choice front-end.
At Oomph, we’ve successfully used both Contentful and Cosmic with a Gatsby front-end. However, Gatsby plugins exist for Prismic and Dato as well.
Summary
As with any decoupled service, your headless CMS choice will be determined by your project’s distinct requirements. Make sure to build into your project plan enough time to experiment with any CMS options you’re considering. If you haven’t worked with a particular CMS yet, give yourself a full day to explore, build a sample content model, add some content and media, and test the connection to your front-end.
Does a clear winner emerge from this comparison? I don’t think so. They each succeed and stand out in different ways. Use this article to kickstart your own evaluation, and see what works for you!
At the time of this writing, there are some field types that the extension doesn’t pass from Cosmic to Algolia.
If you live in an area with a lot of freight or commuter trains, you may have noticed that trains often have more than one engine powering the cars. Sometimes it is an engine in front and one in back, or in the case of long freight lines, there could be an engine in the middle. This is known as “Distributed power” and is actually a recent engineering strategy. Evenly distributed power allows them to carry more, and carry it more efficiently.1
When it comes to your website, the same engineering can apply. If the Content Management System (CMS) is the only source of power, it may not have enough oomph to load pages quickly and concurrently for many users. Not only that, but a single source of power may slow down innovation and delivery to multiple sources in today’s multi-channel digital ecosystems.
One of the benefits of decoupled platform architecture is that power is distributed more evenly across the endpoints. Decoupled means that the authoring system and the rendering system for site visitors are not the same. Instead of one CMS powering content authoring and page rendering, two systems handle each task discreetly.
Digital properties are ever growing and evolving. While evaluating how to grow your own system, it’s important to know the difference between coupled and decoupled CMS architectures. Selecting the best structure for your organization will ensure you not only get what you want, but what is best for your entire team — editors, developers, designers, and marketers alike.
Bombardier Zefiro vector graphic designed for Vexels
What is a traditional CMS architecture?
In a traditional, or coupled, CMS, the architecture tightly links the back-end content administration experience to the front-end user experience.
Content creation such as basic pages, news, or blog articles are created, managed, and stored along with all media assets through the CMS’s back end administration screens. The back end is also where site developers create and store customized applications and design templates for use by the front-end of the site.
Essentially, the two sides of the CMS are bound within the same system, storing content created by authenticated users and then also being directly responsible for delivering content to the browser and end users (front end).
From a technical standpoint, a traditional CMS platform is comprised of:
- A private database-driven CMS in which content editors create and maintain content for the site, generally through some CMS administration interfaces we’re used to (think WordPress or Drupal authoring interfaces)
- An application where engineers create and apply design schemas. Extra permissions and features within the CMS give developers more options to extend the application and control the front end output
- A public front end that displays published content on HTML pages
What is a decoupled CMS architecture?
Decoupled CMS architecture separates, or decouples, the back-end and front-end management of a website into two different systems — one for content creation and storage, and another for consuming content and presenting it to the user.
In a decoupled CMS, these two systems are housed separately and work independently of the other. Once content is created and edited in the back end, this front-end agnostic approach takes advantage of flexible and fast web services and APIs to deliver the raw content to any front-end system on any device or channel. It is even possible that an authoring system delivers content to more than front-end (i.e. an article is published in the back-end and pushed out to a website as well as a mobile App).
From a technical standpoint, a decoupled CMS platform is comprised of:
- A private database-driven CMS in which content editors create and maintain content for the site, generally through the same CMS administration interfaces we’re used to — though it doesn’t have to be2
- The CMS provides a way for the front-end application to consume the data. A web-service API — usually in a RESTful manner and in a mashup-friendly format such as JSON — is the most common way
- Popular front-end frameworks such as React, VueJS, or GatsbyJS deliver the public visitor experience via a Javascript application rendering the output of the API into HTML
Benefits of decoupled
By moving the responsibility for the user experience completely into the browser, the decoupled model provides a number of benefits:
Push the envelope
Shifting the end-user experience out of the conventions and structures of the back-end allows UX Engineers and front-end masterminds to push the boundaries of the experience. Decoupled development gives front-end specialists full control using their native tools.
This is largely because traditional back-end platforms have been focused on the flexibility of authoring content and less so on the experience of public visitors. Too often the programming experience slows engineers down and makes it more difficult to deliver an experience that “wows” your users.
Need for speed
Traditional CMS structures are bogged down by “out-of-the-box” features that many sites don’t use, causing unnecessary bloat. Decoupled CMS structures allow your web development team to choose only what code they need and remove what they don’t. This leaner codebase can result in faster content delivery times and can allow the authoring site to load more quickly for your editors.
Made to order
Not only can decoupled architecture be faster, but it can allow for richer interactions. The front-end system can be focused on delivering a truly interactive experience in the form of in-browser applications, potentially delivering content without a visitor reloading the page.
The back-end becomes the system of record and “state machine”, but back-and-forth interaction will happen in the browser and in real-time.
Security Guard
Decoupling the back-end from the front-end is more secure. Since the front-end does not expose its connection to the authoring system, it makes the ecosystem less vulnerable to hackers. Further, depending on how the front-end communication is set up, if the back-end goes offline, it may not interrupt the front-end experience.
In it for the long haul
Decoupled architectures integrate easily with new technology and innovations and allow for flexibility with future technologies. More and more, this is the way that digital platform development is moving. Lean back-end only or “flat file” content management systems have entered the market — like Contentful and Cosmic — while server hosting companies are dealing with the needs of decoupled architecture as well.
The best of both worlds
Decoupled architecture allows the best decisions for two very different sets of users. Content editors and authors can continue to use some of the same CMSs they have been familiar with. These CMSs have great power and flexibility for content modelling and authoring workflows, and will continue to be useful and powerful tools. At the same time, front-end developers can get the power and flexibility they need from a completely different system. And your customers can get the amazing user experiences they have come to expect.
The New Age of Content Management Systems
Today’s modern CMS revolution is driving up demand for more flexible, scalable, customizable content management systems that deliver the experience businesses want and customers expect. Separating the front- and back-ends can enable organizations to quicken page load times, iterate new ideas and features faster, and deliver experiences that “wow” your audience.
- Great article on the distributed power of trains: Why is there an engine in the middle of that train?
- Non-monolithic CMSs have been hitting the market lately, and include products like Contentful, CosmicJS, and Prismic, among others.
We’ve dreamed about having conversations with our computers for a long time. Stanley Kubrick’s 1968 film 2001: A Space Odyssey imagined a sentient computer named Hal. In the past few years, with the rise of Siri, Alexa and more, we live in that reality.
A simpler version of the natural language processing apps like Siri are chatbots. 2016 was the rise of the chatbot, and 2017 will continue that trend, with more and more users having “conversations” via the keyboard to find information and complete tasks instead of clicking around in search engines and on websites. Some of us have not yet interacted with a chatbot before, so, what is it like? And what is it like to design one?
The experience of interacting with a chatbot is very similar to interacting with a smart and eager-to-please dog — the bot understands a lot, but won’t always give you the feedback that you expect; their intentions are good, but they are not always helpful; and you end up training yourself as much as you try to train them.
When it comes to designing one, it is very much like training a dog as well. Since there is no visual interface, the tools of brand are limited to the voice and tone of the snippets of conversation that the bot can have. If you are lucky, there might be an avatar that the bot can use, but beyond that, the “personality” of the bot comes solely from the way it reacts and the words that it uses.
No UI is still a UI
First, a slight tangent, if I may.
There is a saying, coined by Golden Krishna, that goes The best user interface is no user interface,
or, The Future of UI is no UI.
While on the surface, that is a nice, catchy statement, I don’t think it is true. To get semantic, I’d refine that to say The Future of UI is no GUI.
The absence of a graphical user interface (GUI) is still an interface. If your bot uses Facebook Messenger, or WeChat, or Slack, it still has a GUI. It’s just not one that you designed, and you have to work within the constraints of that system.
Further, I would argue that a Conversational UI is still a UI. How could it not be? To design one, you need to make the same decisions that you would if you were designing anything else — there are just a lot more constraints to work within.
You can’t choose the typeface, you can’t rely on colors, or design details such as borders, drop shadows, and the like. Instead, your UI and all the personality of this little bot comes through in the way it can have a conversation. It would be a very different experience to have the bot use a military tone of voice — gruff, coarse, quick and to the point with very few conversational niceties — than if you used a friendly, happy, and more naturally intoned voice.
A Conversational UI may not be a graphical UI, but it is still a UI. There are decisions to be made, but because there are no visuals to work with, the decisions you make are even more crucial to the expression of your bot’s personality and the brand that you want it to reflect.
Our good little boy, Gitpup
As a way to experiment with Conversational UIs, and as a way to create something useful for our team that might also have usefulness in the larger marketplace, Oomph designed, built, and implemented a chatbot for <a href=”//slack.com”target=”blank” rel=”noopener noreferrer” title=”Opens in a new window”>Slack (a Slackbot) and <a href=”//www.hipchat.com”target=”blank” rel=”noopener noreferrer” title=”Opens in a new window”>Hipchat called <a href=”//www.gitpup.io”target=”_blank” rel=”noopener noreferrer” title=”Opens in a new window”>Gitpup.
The problem that we wanted to solve was “How do you make peer-to-peer code review easier to manage and more fun?” As the name suggests, Gitpup is an assistant that we connect to our Github account. Its job is to announce when a Pull Request is ready for review, assign it to someone, track it, and then close it when the review is approved. Once a review has been completed, the user is given “karma” points that come with bragging rights and have become a currency of sorts amongst our team.
Designing the chatbot
Our “design” and personality came from who we decided that this bot was. We toyed with a bunch of ideas, but once we decided that the bot was a dog, it was easy to start to have some fun with it.
As a dog, Gitpup is everything you would want one to be: helpful, quick to respond, and obedient. But Gitpup has a little sass, too. We wanted it to have some personality, and be somewhat of a jokester. It has random phrases that it throws back at you if you mention it by name, and while the randomness is uncontrollable by nature, it does lead to some fun interactions.
A sample conversation with Gitpup
nate whats unassigned | |
Gitpup APP No open, unapproved and unassigned Pull Requests | |
nate Thanks gitpup | |
Gitpup APP You’re welcome, nate! | |
nichole hey gitpup | |
Gitpup APP See if I play fetch with you next time, @nichole | |
nichole y r u so mean? |
It utters plenty of encouragement when a Pull Request is taken or completed by a team member.
Sample workflow with Gitpup
Gitpup APP [Review #3366] marc has been assigned to Bring react ish into main oomph repo!, #233 on oomphinc.com. Go get ’em, Champ! | |
Gitpup APP [Review #3364] brett closed and merged Add logintoboggan module, #269 on drupal-repo +1 Karma (1,766) Yes. You. Can! | |
Gitpup APP [Review #3365] brandon approved PR Add style for when there are logos in a grid within content (SASS), #207 on kab. Magnifique! | |
Gitpup APP [Review #3365] Thanks for participating, brandon +5 Karma (903). Woof! |
We also built some randomness into the daily grind. In addition to helping with reviews, assignments, and approvals, Gitpup announces a Raffle for any Karma that might be unassigned once a day at a random time. You can type I’m in
or I’ll join
to enter the raffle.
Joining Gitpup’s daily Karma raffle
Gitpup APP Attention Today’s community Karma raffle is in effect! Say I’m in! to join the raffle within the next 5 minutes to join for your chance to win 45 Karma! | |
nate I’m in | |
Gitpup APP Thanks for joining, @nate, you’re the first and only participant in this raffle, Purrrrrfect! Chances are looking good! | |
nichole im in | |
Gitpup APP Fingers crossed, @nichole! There are now 2 participants in this raffle. | |
brett ill join | |
Gitpup APP This could be the one, @brett! There are now 3 participants in this raffle. | |
stephen I’m in as well | |
Gitpup APP May the odds be ever in your favor, @stephen! There are now 4 participants in this raffle. | |
Gitpup APP And the winner of today’s raffle is… @nathandentzau! Congratulations! +45 Karma |
After interacting with Gitpup for a few days, it is easy to forget that it is just a bot that we programmed. Personality is an easy thing to create if you start from a place of authenticity. For us, we were not trying to create a person, we simply wanted an amiable cartoon character that would be fun to interact with and make what might be a dreary task into something a little more engaging.
A well-trained Bot will also train the User
This little experiment into Conversational UIs helped show us how any UI will also train the user in ways to get the most out of it. With a visual UI, it might be lots of trial and error — clicking around to get a sense of what things are and where they go. With Gitpup, it was the same sort of trial and error, but the downside is that a command can simply go unnoticed if it is not in the bot’s vocabulary.
For instance, our patterns match phrases with a certain degree of fuzziness — we ignore punctuation and plurals, just so the commands don’t have to be perfectly typed. To ask Gitpup which PRs are not assigned to someone yet, you can type “whats unassigned” and Gitpup will respond. There is a drawback to a Slackbot that is in an open room where other conversations are happening — you don’t want it to respond to everything that anyone says, so, if you try to get Gitpup’s attention but don’t use the right terms, you won’t get any response.
Users can quickly learn the different commands that a simple bot will recognize, and it’s not so bad to use trial and error to figure it out. But with a more complicated bot that you are in a direct conversation with (via a text app), it can be a little more awkward if your bot does not recognize a lot of natural language. Be sure to design some phrases in the bot’s conversational database that can be used when it does not understand something, or it can not complete a particular task. Success messages are easy to create, but it is harder to create a failure message that does not frustrate a user or make them think that the bot is simply useless.
brett gimme 4034 | |
Gitpup APP Ruh roh, I don’t know about Review #4034 | |
nichole give 3466 to @thatguy | |
Gitpup APP Darn, I don’t know who that is! |
When is a Chatbot right for you?
While chatbots are all the rage right now, I don’t think normal point and click websites are going away anytime soon. A chatbot service isn’t right for everyone because the service they offer is very transactional in nature—get me this, research that. They can offer information that a user didn’t know that they wanted, but that is often more complicated, and in the end, the user needs to opt into the service somehow.
The questions you can ask are:
- How complex is your service? Would the simple conversational style of a chatbot be helpful for your customers, or frustrating?
- Can you imagine having a conversation with your service? What would it sound like? Are visuals necessary to the experience?
- Would your brand identity translate into a “personality” for a bot? Can you imagine what the “voice” of the service might sound like?
- Where are your customers? Are they likely to already use a message service like Facebook Messenger, WeChat or Slack? Do you want to be where they are?
- Are your customers likely to have an Amazon Echo or Google Home device? Is your service better suited as an add-on to these personal assistants?
- Will customers need to authorize a purchase? Will you need to create your own checkout experience, or can you leverage a pre-existing service like Amazon?
- Do you want your service to work over SMS? Will the limitations of text messaging (message length, possible fees for users) hinder the app’s functionality?
Find a Chatbot and Use it!
If you are a marketer looking for new ways to connect with customers, or a manager looking for tools to keep employees efficient and engaged, you should become familiar with chatbots. We’d recommend these as good starting points:
- Gitpup (of course): if you manage a development team that works with GitHub
- x.ai: A personal assistant chatbot that works by being CC’ed on your emails
- healthtap.com: A Facebook Messenger bot that answers your medical questions and is backed up by real people
- meetcleo.com: A SMS chatbot for managing your money
- assi.st: An integrated platform of bots that work for Google Home, Messenger, Twitter, and Alexa.
The barriers between our computers and our normal everyday interactions are coming down. A computer that can hear your voice commands is probably not far away from you at any given time. Conversational UIs are making it easier to get what you need at any moment without finding a particular app, typing on a tiny screen, or doing a lot of clicking around. They become another tool in the belt and another way to be where your customers are.
If you’d like more information about chatbots and whether or not your company can benefit from the power of No GUI, drop us a line. We’d love to chat person to person. 😉
More reading about chatbots: