Oomph has been quiet about our excitement for artificial intelligence (A.I.). While the tech world has exploded with new A.I. products, offerings, and add-ons to existing product suites, we have been formulating an approach to recommend A.I.-related services to our clients.
One of the biggest reasons why we have been quiet is the complexity and the fast-pace of change in the landscape. Giant companies have been trying A.I. with some loud public failures. The investment and venture capitalist community is hyped on A.I. but has recently become cautious as productivity and profit have not been boosted. It is a familiar boom-then-bust of attention that we have seen before — most recently with AR/VR after the Apple Vision Pro five months ago and previously with the Metaverse, Blockchain/NFTs, and Bitcoin.
There are many reasons to be optimistic about applications for A.I. in business. And there continue to be many reasons to be cautious as well. Just like any digital tool, A.I. has pros and cons and Oomph has carefully evaluated each. We are sharing our internal thoughts in the hopes that your business can use the same criteria when considering a potential investment in A.I.
Using A.I.: Not If, but How
Most digital tools now have some kind of A.I. or machine-learning built into them. A.I. has become ubiquitous and embedded in many systems we use every day. Given investor hype for companies that are leveraging A.I., more and more tools are likely to incorporate A.I.
This is not a new phenomenon. Grammarly has been around since 2015 and by many measures, it is an A.I. tool — it is trained on human written language to provide contextual corrections and suggestions for improvements.
Recently, though, embedded A.I. has exploded across markets. Many of the tools Oomph team members use every day have A.I. embedded in them, across sales, design, engineering, and project management — from Google Suite and Zoom to Github and Figma.
The market has already decided that business customers want access to time-saving A.I. tools. Some welcome these options, and others will use them reluctantly.
The Risks that A.I. Pose
Every technological breakthrough comes with risks. Some pundits (both for and against A.I. advancements) have likened its emergence to the Industrial Revolution of the early 20th century. And a high-level of positive significance is possible, while the cultural, societal, and environmental repercussions could also follow a similar trajectory.
A.I. has its downsides. When evaluating A.I. tools as a solution to our client’s problems, we keep this list of drawbacks and negative effects handy, so that we may review it and think about how to mitigate their negative effects:
- A.I. is built upon biased and flawed data
- Bias & flawed data leads to the perpetuation of stereotypes
- Flawed data leads to Hallucinations & harms Brands
- Poor A.I. answers erode Consumer Trust
- A.I.’s appetite for electricity is unsustainable
We have also found that our company values are a lens through which we can evaluate new technology and any proposed solutions. Oomph has three cultural values that form the center of our approach and our mission, and we add our stated 1% For the Planet commitment to that list as well:
- Smart
- Driven
- Personal
- Environmentally Committed
For each of A.I.’s drawbacks, we use the lens of our cultural values to guide our approach to evaluating and mitigating those potential ill effects.
A.I. is built upon biased and flawed data
At its core, A.I. is built upon terabytes of data and billions, if not trillions, of individual pieces of content. Training data for Large Language Models (LLMs) like Chat GPT, Llama, and Claude encompass mostly public content as well as special subscriptions through relationships with data providers like the New York Times and Reddit. Image generation tools like Midjourney and Adobe Firefly require billions of images to train them and have skirted similar copyright issues while gobbling up as much free public data as they can find.
Because LLMs require such a massive amount of data, it is impossible to curate those data sets to only what we may deem as “true” facts or the “perfect” images. Even if we were able to curate these training sets, who makes the determination of what to include or exclude?
The training data would need to be free of bias and free of sarcasm (a very human trait) for it to be reliable and useful. We’ve seen this play out with sometimes hilarious results. Google “A.I. Overviews” have told people to put glue on pizza to prevent the cheese from sliding off or to eat one rock a day for vitamins & minerals. Researchers and journalists traced these suggestions back to the training data from Reddit and The Onion.
Information architects have a saying: “All Data is Dirty.” It means no one creates “perfect” data, where every entry is reviewed, cross-checked for accuracy, and evaluated by a shared set of objective standards. Human bias and accidents always enter the data. Even the simple act of deciding what data to include (and therefore, which data is excluded) is bias. All data is dirty.
Bias & flawed data leads to the perpetuation of stereotypes
Many of the drawbacks of A.I. are interrelated — All data is dirty is related to D.E.I. Gender and racial biases surface in the answers A.I. provides. A.I. will perpetuate the harms that these biases produce as they become easier and easier to use and more and more prevalent. These harms are ones which society is only recently grappling with in a deep and meaningful way, and A.I. could roll back much of our progress.
We’ve seen this start to happen. Early reports from image creation tools discuss a European white male bias inherent in these tools — ask it to generate an image of someone in a specific occupation, and receive many white males in the results, unless that occupation is stereotypically “women’s work.” When AI is used to perform HR tasks, the software often advances those it perceives as males more quickly, and penalizes applications that contain female names and pronouns.
The bias is in the data and very, very difficult to remove. The entirety of digital written language over-indexes privileged white Europeans who can afford the tools to become authors. This comparably small pool of participants is also dominantly male, and the content they have created emphasizes white male perspectives. To curate bias out of the training data and create an equally representative pool is nearly impossible, especially when you consider the exponentially larger and larger sets of data new LLM models require for training.
Further, D.E.I. overflows into environmental impact. Last fall, the Fifth National Climate Assessment outlined the country’s climate status. Not only is the U.S. warming faster than the rest of the world, but they directly linked reductions in greenhouse gas emissions with reducing racial disparities. Climate impacts are felt most heavily in communities of color and low incomes, therefore, climate justice and racial justice are directly related.
Flawed data leads to “Hallucinations” & harms Brands
“Brand Safety” and How A.I. can harm Brands
Brand safety is the practice of protecting a company’s brand and reputation by monitoring online content related to the brand. This includes content the brand is directly responsible for creating about itself as well as the content created by authorized agents (most typically customer service reps, but now AI systems as well).
The data that comes out of A.I. agents will reflect on the brand employing the agent. A real life example is Air Canada. The A.I. chatbot gave a customer an answer that contradicted the information in the URL it provided. The customer chose to believe the A.I. answer, while the company tried to say that it could not be responsible if the customer didn’t follow the URL to the more authoritative information. In court, the customer won and Air Canada lost, resulting in bad publicity for the company.
Brand safety can also be compromised when a 3rd party feeds A.I. tools proprietary client data. Some terms and condition statements for A.I. tools are murky while others are direct. Midjourney’s terms state,
“By using the Services, You grant to Midjourney […] a perpetual, worldwide, non-exclusive, sublicensable no-charge, royalty-free, irrevocable copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, sublicense, and distribute text and image prompts You input into the Services”
Midjourney’s Terms of Service Statement
That makes it pretty clear that by using Midjourney, you implicitly agree that your data will become part of their system.
The implication that our client’s data might become available to everyone is a huge professional risk that Oomph avoids. Even using ChatGPT to provide content summaries on NDA data can open hidden risks.
What are “Hallucinations” and why do they happen?
It’s important to remember how current A.I. chatbots work. Like a smartphone’s predictive text tool, LLMs form statements by stitching together words, characters, and numbers based on the probability of each unit succeeding the previously generated units. The predictions can be very complex, adhering to grammatical structure and situational context as well as the initial prompt. Given this, they do not truly understand language or context.
At best, A.I. chatbots are a mirror that reflects how humans sound without a deep understanding of what any of the words mean.
A.I. systems are trying its best to provide an accurate and truthful answer without a complete understanding of the words it is using. A “hallucination” can occur for a variety of reasons and it is not always possible to trace their origins or reverse-engineer them out of a system.
As many recent news stories state, hallucinations are a huge problem with A.I. Companies like IBM and McDonald’s can’t get hallucinations under control and have pulled A.I. from their stores because of the headaches they cause. If they can’t make their investments in A.I. pay off, it makes us wonder about the usefulness of A.I. for consumer applications in general. And all of these gaffes hurt consumer’s perception of the brands and the services they provide.
Poor A.I. answers erode Consumer Trust
The aforementioned problems with A.I. are well-known in the tech industry. In the consumer sphere, A.I. has only just started to break into the public consciousness. Consumers are outcome-driven. If A.I. is a tool that can reliably save them time and reduce work, they don’t care how it works, but they do care about its accuracy.
Consumers are also misinformed or have a very surface level understanding of how A.I. works. In one study, only 30% of people correctly identified six different applications of A.I. People don’t have a complete picture of how pervasive A.I.-powered services already are.
The news media loves a good fail story, and A.I. has been providing plenty of those. With most of the media coverage of A.I. being either fear-mongering (“A.I. will take your job!”) or about hilarious hallucinations (“A.I. suggests you eat rocks!”), consumers will be conditioned to mistrust products and tools labeled “A.I.”
And for those who have had a first-hand experience with an A.I. tool, a poor A.I. experience makes all A.I. seem poor.
A.I.’s appetite for electricity is unsustainable
The environmental impact of our digital lives is invisible. Cloud services that store our lifetime of photographs sound like featherly, lightweight repositories that are actually giant, electricity-guzzling warehouses full of heat-producing servers. Cooling these data factories and providing the electricity to run them are a major infrastructure issue cities around the country face. And then A.I. came along.
While difficult to quantify, there are some scientists and journalists studying this issue, and they have found some alarming statistics:
- Training GPT-3 required more than 1,200 MWh which led to 500 metric tons of greenhouse gas emissions — equivalent to the amount of energy used for 1 million homes in one hour and the emissions of driving 1 million miles. GPT-4 has even greater needs.
- Research suggests a single generative A.I. query consumes energy at four or five times the magnitude of a typical search engine request.
- Northern Virginia needs the equivalent of several large nuclear power plants to serve all the new data centers planned and under construction.
- In order to support less consumer demand on fossil fuels (think electric cars, more electric heat and cooking), power plant executives are lobbying to keep coal-powered plants around for longer to meet increased demands. Already, soaring power consumption is delaying coal plant closures in Kansas, Nebraska, Wisconsin, and South Carolina.
- Google emissions grew 48% in the past five years in large part because of its wide deployment of A.I.
While the consumption needs are troubling, quickly creating more infrastructure to support these needs is not possible. New energy grids take multiple years and millions if not billions of dollars of investment. Parts of the country are already straining under the weight of our current energy needs and will continue to do so — peak summer demand is projected to grow by 38,000 megawatts nationwide in the next five years.
While a data center can be built in about a year, it can take five years or longer to connect renewable energy projects to the grid. While most new power projects built in 2024 are clean energy (solar, wind, hydro), they are not being built fast enough. And utilities note that data centers need power 24 hours a day, something most clean sources can’t provide. It should be heartbreaking that carbon-producing fuels like coal and gas are being kept online to support our data needs.
Oomph’s commitment to 1% for the Planet means that we want to design specific uses for A.I. instead of very broad ones. The environmental impact of A.I.’s energy demands is a major factor we consider when deciding how and when to use A.I.
Using our Values to Guide the Evaluation of A.I.
As we previously stated, our company values provide a lens through which we can evaluate A.I. and look to mitigate its negative effects. Many of the solutions cross over and mitigate more than one effect and represent a shared commitment to extracting the best results from any tool in our set
Smart
- Limit direct consumer access to the outputs of any A.I. tools, and put a well-trained human in the middle as curator. Despite the pitfalls of human bias, it’s better to be aware of them rather than allow A.I. to run unchecked
- Employ 3rd-party solutions with a proven track-record of hallucination reduction
Driven
- When possible, introduce a second proprietary dataset that can counterbalance training data or provide additional context for generated answers that are specific to the client’s use case and audience
- Restrict A.I. answers when qualifying, quantifying, or categorizing other humans, directly or indirectly
Personal
- Always provide training to authors using A.I. tools and be clear with help text and microcopy instructions about the limitations and biases of such datasets
1% for the Planet
- Limit the amount of A.I. an interface pushes at people without first allowing them to opt in — A.I. should not be the default
- Leverage “green” data centers if possible, or encourage the client using A.I. to purchase carbon offset credits
In Summary
While this article feels like we are strongly anti-A.I., we still have optimism and excitement about how A.I. systems can be used to augment and support human effort. Tools created with A.I. can make tasks and interactions more efficient, can help non-creatives jumpstart their creativity, and can eventually become agents that assist with complex tasks that are draining and unfulfilling for humans to perform.
For consumers or our clients to trust A.I., however, we need to provide ethical evaluation criteria. We can not use A.I. as a solve-all tool when it has clearly displayed limitations. We aim to continue to learn from others, experiment ourselves, and evaluate appropriate uses for A.I. with a clear set of criteria that align with our company culture.
To have a conversation about how your company might want to leverage A.I. responsibly, please contact us anytime.
Additional Reading List
- “The Politics of Classification” (YouTube). Dan Klyn, guest lecture at UM School of Information Architecture. 09 April 2024. A review of IA problems vs. AI problems, how classification is problematic, and how mathematical smoothness is unattainable.
- “Models All the Way Down.” Christo Buschek and Jer Thorp, Knowing Machines. A fascinating visual deep dive into training sets and the problematic ways in which these sets were curated by AI or humans, both with their own pitfalls.
- “AI spam is already starting to ruin the internet.” Katie Notopoulos, Business Insider, 29 January 2024. When garbage results flood Google, it’s bad for users — and Google.
- Racial Discrimination in Face Recognition Technology, Harvard, 24 October 2020. The title of this article explains itself well.
- Women are more likely to be replaced by AI, according to LinkedIn, Fast Company, 04 April 2024. Many workers are worried that their jobs will be replaced by artificial intelligence, and a growing body of research suggests that women have the most cause for concern.
- Brand Safety and AI, Writer.com. An overview of what brand safety means and how it is usually governed.
- AI and designers: the ethical and legal implications, UX Design, 25 February 2024. Not only can using training data potentially introduce legal troubles, but submitting your data to be processed by A.I. does as well.
- Can Generative AI’s Hallucination Problem be Overcome? Louis Poirier, C3.ai. 31 August 2023. A company claims to have a solution for A.I. hallucinations but doesn’t completely describe how in their marketing.
- Why AI-generated hands are the stuff of nightmares, explained by a scientist, Science Focus, 04 February 2023. Whether it’s hands with seven fingers or extra long palms, AI just can’t seem to get it right.
- Sycophancy in Generative-AI Chatbots, NNg. 12 January 2024. Human summary: Beyond hallucinations, LLMs have other problems that can erode trust: “Large language models like ChatGPT can lie to elicit approval from users. This phenomenon, called sycophancy, can be detected in state-of-the-art models.”
- Consumer attitudes towards AI and ML’s brand usage U.S. 2023. Valentina Dencheva, Statistica. 09 February 2023
- What the data says about Americans’ views of artificial intelligence. Pew Research Center. 21 November 2023
- Exploring the Spectrum of “Needfulness” in AI Products. Emily Campbull, The Shape of AI. 28 March 2024
- AI’s Impact On The Future Of Consumer Behavior And Expectations. Jean-Baptiste Hironde, Forbes. 31 August 2023.
- Is generative AI bad for the environment? A computer scientist explains the carbon footprint of ChatGPT and its cousins. The Conversation. 23 May 2023
Everyone’s been saying it (and, frankly, we tend to agree): We are currently in unprecedented times. It may feel like a cliche. But truly, when you stop and look around right now, not since the advent of the first consumer-friendly smartphone in 2008 has the digital web design and development industry seen such vast technological advances.
A few of these innovations have been kicking around for decades, but they’ve only moved into the greater public consciousness in the past year. Versions of artificial intelligence (AI) and chatbots have been around since the 1960s and even virtual reality (VR)/augmented reality (AR) has been attempted with some success since the 1990s (That Starner). But now, these technologies have reached a tipping point as companies join the rush to create new products that leverage AI and VR/AR.
What should we do with all this change? Let’s think about the immediate future for a moment (not the long-range future, because who knows what that holds). We at Oomph have been thinking about how we can start to use this new technology now — for ourselves and for our clients. Which ideas that seemed far-fetched only a year ago are now possible?
For this article, we’ll take a closer look at VR/AR, two digital technologies that either layer on top of or fully replace our real world.
VR/AR and the Vision Pro
Apple’s much-anticipated launch into the headset game shipped in early February 2024. With it came much hype, most centered around the price tag and limited ecosystem (for now). But after all the dust has settled, what has this flagship device told us about the future?
Meta, Oculus, Sony, and others have been in this space since 2017, but the Apple device has debuted a better experience in many respects. For one, Apple nailed the 3D visuals, using many cameras and low latency to reproduce a digital version of the real world around the wearer— in real time. All of this tells us that VR headsets are moving beyond gaming applications and becoming more mainstream for specific types of interactions and experiences, like virtually visiting the Eiffel Tower or watching the upcoming Summer Olympics.
What Is VR/AR Not Good At?
Comfort
Apple’s version of the device is large, uncomfortable, and too heavy to wear for long. And its competitors are not much better. The device will increasingly become smaller and more powerful, but for now, wearing one as an infinite virtual monitor for the entire workday is impossible.
Space
VR generally needs space for the wearer to move around. The Vision Pro is very good at overlaying virtual items into the physical world around the wearer, but for an application that requires the wearer to be fully immersed in a virtual world, it is a poor experience to pantomime moving through a confined space. Immersion is best when the movements required to interact are small or when the wearer has adequate space to participate.
Haptics
“Haptic” feedback is the sense that physical objects provide. Think about turning a doorknob: You feel the surface, the warmth or coolness of the material, how the object can be rotated (as opposed to pulled like a lever), and the resistance from the springs.
Phones provide small amounts of haptic feedback in the form of vibrations and sounds. Haptics are on the horizon for many VR platforms but have yet to be built into headset systems. For now, haptics are provided by add-on products like this haptic gaming chair.
What Is VR/AR Good For?
Even without haptics and free spatial range, immersion and presence in VR is very effective. It turns out that the brain only requires sight and sound to create a believable sense of immersion. Have you tried a virtual roller coaster? If so, you know it doesn’t take much to feel a sense of presence in a virtual environment.
Live Events
VR and AR’s most promising applications are with live in-person and televised events. In addition to a flat “screen” of the event, AR-generated spatial representations of the event and ways to interact with the event are expanding. A prototype video with Formula 1 racing is a great example of how this application can increase engagement with these events.
Imagine if your next virtual conference were available in VR and AR. How much more immersed would you feel?
Museum and Cultural Institution Experiences
Similar to live events, AR can enhance museum experiences greatly. With AR, viewers can look at an object in its real space — for example, a sarcophagus would actually appear in a tomb — and access additional information about that object, like the time and place it was created and the artist.
Museums are already experimenting with experiences that leverage your phone’s camera or VR headsets. Some have experimented with virtually showing artwork by the same artist that other museums own to display a wider range of work within an exhibition.
With the expansion of personal VR equipment like the Vision Pro, the next obvious step is to bring the museum to your living room, much like the National Gallery in London bringing its collection into public spaces (see bullet point #5).
Try Before You Buy (TBYB)
Using a version of AR with your phone to preview furniture in your home is not new. But what other experiences can benefit from an immersive “try before you buy” experience?
- Test-drive a new car with VR, or experience driving a real car on a real track in a mixed-reality game. As haptic feedback becomes more prevalent, the experience of test-driving will become even closer to the real thing.
- Even small purchases have been using VR and AR successfully to trial their products, including AR for fashion retail, eyeglass virtual try-ons, and preview apps for cosmetics. Even do-it-yourself retailer Lowe’s experimented with fully haptic VR in 2018. But those are all big-name retailers. The real future for VR/AR-powered TBYB experiences will allow smaller companies to jump into the space, like Shopify enabled for its merchants.
- Visit destinations before traveling. With VR, you could visit fragile ecosystems without affecting the physical environment or get a sense of the physical space before traveling to a new spot. Visitors who require special assistance could preview the amenities beforehand. Games have been developed for generic experiences like deep sea diving, but we expect more specific travel destinations to provide VR experiences of their own, like California’s Redwood Forest.
What’s Possible With VR/AR?
The above examples of what VR/AR is good at are just a few ways the technology is already in use — each of which can be a jumping-off point for leveraging VR/AR for your own business.
But what are some new frontiers that have yet to be fully explored? What else is possible?
- What if a digital sculptor or 3D model maker could create new three-dimensional models in a three-dimensional virtual space? The application for architects and urban planners is just as impactful.
- What if medical training could be immersive, anatomically accurate, and reduce the need for cadavers? What if rare conditions could be simulated to increase exposure and aid in accurate diagnoses?
- What if mental health disorders could be treated with the aid of immersive virtual environments? Exposure therapy can aid in treating and dealing with anxiety, depression, and PTSD.
- What if highly skilled workers could have technical mentors virtually assist and verify the quality of a build? Aerospace, automotive, and other manufacturing industry experts could visit multiple locations virtually and go where they’re needed most.
- What if complex mathematic-based sciences could provide immersive, data-manipulative environments for exploration? Think of the possibilities for fields like geology, astronomy, and climate change.
- What if movies were told from a more personal point of view? What if the movie viewer felt more like a participant? How could someone’s range of experiences expand with such immersive storytelling?
Continue the AR/VR Conversation
The Vision Pro hasn’t taken the world by storm, as Apple likely hoped. It may still be too early for the market to figure out what AR/VR is good for. But we think it won’t go away completely, either. With big investments like Apple’s, it is reasonable to assume the next version will find a stronger foothold in the market.
Here at Oomph, we’ll keep pondering and researching impactful ways that tomorrow’s technology can help solve today’s problems. We hope these ideas have inspired some of your own explorations, and if so, we’d love to hear more about them.
Drop us a line and let’s chat about how VR/AR could engage your audience.
Feel like you’re seeing a lot more website pop-up banners these days asking about your cookie preferences? Those cookie banners are here to stay, and they’re a vital part of compliance for websites of all sizes.
As global standards for consumer privacy and data protection continue to climb, businesses are burning more time and resources to keep up. One VentureBeat article pegged the cost for a business of maintaining data privacy compliance at an eye-popping $31 million — and the costs of non-compliance can be even higher. Failing to stay on top of this complex patchwork of regulations can trigger real consequences, from steep fines and penalties to the indirect costs of reputational harm and lost business.
Cookie consent is one part of a holistic data privacy strategy — and an increasingly important one. Global privacy laws, such as the General Data Protection Regulation (GDPR), California Consumer Privacy Act (CCPA), and Brazil’s General Data Protection Law (LGPD), require companies to inform visitors about the data collected on their website via cookies and provide them with granular choices about what they’re willing to share. Cookie consent management solutions help users manage cookie preferences when they enter your site, presenting a banner that informs users about how cookies are used and letting them decide which information (if any) they want cookies to collect.
Cookie consent management solutions are rapidly evolving to keep up with changing data privacy standards. CookiePro is a solution from OneTrust designed specifically for small to medium businesses, offering a more automated way to ensure website and mobile applications stay compliant with cookie consent and global privacy regulations. At Oomph, we’ve helped several clients integrate CookiePro into their sites in recent months and think it’s on track to become an industry standard for cookie consent management.
For organizations that are already juggling multiple site integrations, does it make sense to add another? To answer that, let’s take a look at why cookie consent matters, how a tool like CookiePro can help, and if it’s right for you.
Why Do I Need a Cookie Consent Solution?
To comply with privacy laws and provide a transparent experience that builds trust, many website owners are rethinking how they manage compliance. Adding a cookie consent tool to your website can improve the experience for you and your users.
Ensure Compliance
Not taking data privacy seriously can cost you. In December 2022, Meta (the parent company of Facebook) agreed to pay $725 million to settle several class-action lawsuits that found Facebook had let third-parties access users’ private data and their friends’ data without user permission. Oracle has been sued for collecting 4.5 billion personal records from consumers who have specifically opted out of sharing, and Starbucks is potentially facing a lawsuit for continuing to “track customers ‘after they’ve declined all but required cookies.’”
While big-name companies get most of the bad press around data privacy, you don’t have to be a global enterprise to face similar consequences. In 2022, the total value of settlements for class-action lawsuits set a new record at $63 billion — and data breach and privacy class action settlements were among the top 10 settlement categories. Instead of risking a costly settlement, a much less expensive approach is to invest in a solution to help manage the work of compliance.
Build Trust
Beyond protecting your organization from legal action, demonstrating that you care about compliance helps your business build trust and long-term relationships with users. Data privacy is becoming more important to consumers of all ages, with 74% of people ranking data privacy as one of their top values.
A cookie consent solution lets users know that they’re in charge of their own data. It clearly discloses which information your business collects and uses, putting the power in their hands to control the data they share. If users want to change what they’re comfortable sharing later, they can easily update their settings. That level of transparency helps set the tone for your customer interactions, turning users into loyal brand advocates.
Optimize Efficiency
If your website serves users in multiple states or countries, keeping up with the patchwork of state, federal, and international laws is virtually impossible without software. Eleven states have unique data privacy laws in place right now, and 16 states introduced privacy bills during the 2022 to 2023 legislative cycle.
Factor in international regulations like GDPR, and it would take more hours than there are in a day to curate the individual preferences of your customer base. Plus, which of your team members is watching in case any regulations change? The most efficient approach is to use an automated cookie solution to curate consent requirements based on the user’s location and more.
What Is CookiePro?
Developed by OneTrust, which offers more robust data privacy solutions for enterprises, CookiePro started as a product in the OneTrust platform. After recognizing the need among small and medium businesses for a turnkey consent tool, OneTrust spun off CookiePro as a standalone solution.
CookiePro offers plans starting at around $40 per month, making it a budget-friendly alternative to enterprise solutions like OneTrust (or the cost of a lawsuit settlement). CookiePro comes with core compliance features like user-level consent management, acceptance customization, data mapping and recordkeeping, support for over 250 user languages, and additional security features.
After helping several of our clients implement CookiePro, there are a few key features that stand out for us:
- Easy installation: It just takes a few minutes to add a snippet of code to your website to enable CookiePro. It’s compatible with Drupal, WordPress, and other major site platforms.
- Automated cookie blocking: CookiePro’s auto-blocking tool scans your website to identify third-party tracking technologies, categorizes the cookies, and automatically blocks all cookies until users have given consent.
- Robust customizations: You can tailor your CookiePro banner to match your branding by customizing colors, content, and consent language. CookiePro also allows you to customize the user experience by choosing your consent approach and giving users granular control over their cookie settings.
- Upgrade path: Whether you have a small site or one with hundreds of thousands of visitors, CookiePro can support growing business needs. If you find that you need more support or functionality, you can upgrade to OneTrust’s Trust Intelligence Platform to unify all your data privacy management activities.
- Tag management system integrations: You can integrate tag management systems with your cookie consent solution if you use analytics and other platform tags on your website. CookiePro has integrations with many major tag management systems, including Google Tag Manager and Tealium, so you don’t have to change your current setup.
Beyond CookiePro, there are a growing number of other cookie consent solutions on the market, such as Termly and Cookiebot by Usercentrics. The right choice for you will depend on your existing tech stack, budget, and goals — the most important step is to put something in place to protect yourself and your users.
Where Should I Start?
Taking a proactive approach is key to ensuring data privacy for your users and avoiding costly consequences. Educate yourself on the different regulations and requirements, figure out the gaps in your compliance approach, and invest in tools that can help reduce risk and manual effort for your team.
Feeling overwhelmed or need a fresh perspective? Oomph’s accessibility and compliance audit is a great place to start. We can help you go beyond cookie consent to meet Web Content Accessibility Guidelines (WCAG), Americans With Disabilities Act (ADA), and other regulatory standards, helping you mitigate risk and deliver on user expectations. Reach out to us to schedule your site audit.
With low-code and no-code development tools, anyone can be a developer. Right?
That depends. While working in low-code/no-code tools may feel like you’ve unlocked the power of the digital universe, there are still many projects that require traditional full-code solutions.
According to Zapier’s recent no-code report, over 50% of no-code users started in the past year, many of whom are self-taught. Industry analysts also expect that by 2025, over 70% of the applications organizations develop will rely on no-code/low-code tools. That’s not surprising, given that these tools lower the barrier to entry – and the cost – of developing new sites and apps.
With a slew of effective low-code/no-code solutions on the market today, the question isn’t whether you should use no-code/low-code tools to evolve your digital footprint. It’s how and when you should use them so that the tools work for your organization, not against it.
What Is Low-Code/No-Code?
There are three ways to build websites or apps: full-code, low-code, and no-code. Developers hold the keys to the proverbial full-code city, but low-code and no-code open the door to people without a coding background.
While it’s tempting to brush off low-code and no-code as “same same but different,” the differences do matter. Understanding what they are and how they work will help you choose the best route for whatever digital property you need to build.
Low-code development
Low-code development uses APIs, drag-and-drop tools, code and process templates, and more to help build websites, apps, and workflows. These tools typically require some coding skills, but nothing like what you’d need to create a full-code solution. That makes it much quicker and easier to create a product using low-code development than writing all of the code from scratch.
No-code development
No-code development uses visual builders and other simple tools that allow people without any coding skills to build digital experiences. Through drag-and-drop, visual flows, and templated plug-ins, you can build something beautiful without having to touch the code at all. They’re one step more accessible than low-code solutions, making them compelling options for organizations that need fast and cost-effective development.
Pros and Cons of Low-Code/No-Code Development
Low-code/no-code tools take a lot of the time, cost, and aggravation out of traditional development – but they’re not a cure-all for your coding challenges. Before you dive in, keep their strengths and limitations in mind.
Pros of low-code/no-code
- Speed to market: Your path to launch is much quicker with low-code/no-code tools. Stand up a simple brochure-style website or multi-step workflow in a matter of hours, rather than weeks or even months.
- Less expensive: Many businesses are on a budget that can’t flex for an experienced developer. Both low-code and no-code tools can deliver an effective product at a fraction of the price. Once you’ve built your site, you can also rely on your internal team to tweak the copy or update an image down the road instead of hiring outside help.
- Expansive options: The low-code/no-code tools on the market have expanded along with their user base. There are more options than ever before, with themes and plug-ins that look so sleek you’d think they were full code.
Cons of low-code/no-code
- Limited customization: You can create something appealing with no-code/low-code options. But will it totally look and feel like your brand? Probably not. Because these tools are designed to be easy to use, they don’t have the custom capabilities you’ll get through coding.
- Somewhat breakable: Low-code/no-code tools do use code, it’s just written and templated by someone else. That means that if you don’t use the features correctly, you can break the code – and you might still have to call in a developer to make important fixes.
- Time intensive: Drag-and-drop may sound simple, but getting it just right can actually be a time suck for people with minimal web dev skills. You may end up spending more time on your site or app than if you hired someone with development experience.
When Should You Use Low-Code/No-Code Tools?
For simple projects where hitting budgets and timelines is more important than highly customized design, low-code/no-code tools can be a great solve. They’re especially good for:
- Proof-of-concept builds: Looking to stand up a site or app for a new product? Rather than spending months on traditional development (and losing precious time in the market), low-code/no-code tools allow you to build the core functionality, test your idea with users, and gain buy-in or funding from key decision-makers. If you pick the right tool, your MVP could even serve as the foundation for a larger build later.
- Creating simple websites and apps: For basic builds like customer portals, knowledge centers, or bug ticketing systems, you don’t need to recreate the wheel. Low-code/no-code tools offer tons of templates that make it easy to launch straightforward digital experiences.
- Updating an existing digital experience: Thanks to the bevy of CMS tools out there, you can hire a developer for an initial build and have them set up CMS functionality for simple updates later. This hybrid approach gives you the freedom to adapt your site or app as your organization evolves, while still maintaining a custom look and feel.
- Reporting dashboards and workflow automations: Need a new setup for your business intelligence? Since dashboards and workflows are mainly built for internal users, achieving a custom look isn’t typically a priority – and options abound for automating virtually any task.
What Should You Look For in a Low-Code/No-Code Tool?
Before you choose a solution, consider whether anyone on your team has basic coding skills. If yes, low-code tools may be up your alley. If not, consider narrowing your focus to the many no-code tools around.
Whichever route you go, look for these features in both low-code and no-code tools:
- An extensive marketplace: Because low-code and no-code solutions aren’t very customizable, look for a tool with robust built-in features. You’ll want to be able to choose from a diverse set of additional features, tools, and pre-built APIs that integrate with any tech you need.
- Integration support: You may need custom code solutions here and there. Make sure the low-code or no-code tool you choose will allow you to add extra code.
- Visual builder tools: With these tools, you’ll be able to edit the user interface and not the back end, which is key if you want an approachable no-code/low-code tool. Look for drag-and-drop capabilities that can be used to create things like data workflow mapping or dashboard building. Without that, you may still need someone with development experience.
- Granular permission systems: Your website shouldn’t be an open door. Look for low-code/no-code tools that allow you to restrict access to certain apps and data sets by assigning different permissions to different users (administrators, edit access, view-only, etc.).
When Should You Bring in an Agency To Build a Full-Code Solution?
Sometimes, only a custom or full-code solution will do. The more unique you want your digital property to be, the more likely it is that you’ll need to call in an expert. We also suggest you look for support if:
- You have a complex digital environment: If you need a site with hundreds of pages or a multisite architecture, you’ll probably need a custom solution to get the results you’re looking for.
- You need a custom feature: Low-code/now-code tools aren’t bottomless wells of features and plug-ins. If you need a unique-to-you feature or integration that isn’t readily available, it’s time to call in someone who can build one.
- You need a custom user interface: Cookie-cutter templates won’t work for all brands and products. If you need a specific user interface, you may not be able to achieve that within even a low-code platform.
- You want the product to be proprietary: Low-code/no-code products must be hosted within a specific low-code/no-code platform. You’ll need a custom, full-code solution if you want to own the product and sell it as a proprietary tool.
Get Help Leveraging the Right Tools for the Right Projects
You wouldn’t build a house on shaky ground, would you? Then why build an experience on a platform that might not actually be able to support it?
Though no-code/low-code tools certainly democratize the web development market, they aren’t a silver bullet. If you know that whatever you’re building is simple enough that a no-code/low-code tool and your existing team can support it, we say go for it.
But if you’re even a little uncertain, consider getting an outside opinion on how to lay a strong foundation for your next development project.
Want help deciding whether no-code, low-code, or full-code is best for you? We’d love to talk with you about your needs.
The circular economy aims to help the environment by reducing waste, mainly by keeping goods and services in circulation for as long as possible. Unlike the traditional linear economy, in which things are produced, consumed, and then discarded, a circular economy ensures that resources are shared, repaired, reused, and recycled, over and over.
What does this have to do with your digital platform? In a nutshell: everything.
From tackling climate change to creating more resilient markets, the circular economy is a systems-level solution for global environmental and economic issues. By building digital platforms for the circular economy, your business will be better prepared for whatever the future brings.
The Circular Economy isn’t Coming. It’s Here.
With environmental challenges growing day by day, businesses all over the world are going circular. Here are a few examples:
- Target plans for 100% of its branded products to last longer, be easier to repair or recycle, and be made from materials that are regenerative, recyclable, or sustainably sourced.
- Trove’s ecommerce platform lets companies buy back and resell their own products. This extends each products’ use cycle, lowering the environmental and social cost per item.
- Renault is increasing the life of its vehicle parts by restoring old engine parts. This limits waste, prolongs the life of older cars, and reduces emissions from manufacturing.
One area where nearly every business could adopt a circular model is the creation and use of digital platforms. The process of building websites and apps, along with their use over time, consumes precious resources (both people and energy). That’s why Oomph joined 1% For the Planet earlier this year. Our membership reflects our commitment to do more collective good — and to hold ourselves accountable for our collective impact on the environment.
But, we’re not just donating profits to environmental causes. We’re helping companies build sustainable digital platforms for the circular economy.
Curious about your platform’s environmental impact? Enter your URL into this tool to get an estimate of your digital platform’s carbon footprint.
Changing Your Platform From Linear to Circular
If protecting the environment and promoting sustainability is a priority for your business, it’s time to change the way you build and operate your websites and apps. Here’s what switching to a platform for the circular economy could look like.
From a linear mindset…
When building new sites or apps, many companies fail to focus on longevity or performance. Within just a few years, their platforms become obsolete, either as a result of business changes or a desire to keep up with rapidly evolving technologies.
So, every few years, they have to start all over again — with all the associated resource costs of building a new platform and migrating content from the old one.
Platforms that aren’t built with performance in mind tend to waste a ton of energy (and money) in their daily operation. As these platforms grow in complexity and slow down in performance, one unfortunate solution is to just increase computing power. That means you need new hardware to power the computing cycles, which leads to more e-waste, more mining for metals and more pollution from manufacturing, and more electricity to power the entire supply chain.
Enter the circular economy.
…to a circular approach.
Building a platform for the circular economy is about reducing harmful impacts and wasteful resource use, and increasing the longevity of systems and components. There are three main areas you can address:
1. Design out waste and pollution from the start.
At Oomph, we begin every project with a thorough and thoughtful discovery process that gets to the heart of what we’re building, and why. By identifying what your business truly needs in a platform — today and potentially tomorrow — you’ll minimize the need to rebuild again later.
It’s also crucial to build efficiencies into your backend code. Clean, efficient code makes things load faster and run more quickly, with fewer energy cycles required per output.
Look for existing frameworks, tools, and third-party services that provide the functions you need and will continue to stay in service for years or decades to come. And, instead of building a monolith platform that has to be upgraded every few years or requires massive computing power, consider switching to a more nimble and efficient microservices architecture.
2. Keep products and services in use.
Regular maintenance and timely patching is key to prolonging the life of your platform. So is proactively looking for performance issues. Be sure to regularly test and assess your platform’s speed and efficiency, so you can address problems early on.
While we’re advocating for using products and services for as long as possible, if your platform is built on microservices, don’t be afraid to replace an existing service with a new one. Just make sure the new service provides a benefit that outweighs the resource costs of implementing it.
3. Aim to regenerate natural systems.
The term “regenerate” describes a process that mimics the cycles of nature by restoring or renewing sources of energy and materials. It might seem like the natural world is far removed from your in-house tech, but there are a number of ways that your IT choices impact the environment.
For starters, you can factor sustainability into your decisions around vendors and equipment. Look for digital hosting companies and data centers that are green or LEED-certified. Power your hardware with renewable energy sources. Ultimately, the goal is to consider not just how to reduce your platform’s impact on the environment, but how you can create a net-positive effect by doing better with less.
Get Ready for the Future
We’ve long seen that the ways in which businesses and societies use resources can transform local and global communities. And we know that environmental quality is inextricably linked to human wellbeing and prosperity. The circular economy, then, provides a way to improve our future readiness.
Companies that invest in sustainability generally experience better resilience, improved operational performance, and longer-lasting growth. They’re also better suited to meet the new business landscape, as governments incentivize sustainable activities, customers prefer sustainable products, and employees demand sustainable leadership.
Interested in exploring how you can join the new circular economy with your digital platforms? We’d love to help you explore your options, just contact us.
In our previous post we broadly discussed the mindset of composable business. While “composable” can be a long term company-wide strategy for the future, companies shouldn’t overlook smaller-scale opportunities that exist at every level to introduce more flexibility, longevity, and reduce costs of technology investments.
For maximum ROI, think big, then start small
Many organizations are daunted by the concept of shifting a legacy application or monolith to a microservices architecture. This is exacerbated when an application is nearing end of life.
Don’t discount the fact that a move to a microservices architecture can be done progressively over time, unlike the replatform of a monolith which is a huge investment in both time and money that may not be realized for years until the new application is ready to deploy.
A progressive approach allows organizations to:
- Move faster and allow for adjustments as needed
- Begin realizing returns on investments faster
- Reduce risk by making smaller investments and deployments
- Ease budgeting process by funding an overhaul in stages
- Improve quality by minimizing the scope of tests
- Save money on initial investment and maintenance where services are centralized
- Benefit from longevity of a component-based system
Prioritizing the approach by aligning technical architecture with business objectives
As with any application development initiative, aligning business objectives with technology decisions is essential. Unlike replatforming a monolith, however, prioritizing and planning the order of development and deployments is crucial to the success of the initiative.
Start with clearly defining your application with a requirements and feature matrix. Then evaluate each using three lenses to see priorities begin to emerge:
- With a current state lens, evaluate each item. Is it broken? Is it costly to maintain? Is it leveraged by multiple business units or external applications?
- Then with a future state lens, evaluate each item. Could it be significantly improved? Could it be leveraged by other business units? Could it be leveraged outside the organization (partners, etc…)? Could it be leveraged in other applications, devices, or locations?
- Lastly, evaluate the emerging priority items with a cost and effort lense. What is the level of effort to develop the feature as a service? What is the likely duration of the effort?
Key considerations when planning a progressive approach
Planning is critical to any successful application development initiative, and architecting a microservices based architecture is no different. Be sure to consider the following key items as part of your planning exercises:
- Remember that rearchitecting a monolith feature as a service can open the door to new opportunities and new ways of thinking. It is helpful to ask “If this feature was a stand alone service, we could __”
- Be careful of designing services that are too big in scope. Work diligently to break down the application into the smallest possible parts, even if it is later determined that some should be grouped together
- Keep security front of mind. Where a monolith may have allowed for a straightforward security management policy with everything under one roof, a services architecture provides the opportunity for a more customized security policy, and the need to define how separate services are allowed to communicate with each other and the outside world
In summary
A microservices architecture is an approach that can help organizations move faster, be more flexible and agile, and reduce costs on development and maintenance of software applications. By taking a progressive approach when architecting a monolith application, businesses can move quickly, reduce risk, improve quality, and reduce costs.
If you’re interested in introducing composability to your organization, we’d love to help! Contact us today to talk about your options.
While the terminology was first spotlighted by IBM back in 2014, the concept of a composable business has recently gained much traction, thanks in large part to the global pandemic. Today, organizations are combining more agile business models with flexible digital architecture, to adapt to the ever-evolving needs of their company and their customers.
Here’s a high-level look at building a composable business.
What is a Composable Business?
The term “composable” encompasses a mindset, technology, and processes that enable organizations to innovate and adapt quickly to changing business needs.
A composable business is like a collection of interchangeable building blocks (think: Lego) that can be added, rearranged, and jettisoned as needed. Compare that with an inflexible, monolithic organization that’s slow and difficult to evolve (think: cinderblock). By assembling and reassembling various elements, composable businesses can respond quickly to market shifts.
Gartner offers four principles of composable business:
- Discovery: React faster by sensing when change is happening.
- Modularity: Achieve greater agility with interchangeable components.
- Orchestration: Mix and match business functions to respond to changing needs.
- Autonomy: Create greater resilience via independent business units.
These four principles shape the business architecture and technology that support composability. From structural capabilities to digital applications, composable businesses rely on tools for today and tomorrow.
So, how do you get there?
Start With a Composable Mindset…
A composable mindset involves thinking about what could happen in the future, predicting what your business may need, and designing a flexible architecture to meet those needs. Essentially, it’s about embracing a modular philosophy and preparing for multiple possible futures.
Where do you begin? Research by Gartner suggests the first step in transitioning to a composable enterprise is to define a longer-term vision of composability for your business. Ask forward-thinking questions, such as:
- How will the markets we operate in evolve over the next 3-5 years?
- How will the competitive landscape change in that time?
- How are the needs and expectations of our customers changing?
- What new business models or new markets might we pursue?
- What product, service, or process innovations would help us outpace competitors?
These kinds of questions provide insights into the market forces that will impact your business, helping you prepare for multiple futures. But you also need to adopt a modular philosophy, thinking about all the assets in your organization — every bit of data, every process, every application — as the building blocks of your composable business.
…Then Leverage Composable Technology
A long-term vision helps create purpose and structure for a composable business. Technology is the tools that bring it to life. Composable technology begets sustainable business architectures, ready to address the challenges of the future, not the past.
For many organizations, the shift to composability means evolving from an inflexible, monolithic digital architecture to a modular application portfolio. The portfolio is made up of packaged business capabilities, or PBCs, which form the foundation of composable technology.
The ABCs of PBCs
PBCs are software components that provide specific business capabilities. Although similar in some respects to microservices, PBCs address more than technological needs. While a specific application may leverage a microservice to provide a feature, when that feature represents a business capability beyond just the application at hand, it is a PBC.
Because PBCs can be curated, assembled, and reassembled as needed, you can adapt your technology practically at the pace of business change. You can also experiment with different services, shed things that aren’t working, and plug in new options without disrupting your entire ecosystem.
When building an application portfolio with PBCs, the key is to identify the capabilities your business needs to be flexible and resilient. What are the foundational elements of your long-term vision? Your target architecture should drive the business outcomes that support your strategic goals.
Build or Buy?
PBCs can either be developed internally or sourced from third parties. Vendors may include traditional packaged-software vendors and nontraditional parties, such as global service integrators or financial services companies.
When deciding whether to build or buy a PBC, consider whether your target capability is unique to your business. For example, a CMS is something many businesses need, and thus it’s a readily available PBC that can be more cost-effective to buy. But if, through vendor selection, you find that your particular needs are unique, you may want to invest in building your own.
Real-World Example
While building a new member retention platform for a large health insurer, we discovered a need to quickly look up member status during the onboarding process. Because the company had a unique way of identifying members, it required building custom software.
Although initially conceived in the context of the platform being created, a composable mindset led to the development of a standalone, API-first service — a true PBC providing member lookup capability to applications across the organization, and waiting to serve the applications of the future.
A Final Word
Disruption is here to stay. While you can’t predict every major shift, innovation, or crisis that will impact your organization, you can (almost) future-proof your business with a composabile approach.
Start with the mindset, lay out a roadmap, and then design a step-by-step program for digital transformation. The beauty of an API-led approach is that you can slowly but surely transform your technology, piece by piece.
If you’re interested in exploring a shift to composability, we’d love to help. Contact us today to talk about your options.