Where to Find Hidden Intelligence in Unstructured Data

Producing dashboards and generating reports from structured data has been a standard method for extracting insights from accumulated customer information. While this approach has worked well in industries such as online retail with every consumer click, swipe, or purchase stored in a relational database, many sectors, such as healthcare, have troves of unstructured data that sit unused.  

Deciphering signal from this enormous pile of noise presents companies with tremendous potential opportunity for better understanding customer behaviors that lead to better customer experiences.

Specifically, healthcare could see a wave of innovations in preventative care and transformative patient experience based on new intelligence from previously untapped unstructured data. With estimates of upwards of 90% of data in unstructured form, this will be the new gold rush in customer analytics.

expanding_digital_universe.png

 

Sorting Through the Universe of Unstructured Data

So, what exactly is unstructured data and why is there so much of it?  Virtually every person-to-person communication today produces some form of unstructured data — typically in raw text-based form. Text messages, chat, documents, and emails are the primary culprits. Now with video, audio, and still images the universe of unstructured has expanded dramatically, but it comes in forms even more challenging to interpret in a large scale, automated way.

For example, drones are now used to capture video and still imagines for everything from homeowners insurance claims to measuring customer traffic patterns at shopping centers and amusement parks by photographing parking lot at various times. The challenge lies in evaluating this mountain of data without requiring human intervention — otherwise, it just won’t scale.   

Medical records present a more mundane but potentially transformational example of the power of unstructured data. A typical patient record likely contains a jumble of hand-written notes and documents that may or may not be held within a “certified” Electronic Medical Records (EMR) system. While EMR systems have been adopted by 90% of office-based physicians, consensus estimates are that 80% of that data is in unstructured form. As a result, it remains largely untapped for more proactive and preventative individual care, or for use across patient populations to inform research and public policy. The reason for this is that they are likely stored as scanned image files or pdf documents and not in a relational database that is easy to search, extract and analyze. 

The Opportunity in Healthcare

IU Health is using data analytics to develop a deeper understanding of their patients to improve the patient experience as well as health outcomes. They do this by augmenting their internal data with external data to help spot correlations that could lead to more targeted intervention.  According to Richard Chadderton, senior vice president, engagement and strategy, “It is promising that we can augment patient data with external data to determine how to better engage with people about their health. We are creating the underlying platform to uncover those correlations and are trying to create something more systemic.”  They are looking at correlations like housing density and high disengagement from health to spot areas for early intervention.

By taking this “outside-in” approach providers like IU Health can capture the full context of individual patient behaviors and outcomes. With better understanding, you can imagine a more consumer-like approach to healthcare that would lend itself to use cases such as:

Proactive and preventative Care – patient and doctors with better longitudinal data delivered via shared online dashboards could map trends in vitals, cholesterol, blood sugar, weight, and other factors to anticipate and head off more significant problems. Payers could provide financial incentives for patients to participate, lowering premiums for participation and hitting certain milestones.  Patients with mobile phones could receive alerts and text messages that “nudge” them to continue to monitor their health and adhere to a schedule of preventative care visits that include blood tests and physicals.   

Population-wide data use – By aggregating more granular individual patient data, providers and payers can start to isolate the root causes of broader health problems and develop community-wide preventative programs to educate patient populations and drive behaviors that lead to better overall outcomes. 

The key is to start to get better data on individual patients by unlocking what’s hidden in unstructured form, uncovering trends at the individual patient level, and discovering correlations at the community level that drive innovations in preventative care.  After all, the best outcome is less healthcare and more healthy people.  

Photo credit: Photo by Crew on Unsplash

Why the Data Revolution Stalled in Healthcare

For consumers, the digital and data revolution is well underway. From your smartphone, you can access everything in your personal and work lives: paying bills, transferring money, or ordering any conceivable product on Amazon—all while sipping that latte at your local café. But what you can’t do is access or manage your health records. That requires a phone call to the doctor’s office, a request form that needs to be faxed, resulting in hard copies that need to be mailed. All accomplished in roughly 2-3 weeks if you’re lucky.

Unless you feel nostalgic for the bygone era of curly paper and the screeching/whooshing noise of the fax machine making a connection (which always elicited a satisfying grin), this way of handling patient data is, let’s just say, suboptimal. The problem goes well beyond consumer convenience, as the same lack of access limits clinician’s ability to better understand a patient’s medical history. Furthermore, with records locked away in file cabinets, there is no opportunity to anonymize and aggregate data to better understand correlations between treatments or procedures and outcomes across patient populations.  Right now, Facebook and Google are capturing data on virtually every online activity and likely have more digitized personal and behavioral data on consumers than any health organization. 

Healthcare Data Gap

In their Age of Analytics study on the adoption of data-driven practices across industry sectors, Mckinsey Global Institute (MGI) found that U.S. healthcare lagged other industries in “capturing value from data and analytics.” Not surprisingly, location-based data (GPS) and U.S. retail led the way in value capture (see chart), as turn-by-turn directions and one-click purchasing have now become essential capabilities on smartphones. According to MGI, U.S. retail has been able to capture three to four times the value as healthcare, with GPS presenting an even a stronger case, at roughly five to six times.

MGI_data_value_capture_chart.png

These sectors were able to leap ahead due to the widespread adoption of standardization and integration of technology. Standard protocols for GPS technology, first established by the military, have been around for decades. The accelerated growth of mobile platforms that drove online retail was made possible by standardization on operating systems (iOS and Android) and coding languages like Python and Ruby on Rails, that armies of developers used to accelerate development cycles and enable interoperability.

The question remains, how does healthcare catch up?  How does the industry start to capture greater value from digital technologies, data, and analytics?  Let’s start with what’s getting in the way.

Barriers to Adoption

In their research, MGI identifies several barriers to greater adoption of analytics in healthcare, these include:

  • Lack of incentives,
  • Difficulty of process and organizational changes,
  • Shortage of technical talent,
  • Data-sharing challenges, and
  • Regulations

Scanning this list, you start to understand perhaps why healthcare lags behind, say, retail in adopting analytics. Let’s face it; healthcare is a beast. It’s highly regulated, decentralized, specialized, complex and fraught with risk (legal, financial, ethical). Purchasing diapers and batteries online is not exactly open-heart surgery.

While direct comparisons are difficult to make, perhaps we can draw some parallels to the underlying conditions within certain sectors, like retail, that drive the wider use of data and analytics to improve decision making and create better customer experiences.  Personalization, choice, and consumer control is the hallmark of successful retailers today— particularly the online, digital native firms. Beyond the obvious example of Amazon, there are specialty retailers like Warby Parker, Zappos, and prepared food providers like Blue Apron that combine choice, convenience, and personalization fueled by data (customer profile as well as purchase history). To make this work, consumers today have a tacit understanding with online retailers—they know with every interaction these companies are collecting more information about them. They are fine with this to the extent the companies are using this information to understand consumer’s needs, wants, desires, and preferences and deliver something of value in return. As Google discovered with Gmail ads, when the value exchange becomes unbalanced — i.e., pummeling your inbox with random ads— consumers will rebel.

The fact remains that the digital revolution of the last decade was consumer-focused and consumer-led.  The customer was at the center of the action. This is where incentives must be focused. The fundamental difference with healthcare is that historically the consumer has been a bystander, as providers and payers interacted to control the type of care, determine prices, and handle payments. The industry has looked and acted more like traditional B2B, slow to innovate, adopt new technologies and experiment with radically new ways of doing business. Until the consumer is front and center, informed, empowered and able to make their own decisions, things won’t change. Healthcare needs to gets personal.

Personalized Health Using Data

The proliferation of portable monitoring devices—be they commercially-available heart rate monitors or sophisticated diagnostic equipment—will produce vast amounts of new data. The question is, will this just create more “noise” or lead to changes in patient behavior, more effective treatment options, and better outcomes. There are some early signs of progress. Essentia Health instituted home monitoring for patients with congestive heart failure and saw a significant drop in readmission rates (2% vs 25% for industry average).

The answer to whether data can make a difference lies in the level of actual or perceived value the patient gets out of the data-sharing deal.  In a consumer-focused world, it’s all about what’s in it for me—convenience, choice, cost savings. If consumers are going to make an effort to share their data (not to mention overcoming privacy concerns), what do they get in return? What they need first is to feel like they are in control, that they can influence the process. It’s no secret why buying a car or a house are two of the most stressful and least satisfying consumer experiences. They involve complicated, consequential decisions with very little control and limited access to information. Online players like TrueCar and Zillow have started to chip away at the this, offering greater transparency through access to historical price comparisons, enabling consumers to make more informed decisions.

For the data revolution to take hold in healthcare, this same level of consumer control, driven by data transparency, will need to become a reality. Only then will consumer demand begin to grow such that payers and providers will have to innovate and compete based on publicly available data on quality of care and cost, and private, consumer-controlled patient profiles. The result will be better outcomes for all. 

 

Photo credit:  CommScope via Visualhunt / CC BY-NC-ND

Customer Metrics that Matter in Distribution

Having satisfied customers is your goal, right?  But is satisfaction enough? What about loyalty and advocacy? You want to create advocates that will tell your story, recommend you to friends and colleagues.  You may be using Net Promoter Score (NPS) to gauge overall customer satisfaction and loyalty, but you also need detailed feedback that is actionable, that allows you to pinpoint problems across your customer's journey. Also, according to research by Corporate Executive Board(CEB), customer satisfaction alone can be a poor indicator of customer loyalty. They found that 1 in 5 (20%) of self-described ‘satisfied’ customers said they intended to leave the company in question, and more than 1 in 4 (28%) of the “dissatisfied” customers intended to stay.

So, what does create loyalty?  Reducing customer effort makes a difference. 

From the CEB research: “First, delighting customers doesn’t build loyalty; reducing their effort—the work they must do to get their problem solved—does. Second, acting deliberately on this insight can help improve customer service, reduce customer service costs, and decrease customer churn.”

Using NPS is still essential, as it can help you pinpoint problems and track progress by having a consistent metric that tracks the degree to which customers are satisfied by understanding their willingness to recommend your product or service.  This provides a good baseline, but you need to uncover what’s driving your NPS scores.  Adding Customer Effort Score (CES) to your arsenal to compliment your NPS metrics will give you the causal relationships you’re looking for related to how customer effort impacts satisfaction and loyalty.

Advantages of Using Customer Effort Score

In the manufacturing and distribution sectors, CEB’s Lara Ponomareff notes that there are several advantages to using CES for these types of firms, these include:

  1. Practical in Nature - Distributors can make tangible changes to their customer service programs by identifying customer pain points.  By gauging effort at various touch points, from initial support call to final issue resolution, managers can isolate process inefficiencies, ineffective policies, or employees that need coaching or counseling.
  2. Subjective -  Pomoroff notes that the notion of customer effort is subjective, and in CEB’s research they found that, “effort is actually two-thirds of what we call ‘feel,’ or how I felt about the interaction as a customer, and only one-third is what we call ‘do,’ or the actions and steps I actually had to take.” Closely monitoring and measuring customer service outcomes can help understand and influence perceptions of effort. For example, redirecting customers to a self-service site that has ready answers and sophisticated search may seem like a low effort alternative for a customer to get a question answered.  But after just a few minutes of scanning FAQs and prior posts, and only getting partial answers, the perceived level effort can quickly build, leading to frustrated customers.
  3. Actionable – The level of effort can vary throughout the stages of the customer journey. These variances can help you pinpoint problem areas.  Your online product discovery, selection and check out may be a breeze, but when a problem occurs that requires escalation to a contact center, the customer’s perceived effort may start to escalate.  Interminable wait times, multiple hand-offs, poorly trained or ill-informed reps can make issue resolution seem rather laborious.  Measuring CES at each point of interaction can help isolate issues and fix problems.  
  4. Reduces cost – Callbacks, escalation, and backlogged customer complaints can be costly as they required staff to resolve current issues and be on call to address the next wave of problems.  By isolating issues using CES, customer service operations can start to use practices that help preempt or anticipate future issues.  For example, by tracking issues or questions related to the initial use of a specific product after purchase, companies could send a link to a how-to video that addresses the most common questions related to the product and pre-empts call to the contact center in the future.   

Conclusion

On the adequacy of customer satisfaction as a loyalty metric, perhaps customer experience consultant Joesph Michelli put it best: “At the end of the day, the customer satisfaction score is very little more than a measure of your competence – your perceived competence – in the mind of a customer.” Reinforcing the competitive imperative facing companies today, he emphasizes that, “If you can’t satisfy 90 percent of your customers 90 percent of the time, you should be in another career field.” 

A cautionary warning indeed for those just relying on satisfaction to gauge loyalty.  Adding metrics like CES could yield additional insights that will reduce churn and service expenditures, ultimately making your customers happier and you more competitive. 

Why There is Major Disruption Ahead for The Food Service Industry

Guest Post by Matt Gutermuth, Founder G7 Leadership

From the moment that Julia Child entered our living rooms over 50 years ago, bringing refined French cuisine to the masses, the American palate changed forever. Today, celebrity chefs receive unprecedented exposure to all forms of media (television, web, social etc.), and names like Bobby Flay, Giada De Laurentiis, and Guy Fieri are now as recognizable as superstar athletes and rock stars.

We now have an entire consumer segment of “Foodies” that eat out more often, enjoy trying global fare, and are much more educated about the food that they eat. They want natural, organic, locally sourced, clean labels whether they are enjoying a meal at their favorite restaurant or shopping their neighborhood grocery store. In 2016, for the first time in history, retail sales at US eating establishments surpassed those of grocery stores.  And there has been a steady supply of new restaurants to meet this demand. In 2001 there were 469,018 restaurants in the country.  By 2016, that number had jumped to just over 600,000, an increase of 30%. 

Data: US Census Bureau; US Retail Spending

Data: US Census Bureau; US Retail Spending

And there has been a steady supply of new restaurants to meet this demand. In 2001 there were 469,018 restaurants in the country.  By 2016, the number had jumped to just over 600,000, an increase of 30%. 

But if you start to dig a little deeper into the restaurant growth story, there are some troubling signs and legitimate concerns about overcapacity — or dare we say a “bubble.” According to NPD, in 2016 the number of independent restaurants in the US dropped by 3%, and the overall number of restaurants (independent and chain) fell by 1%.  While certain segments of food service are maintaining moderate sales growth, NPD data shows that the casual dining and midscale/family dining segment continue to be soft.  Visits to casual dining restaurants are falling by 4 percent, and midscale/family dining lost 3 percent of their trips during the first quarter of 2017.  In addition, consumers are defining “dining out” much differently than they have in the past. Traditional restaurants are losing share to food retailers (Wegmans, Whole Foods, HEB etc), convenience stores (WaWa, Sheetz etc.) and meal kits (Blue Apron, Hello Fresh, Plated etc.).  Many Food retailers are now offering restaurant quality meals that can be consumed on sight or brought home at a value price point.  In fact, the fastest growing segment of food retail happens to be food service offerings created and sold inside the grocery store.  Convenience stores are also beginning to steal restaurant trips with their food service offerings.  If you have been inside a WaWa lately, you have probably noticed that it is certainly not your father’s gas station!

We are more fascinated with food than ever, and admittedly the majority of us don’t know how to cook.  There are a growing number of online providers like Blue Apron, Hello Fresh, and Plated, that deliver restaurant-quality meals with the prep work already done, and easy to follow instructions so that even those of us that struggle to boil water, can create terrific gourmet meals in our own kitchens. With these meal kits, household celebrity chefs now have other ways to get their gourmet fix, without going a restaurant or a grocery store.

Larger Trends

As we look ahead to the next decade, Technology will challenge the status quo at traditional grocery chains and restaurants.  Today’s consumer has a much different expectation today than just 10 years ago (thank you iPhone, Amazon, and Google!).  We expect to engage with brands on our terms, the way we choose to, not the way the brand “markets” us to.  We expect things now and customized to our liking.  Amazon can get you want you want, in some cases, within the hour, and Google can provide any answer in seconds.  Now that Amazon has entered the Food Industry directly with their acquisition of Whole Foods, we could see the greatest disruption the food industry has seen in over 50 years.  

Emerging Leaders

The lines between Food Service and Food Retail are already blurred and will only become even more so over time.  The daily question we all face: “What do you want to do for dinner?” was once black and white, with “cooking” representing a trip to the grocery store and “eating out” representing a trip to a restaurant. That daily decision is no longer black and white, as restaurant quality meals become a larger, more profitable, and growing segment for grocery stores, convenience stores, and meal kit providers.  Leading the way is Wegmans, who has unseated both Publix and Trader Joe’s as America’s favorite grocery chain.  The research firm CRC projects that five years from now prepared foods will represent 6.7% of grocery store sales, up from just 1.7% five years ago. CRC predicts that prepared food sales could exceed $65 billion in annual sales five years from now.  Today, “going out” to eat no longer exclusively means a trip to your local restaurant, but is becoming more likely to be a Wegmans, Whole Foods, HEB, or any number of other traditional grocery retailers that continue to improve their prepared offerings each and every day. Every dollar spent on prepared meals, and every trip made to “dine out” in traditional grocery outlets, represents a lost opportunity for the food service channel.  Food retailers have been successful at growing food service because they meet all the essential elements that drive consumer behavior, which according to NPD’s Warren Solochek, include “convenience, quality food, value, and a positive experience.” 

Lessons from Walmart

When Walmart entered the grocery business, there was a good bit of skepticism, with established grocery chains scoffing at the notion that a big box, non-food retailer could possibly be successful selling food.  “They don’t know our business,” and “Food is much more difficult than TV sets” was the traditional Food Retail perspective at the time. By the late 1980’s, Walmart proved to be a fast learner, launching their first Supercenter in 1987.  They are now the largest “grocery store” in the country, with over 21 percent market share of the U.S. traditional grocery industry.  Wal Mart’s mastery of supply chain and logistics, honed over previous decades, enabled them to execute their mission of “Save Money, Live Better” and changed how consumers bought their groceries. They became, in essence, a supply chain and logistics company with stores that sold food. Their inventory management and cost controls continue to be the envy of the industry, and create a significant competitive advantage.  They have forever changed how products are sourced, transported, and priced on the shelf.  Wal Mart’s size and scale coupled with their unmatched supply chain and logistics expertise has put enormous pressure on their traditional food retail competitors, and in many ways changed how consumers shop for food.

The Future with Amazon

Perhaps the most disruptive force in the food industry today is Amazon. Arguably, Amazon has the most robust household information of any retailer in the world (brick and mortar or online).  Consumers today require customization and want personalization, and Amazon is poised to deliver both in a way that other retailers can’t and won’t be able to for some time.  Couple their knowledge of the consumer with supply chain and logistics expertise that rivals Wal Mart’s, and it is not a stretch to suggest that Amazon is a formidable threat. Much of what we heard in the 80’s when Wal Mart entered the food business, we are hearing again in reference to Amazon’s desire to win in food.  Assuming that the acquisition of Whole Foods gets completed, Bezos has already done what most naysayers claimed he couldn’t do — quickly scale a physical food presence across the United States.  With over 450 stores ( mini distribution centers) located in sought-after locations (affluent neighborhoods), he’s done just that.  So, what does this mean for the industry, and what players will be impacted the most?  Food retail?  Food Service?  The short answer is all of the above.  Amazon’s move into this space is an absolute game-changer, and the impact will be felt across the food landscape for years to come (restaurants and grocery stores).  Amazon will unleash their superior household level insight along with their supply chain and logistics expertise to once again change how the consumer shops and interacts with food, much like Wal Mart did in the 80’s and 90’s. In addition to their operational advantages, Amazon doesn’t have to play by the same rules on Wall Street as their traditional competitors do.  This may change over time as Amazon continues to grow, but today their profit expectations are very different from other food industry companies trying to compete, freeing them up to take risks that others can’t. 

Amazon is betting that they can innovate faster and execute better than incumbents, by using their technology, knowledge of the consumer, supply chain and logistics advantage to change how both consumers and culinarians purchase and interact with food. They have redefined choice and convenience for consumers in the online world, and who’s to say they can’t do the same in food service or food retail?  Amazon has already built out local distribution centers that enable same day delivery of merchandise. How soon will they scale up food delivery to the home?  When do they start supplying directly to restaurants?  Offering price transparency, convenience, and choice is not easy in the food service world today, but they are becoming an expectation of consumers everywhere.  These same consumers are chefs and restaurant owners, that will welcome the transparency and ease of doing business that Amazon currently provides. In addition, most of these culinarians are probably already Prime or Amazon business customers (over 50% of US households are Prime members). Can they deliver the impossible:  size, scale, highly differentiated offerings that are personalized?  I would not bet against them!  

Conclusion

In the face of this relatively new and formidable threat, coupled with a more educated and demanding consumer, food retailers, restaurants and the entire ecosystems that supports and supplies them must reevaluate everything that they do.  It has always been important to start with the consumer, but in today’s environment, the consumer has more control than ever before, and the failure to keep up with them will be devastating.  If you are a food company that does not have the consumer front and center (in how you operate every single day, not just in a company slogan or mission statement), you will struggle to compete and survive in this new world. 

Amazon operates each day with a “Day 1” mindset.  This approach to the business enables them to move quickly and provide consumers with things they didn’t even realize they wanted. A large segment of the food industry is just now building online ordering capability and an omni-channel strategy. Amazon is already there and is close to taking “ordering” completely out of the equation with automated replenishment. Satisfying a very different consumer and competing in this new digital world will require new thinking and bold leadership.  Ask yourself, is the product or service you provide fast, convenient, transparent, and easy?  If not, that is a gap that Amazon will exploit. Ask yourself, are your consumer performance standards high enough, and are you delivering on those standards at a rate that will enable you to compete in the future. If not, you need to challenge your existing metrics as they pertain to the consumer / customer. 

Being “good” is simply not enough when the expectation is great!  The most important first step is the realization that the competitive landscape has shifted dramatically and only those able to adapt and change will survive.  There is some time to react, but the clock is definitely ticking. 

About Matt:  Matt was formerly President & CEO, Safeway.com, and held senior executive positions at Sysco and Winn-Dixie. He is now founder of G7 Leadership, inspiring others to be great leaders by sharing over 25+ years of leadership experience to help others navigate change.

 

Photo credit: Premshree Pillai via Visual Hunt

 

 

 

Machine Learning and Price Optimization

Determining what price to charge for your product or service can at times be deceptively easy:  figure out what your competition is doing and either match or beat that price. This approach works fine for commodity markets, where price transparency and comparison is easy. But what happens when there aren’t readily available comparisons? Or when you have “similar” product characteristics, but no other information that you can access that may have influenced pricing decisions such as other services included, seasonality, location, etc.  In this case, you often go with a gut feeling or accumulated experience to price products and hope there is adequate demand at your chosen price level.

For most organizations, even if they did have data on all the product characteristics and external factors they would still struggle to process this information in a way informs day-to-day pricing decisions.  The task of combing through and analyzing large data sets, determining correlations and assigning weighting factors to various product characteristics and other variables, is still beyond the capabilities of most organization’s technology stack designed for managing supply chains and customers, not high-powered analytical analysis.

Machine learning technology is starting to fill this gap, and traditional companies and startups are changing how pricing is done using smart analytics, processing power, and human intuition to optimize pricing.  Let’s take a look a couple of real world applications in the insurance and hospitality industies.   

Insurance Industry Application

As one of the largest insurers in the world, AXA has massive amounts of data on customer claim histories, and they are putting this to good use to help prevent large loss claims. Every year, 7%-10% of the company’s customers cause an accident, with most involving small claims of hundreds or thousands of dollars. 

However, approximately 1% of customers involved “large-loss” cases of over $10,000 and Axa needed a better way to predict and hence prevent the number and size of the large-loss cases. They had been using a more traditional machine learning technique called Random Forest but were only getting prediction accuracy rates of less than 40%.  In the hopes of getting better results, they started using Google’s TensorFlow deep learning solution and saw their prediction accuracy climb to 78%.  They were able to do this by tapping into the advanced neural network model that Google had been refining over the years, and combining this with the scale of their cloud offerings to deliver the computing power necessary to handle the processing load. Axa is now in a position to accurately price risk based on better understanding of the attributes of policyholders and other factors that lead to large-loss cases.

AirBnB’s Pricing Algorithm

Airbnb’s pricing challenge is a bit more complicated than most, as the users, not the company are responsible for setting prices. To enable hosts with pricing decisions, the company needed to provide the tools and data to help them optimize the price received while maintaining occupancy levels.  While conducting user research, Airbnb observed that during the initial sign up process, when hosts came to the pricing page they immediately began to search for other similar properties. The problem was that not only was this a laborious and time-consuming process, but they often had trouble locating similar properties. They discovered what most people learn when trying to sell their home, that it’s tough to find exact comparable properties. They also had to contend with pricing comps across an entire city, spanning multiple neighborhoods. They needed a way to automate this analysis and provide meaning price guidance to hosts.  

So the technical team at Airbnb set their sights on solving two problems: 1.  Automate the property comparison process, and 2. Understand supply and demand dynamics to make timely price adjustments. 

Unlike eBay, where there aren’t any location or time dependencies — you can buy and sell anything from anywhere at any time — lodging is very location and date-specific. And in the Airbnb model, can be as varied and idiosyncratic as the people that own the properties. To solve for this, Airbnb developed a list of the prime characteristics of properties, applied weightings to each one based on their importance to potential renters, and then ran these assumptions against years of transaction data to model against actual outcomes (i.e. what was the final price).

Image: Airbnb pricing tool

Image: Airbnb pricing tool

They were looking for how each variable correlated with price to understand the key drivers of value and to inform their pricing engine to make better pricing recommendations. They discovered things like the number of ratings correlated with higher demand, and that the use of certain types of photos translated to higher prices. Surprisingly, the professional photos of living rooms didn’t fare as well as the nice cozy bedroom shots taken by the owner. 

With these new insights, Airbnb was able to provide a more useful pricing tool for hosts that not only allowed them to price their properties based on more comprehensive comparative analysis but also provide dynamic pricing recommendations in response to changing demand. Similar to how airlines handle pricing, hosts get ongoing guidance based on market conditions so they can make adjustments the will drive higher occupancy.

Conclusion

Machine learning augments human decisions by narrowing a set of choices.  But just running a “black box” in the background that produces the miraculous answer is not sufficient. In the above example, insurance agents need to be able to explain the rationale behind auto premium price differences and rental hosts need to understand why and how price recommendations were determined to maintain trust and confidence in the information. It’s important to keep in mind that while the machine learns and provides answers, humans still need to explain what it means and why the results should be trusted.  The product lead for Airbnb put it best, “We wanted to build an easy-to-use tool to feed hosts information that is helpful as they decide what to charge for their spaces while making the reasons for its pricing tips clear.”

A Closer Look at Einstein, Salesforce's New AI Features

Is there any promise for the use of AI in sales and marketing? In a B2B context? Leading CRM solution provider Salesforce seems to think there is. In the past year, they rolled out AI-enabled enhancements to their cloud-based sales, marketing, and support solutions that are designed to deliver more predictive analysis, helping sales reps identify the most qualified leads, and giving marketers the intel to know who to target with what offer. 

To determine whether there is hope for such solutions or just hype, we’ll take a quick look at the major features of Salesforce’s Einstein AI, review some of the early critiques by the experts, and ponder some of the real-world use cases that might yield breakthrough results. Salesforce has deployed Einstein across their entire suite of solutions, but for the brevity’s sake, we’ll focus just on the sales cloud.

Feature overview of Einstein Sales

  • Einstein Lead Scoring: Einstein Lead Scoring models are built specifically for each customer and organization, which ensures that the models are tailored to the business. Einstein Lead Scoring analyzes all standard and custom fields attached to the Lead object, then tries different predictive models like Logistic Regression, Random Forests, and Naïve Bayes. It automatically selects the best one based on a sample dataset.
  • Einstein Opportunity & Account Insights: Sales Cloud Einstein analyzes all the standard fields attached to the Opportunity data in addition to email and calendar data, and then uses machine learning, natural language processing, and statistical analysis to provide sales reps and managers with "Predictions", "Key Moments", and "Smart Follow-Ups."
  • Einstein Activity Capture: This logs historical emails and calendar events from up to six months back for Gmail and up to two years back for Office 365.  It then works in the background to passively capture every email or calendar event sent or received. The captured emails and events are all displayed in an activity timeline, providing a history of the team’s relationship with a customer.
  • Einstein Follow-Ups:  This provides proactive email notifications, letting reps know when a customer needs an immediate response, or set a follow-up reminder.

Early critiques

Having lived through many “hype-cycles” over the years, technology buyers tend to react in the same way whenever there is some breakthrough new technology:  “so, what problem does it actually solve.”  In a recent article on new AI solutions, NextWeb talked about how “AI-powered tools are now helping scale the efforts of sales teams by gleaning useful patterns from data, finding successful courses of action, and taking care of the bulk of the work in addressing customer needs and grievances.”  Techcrunch takes a bit more pragmatic view on Salesforce's AI, “certainly automatic model generation, if it works as described and truly delivers the best models in an automated fashion, is highly sophisticated technology, but in the end, users don’t care about any of that. They want tools that help them do their jobs better, and if AI contributes to that, all the better.” On how to think about AI in the technology solution stack, they noted “the fact is AI is not a product in the true sense, so much as a set of technologies. We should keep that in mind as we judge these announcements, looking at how they improve the overall products and not at the shiny bells and whistles.” 

Possible Use Cases

Complex B2B sales remains a mostly human activity, and any technology deployed to support the process should help augment, not replace human judgment. If applied correctly, AI could help spot consistent patterns that narrow down a list of highly qualified leads for reps to contact given certain triggers. This is no doubt useful and could drive efficiency, but if the objective is to close larger more complicated enterprise sales, the most likely use case could be AI that tells reps who to talk to, but not what to say or do next. As we have discussed before, buyers and the buying process is not perfectly rational, and algorithms need good data

CRM systems can be full of human-keyed data that may be inconsistent, inaccurate, or lack sufficient depth to be meaningful.  Additionally, much of what’s entered can be subjective (close dates, probability of close, deal size) and often overly optimistic. What ultimately matters are customer behaviors: what products did they buy, when did they buy, what did they pay. Using actual prior transaction data for the AI analysis would likely improve relevancy and accuracy of predictions to make marketing and sales more efficient, and more importantly, more productive. 

Lessons from Google Data Centers: “Gaming” Their Way to Better Efficiency

Google data centers consume lots of power.  By recent estimates, they have over 2.5 million servers that consumed 4,402,836 MWh of electricity in 2014, equivalent to the average yearly consumption of about 366,903 U.S. family homes. Over the years they’ve had scores of PhD’s focused on coming up with solutions to optimize data center efficiency. Then they unleashed machine learning on the machines.

Using the same AI technology that taught itself to play Atari and beat the world champion in Go, Google’s DeepMind machine learning algorithms now control 120 different variables in their data centers, constantly learning what combination of adjustments maximize efficiency.  The result?  Deepmind was able to achieve 15% reduction in overall power savings and a 40% reduction of energy used for cooling, translating into hundreds of millions in cost savings.

Commenting on these results, author and MIT professor Erick Brynjolfsson addressed the broader implications: “You can imagine if you take that level of improvement and apply it to all of our systems — our factories, our warehouses, our transportation systems, we could get a lot of improvement in our living standards.”

Apparently, we’ve barely scratched the surface:  According to McKinsey: “while 90 percent of all digital data has been created within the last two years, only one percent of it has been analyzed, across both public and private sectors.” And behemoths like GE are fully on board with advanced analytics, spending $1 billion this year alone to analyze data from sensors on gas turbines, jet engines, and oil pipelines. If they can achieve Google-like results, the implications could be staggering.  

A Thought Experiment

Most organizations don’t have the resources of Google or GE, but they do experience similar problems that could be solved with a better understanding of all the variables that impact performance and a mindset of constant improvement. It’s important to keep in mind; Google already had some of the most efficient data centers in the industry before they unleashed DeepMind on the problem.

Obviously, you can’t snap your fingers and suddenly become Google.  So, perhaps a thought experiment is in order. One where you, for a moment, suspend disbelief, set aside current constraints, and think about what’s possible. With the Google example in mind, in what areas of your organization could you reap the greatest benefit with respect to, for example, production or servicing costs, or close ratios and customer retention that drive revenue?  What are the key variables that impact each of these areas and if you had perfect information what would it tell you? If you come up with, for instance, five variables that impact customer support costs, try to come up with 10 or even 20.  Challenge your team to do the same.  The point is not to engage in some pie-in-the-sky exercise, but to appreciate the level of complexity inherent in any activity within your business, and to start to look for correlations between events, activities, behaviors, and outcomes.

Further, you need to challenge the conventional wisdom in your organization that reinforces that notion that finding the “single cause” for performance issues will result in optimal outcomes, when in fact understanding the broader collection of variables will likely produce better results.  Google identified 120 variables just for data center energy consumption.  How about you? 

Digital Transformation: Where are You Now and Where Do You Need to Be?

We hear a lot about digital transformation and disruption, with boards pushing CEOs to “become digital” and completely rethink their business models. Geoffrey Moore provides an interesting framework for thinking about digital disruption as a continuum or serious of steps with firms having different starting points based on where they are in their life cycle: 1. new entrants incubating and scaling a truly digital business model, or 2. established companies that are modernizing and already scaled “industrial” model. 

Regardless of your starting point, creating and building a strong analytics competency is essential to remain competitive.  His point is that digital is data.  And when we talk about disruption, it’s about how companies use data and analytics to create new business models or services.  

Moore sees competitive firms in the future as those able to read the “signals” from customer data:  

“In the digital economy, such signals live at the intersection of two types of datasets—systems of record, which capture transactional data, and systems of engagement, whose log files capture all the peripheral interactions that occur in and around a transaction.”

Getting to this point requires climbing a series of “stairs” to reach the point of digital disruption. But first you need to figure out where you are now. 

Climbing the Stairs to Digital Disruption

According to Moore’s model, there are five steps that firms must ascend, with each corresponding to their digital IT maturity:  1. systems of record, 2. systems of engagement, 3. engagement analytics, 4, systems of intelligence and, 5. systems of disruption.  

Here’s a quick synopsis of each phase:

  1. Systems of Record:  ERP and CRM systems provide a single view of the customer and streamline the quote to cash process.  Key challenge – systems are still organized around the products, and they make it difficult to get a single view of the customer.
  2. Systems of Engagement:  Mobile applications and omni-channel communications improve customer experience, reduce time to transact, and eliminate disintermediation. Key challenge -- if systems of record are  behind in their "accommodation of customer-centricity," according to Moore, “you now have a ‘two-stair’ challenge ahead of you.”
  3. Engagement Analytics:  Dashboards and reports extract insights from Systems of Engagement about customer preferences, market trends, systems inefficiencies, and user adoption.  Key Challenge – at this phase you still have “human-in-the-loop computing,” that relies on people being able to "detect patterns and infer relationships.”  Innovation still moves at “human-centric pace.”
  4. Systems of Intelligence:  Machine learning detects near-invisible correlations, infers causation, enables prediction, and proposes prescriptions, in order to optimize all types of interaction. Key Challenge -- You need the right talent to “secure the data science expertise to work the algorithms, and then you need to get access to enormous amounts of data to feed the beast.”
  5. Systems of Disruption:  Systems of Intelligence leverage proprietary insights to disrupt inefficient markets with novel digital services.  Key Challenge -- getting through steps 1-4, which ultimately may require a new infrastructure model, a new operating model, and a new business model.

Moore posits that today most established companies operating in more traditional industries (i.e. not the digital natives) are somewhere between systems of record and systems of engagement, with a smaller number of innovators reaching Stage 3 - Engagement Analytics. He warns that established companies need to be firmly at stage 3 by the end of this decade or face a real existential crisis.

The "Two-Stair" Challenge

So, are you facing a two-stair challenge today?  Based on Moore’s framework, the degree of “customer-centricity” you have now in your systems and processes is a good indicator.  Firms that have attained just the systems of record level tend to be more inwardly focused on efficiency and less externally focused on effectiveness of customer interactions. Readjusting your focus externally and understanding your customer using historic transaction data and the “interactions that occur in and around a transaction,” is the key to accelerating your ascent to digital disruption and maintaining competitiveness in the new digital economy.

In our latest eBook: The New Customer Experience: Using Data and Analytics to Drive Digital Transformation, we discuss the key elements of the new B2B customer experience; the four common barriers to digital transformation; your essential analytics toolset; and how to get started down this path using feasibility studies to gauge where you are now and where to invest next in your digital journey. 

 

Photo via VisualHunt

Using a Journey Map to Improve Customer Experience

The old adage “you never get a second chance to make a first impression” still holds true today. However, the reality is that customers have multiple “first impressions” along their journey, from evaluation to purchase, to post-sales support.  And a bad experience at any point can wipe out any goodwill generated to that point. Gartner calls each of these points a “moment of truth” or critical decisions customers make at various points along their journey that can make or break a relationship— driving the customer to abandon their purchase, or perhaps the relationship entirely.

Companies use a variety of customer surveys and tools to try and gauge customer satisfaction and determine problem areas. While an essential part of a company’s toolkit, surveys are just one source of input to include in a comprehensive customer journey mapping that shows where, when, and how the company dropped the ball. To determine how a journey map might work for you, you need to understand the core elements in your typical map, why they are important, and how you might use them to pinpoint problems and identify opportunities for improvement. 

Primary Components of a Customer Journey Map

There is not one single type of customer journey (that would be too easy), but can be many permutations based what you provide (product or service) and the breadth of your focus (single customer persona or complete process). Regardless, there are some common core elements found in all good journey or experience maps.

The folks at Adaptive Path use “Experience Maps” to capture the complete customer experience and identify areas of customer pain and opportunities for improvement.  It starts with establishing guiding principles and includes the journey model, qualitative insights, quantitative metrics, and key takeaways. It’s an “artifact that serves to illuminate the complete experience a person may have with a product or service.”  

 

Guiding Principles – These principles define the context for the experience or journey map, and the scope of the analysis, be it specific personas or value propositions.  The objective is to gauge at multiple points across the customer journey, how well the customer experience agrees with these guiding principles.  

Journey Model – This is where you document the path the customer takes, the transitions they have to make from different phases (sales, delivery) and channels (web to phone support).  Here you want to capture not just the steps but illustrate something about the process: what is not working, the scope of the problem (how many customers), and the nature of the activity (linear steps or variable), what systems and tools are involved.

Qualitative Insights – These insights include the “doing” (journey) but also the thinking and feeling—the frame of mind of the customer at any given point in the journey.  They may feel anxious, confused, angry, or disappointed. You also want to understand what they are thinking: “What is the easiest way to get from A to B,” “I want to get the best price but I’m willing to pay more for convenience,” “The answer I’m looking for is not on the website, what now?”

Quantitative Info – Here is where you can use the survey data, web traffic, or abandon rates to understand the source and magnitude of the problem. By including clear metrics on the journey map (survey data in the Rail Europe case), you can quickly pinpoint problem areas. 

quant_adaptivepath.png

 

Takeaways -- The takeaways should guide decisions related to solving the problems identified in the journey mapping exercise:  reducing pain points and taking advantage of opportunities to improve your customer experience.  These bullet points provide a clear summary for your team as to priorities going forward and areas for investment that will deliver measurable value.

Conclusion

Customer Journey Maps can be a valuable tool to help you isolate customer experience challenges. It can also be an unwieldy, tangled mess if you don’t apply some basic structure to the upfront research, construction of the map, and evaluation of key takeaways.  To help keep you grounded and focused, start with the key principles and use them as “guardrails” to keep you on track to better customer insights.

 

Photo via Visual hunt

 

Will Algorithms Replace Human Judgement in the B2B Sales Cycle?

With all the talk about advanced algorithms, artificial intelligence, and chatbots one begins to wonder when virtually every B2C or B2B transaction will be automated.  In this utopian (dystopian?) future, the machines will know exactly what you want, buy it for you, and deliver it to you the same day.  But what role will humans play?  Are we to be disintermediated by the machines?  Replaced by algorithms? Future thinker and researcher Andrew McAfee makes the case that algorithms can and do outperform “experts” that rely on accumulated experience and good old human judgement—but only under certain conditions.

Understanding where you can use advanced algorithms will help you think through where to apply investments in analytics and what complementary skills you need on your marketing and sales teams.

Humans vs. Machines

According to McAfee, there is an abundance of evidence indicating that algorithms outperform human experts in their prediction making prowess.  One research study he cited, which involved the meta-analysis of 136 different studies comparing the prediction accuracy of machine vs. man, showed that in only 8 of the 136 studies the “expert judgments were clearly better than their purely data-driven equivalents.”  He further noted that “Most of these studies took place in messy, complex, real-world environments, not stripped-down laboratory settings.” So, why is this the case? In what situations or conditions do algorithms have the advantage?  And what about human intuition? To answer this, he calls on a bit of theory regarding the ideal conditions for decisions made based on judgment and intuition.  The ideal conditions for human judgment include:

  • an environment that is sufficiently regular to be predictable
  • an opportunity to learn these regularities through prolonged practice

In the medical field, you can find examples of expert judgment that fit the above criteria.  McAfee notes that since human biology changes very slowly, medicine meets the first criteria, but gyrating stock markets certainly don’t.  The second condition benefits from fast and consistent feedback loops that promote learning that can be applied to future decisions.  Anesthesiologists working with dozens of patients experience rapid feedback loops --seeing the effects of their actions -- that help improvement decisions. Where human intuition and judgment tend to break down is in “noisy,” highly variable environments where there are large data sets that aren’t easily interpreted.

Implications for B2B Marketing and Sales

For marketing and sales professionals, there are two aspects of the above analysis that impact that effectiveness of algorithms for lead qualification and selling and perhaps tilt the scales to human decisions:  the availability of complete and accurate data to feed the algorithm, and unknown preferences and biases of real buyers. The quality of the output from an algorithm can be highly dependent upon the veracity of the inputs. If there is missing or incomplete data, or a small sample size is used that skews results, the algorithm could suffer. Additionally, the complexity of the typical B2B sale makes automation with analytics trickier.  In the B2B sale, there is often multiple decision-makers and influencers, they can be less than forthcoming in sharing their intentions, and they face reputational risk when making buying decisions.  All of which requires more handholding, coaching, educating, understanding, and communicating—none of which easily automated.  

Conclusion

If we apply the ideal conditions for intuitive decision-making listed above to the B2B marketing and sales functions, we can see where the line could be drawn between automated algorithms and human decisions.  The first condition describes an “environment that is sufficiently regular to be predictable,” which would apply to a sales process whereby qualified prospects have a consistent set of pain points and requirements.  The second condition is where sales team “learns these regularities” and becomes more adept at educating, positioning, and pricing based on this understanding, resulting in faster sales cycles and higher close rates.

The antithesis of this is random leads that have no consistency and high levels of variability, which limits the intuitive decision-making ability of your sales team in determining what is “right” for the customer.  In this context, the job of analytics and algorithms is to eliminate the randomness by combing through data to find patterns that help identify consistent characteristics of qualified leads and deliver them to sales.  It’s not utopian to imagine machines and humans cooperating to make better decisions, you just need to have them each focus on the task they are best equipped to handle.

Photo credit: MattHurst via Visualhunt.com /  CC BY-SA      

Using Customer Lifetime Value to Create a Data-Driven Culture

Recent research shows that businesses have made some progress with their Big Data and analytics projects, but success is mostly limited to expense reduction initiatives.  Business transformation efforts and new revenue streams continue to lag.

Analytics Projects Still Expense-Driven

The results from a New Vantage survey of Fortune 1000 executives regarding their Big Data projects shows that “decrease expenses” was an area the showed the highest response (49.2%) for “Started and seen value.”  The responses for “Add revenue” and “Transform the business for the future” received the highest responses for “Not started.”  Interestingly, “Establish a data-driven culture” received the highest response (41.5%) for “Started and not seen value.” 

The report hints at the potential problem:

“In spite of the successes, executives still see lingering cultural impediments as a barrier to realizing the full value and full business adoption of Big Data in the corporate world.”

If one assumes that Big Data or advanced analytics is a major element of any business transformation that will create differentiation and competitive advantage, then removing the impediments to this transformation is paramount for execs. The key to creating a data-driven culture may lie not in focusing on data per se, but on customers and the value they create for your firm, and the value you deliver. Paradoxically, focusing externally on your customers may be the best way to drive internal cultural change.

Using CLV Metrics to Drive Change

MIT’s Michael Schrage talks about how companies can use customer lifetime value (CLV) to bring a more rigorous, data-driven approach to customer relationships focused on long-term relationships. Talking about the value of CLV, he noted:

“By imposing economic discipline, ruthlessly prioritizing segmentation, retention, and monetization, the metric assures future customer profitability is top of mind.”

He also notes the CLV is not enough: “While delighting customers and meeting their needs remain important, they’re not enough for a lifetime.” He argues that CLV metrics should measure how effectively “innovation investment” increases customer health and wealth.  From his workshops, he found that clients talked about how customers become more valuable to a company when “they buy more stuff,” or “they pay more” or “they’re loyal to our brand.”  All of which are traditional CLV type metrics.  He advocates going beyond these measures of value to incorporate more of an “investment ethos,” that looks at customer value created when customers:

  • Share good ideas
  • Evangelize for you on social media
  • Reduce your costs through self-service
  • Introduce you to new customers
  • Share data

By expanding the notion of what constitutes customer value companies can start to rethink segmentation, pricing, and promotions. It might also educate and better align your employees— regardless of their job title—with a complete view of customer value and the importance of measuring and tracking it. This investment view of CLV will help sales understands how new customer introductions create new opportunities; marketing can appreciate how evangelizing on social media drives more leads; product development gets new ideas; and customer support becomes more efficient resulting from greater customer self-service.  Once employees see the potential benefit to them, they just might be more motivated to seek out and use these metrics, thereby creating the data-driven behaviors and decision making that is key to transformation.

Schrage observed in one of his workshops how participants kept interchanging references to the creation of lifetime value as when “we” do something, or when “they”(customers) do something. He noted that there were much broader and deeper discussions around how to engage with and invest in their customers. And more comprehensive CLV metrics are the method for tracking how well the company is engaging and investing.

Conclusion

Cultural change is and has always been a difficult proposition for companies of any size.  Using a broader definition of CLV and the metrics to track it, could help align multiple areas of your organization around customer value that could jump start the data-driven cultural change that will drive transformation.  By clarifying what customer value means, how it is measured, and how each employee impacts these metrics you have a chance of creating a broader sense of purpose -- increasing customer value -- around which your team can rally.

Why Your Customer Profiles Need Behavioral Data

Developing customer profiles and segmentation strategies is essential to delivering personalized and relevant products and services. But too often in the B2B space, segmentation stops at the demographic level (size, industry, geography) and doesn’t include buyer’s behaviors and actions. For companies facing stagnate sales growth, building deeper and broader customers profiles that include a behavioral component may reveal the keys to greater growth and profitability.  B2C companies have led the way on profiling, segmentation, and understanding buyer behaviors. It’s time for B2B firms to catch up.

Buyers as Rational Actors

In the consumer B2C space you will likely find irrational, impulse-purchasing customer decisions driven by emotion. Conversely, business buying decisions in the B2B world are driven by careful, measured, cool-headed analysis, devoid of any emotion. Or so we think. When comparing the two segments, you find one unifying element: humans are involved. And in either the consumer or business context, buyers rarely act as rationally as one would assume. There is plenty of research to back up this notion of irrational human behavior when it comes to evaluating risk, loss, probabilities and other “cognitive biases” associated with decision making.

The most notable work in this field is the Nobel Prize winning research of Kahneman and Tversky, who through decades of research showed there were two competing decision processes that humans engage in:  one is fast, intuitive, and emotional; the other is slower, more deliberative, and more logical.  What they discovered is that when faced with decisions involving high degrees of uncertainty or potential risk, even smart and experienced people can fall prey to biases that lead to bad outcomes.

To develop more meaningful customer profiles, you need to understand how certain types of decision-making biases may be influencing your buyer’s behavior. You need to know how these customer decision biases might influence how and when you offer price discounts, rebates, service contracts, extended warranties and add-on services.

“Loss aversion” is one of the decision-making biases uncovered in their research, and understanding how it influences customer decisions may help you structure your offerings in ways that could improve conversions.  From Kahnemen:

“For most people, the fear of losing $100 is more intense than the hope of gaining $150. We concluded from many such observations that losses loom larger than gains and that people are loss averse.” 

How does loss aversion play out in everyday life?  Homeowners are less likely to sell their home when prices are falling, or investors are less inclined to dump stocks when the market is dropping.  Either action would cause them to recognize the loss, and thus mentally process it. You also find this phenomenon in sports, with professional golfers making a higher percentage par putts (risk of losing a shot) than birdies of equal distance.

Recognizing that behavioral biases impact decisions, you need to ask yourself: Is the way you position and price your products and services mindful of these potential biases?  Are you presenting the “upside” of your offering when the customer is really concerned about downside risk or loss?  Do all your buyers think and behave the same way? Deeper customer profiles and segmentation that incorporates prior actions and behaviors can help you begin to answer these questions.  So what are the core elements of your new, comprehensive customer profile?

Elements of a Customer Profile

To develop a more compressive B2B customer profile, we can borrow somewhat from the B2C world.  In the consumer space, there are six key areas that matter to marketers:

1.       demographic (age, gender, income)

2.       geographic (where they live/roam)

3.       attention (what they concentrate on)

4.       consumption (what they buy)

5.       behavioral (what matters to them)

6.       intentional (what they’re about to do)

Taking these in order, you will likely have acceptable quality and depth of demographic and geographic.  You may also have a basic idea as to “attention” if you are tracking any basic online activity, for example.

Where things get interesting is when you overlay consumption and behavioral data based on prior purchase history. With this data in hand, you have a distinct pattern of purchasing behavior that can lead you to the ultimate end game: intentions. You want to know what buyers are likely to do next—what, when and how much will they purchase—and to be ready with the right combination of offerings when that moment arrives.

Conclusion

The process of developing and using customer profiles is at its core a process of testing assumptions: you structure pricing and offerings targeting a specific segment, and expect certain outcomes. This process begins with understanding the quirks of the human mind (buyer decision making), the depth and breadth of the data about the customer (the profile), the combination of products, prices and promotions you test, the results you see, and the adjustments you make based on those results. This requires digging a bit deeper on your profiles, adding consumption and behavioral data to help you find new opportunities for growth.  

What’s Holding Back Your Digital Transformation

Everyone seems to be on the digital transformation bandwagon.  Recent research by Mulesoft shows that 88% of IT decision makers either have an initiative underway or will within three years.  So, what do they hope to accomplish with these transformative efforts?  Over 60% said that they want to “create great customer experiences,” with 77% looking to improve existing business processes.  But recent experience sheds some light on execution challenges they face, as respondents indicated that only half were able to complete digital transformation projects undertaken in the past year.  The reasons they cited: time constraints and misaligned priorities.

Priorities and Alignment

The top priorities for IT decision makers in the study included the usual suspects:  Security, cloud, application integration, and BI/Analytics.  Although these may be the must-have list for IT, the business leaders may have other priorities. The mismatch can result in priority misalignment and unfunded or stalled projects. A lack of understanding of common objectives across teams, functions, or business units is a result of poor organizational alignment. You see it play out all the time as CIOs who want to focus on securing infrastructure and data, while the business wants just to keep innovating, creating conflicting priorities. Both parties need to be aligned with the notion that security is a priority, or is frankly just important, otherwise you will have constant struggles for resources and budget dollars. The result is often a loss of momentum on projects or misaligned priorities.

To get better alignment, you need start with a shared sense of purpose, and this usually begins at the organizational level.  Understanding your purpose is the foundation for your strategy, business model, operational model, resources, and systems.  You will invest in and pay attention to what is most important, your reason for being.  But before you schedule the company offsite to ponder the question of why you exist, there may be a ready answer.  In his recent letter to shareholders, Jeff Bezos of Amazon described why they exist and hence where they focus: "You can be competitor focused, you can be product focused, you can be technology focused, you can be business model focused, and there are more. But in my view, obsessive customer focus is by far the most protective of Day 1 vitality."  In Bezos-speak, “Day 1” refers to the life stage of a business that is growing and innovating.  Day 2 is the stage when maturity is reached followed by stagnation and ultimately the demise of a business. Amazon’s reason for being is the customer.  In the letter, he expands further on the point: "Staying in Day 1 requires you to experiment patiently, accept failures, plant seeds, protect saplings, and double down when you see customer delight. A customer-obsessed culture best creates the conditions where all of that can happen."

So, aligning every aspect of your organization around the customer could be the rallying point to develop a shared sense of purpose and solve misalignment issues that can bog down initiatives, or “experiments,” to use Bezos’ terminology.  As you scan the list of technology priorities from the study, one of in particular stands out as having the potential to encourage the alignment necessary for digital transformation.  BI/Analytics could be a great way to align around the customer and prioritize projects that will deliver the greatest impact on customer experience.  Let’s look at are few reasons why.

The Case for BI/Analytics and Alignment

There are three reasons better alignment can be achieved with BI/Analytics:

  1. Creates an external focus on the customer -  Rather than internally focused process improvement, organizations can refocus externally to the data-driven customer journey, moving from efficiency to innovation using data and analytics in new ways that drive value.
  2. Aligns with growth and innovation – By using data and analytics, internal IT is an engine of innovation rather than just the support function.  This puts the IT team on the same side of the table as their business counterparts, developing and executing on new ideas.
  3. Exposes untapped potential –  Existing customer data is one the last untapped resources within organizations. By combing through previous transaction and support data, you can better understand behaviors and actions that can inform decisions regarding new offerings and opportunities to delight your customers.

Conclusion

Proper alignment between IT and business can start by putting the customer first and understanding their needs, wants, and desires through the data they leave behind. You will likely find in there ways to meet the growth of your business and make IT the planter and cultivator of the seeds of growth. 

The New B2B Customer Experience: Table Stakes for the Enterprise

Buried in your mountain of customer transaction data, may lie the key to delivering a vastly better customer experience.  In our latest eBook -- The New Customer Experience: Using Data and Analytics to Drive Digital Transformation -- we talk about how B2B firms that are looking to stay competitive, need to make the transition from a “transaction-based” mindset to one based on deeper customer knowledge driven by data. Data that you already have!  This “data-led digital transformation,” is essential for creating better customer experiences that lead to greater customer trust, loyalty and ultimately profitability.  

Transaction-based Mindset

Organizations that operate with a transaction-based analytics mindset have two primary characteristics: 1. they tend to focus on “what happened”—reporting on historical data, and 2. they are company rather than customer oriented.  What’s lacking with traditional dashboards and reports consumed at the management level, is the understanding of why things happened. Sales have fallen for the last three quarters, customer attrition is growing but management often just sees these events in the aggregate, effecting a monolithic group called customers, without a deeper understanding of what is happening in different customer segments that may disproportionately impact results. The other challenge with a focus on past results is that you have no ability to alter the outcomes, that drop in sales already happened, you can’t go back it time. The easy way to determine whether you are operating in transaction mode is the degree to which you understand and can differentiate your customers. Customer segmentation is where every data-led digital transformation begins.  

Transaction mindset is about managers using data to make decisions after the fact, while digital transformation is about sales reps and customer using analytics making decisions in real-time. This transformation is essential to delivering the new B2B customer experience, and it has now become competitive table stakes for enterprises across industries.

Data-led Digital Transformation

The new B2B customer experience is about helping buyers make smarter, faster decisions by being “context-aware” during every phase of the customer lifecycle. Context is most easily understood as the buyer’s current situation or circumstances. Are they a new customer or repeat? What type of customer (size, location, industry sub-segment, etc.)? What are their specific needs/requirements?  Which products are the best fit? When do they need it? What is their price point? What are comparative offerings?  What is their level of knowledge/experience with the product? What other value-added offerings are appropriate?

B2B companies have thrived in the past by having a deep understanding of their customers’ needs and providing excellent personal service. They delivered this high-touch service through direct customer contact with field and service reps. The new B2B customer experience builds on this notion of personal service but augments the personal touch with a digital en­abler: data and advanced analytics that deliver deeper knowledge of customer needs and buying behaviors.  The new B2B cus­tomer experience means anticipating the information they will need to make informed buying decisions, and predicting when, how and what type of support they will need in the future.

Getting Started

To help you jumpstart your efforts, in the eBook, we drill down deeper on the key enablers of your digital transformation journey:

  • Key elements of the new B2B customer experience
  • Four common barriers to digital transformation using data
  • Your essential B2B analytics toolset

In the final chapter, we spell out the best practices for accelerating your analytics initiatives without taking on too much cost or risk. We hope that you find this insightful and informative.
 

Gartner Conference Recap: Using Data to Map Your Customer Journey

At the recent Gartner Data & Analytics Summit, AI and machine learning got a lot of attention. While these topics are important for mapping your digital transformation long term, customer analytics are still the most relevant for today.  According to Gartner, customer analytics continues to rank highest in terms of technology investment for customer experience (CX) projects.  Further, they anticipate that by 2020, more than 40% of all data analytics projects will relate to an aspect of customer experience.

So what can you do now to start to understand the different elements of your customer experience?  Start to think about your customer journey, from initial inquiry to customer support. What makes your customers happy, loyal, and repeat purchasers. You need data to help answer these questions.

Moments of Truth

It’s important to understand your customer as much or more than your technology. You need to understand your typical buying cycle and customer buying behavior, and a customer journey map can help.  Per Gartner, there are "Moments of Truth" within activity streams that fluctuate as customers move from buying cycle to owning cycle, and back to buying.  These moments are key opportunities to make a strong, lasting impression.

Gartner describes these moments as three major customer activities that include Explore, Evaluate and Engage.  The graphic below shows how the cycle works, with moments of truth preceding the “off ramps” where customers could abandon your product or service if they have a bad experience.    

Source: Gartner

Source: Gartner

 

Explore and Evaluate

Customers want transparency and choice. And they want to research products and talk to experts to help them make informed decisions.  This is where trust is established, with the customer determining whether they are being manipulated or empowered, or inundated and confused.  Building customer segmentation data based on records of past purchases across your customer base can help match product and price that is appropriate for each customer. It’s not about manipulation, but helping them quickly and easily finding the right product or service at a price point that meets their expectations.  The more you know about them, the better you can help them make an informed choice. Using segmentation and customer archetypes helps jump start the process by presenting them with the appropriate products, at a price they are willing to pay, that returns acceptable margins to the business.

Engagement

In the Gartner model, engagement is the most prominent in the “Owning” cycle, where customers consume products and services, and develop impressions of the overall experience. As the graphic above shows, abandonment can be a result of a poor experience at moments of truth. Again, customer analytics can help you understand customer needs and behaviors post sales, as they consume or use your product or service.  Using past customer data, you can look for patterns of use and behaviors that may lead to abandonment.  For example, there may be a spike in support calls from a customer followed by a reduction in purchases. Or perhaps there are seasonal considerations for a certain customer segment that create inventory and staffing challenges that need to be anticipated. 

Source of Customer Data

So where might you find the data to better understand your customers at each leg of the journey?  Gartner suggests the following:

  • Direct feedback surveys such as relationship, transactional or special purpose survey.
  • Indirect feedback such as text, speech and interaction analytics for customer care.
  • Operational data from CRM systems, call center software and marketing analytics to infer customer perceptions.
  • Market research such as marketing department studies gathered to define and understand the target audience.
  • Qualitative research including focus groups, online research communities, and ethnographic research.

For most B2B firms, the operational data from CRM and transaction systems will likely be the best place to start, followed by direct and indirect feedback from surveys and call centers. These data already exist or can be easily obtained and could help accelerate your journey mapping exercise to identify areas of dissatisfaction or missed opportunities.  

Recommendations

In her session, Maximizing Value along the Customer Journey, analyst Melissa Davis offered up the follow recommendations:

  1. Identify high-priority customers – Identify the highest impact customer segments (likely your top 20% of customers that deliver 80% of your revenue and profitability).  This is your starting point for customer analytics.
  2. Identify high-priority moments— Identify places on the buying journey that disproportionately create or destroy customer loyalty and advocacy.  Look at conversion, abandonment and churn around these moments.
  3. Identify high-priority investments in customer analytics— Work with LOB leaders and IT to create an inventory of data analytics competencies and develop a road map of key data analytics projects.

While advanced analytics (AI, machine learning) may be on your road map for the future, make sure you are focused today on customer analytics that will create demonstrable value along the customer journey.