Why Healthcare Providers Need Performance Benchmarking

A benchmark is a measurement that sets the standard for judging performance. In health care a benchmark helps to determine whether a level of quality is meeting, exceeding or falling short.

Benchmarks are increasingly vital for healthcare providers across the spectrum of care as payers continue to emphasize value-based purchasing and standards of quality in their pay for performance schemes.


The biggest single payer, Medicare, has its hospital Value Based Purchasing program with dollar incentives for performance above the benchmarks and dollar disincentives for performance below. On top of this program is the hospital Readmission Reduction penalty with up to 3 percent deducted from total hospital Medicare payments for hospitals that do not meet the readmission benchmarks. These penalties can easily add up to millions of lost dollars. Anthem, United, Aetna and Cigna all have similar quality measurement programs with money tied to them.


How can providers remain on top of these developments? The Medicare and commercial payer quality programs publish provider actual and benchmark results.  Medicare even has a website, www.medicare.gov/hospitalcompare/search.html, that lets you look up actual results and the benchmarks for each hospital in the country.  But these are published years after the care is provided.  Even commercial payers lag in reporting actual results.  As a provider needing to meet these benchmarks to get paid, you need current data and information on how you are doing so you can act on it and improve toward meeting payer expectations before they report your actual results.  Otherwise, it is like practicing medicine while looking in a rearview mirror.


Moreover, managed care companies want to know whether you care about what your outcomes are. How many providers have lost a contract because they cannot report even basic quality results comparing themselves to benchmarks?


The best measures are service line specific or procedure-specific and the benchmarking should be apples to apples. Data measurements should be standardized and comparable to provide confidence in the data.




Here are eight classes of benchmark indicators providers should monitor to benefit their practices.


1. Outcomes benchmarks. In surgery and pain management, outcomes are benchmarked based on the patient-reported pain levels before and after an intervention. Hospitals and procedure centers can benchmark:


  • Perioperative pain
  • Percent of patients reporting worse pain immediately after the procedure
  • Post-discharge pain
  • Percent of patients reporting worse pain after returning home
  • Improved pain scores
  • Increased functionality


For some patients, the treatment can provide short term relief but not long-term relief; other patients experience long term relief but are still in pain immediately after the treatment. There are numerous pain measurement scales, ranging from the less complicated to the more complicated, but the important thing is to have a patient-centered measurement scale, so you are able to compare your results to others.


New Health Analytics maintains a huge database of post-surgical outcomes. Analysis of the database shows that 12 percent of patients report worse pain immediately after a cervical procedure while around 8 percent of patients report worse pain immediately after lumbosacral procedures. The incidence of pain after returning home are similar between the two.


Providers need benchmarks like these to show managed care companies you are good and get favorable results for your patients. If you are better than the benchmark, you can take a tougher negotiating stance with the managed care company. Too often, providers assert they are excellent, but they really stop the managed care negotiator in their tracks when the provider can measure and demonstrate their excellence.


2. Procedure benchmarks. Keeping track of procedure benchmarks is useful to understand the efficiency of your physicians and whether you are receiving the appropriate reimbursement for your work.  Hospital outpatient facilities and ambulatory care centers can benchmark:


  • Procedure time
  • Recovery time
  • Length of stay
  • Medicare/private payor reimbursement
  • Facility fee
  • Non-facility fee


It is an old saying, but it is true.  If you do not measure it, you cannot manage it.  These sorts of indicators can help lower costs and monitor reimbursements.  Found money. 


3. Patient satisfaction benchmarks. Tracking patient satisfaction is essential for all practices, but especially because the ultimate goal is not to conduct a patient visit, but to improve a patient's function, mobility and, ultimately, quality of life. To many people, patient feedback is probably the most important outcome measure you can use in clinical practice.  A happy patient is one that is likely to return.


After a hospital discharge or ambulatory procedure, you can monitor symptoms such as nausea, vomiting, fever, difficulty urinating, bleeding and signs of infection. Hospitals are increasingly contacting patients with a follow-up phone call at the patient’s home. It is easy to turn that call into a quick survey to collect complications data and other patient satisfaction.


When asking about patient satisfaction, it is best to keep the questions short and focused.  Great questions to ask about patient satisfaction include:


  • Overall quality
  • Registration and admission
  • Pre-op care
  • Preadmission testing
  • Recovery


It is useful to break it down by hospital service line, DRG, or procedure in the ambulatory setting so you can see which processes the weakest and which ones are relatively the strongest. This sort of breakdown is not something CMS does with its patient satisfaction surveys.


4. Physician satisfaction. It is also important to measure physician satisfaction, so you can see where your practice is excelling and where you may need improvement. Questions can cover:


  • Scheduling
  • Efficiency of the practice
  • Patient satisfaction from the physician's point of view
  • Anesthesia services
  • Skills of the personnel
  • General appearance of the facility
  • Overall rating


Physician satisfaction surveys can be done online (or through paper surveys) asking a series of questions about their experience at the hospital or the procedure center.


5. Staff satisfaction. You will not have happy patients unless you have happy staff.  Maintaining employees for as long as possible is important because hiring a new staff member is a drain on practice resources. Maintaining a low employee turnover depends on having a high employee satisfaction. Employee satisfaction can be tracked in the following areas:


  • Work and home life
  • Management and communications
  • Work and learning environment
  • Fairness and effectiveness
  • Quality of job design
  • Work pressure
  • Overall job satisfaction
  • Staff exposure to patient safety issues


The benchmarking data uncovers opportunities for improvement which leads to development of action plans to address those deficiencies.


6. Practice management benchmarks. Patient outcomes are the number one priority for many pain physicians, but maintaining practice management norms is a close second. It's important to understand how your practice measures against other practices nationwide to determine whether potential changes to office management are necessary. Basic benchmarking indicators include:


  • Patient volume—number of patients seen per month
  • Total number of patients seen annually
  • Number of days spent doing procedures
  • Number of days spent seeing patients in the clinical office
  • Amount of procedures referred to the facility
  • Volume of procedures performed annually


These management benchmarks help do several things. First, they help project financial performance over the next one to two months. Second, they can be a leading indicator of any problems headed your way. If you see the number of new patients drop off, you could be struggling with cash flow a month or two down the road, for example.


7. Financial benchmarks. Comparing providers to others in your area using financial benchmarks can allow you to make sure you are not getting outside industry standards financially. Straightforward financial benchmarks include:


  • cost per case
  • collection rates
  • personnel expenses
  • profitability per case


The profitability of each case is determined by dividing the reimbursement per treatment to the amount of time the case takes to perform (per hour). If you do that, you'll see that cervical cases actually generate more payment than lumbar cases, for example.


8. Patient drug test benchmarks. One of the biggest issues in medicine today is controlling narcotic use and correcting divergence. Physicians can use benchmarks in this area to understand how their patients and practice fit into the local and national spectrum of drug test results. Taping into the electronic medical record or the hospital laboratory information system can give some important measurements:


  • Averages for results such as no abnormality detected
  • Illicit substances found
  • Unprescribed medication found
  • Prescribed drug not found
  • Drug above expected range
  • Drug under expected range


A report from Quest Diagnostics finds 55-63 percent of patient test results were inconsistent, meaning many patients are misusing prescription drugs and putting their health at risk. Isn’t this good for any practitioner to know about their patients and to protect themselves from overprescribing.

Automating your Business with Python

Part 1 of 2: Why Python?

Demanding clients, tightening budgets, looming deadlines, endless data entry, fierce competition, and desire for innovation. Sound familiar? In a world where we’re expected to do things better and faster with a smaller budget and less people, Python is a tool that can be used to automate the tedious, repetitive, and manual aspects of your business, allowing you to deliver more quickly and with greater accuracy, freeing your people up to do other things that truly add value to your business, and helping you gain the competitive advantage.

What is Python, and why should you take notice?

According to the official Python site, Python is “an interpreted, object-oriented, high-level programming language with dynamic semantics.[1]” Bored yet? Basically, Python is a programming language that can be used to do just about anything. But the same could be said about the dozens of other programming languages out there: Java, Javascript, C++, C#, Perl, PHP, Ruby…and the list goes on and on.

So, what makes Python different? Before we go there, let’s talk about how Python is the same as some of the other top programming languages.

o   Python is open-source. No licenses to purchase. Free to use, forever.

o   Python is powerful. It has thousands of standard and 3rd-party modules & libraries that enable you to develop just about any solution you can think of:  Data analytics, document parsing, browser automation, and everything in between.

o   Python is flexible. It can run in any major operating system: Windows, Linux, & Mac.

o   Python is supported by a massive community. Python is popular and growing fast (more on that later). This means there is a ton of community support & documentation available.

o   Python is scalable & enterprise-capable. This one is important. One misconception is that Python is only good for smaller projects, and that couldn’t be more wrong. It’s true that Python is great for small projects, but it’s also being used all over the world in major companies for large-scale mission-critical projects. Google, Facebook, Capital One, NASA, Netflix, & Dropbox all use Python. Instagram, one of the most popular mobile apps in the world, with over 400 million daily users, uses Python as one of its core technologies[2].

Now let’s go back to my original question: What makes Python different? Three things stand out to me: speed to solution, efficiency, and popularity. Python is unique in that it has the good qualities of other major programming languages, while also allowing you to deliver solutions faster and more efficiently. This unique combination has led to Python becoming one of the most popular & fastest growing programming languages in the world.

Python is powerful, yet very easy to read and write, which leads to solutions being delivered in less time with fewer lines of code. The official Python website claims that “Python programs are typically 3-5 times shorter than equivalent Java programs.” This was also shown to be true in a study[3] by Connelly Barnes, a Senior Research Scientist at Adobe. When comparing 8 of the major programming languages, he found that Python developers solved the problem in the fastest time with the second fewest lines of total code. To illustrate, here is a quick code comparison of the classic “Hello World” example.

Hello World.png

To be completely fair, there are times when other languages are better suited, as is explained here -- https://www.python.org/doc/essays/comparisons/ -- but in many cases it’s hard to beat the combination of power, developer efficiency, simplicity, and support that comes with Python. It should also be noted that the inclusion of Python in your stack does not mean that you need to exclude Java, C++, or other languages. Alex Martelli, a well-known Python developer & Google engineer, said that “the very earliest Googlers (Sergey, Larry, Craig, ...) made a good engineering decision: ‘Python where we can, C++ where we must’”[4].

Python is popular and growing FAST. Stack Overflow, the “largest, most trusted online community for developers to learn, share their knowledge, and build their careers”, recently did a study[5] on the growth and popularity of major programming languages. While most languages remained relatively flat in growth, Python has experienced major growth over the past 6+ years.

Python Chart.png



Just Google “most popular programming languages” or “fastest growing programming languages”, pull up any recent study or article, and almost every single one will list Python near the top. Everyone from hardcore developers to hobby developers to data analysts love Python – for all of the reasons previously mentioned. This growth, popularity, and excitement around Python means there is a massive community to lean on for support and documentation, and that same community is constantly extending the language, adding new features and libraries all the time.


Why Python?

We’re all looking for the fastest, most efficient, and most effective route to satisfy the needs and solve the problems of our organizations. Python, with its combination of power, efficiency, simplicity, and popularity, can help you accelerate innovation for your own business and for your clients.


In Part 2 of “Automating Your Business with Python”, we’ll provide some specific examples of how we’ve used Python to automate things internally here at EnterBridge, as well as for many of our clients. Stay tuned!


[1] https://www.python.org/doc/essays/blurb/

[2] https://instagram-engineering.com/what-powers-instagram-hundreds-of-instances-dozens-of-technologies-adf2e22da2ad

[3] https://www.connellybarnes.com/documents/language_productivity.pdf

[4] https://stackoverflow.com/questions/2560310/heavy-usage-of-python-at-google

[5] https://stackoverflow.blog/2017/09/06/incredible-growth-python/?utm_source=so-owned&utm_medium=blog&utm_campaign=gen-blog&utm_content=blog-link&utm_term=why-python-growing-quickly

Workflow Automation and the Software Technology that Will Automate Business

Businesses today are ripe for automation.  Many of the major businesses in the world today have been in business for decades and are still relying on manual tasks to be completed in legacy systems.  Workflow automation is often defined as automating manual business tasks but workflow automation does much, much more.  Workflow automation transforms businesses through the implementation and utilization of the latest technology and software.  The real business benefit comes from the synergy of utilizing the latest technology with the automation of manual business tasks.


Historically, workflow automation has primarily focused on organizing business tasks to facilitate smooth process completion and task transitions.  Recent technology advancements, including artificial intelligence, have allowed workflow automation to automate historically manual tasks that required the utilization of humans to complete manual tasks.  Workflow automation utilized in business today rely on the newest technology available.  Python is one of the fastest growing programming languages in the world and has become extremely popular in automating workflow processes.  A large part of its popularity can be attributed to its ease of use and ability to automate manual business tasks.  Python works well with automating web-based processes and can be up and running with minimal time compared to other historically popular programming languages.


 The cloud is becoming a vital tool leveraged by businesses to reduce infrastructure costs and to securely digitize workflow processes.  The cloud allows for a centralized repository for all processes and procedures to be stored and worked through.  Automating workflow in a centralized cloud environment is more cost effective than trying to automate workflow processes involving multiple office location and multiple document types.  Technology is continuing to improve to ensure workflow automation is cost effective for businesses to utilize. 


Workflow automation has countless benefits to an organization, but some of the largest include the following:

  • Error Reduction – Automation reduces the errors caused by human and manual tasks.
  • Cost Reduction – Automation reduces the overall costs to complete a transaction.
  • Visibility – Automation improves transparency over processes and procedures


Software technology is vital for businesses to obtain the maximum benefits of workflow automation solutions.  Software technology can be thought of as a bridge between manual tasks to automated.  Without a bridge in place, manual tasks can never be automated.  Software technology allows businesses to realize the all of the benefits of workflow automation.


Automating the entire process is vital to obtaining the maximum value of workflow automation.  This may sound obvious, but businesses often only automate a small part of the overall business process being reviewed.  The overall cost benefit analysis of implementing a workflow automation solution will continue to improve as more processes and procedures are automated.  Businesses that perform the analysis and due diligence needed to approve and implement a workflow automation solution find the largest benefit.  Businesses today have a wide selection of different technology solution partners that can provide workflow automation.  Harness the power of workflow automation in your business today by utilizing the latest in software technology that is available

Are You a Citizen Robot Builder?

Since the dawn of the computer age (and before!), technology has been used to automate business workflows, be it order entry, inventory management, credit application processing, etc.  At first glance, RPA or Robotic Process Automation, seems like another term for Business Process Automation (BPA).  What distinguishes RPA from BPA, is that a programmatic robot – or bot - is replacing human interaction with a user interface, such as a form in a browser.

Let’s face it, even in the digital age, there are still times when the programmatic interchange of data between systems is not practical and/or possible.  Traditionally, we would have hired clerks to perform data entry tasks, or asked analysts or other team members to perform these duties.  IT might get involved and spend thousands of dollars creating APIs to allow access to the systems.  At its most basic level, a bot can be created and deployed to perform this action.

And in certain cases when IT cannot or will not create the bot, business team members have taken it upon themselves to create and deploy the bot themselves.  Like the wizardry that used to be performed in Microsoft Access or by using Excel Macros, these bot creators are now automating data entry tasks and removing the human element that was present in the workflow automations of the past.

Citizen Data Scientists have been getting a lot of buzz over the past couple of years, as access to data has been democratized and IT departments have enabled self-serve data analytics.  On a similar note, it appears that a new group is emerging in the workplace, Citizen Robot Builders.  These are typically business team members (generally not IT or traditional developers) that have a penchant for programming and a desire to automate some of the more mundane tasks that their business unit performs.  The proliferation and accessibility of Python has certainly contributed to this, as well as browser automation frameworks such as Selenium WebDriver and Puppeteer.  

Bot based automation is no different than any other process automation, at a certain point, IT will want and need to exert some control over the bot creation and management process.  Questions that will likely be raised are:

  • Who is responsible for the maintenance of the bots if the underlying systems change?  For example, if the bot is entering data into a web-based form, what happens when the form layout changes?  Assuming there is a login required to this form, what happens when the login credentials expire?  Who will notice?  How?
  • Where is the source code for the bot?  On a business analyst’s laptop?  Are there variants of the bot that have been created for similar but slightly different purposes?
  • Are their security or compliance issues with how the bot is accessing or updating data?  Are their HIPAA, ePHI or GDPR concerns that must be mitigated (or at least acknowledged)?
  • How are the bots deployed?  Where are they running?  What if multiple bots need to perform a task in a certain sequence, how is that orchestrated?

The larger your organization and the more bots you have deployed, the more important and critical the answers to the above questions become.  The thought of dozens (or hundreds!) of bots running rampant throughout the organization, with no centralized control and monitoring mechanism in place, is somewhat concerning.  In an ideal situation, RPA would be a stop-gap toward a more controlled API driven solution that would likely be easier to maintain and monitor.  But, as with most decisions, the ROI must be justified.

Why Predictions Fail

The new year is upon us and with the changing of the calendar inevitably comes the raft of predictions for the upcoming year. These great soothsayers opine on everything from economic data, stocks, consumers trends or the latest fashions. Most of these are riskless predictions — except perhaps for one’s reputation — as being wrong brings no financial cost.

Businesses, however, must make predictions all the time, and being wrong can bring real financial cost from a failed project or product launch. So, why do predictions fail?  The primary cause is ignoring available information, leading to overconfidence.

Making Bad Predictions

In his book, Thinking Fast and Slow, Daniel Kahneman describes how when teams are charged with making predictions they often rely on what he refers to as an “inside view,” where the team bases their estimates solely on their “particular circumstances.” Curiously, even when individual members of team possess “outside” knowledge about failure rates of similar projects, they still tend to be overly optimistic about the current situation. In the book, he describes how a group of professors were collaborating on a book project and were asked at that the outset of the project how long they believed it would take to complete. They predicted the project would take around two years, and no one even contemplated the probability of failure. After these confidential predictions were tabulated, Kahneman asked one of the professors who had the most experience with these types of projects in the past, what he had learned. The professor pondered the question and responded that 40% failed entirely, and the successful ones took seven years to complete. As most teams do in these situations, they ignored this evidence and dove merrily into the book project — completing it eight years later.

The Planning Fallacy

What is it a play here, according to Kahneman, is the Planning Fallacy at work. This is a condition when plans or forecasts are:

·       Unrealistically close to the best-case scenario, and

·       Could be improved by consulting the statistics of similar cases

In the book project above, you could imagine a range of outcomes where completion in two years was likely among the best-case scenarios, but it’s what the team agreed was about right. What’s required, according to Kahneman, is an outside view that provides base rate information from similar situations.  Without this information, decision-makers will anchor their expectations on the initial forecast, making ill-informed and sometimes spectacularly wrong bets. Leading up to the collapse of the financial markets in 2008, data on mortgage default rates were readily available but promptly ignored, failure was not contemplated, except for the very few who spotted the trends (i.e. deviations from the base rate) and shorted the market.

Finding Your Base Rate

As you make your revenue and budget projections or contemplate funding new product development, you should begin by challenging your underlying assumptions about the range of potential outcomes. Your initial prediction will serve as the anchor for future adjustments, so if the initial predictions are wildly optimistic, your adjustments going forward will likely be frequent and substantial, creating financial as well as credibility issues with your team and stakeholders. So, look for similar “cases” that match your situation, and find that data that will help inform your estimates as to a range of outcomes. Using this data, note the best-case scenario and challenge yourself not to fall in love with it. Taking an “outside view”, compare your historical data (sales, profit margins, product cycles, etc.) to similar organizations, project types and look for deviations and patterns, not averages.

The point is not to squelch optimism or risk-taking, but that you need to be mindful of your decision-making methods. Take an outside view, avoid making the planning fallacy, and seek out your base rate data. 


Photo by Caleb Ekeroth on Unsplash

Reevaluate Your Pricing Model to Improve Cash Flow and Margins

Startups and established firms both deal with cash flow issues. For startups, effectively managing cash can mean the difference between living to fight another day or closing up shop. For others, it could mean funding new product development to leapfrog a competitor.  

With the start of the new year, it may be time to take another look at how you price your product or service to improve margins and cash flow.  Software-as-a-Service (SaaS) tech startups can provide a useful model that shows how tweaks in pricing and terms can have a dramatic impact on cash flow and profit.

Pricing Structure Options

SaaS firms operate using some combination of following three pricing methods:

  1. Linear Pricing (LP) – This is a pure consumption model where, for example, each transaction or say some analytics “event” costs $0.10. You pay for what you consume.
  2. 2 Part Tariff (2PT) -  In this case, the analytics software has a base platform fee of say $10,000, and each analytics event processed by the system costs $0.10. The same model as above, but you can charge a platform fee in addition to consumption.
  3. 3 Part Tariff (3PT) - This also includes the base platform fee, but it is now $25,000 because the first 150k transactions/events are free.  However, each additional transaction/event costs are slightly higher at $0.15.

With pricing, you are trying to balance the value delivered (both real and perceived) with a price point that captures that value using a cost-plus model or value capture. Looking at the three SaaS examples above, there is lower risk and clear value in the Linear Pricing model as the customer has visibility into and can control, the monthly cost my managing their consumption. The risk for the vendor is high variable usage that causes support challenges and monthly fluctuations in cash flow. 

The 2 Part Tariff helps mitigate this risk by including an upfront platform fee to provide a cushion to guard against potential monthly fluctuations in consumption. 

By including a bundle of “free” transactions in the platform fee in the 3 Part Tariff, you have the flexibility to negotiate the number of transactions you would be willing to provide, without compromising on the platform fee level.  Sales teams like this approach as it gives them the option to offer something of value without creating a potentially unprofitable account.

In the SaaS world, contracts are typically structured using per unit of consumption (message, analytics event, telephony minute), per person(user), or enterprise license agreement.

Introducing Annual Contracts

Getting customers to move away from a transaction or monthly payment model to an annual contract can have a dramatic impact on your cash flow.  Using SaaS firms as an example over a hypothetical 24-month period, you can see in the chart how the various types of payment options can impact your cash position month to month. This chart from Tomasz Tunguz at Redpoint Ventures show how annual prepay creates a cash cushion from month 1 to 12. The gap jumps again at month 13 when annual renewals hit, dramatically changing the financial condition of the company versus the monthly, semi-annual options that stretch out payments. 


Positioning for Annual Contracts


To be clear, getting your customer to commit to a year’s worth of payments is no small task, particularly if you operate in a commodity market or your value proposition is not clear and compelling. Price points clearly matter, as your customers have cash flow issues of their own to manage, and are not likely to cut a large check without feeling like there is real value in doing so.

You first need to determine your positioning in the market before you consider changes to pricing. Looking at this through the eyes of startup companies could prove useful. Tunguz provides a summary of the alternative options used by successful startups:

"Startups can choose to price below the market, to gain share and grow quickly (Zendesk, AirWatch); they can choose to price at the market price and differentiate based on product features (Dropbox and Box); or they can charge a premium for their product, which reinforces their positioning as the gold-standard in the sector (Palantir and Workday)."

Once you understand how you want to position yourself, you can then start to reevaluate your pricing and contract terms to see if there is an opportunity to bundle products and services, create more value, and have your customers commit to longer-term agreements that benefit all parties. 


Photo by Didier Weemaels on Unsplash

How Better Customer Experience Drives Down Customer Acquisition Costs

If you ask any CEO their thoughts on the proper amount of spend on marketing and sales, in an unguarded moment, they’d likely say as close to zero as possible. In their ideal world, products sell themselves, word of mouth is the only means you need, and customer acquisition costs are negligible.  No such world exists, but there are certain leverage points to focus on that can help optimize marketing and sales expenditures, resulting in lower customer acquisition costs. The key is creating customer advocates that will sing your praises in case studies and gladly make referrals – starting word of mouth buzz that results in new leads and sales. Research by Satmetrix, creator of the Net Promoter Score (NPS), shows that companies in the top 10% of Promoter percentages enjoy referral rates almost twice as high as those at the bottom.

Before you pull the trigger on an elaborate referral campaign, you need to find the cause of customer happiness (or discontent). To do this, you need to home in on two areas: product fit and customer experience. In other words, you need to understand how much do they love your product and love working with you? Knowing these two things at a detailed level will help create much better leverage in your marketing and sale efforts. The higher the advocacy level, the lower sale and marketing lift required.

Where it’s Worked

Whether it’s a local restaurant that’s always booked or a “unicorn” technology with millions of users, you can find examples of where word of mouth (or virality in the online world) has produced real results. 

Chances are you are not going to find the always-booked, super hip restaurant advertising in the local paper — the “in crowd” of foodies simply know where to go. In fact, not only do these restaurants not advertise, some don’t even bother to put signage on the building. But the foodie community has no problem finding them and spreading the word to their extended networks of colleagues, friends, and family.  Now, much to the delight of those businesses receiving the positive buzz, social media has changed the definition of “friend”, creating extended communities of connected people that can make or break a new restaurant with their social posts.

In the technology space, the holy grail is for marketers is virality, buzz about your product that spreads like wildfire (or like some killer pathogen). Basecamp, an online project management solution for business and individuals, began as in internal project to build tools they would use themselves and morphed into a product with millions of users.  From their website (emphasis added): 

"Today, 10-years after Basecamp first hit the market, nearly 15,000,000 people have worked on a project with Basecamp! And every week, thousands of companies sign up to use Basecamp. We’re so grateful that Basecamp has become a worldwide phenomenon almost entirely through word-of-mouth. Our customers evangelize Basecamp simply because they love Basecamp."

There are a couple of important lessons from the Basecamp story. The first is that they entered a space (project management software) that was well established and very crowded.  Microsoft Project had been available for years, and there were countless other enterprise tools for managing projects.  But what the folks at Basecamp did differently is they focused on a niche user base, technical and creative types, and built only the essential functionality to meet their needs. They didn’t try to take on MS Project and lard up the product with unnecessary features. They built just what their audience needed, and with the founders' background in web design, focused intently on user experience. Second, they focused on developing a community by publishing a popular blog that they used as a platform to communicate their thoughts on a new way of working. The blogs became widely popular and later turned into best-selling books. They had a very distinct point of view that influenced the way they ran their company that was authentic and credible, which helped create a halo effect around the product, spurring goodwill and lots of referrals.  Their philosophy was “if they like what we have to say, they’ll probably also like what we have to sell.”

More recently, companies like WhatsApp have taken virality to the extreme, generating hundreds of millions of users worldwide. At the time they were acquired by Facebook in 2014, they had a user base of over 450 million, less than 20 employees, and zero marketing budget. Whatsapp rode the mobile phone wave around the world, providing a chat option that undercut costly messaging services from wireless providers by using mobile broadband to send messages rather than traditional texts.  Like Basecamp, they offered just the essential functionality needed — “simple, secure, and fast” — and used the inherent viral nature of the product to rapidly grow a user base, offering the service free for the first year. Also, their commitment not to display ads inside the product helped burnish their credibility with the community, creating a loyal cadre of fans.

Keep it Simple and Easy

The above examples may represent outlier cases of virality in the tech world, but there are valuable lessons here for businesses in any industry and stage of growth. Two words come to mind when thinking about the customer experience they created: simple and easy. To attack established markets and deep-pocket incumbents, they created simple, easy to use products that met the needs of specific users.  Users could easily acquire, set up, and use the products in a matter of minutes with little or no training or support.  There was no complicated pricing or negotiations that get in the way of decision making; they created a frictionless path to user adoption.

Your product or service is likely more complicated than a single purpose app, but the principles still apply: you need to remove complexity and deliver a simple and easy customer experience. This starts with the first customer interaction with your website or a sales rep, through to delivery and support.

You need start with baseline metrics for your customer experience is NPS and Customer Effort Score (CES) across customer journey to pinpoint critical gaps. It’s important to look at product and service to determine whether bloated features or complicated pricing are problems in addition to uninspired customer service and slow issue resolution.

Maintain Your Balance

A good way to think about the relationship of product, customer experience, marketing, and sales is maintaining balance, with product and customer experience directly influencing or counterbalancing sales and marketing spend.  As the handy nearby illustration shows, the degree of product and customer experience “fit” will impact how much additional lift will be required (money, people, campaigns) to generate leads. 


Stronger product and customer experience fit creates more loyal customers that become willing advocates that create low-cost referrals. The word fit is important to note, as it’s not about just creating more features, it’s finding the right match of features and overall experience for a distinct audience. In fact, finding the right fit may involve cutting out or simplifying. Less may be more in this case. But it’s impossible to know until you turn your attention to your customers, understand what makes them tick, and design products and experiences that will make them sing your praises.  

Photo by The Creative Exchange on Unsplash

How to Build Trust in Data Using Storytelling

Why do people flatly reject clear evidence or fail to seek out new information when trying to unravel a problem?  Whether it’s housing bubbles or plunging sales figures portending rough times ahead for a company, smart people can often be rather ignorant of current reality.

Some social scientists believe it is a matter of innate biases that we humans have that prevent us from processing new information rationally and dispassionately. Confirmation bias —which causes people to seek out information that confirms their existing beliefs and ignore any contrary data or opinions —is one of the most commonly occurring examples. Similarly, motivated reasoning by scientists and academics combined with “careerism” and the growing pressure to publish more compelling and sensationalized findings that lack the proper academic rigor, offer more evidence of the pervasiveness of bias.

There is also lack of trust. Business leaders, who sing the praises of data and analytics are still, according to research by KPMG and Forrester, skeptical of their data with only one third indicating they trust the analytics coming from their business operations.

Tell a Story

You have to wonder why something deemed so critical — using data and analytics to gain a better understanding of the world around us — is too often ignored or just outright rejected? Unconscious bias plays a part, but trust is the secret sauce, the essential ingredient for more rational, data-driven decisions. There must be trust in the integrity of the data (is it accurate and complete) as well as the source of the data (no manipulation, hidden agendas). So, how do you achieve trust-building in data and analytics — factoring in potential bias inherent in how people process information?  

There is one communication technique used for centuries that may hold the key to knocking down the bias and trust barriers: storytelling.  Using a narrative device rather than just facts and figures has proven to be effective at persuading, influencing (yes, manipulating), and imparting knowledge for the simple reason that people are better able to retain information delivered in this manner. Research has shown that more areas of our brain are activated by storytelling, which leads to higher levels of engagement and attention. Research in behavioral science has also revealed that cognitive bias is the result of people using mental shortcuts or simplified rules of thumb when faced with complex issues.

To address the trust element, you must bring the audience (your stakeholders) into the story such that they start to relate to and empathize with the “characters” and the situation. In addition, using data visualization to support your story can have a huge impact, as humans process visual images 60 times faster than words.

Lessons From ‘The Big Short’

You can find stories in film and literature that can help you understand complex issues or topics as well as human behaviors and situations. There is perhaps no better recent example than the film The Big Short, which weaved a compelling tale of the financial crisis driven by the convoluted world of mortgage-back securities and collateralized debt obligations. As the housing market cratered in 2008, bringing down the entire economy with it, no one was spared. At the time, the average person had little understanding of the causes of the meltdown but clearly felt the impact. In the aftermath of the crisis, there was finger-pointing and recriminations across the board, with public trust in financial institutions, regulators, ratings agencies, and politicians plummeting. They all had their version of the truth, none of which resonated with the average citizen. Two factors were working against gaining a better understanding of what happened and thus determining the proper corrective action: wholesale loss of trust and the inherent complexity of the issue.

The film was useful in unpacking this complicated mess by telling a story through the eyes of people closest to the carnage. There was the contrarian hedge fund manager who successfully shorted the market and made billions, and the investment houses and regulators operating in a state of denial, assuming that housing prices could only go up and that rising foreclosure rates were not a concern. There was also the over-leveraged homeowner facing the realities of negative equity. The current state was working out well for the financial firms and ratings agencies. They were raking up massive fees and were not motivated to seek out contrarian points of view. Homeowners saw home prices continue to go up and wanted to believe the party would never end. This created the perfect environment for confirmation bias to take hold, where people only seek out data points that reinforce what they already believe. In this case, it was that the market could only go up, and there is no real systemic risk. It was a toxic mix of blind faith and willful ignorance.

The film had the essential elements of good storytelling:  setting the context and the characters and telling the tale in a narrative form or story arc. Good storytelling engages the viewer or listener, makes them relate in some way to the characters and the situations in which they find themselves. In The Big Short, the narrative unfolded in a cascading wave of causation: personal incomes stagnating, loan defaults rising, and prices on mortgage-back securities dropping exposing the extremely leveraged positions by traders at Bear Stearns, Lehman Brothers, and many others. As we all know, the story didn't end well, except of course for the contrarian hedge fund managers.   

Context and Causation

Traditional communication efforts regarding data and analytics fall short because they don’t set the proper context and don’t unpack complexity by laying out the trail of causation in narrative form. Too often the result is stakeholders that are tuned out, confused, overwhelmed, or mistrustful — most of whom will fall back on their mental shortcuts, making decisions riddle with biases.


To get your team or organization to begin making smarter, data-driven decisions, storytelling can be a very effective tool. By using this method to inform your decision-making processes, you are better able to engage your various stakeholders by setting the proper context, have them relate to the problem by identifying with the characters and situation, and walk away with a clearer understanding of complex issues. To be clear, like any tool storytelling can be used to manipulate and deceive. It’s incumbent upon leaders in any organization to establish a culture of trust and candor that under girds this approach. So, next time you have to present your data, think about the story you are trying to tell. And pay attention to those contrarians in your midst. 


Photo credit:  Paramount Pictures

It's Just Commerce

Guest Post by Matt Gutermuth, Co-founder of empower: Technology Positive

Now that “e-commerce” is more popular than ever, it is the perfect time to retire that term! It has become quite “normal” for consumers to purchase any number of things via electronic means, and they expect the online experience to mirror the in-store one. Therefore, it is time for retailers to stop viewing these experiences as separate, but instead, create an environment where consumers can engage with the retail brand on their own terms, when and where they prefer.

When one talks about e-commerce, does this just mean a new online distribution channel or a new way of doing business? Or both? What role do physical stores play in a future world influenced by our digital economy?  Leaders need to stop thinking about e-commerce as a separate, new thing, and just think commerce — delivered in new ways. Consumers care about choice, value, convenience, and cost. They don’t think about channels or corporate silos.

Like Nike’s old tagline, “Just Do It,” going forward e-commerce should become; “Just Commerce.” It’s not just about adding an online presence or eliminating stores, but instead, it is about evolving the consumer experience across all channels seamlessly, creating a branded experience for your consumer wherever and however they choose to engage.

Apple Stores: Digital Goes Physical

A leader in the digital revolution, Apple, has almost 500 retail outlets across the globe. Their strategy is to deliver a complete consumer experience that reinforces a brand that is cool, hip, sophisticated; melding design and engineering in ways no other company has. Their stores reflect this. Designed like modern art galleries, the stores display a variety of Apple products like works of art.

They also serve a practical purpose, providing support to consumers by setting up their devices and handling repairs. Surprisingly, recent research indicates that men over 65 spend more on Apple devices than any other demographic. You can imagine that this less tech-savvy group is also more likely to use the store to help set up and learn how to use those devices. Creating hands-on experiences was the motivation for opening the first store back in 2001, years ahead of the launch of the first iPhone in 2007. At the time, the intent was to have consumers test drive the new Mac laptops, which still had a small share of the market compared to PCs. Today, these stores create an environment where consumers can get help setting up their iPhone, iPad, or Mac, while checking out a new watch, or purchasing some cool wireless earbuds.

Even as online commerce exploded, Apple doubled down on the strategy, opening hundreds of new stores. They re-invented the in-store experience by removing cash registers, effectively turning every employee into a consumer-facing brand ambassador, that has the POS system on their individual iPhone.  They took a more expansive view of commerce that included online, stores, and distribution through third parties (Verizon, AT&T, Best Buy).  But, regardless of the channel, the consumer has the same experience with the brand. Each location serves a specific consumer need and demographic segment, all while the brand experience stays consistent.

Creating a Culture of Just Commerce

Changing consumer preferences for anytime, anywhere access require brands to provide a seamless experience across channels. Unfortunately, many companies are culturally and organizationally set up to fail because they see online and physical stores as separate, rather than complementary or integrated. To bolster their online capabilities — and to counter the Amazon onslaught — retail leaders like Walmart and Target, are expanding their digital presence by adopting a tech startup mentality: opening offices in Silicon Valley and funding tech incubators that are run separately from the rest of the organization. They want to move faster, freeing these new teams to take risks and innovate in ways they believe are not possible within their larger corporate structure.

Store No. 8

Earlier this year Walmart launched their “Store No. 8” outpost in Silicon Valley, designed to accelerate their digital efforts by tapping into the talent and culture of the Valley.  While the name harkens back to their very early attempts to innovate the store experience, this new initiative is heavily influenced by digital:  virtual reality, robotics, and machine learning. Target is also getting into the innovation game, announcing a partnership with the tech incubator TechStars to help nurture startups that will develop breakthrough technologies blending the digital and physical worlds.  

Walmart is betting heavily on their online channel, with several recent acquisitions including jet.com. And they are starting to see results, as their online business grew 63% in the first quarter of 2017.  Leaders in both companies are trying to find ways to move faster, be more creative, and solution-oriented by building separate organizations. But is this the best way to do it? Having the digital business units remain as separate and distinct entities could create separate and distinct cultures that would be noticed by the consumer, thus risking the brand's authenticity. The head of Walmart’s e-commerce operations, Marc Lore, states that "every day, I become more and more convinced about the omnichannel advantage.” One could argue, that to create an authentic, seamless, consumer experience, the company itself needs to maintain a culture that supports this versus creating division.

What may be lost in these innovation efforts is the consumer. If the desired end state is integrated, omnichannel capabilities, then do the existence of separate organizations help or hurt the cause? There is a risk that by doing so they create a technology-led rather than consumer-led approach to innovation. Having a consumer-driven, consumer-first culture is paramount to achieving breakthrough changes that will create competitive differentiation.  


The leadership challenge going forward for all brands is to figure out how best to integrate their e-commerce teams with the rest of the organization. They can do this by aligning all groups around a set of consumer-first principles that guide their decisions and actions.  Managing organizations through dramatic change is about defining a clear vision, mission and strategy, that achieves organizational alignment. 

Delivering an unmatched experience by understanding—at a granular level—the needs, wants, desires, aspirations of your consumers is your purpose. Omnichannel excellence will enable your brand to develop the emotional connection with the consumer that drives loyalty and ultimately, successful business results. The fact is, you must commit to the former to achieve the latter.

Everyone in your organization must take an authentic consumer-centric approach to their individual roles, versus the traditional way of doing things. To set the tone, and to make it easier for team members to understand what a winning omnichannel culture looks like, utilize a simple message that all teams can easily rally around: after all, it’s “Just Commerce!” 

Matthew Gutermuth was formerly President & CEO, Safeway.com, and has held senior executive positions at Sysco and Winn-Dixie. He is the founder of G7 Leadership, inspiring others to be great leaders by sharing over 25+ years of leadership experience helping others navigate change, and is also the co-founder of Empower: Technology Positive; bringing technology solutions to the Food Industry to enable improved performance.  www.empowerpositive.net

Where to Find Hidden Intelligence in Unstructured Data

roducing dashboards and generating reports from structured data has been a standard method for extracting insights from accumulated customer information. While this approach has worked well in industries such as online retail with every consumer click, swipe, or purchase stored in a relational database, many sectors, such as healthcare, have troves of unstructured data that sit unused.  

Deciphering signal from this enormous pile of noise presents companies with tremendous potential opportunity for better understanding customer behaviors that lead to better customer experiences.

Specifically, healthcare could see a wave of innovations in preventative care and transformative patient experience based on new intelligence from previously untapped unstructured data. With estimates of upwards of 90% of data in unstructured form, this will be the new gold rush in customer analytics.


Sorting Through the Universe of Unstructured Data

So, what exactly is unstructured data and why is there so much of it?  Virtually every person-to-person communication today produces some form of unstructured data — typically in raw text-based form. Text messages, chat, documents, and emails are the primary culprits. Now with video, audio, and still images the universe of unstructured has expanded dramatically, but it comes in forms even more challenging to interpret in a large scale, automated way.

For example, drones are now used to capture video and still imagines for everything from homeowners insurance claims to measuring customer traffic patterns at shopping centers and amusement parks by photographing parking lot at various times. The challenge lies in evaluating this mountain of data without requiring human intervention — otherwise, it just won’t scale.   

Medical records present a more mundane but potentially transformational example of the power of unstructured data. A typical patient record likely contains a jumble of hand-written notes and documents that may or may not be held within a “certified” Electronic Medical Records (EMR) system. While EMR systems have been adopted by 90% of office-based physicians, consensus estimates are that 80% of that data is in unstructured form. As a result, it remains largely untapped for more proactive and preventative individual care, or for use across patient populations to inform research and public policy. The reason for this is that they are likely stored as scanned image files or pdf documents and not in a relational database that is easy to search, extract and analyze. 

The Opportunity in Healthcare

IU Health is using data analytics to develop a deeper understanding of their patients to improve the patient experience as well as health outcomes. They do this by augmenting their internal data with external data to help spot correlations that could lead to more targeted intervention.  According to Richard Chadderton, senior vice president, engagement and strategy, “It is promising that we can augment patient data with external data to determine how to better engage with people about their health. We are creating the underlying platform to uncover those correlations and are trying to create something more systemic.”  They are looking at correlations like housing density and high disengagement from health to spot areas for early intervention.

By taking this “outside-in” approach providers like IU Health can capture the full context of individual patient behaviors and outcomes. With better understanding, you can imagine a more consumer-like approach to healthcare that would lend itself to use cases such as:

Proactive and preventative Care – patient and doctors with better longitudinal data delivered via shared online dashboards could map trends in vitals, cholesterol, blood sugar, weight, and other factors to anticipate and head off more significant problems. Payers could provide financial incentives for patients to participate, lowering premiums for participation and hitting certain milestones.  Patients with mobile phones could receive alerts and text messages that “nudge” them to continue to monitor their health and adhere to a schedule of preventative care visits that include blood tests and physicals.   

Population-wide data use – By aggregating more granular individual patient data, providers and payers can start to isolate the root causes of broader health problems and develop community-wide preventative programs to educate patient populations and drive behaviors that lead to better overall outcomes. 

The key is to start to get better data on individual patients by unlocking what’s hidden in unstructured form, uncovering trends at the individual patient level, and discovering correlations at the community level that drive innovations in preventative care.  After all, the best outcome is less healthcare and more healthy people.  

Photo credit: Photo by Crew on Unsplash

Why the Data Revolution Stalled in Healthcare

For consumers, the digital and data revolution is well underway. From your smartphone, you can access everything in your personal and work lives: paying bills, transferring money, or ordering any conceivable product on Amazon—all while sipping that latte at your local café. But what you can’t do is access or manage your health records. That requires a phone call to the doctor’s office, a request form that needs to be faxed, resulting in hard copies that need to be mailed. All accomplished in roughly 2-3 weeks if you’re lucky.

Unless you feel nostalgic for the bygone era of curly paper and the screeching/whooshing noise of the fax machine making a connection (which always elicited a satisfying grin), this way of handling patient data is, let’s just say, suboptimal. The problem goes well beyond consumer convenience, as the same lack of access limits clinician’s ability to better understand a patient’s medical history. Furthermore, with records locked away in file cabinets, there is no opportunity to anonymize and aggregate data to better understand correlations between treatments or procedures and outcomes across patient populations.  Right now, Facebook and Google are capturing data on virtually every online activity and likely have more digitized personal and behavioral data on consumers than any health organization. 

Healthcare Data Gap

In their Age of Analytics study on the adoption of data-driven practices across industry sectors, Mckinsey Global Institute (MGI) found that U.S. healthcare lagged other industries in “capturing value from data and analytics.” Not surprisingly, location-based data (GPS) and U.S. retail led the way in value capture (see chart), as turn-by-turn directions and one-click purchasing have now become essential capabilities on smartphones. According to MGI, U.S. retail has been able to capture three to four times the value as healthcare, with GPS presenting an even a stronger case, at roughly five to six times.

These sectors were able to leap ahead due to the widespread adoption of standardization and integration of technology. Standard protocols for GPS technology, first established by the military, have been around for decades. The accelerated growth of mobile platforms that drove online retail was made possible by standardization on operating systems (iOS and Android) and coding languages like Python and Ruby on Rails, that armies of developers used to accelerate development cycles and enable interoperability.

The question remains, how does healthcare catch up?  How does the industry start to capture greater value from digital technologies, data, and analytics?  Let’s start with what’s getting in the way.

Barriers to Adoption

In their research, MGI identifies several barriers to greater adoption of analytics in healthcare, these include:

  • Lack of incentives,
  • Difficulty of process and organizational changes,
  • Shortage of technical talent,
  • Data-sharing challenges, and
  • Regulations

Scanning this list, you start to understand perhaps why healthcare lags behind, say, retail in adopting analytics. Let’s face it; healthcare is a beast. It’s highly regulated, decentralized, specialized, complex and fraught with risk (legal, financial, ethical). Purchasing diapers and batteries online is not exactly open-heart surgery.

While direct comparisons are difficult to make, perhaps we can draw some parallels to the underlying conditions within certain sectors, like retail, that drive the wider use of data and analytics to improve decision making and create better customer experiences.  Personalization, choice, and consumer control is the hallmark of successful retailers today— particularly the online, digital native firms. Beyond the obvious example of Amazon, there are specialty retailers like Warby Parker, Zappos, and prepared food providers like Blue Apron that combine choice, convenience, and personalization fueled by data (customer profile as well as purchase history). To make this work, consumers today have a tacit understanding with online retailers—they know with every interaction these companies are collecting more information about them. They are fine with this to the extent the companies are using this information to understand consumer’s needs, wants, desires, and preferences and deliver something of value in return. As Google discovered with Gmail ads, when the value exchange becomes unbalanced — i.e., pummeling your inbox with random ads— consumers will rebel.

The fact remains that the digital revolution of the last decade was consumer-focused and consumer-led.  The customer was at the center of the action. This is where incentives must be focused. The fundamental difference with healthcare is that historically the consumer has been a bystander, as providers and payers interacted to control the type of care, determine prices, and handle payments. The industry has looked and acted more like traditional B2B, slow to innovate, adopt new technologies and experiment with radically new ways of doing business. Until the consumer is front and center, informed, empowered and able to make their own decisions, things won’t change. Healthcare needs to gets personal.

Personalized Health Using Data

The proliferation of portable monitoring devices—be they commercially-available heart rate monitors or sophisticated diagnostic equipment—will produce vast amounts of new data. The question is, will this just create more “noise” or lead to changes in patient behavior, more effective treatment options, and better outcomes. There are some early signs of progress. Essentia Health instituted home monitoring for patients with congestive heart failure and saw a significant drop in readmission rates (2% vs 25% for industry average).

The answer to whether data can make a difference lies in the level of actual or perceived value the patient gets out of the data-sharing deal.  In a consumer-focused world, it’s all about what’s in it for me—convenience, choice, cost savings. If consumers are going to make an effort to share their data (not to mention overcoming privacy concerns), what do they get in return? What they need first is to feel like they are in control, that they can influence the process. It’s no secret why buying a car or a house are two of the most stressful and least satisfying consumer experiences. They involve complicated, consequential decisions with very little control and limited access to information. Online players like TrueCar and Zillow have started to chip away at this, offering greater transparency through access to historical price comparisons, enabling consumers to make more informed decisions.

For the data revolution to take hold in healthcare, this same level of consumer control, driven by data transparency, will need to become a reality. Only then will consumer demand begin to grow such that payers and providers will have to innovate and compete based on publicly available data on quality of care and cost, and private, consumer-controlled patient profiles. The result will be better outcomes for all. 


Photo credit:  CommScope via Visualhunt / CC BY-NC-ND

Customer Metrics that Matter in Distribution

Having satisfied customers is your goal, right?  But is satisfaction enough? What about loyalty and advocacy? You want to create advocates that will tell your story, recommend you to friends and colleagues.  You may be using Net Promoter Score (NPS) to gauge overall customer satisfaction and loyalty, but you also need detailed feedback that is actionable, that allows you to pinpoint problems across your customer's journey. Also, according to research by Corporate Executive Board(CEB), customer satisfaction alone can be a poor indicator of customer loyalty. They found that 1 in 5 (20%) of self-described ‘satisfied’ customers said they intended to leave the company in question, and more than 1 in 4 (28%) of the “dissatisfied” customers intended to stay.

So, what does create loyalty?  Reducing customer effort makes a difference. 

From the CEB research: “First, delighting customers doesn’t build loyalty; reducing their effort—the work they must do to get their problem solved—does. Second, acting deliberately on this insight can help improve customer service, reduce customer service costs, and decrease customer churn.”

Using NPS is still essential, as it can help you pinpoint problems and track progress by having a consistent metric that tracks the degree to which customers are satisfied by understanding their willingness to recommend your product or service.  This provides a good baseline, but you need to uncover what’s driving your NPS scores.  Adding Customer Effort Score (CES) to your arsenal to compliment your NPS metrics will give you the causal relationships you’re looking for related to how customer effort impacts satisfaction and loyalty.

Advantages of Using Customer Effort Score

In the manufacturing and distribution sectors, CEB’s Lara Ponomareff notes that there are several advantages to using CES for these types of firms, these include:

  1. Practical in Nature - Distributors can make tangible changes to their customer service programs by identifying customer pain points.  By gauging effort at various touch points, from initial support call to final issue resolution, managers can isolate process inefficiencies, ineffective policies, or employees that need coaching or counseling.
  2. Subjective -  Pomoroff notes that the notion of customer effort is subjective, and in CEB’s research they found that, “effort is actually two-thirds of what we call ‘feel,’ or how I felt about the interaction as a customer, and only one-third is what we call ‘do,’ or the actions and steps I actually had to take.” Closely monitoring and measuring customer service outcomes can help understand and influence perceptions of effort. For example, redirecting customers to a self-service site that has ready answers and sophisticated search may seem like a low effort alternative for a customer to get a question answered.  But after just a few minutes of scanning FAQs and prior posts, and only getting partial answers, the perceived level effort can quickly build, leading to frustrated customers.
  3. Actionable – The level of effort can vary throughout the stages of the customer journey. These variances can help you pinpoint problem areas.  Your online product discovery, selection and check out may be a breeze, but when a problem occurs that requires escalation to a contact center, the customer’s perceived effort may start to escalate.  Interminable wait times, multiple hand-offs, poorly trained or ill-informed reps can make issue resolution seem rather laborious.  Measuring CES at each point of interaction can help isolate issues and fix problems.  
  4. Reduces cost – Callbacks, escalation, and backlogged customer complaints can be costly as they required staff to resolve current issues and be on call to address the next wave of problems.  By isolating issues using CES, customer service operations can start to use practices that help preempt or anticipate future issues.  For example, by tracking issues or questions related to the initial use of a specific product after purchase, companies could send a link to a how-to video that addresses the most common questions related to the product and pre-empts call to the contact center in the future.   


On the adequacy of customer satisfaction as a loyalty metric, perhaps customer experience consultant Joesph Michelli put it best: “At the end of the day, the customer satisfaction score is very little more than a measure of your competence – your perceived competence – in the mind of a customer.” Reinforcing the competitive imperative facing companies today, he emphasizes that, “If you can’t satisfy 90 percent of your customers 90 percent of the time, you should be in another career field.” 

A cautionary warning indeed for those just relying on satisfaction to gauge loyalty.  Adding metrics like CES could yield additional insights that will reduce churn and service expenditures, ultimately making your customers happier and you more competitive. 

Why There is Major Disruption Ahead for The Food Service Industry

Guest Post by Matt Gutermuth, Founder G7 Leadership

From the moment that Julia Child entered our living rooms over 50 years ago, bringing refined French cuisine to the masses, the American palate changed forever. Today, celebrity chefs receive unprecedented exposure to all forms of media (television, web, social etc.), and names like Bobby Flay, Giada De Laurentiis, and Guy Fieri are now as recognizable as superstar athletes and rock stars.

We now have an entire consumer segment of “Foodies” that eat out more often, enjoy trying global fare, and are much more educated about the food that they eat. They want natural, organic, locally sourced, clean labels whether they are enjoying a meal at their favorite restaurant or shopping their neighborhood grocery store. In 2016, for the first time in history, retail sales at US eating establishments surpassed those of grocery stores.  And there has been a steady supply of new restaurants to meet this demand. In 2001 there were 469,018 restaurants in the country.  By 2016, that number had jumped to just over 600,000, an increase of 30%. 

US Census Bureau.jpg

And there has been a steady supply of new restaurants to meet this demand. In 2001 there were 469,018 restaurants in the country.  By 2016, the number had jumped to just over 600,000, an increase of 30%. 

But if you start to dig a little deeper into the restaurant growth story, there are some troubling signs and legitimate concerns about overcapacity — or dare we say a “bubble.” According to NPD, in 2016 the number of independent restaurants in the US dropped by 3%, and the overall number of restaurants (independent and chain) fell by 1%.  While certain segments of food service are maintaining moderate sales growth, NPD data shows that the casual dining and midscale/family dining segment continue to be soft.  Visits to casual dining restaurants are falling by 4 percent, and midscale/family dining lost 3 percent of their trips during the first quarter of 2017.  In addition, consumers are defining “dining out” much differently than they have in the past. Traditional restaurants are losing share to food retailers (Wegmans, Whole Foods, HEB etc), convenience stores (WaWa, Sheetz etc.) and meal kits (Blue Apron, Hello Fresh, Plated etc.).  Many Food retailers are now offering restaurant quality meals that can be consumed on sight or brought home at a value price point.  In fact, the fastest growing segment of food retail happens to be food service offerings created and sold inside the grocery store.  Convenience stores are also beginning to steal restaurant trips with their food service offerings.  If you have been inside a WaWa lately, you have probably noticed that it is certainly not your father’s gas station!

We are more fascinated with food than ever, and admittedly the majority of us don’t know how to cook.  There are a growing number of online providers like Blue Apron, Hello Fresh, and Plated, that deliver restaurant-quality meals with the prep work already done, and easy to follow instructions so that even those of us that struggle to boil water, can create terrific gourmet meals in our own kitchens. With these meal kits, household celebrity chefs now have other ways to get their gourmet fix, without going a restaurant or a grocery store.

Larger Trends

As we look ahead to the next decade, Technology will challenge the status quo at traditional grocery chains and restaurants.  Today’s consumer has a much different expectation today than just 10 years ago (thank you iPhone, Amazon, and Google!).  We expect to engage with brands on our terms, the way we choose to, not the way the brand “markets” us to.  We expect things now and customized to our liking.  Amazon can get you want you want, in some cases, within the hour, and Google can provide any answer in seconds.  Now that Amazon has entered the Food Industry directly with their acquisition of Whole Foods, we could see the greatest disruption the food industry has seen in over 50 years.  

Emerging Leaders

The lines between Food Service and Food Retail are already blurred and will only become even more so over time.  The daily question we all face: “What do you want to do for dinner?” was once black and white, with “cooking” representing a trip to the grocery store and “eating out” representing a trip to a restaurant. That daily decision is no longer black and white, as restaurant quality meals become a larger, more profitable, and growing segment for grocery stores, convenience stores, and meal kit providers.  Leading the way is Wegmans, who has unseated both Publix and Trader Joe’s as America’s favorite grocery chain.  The research firm CRC projects that five years from now prepared foods will represent 6.7% of grocery store sales, up from just 1.7% five years ago. CRC predicts that prepared food sales could exceed $65 billion in annual sales five years from now.  Today, “going out” to eat no longer exclusively means a trip to your local restaurant, but is becoming more likely to be a Wegmans, Whole Foods, HEB, or any number of other traditional grocery retailers that continue to improve their prepared offerings each and every day. Every dollar spent on prepared meals, and every trip made to “dine out” in traditional grocery outlets, represents a lost opportunity for the food service channel.  Food retailers have been successful at growing food service because they meet all the essential elements that drive consumer behavior, which according to NPD’s Warren Solochek, include “convenience, quality food, value, and a positive experience.” 

Lessons from Walmart

When Walmart entered the grocery business, there was a good bit of skepticism, with established grocery chains scoffing at the notion that a big box, non-food retailer could possibly be successful selling food.  “They don’t know our business,” and “Food is much more difficult than TV sets” was the traditional Food Retail perspective at the time. By the late 1980’s, Walmart proved to be a fast learner, launching their first Supercenter in 1987.  They are now the largest “grocery store” in the country, with over 21 percent market share of the U.S. traditional grocery industry.  Wal Mart’s mastery of supply chain and logistics, honed over previous decades, enabled them to execute their mission of “Save Money, Live Better” and changed how consumers bought their groceries. They became, in essence, a supply chain and logistics company with stores that sold food. Their inventory management and cost controls continue to be the envy of the industry, and create a significant competitive advantage.  They have forever changed how products are sourced, transported, and priced on the shelf.  Wal Mart’s size and scale coupled with their unmatched supply chain and logistics expertise has put enormous pressure on their traditional food retail competitors, and in many ways changed how consumers shop for food.

The Future with Amazon

Perhaps the most disruptive force in the food industry today is Amazon. Arguably, Amazon has the most robust household information of any retailer in the world (brick and mortar or online).  Consumers today require customization and want personalization, and Amazon is poised to deliver both in a way that other retailers can’t and won’t be able to for some time.  Couple their knowledge of the consumer with supply chain and logistics expertise that rivals Wal Mart’s, and it is not a stretch to suggest that Amazon is a formidable threat. Much of what we heard in the 80’s when Wal Mart entered the food business, we are hearing again in reference to Amazon’s desire to win in food.  Assuming that the acquisition of Whole Foods gets completed, Bezos has already done what most naysayers claimed he couldn’t do — quickly scale a physical food presence across the United States.  With over 450 stores ( mini distribution centers) located in sought-after locations (affluent neighborhoods), he’s done just that.  So, what does this mean for the industry, and what players will be impacted the most?  Food retail?  Food Service?  The short answer is all of the above.  Amazon’s move into this space is an absolute game-changer, and the impact will be felt across the food landscape for years to come (restaurants and grocery stores).  Amazon will unleash their superior household level insight along with their supply chain and logistics expertise to once again change how the consumer shops and interacts with food, much like Wal Mart did in the 80’s and 90’s. In addition to their operational advantages, Amazon doesn’t have to play by the same rules on Wall Street as their traditional competitors do.  This may change over time as Amazon continues to grow, but today their profit expectations are very different from other food industry companies trying to compete, freeing them up to take risks that others can’t. 

Amazon is betting that they can innovate faster and execute better than incumbents, by using their technology, knowledge of the consumer, supply chain and logistics advantage to change how both consumers and culinarians purchase and interact with food. They have redefined choice and convenience for consumers in the online world, and who’s to say they can’t do the same in food service or food retail?  Amazon has already built out local distribution centers that enable same day delivery of merchandise. How soon will they scale up food delivery to the home?  When do they start supplying directly to restaurants?  Offering price transparency, convenience, and choice is not easy in the food service world today, but they are becoming an expectation of consumers everywhere.  These same consumers are chefs and restaurant owners, that will welcome the transparency and ease of doing business that Amazon currently provides. In addition, most of these culinarians are probably already Prime or Amazon business customers (over 50% of US households are Prime members). Can they deliver the impossible:  size, scale, highly differentiated offerings that are personalized?  I would not bet against them!  


In the face of this relatively new and formidable threat, coupled with a more educated and demanding consumer, food retailers, restaurants and the entire ecosystems that supports and supplies them must reevaluate everything that they do.  It has always been important to start with the consumer, but in today’s environment, the consumer has more control than ever before, and the failure to keep up with them will be devastating.  If you are a food company that does not have the consumer front and center (in how you operate every single day, not just in a company slogan or mission statement), you will struggle to compete and survive in this new world. 

Amazon operates each day with a “Day 1” mindset.  This approach to the business enables them to move quickly and provide consumers with things they didn’t even realize they wanted. A large segment of the food industry is just now building online ordering capability and an omni-channel strategy. Amazon is already there and is close to taking “ordering” completely out of the equation with automated replenishment. Satisfying a very different consumer and competing in this new digital world will require new thinking and bold leadership.  Ask yourself, is the product or service you provide fast, convenient, transparent, and easy?  If not, that is a gap that Amazon will exploit. Ask yourself, are your consumer performance standards high enough, and are you delivering on those standards at a rate that will enable you to compete in the future. If not, you need to challenge your existing metrics as they pertain to the consumer / customer. 

Being “good” is simply not enough when the expectation is great!  The most important first step is the realization that the competitive landscape has shifted dramatically and only those able to adapt and change will survive.  There is some time to react, but the clock is definitely ticking. 

About Matt:  Matt was formerly President & CEO, Safeway.com, and held senior executive positions at Sysco and Winn-Dixie. He is now founder of G7 Leadership, inspiring others to be great leaders by sharing over 25+ years of leadership experience to help others navigate change.


Photo credit: Premshree Pillai via Visual Hunt

Machine Learning and Price Optimization

Determining what price to charge for your product or service can at times be deceptively easy:  figure out what your competition is doing and either match or beat that price. This approach works fine for commodity markets, where price transparency and comparison is easy. But what happens when there aren’t readily available comparisons? Or when you have “similar” product characteristics, but no other information that you can access that may have influenced pricing decisions such as other services included, seasonality, location, etc.  In this case, you often go with a gut feeling or accumulated experience to price products and hope there is adequate demand at your chosen price level.

For most organizations, even if they did have data on all the product characteristics and external factors they would still struggle to process this information in a way informs day-to-day pricing decisions.  The task of combing through and analyzing large data sets, determining correlations and assigning weighting factors to various product characteristics and other variables, is still beyond the capabilities of most organization’s technology stack designed for managing supply chains and customers, not high-powered analytical analysis.

Machine learning technology is starting to fill this gap, and traditional companies and startups are changing how pricing is done using smart analytics, processing power, and human intuition to optimize pricing.  Let’s take a look a couple of real world applications in the insurance and hospitality industies.   

Insurance Industry Application

As one of the largest insurers in the world, AXA has massive amounts of data on customer claim histories, and they are putting this to good use to help prevent large loss claims. Every year, 7%-10% of the company’s customers cause an accident, with most involving small claims of hundreds or thousands of dollars. 

However, approximately 1% of customers involved “large-loss” cases of over $10,000 and Axa needed a better way to predict and hence prevent the number and size of the large-loss cases. They had been using a more traditional machine learning technique called Random Forest but were only getting prediction accuracy rates of less than 40%.  In the hopes of getting better results, they started using Google’s TensorFlow deep learning solution and saw their prediction accuracy climb to 78%.  They were able to do this by tapping into the advanced neural network model that Google had been refining over the years, and combining this with the scale of their cloud offerings to deliver the computing power necessary to handle the processing load. Axa is now in a position to accurately price risk based on better understanding of the attributes of policyholders and other factors that lead to large-loss cases.

AirBnB’s Pricing Algorithm

Airbnb’s pricing challenge is a bit more complicated than most, as the users, not the company are responsible for setting prices. To enable hosts with pricing decisions, the company needed to provide the tools and data to help them optimize the price received while maintaining occupancy levels.  While conducting user research, Airbnb observed that during the initial sign up process, when hosts came to the pricing page they immediately began to search for other similar properties. The problem was that not only was this a laborious and time-consuming process, but they often had trouble locating similar properties. They discovered what most people learn when trying to sell their home, that it’s tough to find exact comparable properties. They also had to contend with pricing comps across an entire city, spanning multiple neighborhoods. They needed a way to automate this analysis and provide meaning price guidance to hosts.  

So the technical team at Airbnb set their sights on solving two problems: 1.  Automate the property comparison process, and 2. Understand supply and demand dynamics to make timely price adjustments. 

Unlike eBay, where there aren’t any location or time dependencies — you can buy and sell anything from anywhere at any time — lodging is very location and date-specific. And in the Airbnb model, can be as varied and idiosyncratic as the people that own the properties. To solve for this, Airbnb developed a list of the prime characteristics of properties, applied weightings to each one based on their importance to potential renters, and then ran these assumptions against years of transaction data to model against actual outcomes (i.e. what was the final price).

Image: Airbnb pricing tool

They were looking for how each variable correlated with price to understand the key drivers of value and to inform their pricing engine to make better pricing recommendations. They discovered things like the number of ratings correlated with higher demand, and that the use of certain types of photos translated to higher prices. Surprisingly, the professional photos of living rooms didn’t fare as well as the nice cozy bedroom shots taken by the owner. 

With these new insights, Airbnb was able to provide a more useful pricing tool for hosts that not only allowed them to price their properties based on more comprehensive comparative analysis but also provide dynamic pricing recommendations in response to changing demand. Similar to how airlines handle pricing, hosts get ongoing guidance based on market conditions so they can make adjustments the will drive higher occupancy.


Machine learning augments human decisions by narrowing a set of choices.  But just running a “black box” in the background that produces the miraculous answer is not sufficient. In the above example, insurance agents need to be able to explain the rationale behind auto premium price differences and rental hosts need to understand why and how price recommendations were determined to maintain trust and confidence in the information. It’s important to keep in mind that while the machine learns and provides answers, humans still need to explain what it means and why the results should be trusted.  The product lead for Airbnb put it best, “We wanted to build an easy-to-use tool to feed hosts information that is helpful as they decide what to charge for their spaces while making the reasons for its pricing tips clear.”

A Closer Look at Einstein, Salesforce's New AI Features

Is there any promise for the use of AI in sales and marketing? In a B2B context? Leading CRM solution provider Salesforce seems to think there is. In the past year, they rolled out AI-enabled enhancements to their cloud-based sales, marketing, and support solutions that are designed to deliver more predictive analysis, helping sales reps identify the most qualified leads, and giving marketers the intel to know who to target with what offer. 

To determine whether there is hope for such solutions or just hype, we’ll take a quick look at the major features of Salesforce’s Einstein AI, review some of the early critiques by the experts, and ponder some of the real-world use cases that might yield breakthrough results. Salesforce has deployed Einstein across their entire suite of solutions, but for the brevity’s sake, we’ll focus just on the sales cloud.

Feature overview of Einstein Sales

  • Einstein Lead Scoring: Einstein Lead Scoring models are built specifically for each customer and organization, which ensures that the models are tailored to the business. Einstein Lead Scoring analyzes all standard and custom fields attached to the Lead object, then tries different predictive models like Logistic Regression, Random Forests, and Naïve Bayes. It automatically selects the best one based on a sample dataset.
  • Einstein Opportunity & Account Insights: Sales Cloud Einstein analyzes all the standard fields attached to the Opportunity data in addition to email and calendar data, and then uses machine learning, natural language processing, and statistical analysis to provide sales reps and managers with "Predictions", "Key Moments", and "Smart Follow-Ups."
  • Einstein Activity Capture: This logs historical emails and calendar events from up to six months back for Gmail and up to two years back for Office 365.  It then works in the background to passively capture every email or calendar event sent or received. The captured emails and events are all displayed in an activity timeline, providing a history of the team’s relationship with a customer.
  • Einstein Follow-Ups:  This provides proactive email notifications, letting reps know when a customer needs an immediate response, or set a follow-up reminder.

Early critiques

Having lived through many “hype-cycles” over the years, technology buyers tend to react in the same way whenever there is some breakthrough new technology:  “so, what problem does it actually solve.”  In a recent article on new AI solutions, NextWeb talked about how “AI-powered tools are now helping scale the efforts of sales teams by gleaning useful patterns from data, finding successful courses of action, and taking care of the bulk of the work in addressing customer needs and grievances.”  Techcrunch takes a bit more pragmatic view on Salesforce's AI, “certainly automatic model generation, if it works as described and truly delivers the best models in an automated fashion, is highly sophisticated technology, but in the end, users don’t care about any of that. They want tools that help them do their jobs better, and if AI contributes to that, all the better.” On how to think about AI in the technology solution stack, they noted “the fact is AI is not a product in the true sense, so much as a set of technologies. We should keep that in mind as we judge these announcements, looking at how they improve the overall products and not at the shiny bells and whistles.” 

Possible Use Cases

Complex B2B sales remains a mostly human activity, and any technology deployed to support the process should help augment, not replace human judgment. If applied correctly, AI could help spot consistent patterns that narrow down a list of highly qualified leads for reps to contact given certain triggers. This is no doubt useful and could drive efficiency, but if the objective is to close larger more complicated enterprise sales, the most likely use case could be AI that tells reps who to talk to, but not what to say or do next. As we have discussed before, buyers and the buying process is not perfectly rational, and algorithms need good data

CRM systems can be full of human-keyed data that may be inconsistent, inaccurate, or lack sufficient depth to be meaningful.  Additionally, much of what’s entered can be subjective (close dates, probability of close, deal size) and often overly optimistic. What ultimately matters are customer behaviors: what products did they buy, when did they buy, what did they pay. Using actual prior transaction data for the AI analysis would likely improve relevancy and accuracy of predictions to make marketing and sales more efficient, and more importantly, more productive. 

Lessons from Google Data Centers: “Gaming” Their Way to Better Efficiency

Google data centers consume lots of power.  By recent estimates, they have over 2.5 million servers that consumed 4,402,836 MWh of electricity in 2014, equivalent to the average yearly consumption of about 366,903 U.S. family homes. Over the years they’ve had scores of PhD’s focused on coming up with solutions to optimize data center efficiency. Then they unleashed machine learning on the machines.

Using the same AI technology that taught itself to play Atari and beat the world champion in Go, Google’s DeepMind machine learning algorithms now control 120 different variables in their data centers, constantly learning what combination of adjustments maximize efficiency.  The result?  Deepmind was able to achieve 15% reduction in overall power savings and a 40% reduction of energy used for cooling, translating into hundreds of millions in cost savings.

Commenting on these results, author and MIT professor Erick Brynjolfsson addressed the broader implications: “You can imagine if you take that level of improvement and apply it to all of our systems — our factories, our warehouses, our transportation systems, we could get a lot of improvement in our living standards.”

Apparently, we’ve barely scratched the surface:  According to McKinsey: “while 90 percent of all digital data has been created within the last two years, only one percent of it has been analyzed, across both public and private sectors.” And behemoths like GE are fully on board with advanced analytics, spending $1 billion this year alone to analyze data from sensors on gas turbines, jet engines, and oil pipelines. If they can achieve Google-like results, the implications could be staggering.  

A Thought Experiment

Most organizations don’t have the resources of Google or GE, but they do experience similar problems that could be solved with a better understanding of all the variables that impact performance and a mindset of constant improvement. It’s important to keep in mind; Google already had some of the most efficient data centers in the industry before they unleashed DeepMind on the problem.

Obviously, you can’t snap your fingers and suddenly become Google.  So, perhaps a thought experiment is in order. One where you, for a moment, suspend disbelief, set aside current constraints, and think about what’s possible. With the Google example in mind, in what areas of your organization could you reap the greatest benefit with respect to, for example, production or servicing costs, or close ratios and customer retention that drive revenue?  What are the key variables that impact each of these areas and if you had perfect information what would it tell you? If you come up with, for instance, five variables that impact customer support costs, try to come up with 10 or even 20.  Challenge your team to do the same.  The point is not to engage in some pie-in-the-sky exercise, but to appreciate the level of complexity inherent in any activity within your business, and to start to look for correlations between events, activities, behaviors, and outcomes.

Further, you need to challenge the conventional wisdom in your organization that reinforces that notion that finding the “single cause” for performance issues will result in optimal outcomes, when in fact understanding the broader collection of variables will likely produce better results.  Google identified 120 variables just for data center energy consumption.  How about you? 

Digital Transformation: Where are You Now and Where Do You Need to Be?

We hear a lot about digital transformation and disruption, with boards pushing CEOs to “become digital” and completely rethink their business models. Geoffrey Moore provides an interesting framework for thinking about digital disruption as a continuum or serious of steps with firms having different starting points based on where they are in their life cycle: 1. new entrants incubating and scaling a truly digital business model, or 2. established companies that are modernizing and already scaled “industrial” model. 

Regardless of your starting point, creating and building a strong analytics competency is essential to remain competitive.  His point is that digital is data.  And when we talk about disruption, it’s about how companies use data and analytics to create new business models or services.  

Moore sees competitive firms in the future as those able to read the “signals” from customer data:  

“In the digital economy, such signals live at the intersection of two types of datasets—systems of record, which capture transactional data, and systems of engagement, whose log files capture all the peripheral interactions that occur in and around a transaction.”

Getting to this point requires climbing a series of “stairs” to reach the point of digital disruption. But first you need to figure out where you are now. 

Climbing the Stairs to Digital Disruption

According to Moore’s model, there are five steps that firms must ascend, with each corresponding to their digital IT maturity:  1. systems of record, 2. systems of engagement, 3. engagement analytics, 4, systems of intelligence and, 5. systems of disruption.  


Here’s a quick synopsis of each phase:

  1. Systems of Record:  ERP and CRM systems provide a single view of the customer and streamline the quote to cash process.  Key challenge – systems are still organized around the products, and they make it difficult to get a single view of the customer.
  2. Systems of Engagement:  Mobile applications and omni-channel communications improve customer experience, reduce time to transact, and eliminate disintermediation. Key challenge -- if systems of record are  behind in their "accommodation of customer-centricity," according to Moore, “you now have a ‘two-stair’ challenge ahead of you.”
  3. Engagement Analytics:  Dashboards and reports extract insights from Systems of Engagement about customer preferences, market trends, systems inefficiencies, and user adoption.  Key Challenge – at this phase you still have “human-in-the-loop computing,” that relies on people being able to "detect patterns and infer relationships.”  Innovation still moves at “human-centric pace.”
  4. Systems of Intelligence:  Machine learning detects near-invisible correlations, infers causation, enables prediction, and proposes prescriptions, in order to optimize all types of interaction. Key Challenge -- You need the right talent to “secure the data science expertise to work the algorithms, and then you need to get access to enormous amounts of data to feed the beast.”
  5. Systems of Disruption:  Systems of Intelligence leverage proprietary insights to disrupt inefficient markets with novel digital services.  Key Challenge -- getting through steps 1-4, which ultimately may require a new infrastructure model, a new operating model, and a new business model.

Moore posits that today most established companies operating in more traditional industries (i.e. not the digital natives) are somewhere between systems of record and systems of engagement, with a smaller number of innovators reaching Stage 3 - Engagement Analytics. He warns that established companies need to be firmly at stage 3 by the end of this decade or face a real existential crisis.

The "Two-Stair" Challenge

So, are you facing a two-stair challenge today?  Based on Moore’s framework, the degree of “customer-centricity” you have now in your systems and processes is a good indicator.  Firms that have attained just the systems of record level tend to be more inwardly focused on efficiency and less externally focused on effectiveness of customer interactions. Readjusting your focus externally and understanding your customer using historic transaction data and the “interactions that occur in and around a transaction,” is the key to accelerating your ascent to digital disruption and maintaining competitiveness in the new digital economy.

In our latest eBook: The New Customer Experience: Using Data and Analytics to Drive Digital Transformation, we discuss the key elements of the new B2B customer experience; the four common barriers to digital transformation; your essential analytics toolset; and how to get started down this path using feasibility studies to gauge where you are now and where to invest next in your digital journey. 


Photo via VisualHunt

Using a Journey Map to Improve Customer Experience

The old adage “you never get a second chance to make a first impression” still holds true today. However, the reality is that customers have multiple “first impressions” along their journey, from evaluation to purchase, to post-sales support.  And a bad experience at any point can wipe out any goodwill generated to that point. Gartner calls each of these points a “moment of truth” or critical decisions customers make at various points along their journey that can make or break a relationship— driving the customer to abandon their purchase, or perhaps the relationship entirely.

Companies use a variety of customer surveys and tools to try and gauge customer satisfaction and determine problem areas. While an essential part of a company’s toolkit, surveys are just one source of input to include in a comprehensive customer journey mapping that shows where, when, and how the company dropped the ball. To determine how a journey map might work for you, you need to understand the core elements in your typical map, why they are important, and how you might use them to pinpoint problems and identify opportunities for improvement. 

Primary Components of a Customer Journey Map

There is not one single type of customer journey (that would be too easy), but can be many permutations based what you provide (product or service) and the breadth of your focus (single customer persona or complete process). Regardless, there are some common core elements found in all good journey or experience maps.

The folks at Adaptive Path use “Experience Maps” to capture the complete customer experience and identify areas of customer pain and opportunities for improvement.  It starts with establishing guiding principles and includes the journey model, qualitative insights, quantitative metrics, and key takeaways. It’s an “artifact that serves to illuminate the complete experience a person may have with a product or service.”  


Guiding Principles – These principles define the context for the experience or journey map, and the scope of the analysis, be it specific personas or value propositions.  The objective is to gauge at multiple points across the customer journey, how well the customer experience agrees with these guiding principles.  

Journey Model – This is where you document the path the customer takes, the transitions they have to make from different phases (sales, delivery) and channels (web to phone support).  Here you want to capture not just the steps but illustrate something about the process: what is not working, the scope of the problem (how many customers), and the nature of the activity (linear steps or variable), what systems and tools are involved.

Rail 2.png

Qualitative Insights – These insights include the “doing” (journey) but also the thinking and feeling—the frame of mind of the customer at any given point in the journey.  They may feel anxious, confused, angry, or disappointed. You also want to understand what they are thinking: “What is the easiest way to get from A to B,” “I want to get the best price but I’m willing to pay more for convenience,” “The answer I’m looking for is not on the website, what now?”

Quantitative Info – Here is where you can use the survey data, web traffic, or abandon rates to understand the source and magnitude of the problem. By including clear metrics on the journey map (survey data in the Rail Europe case), you can quickly pinpoint problem areas. 


Takeaways -- The takeaways should guide decisions related to solving the problems identified in the journey mapping exercise:  reducing pain points and taking advantage of opportunities to improve your customer experience.  These bullet points provide a clear summary for your team as to priorities going forward and areas for investment that will deliver measurable value.


Customer Journey Maps can be a valuable tool to help you isolate customer experience challenges. It can also be an unwieldy, tangled mess if you don’t apply some basic structure to the upfront research, construction of the map, and evaluation of key takeaways.  To help keep you grounded and focused, start with the key principles and use them as “guardrails” to keep you on track to better customer insights.


Photo via Visual hunt

Will Algorithms Replace Human Judgement in the B2B Sales Cycle?

With all the talk about advanced algorithms, artificial intelligence, and chatbots one begins to wonder when virtually every B2C or B2B transaction will be automated.  In this utopian (dystopian?) future, the machines will know exactly what you want, buy it for you, and deliver it to you the same day.  But what role will humans play?  Are we to be disintermediated by the machines?  Replaced by algorithms? Future thinker and researcher Andrew McAfee makes the case that algorithms can and do outperform “experts” that rely on accumulated experience and good old human judgement—but only under certain conditions.

Understanding where you can use advanced algorithms will help you think through where to apply investments in analytics and what complementary skills you need on your marketing and sales teams.

Humans vs. Machines

According to McAfee, there is an abundance of evidence indicating that algorithms outperform human experts in their prediction making prowess.  One research study he cited, which involved the meta-analysis of 136 different studies comparing the prediction accuracy of machine vs. man, showed that in only 8 of the 136 studies the “expert judgments were clearly better than their purely data-driven equivalents.”  He further noted that “Most of these studies took place in messy, complex, real-world environments, not stripped-down laboratory settings.” So, why is this the case? In what situations or conditions do algorithms have the advantage?  And what about human intuition? To answer this, he calls on a bit of theory regarding the ideal conditions for decisions made based on judgment and intuition.  The ideal conditions for human judgment include:

  • an environment that is sufficiently regular to be predictable
  • an opportunity to learn these regularities through prolonged practice

In the medical field, you can find examples of expert judgment that fit the above criteria.  McAfee notes that since human biology changes very slowly, medicine meets the first criteria, but gyrating stock markets certainly don’t.  The second condition benefits from fast and consistent feedback loops that promote learning that can be applied to future decisions.  Anesthesiologists working with dozens of patients experience rapid feedback loops --seeing the effects of their actions -- that help improvement decisions. Where human intuition and judgment tend to break down is in “noisy,” highly variable environments where there are large data sets that aren’t easily interpreted.

Implications for B2B Marketing and Sales

For marketing and sales professionals, there are two aspects of the above analysis that impact that effectiveness of algorithms for lead qualification and selling and perhaps tilt the scales to human decisions:  the availability of complete and accurate data to feed the algorithm, and unknown preferences and biases of real buyers. The quality of the output from an algorithm can be highly dependent upon the veracity of the inputs. If there is missing or incomplete data, or a small sample size is used that skews results, the algorithm could suffer. Additionally, the complexity of the typical B2B sale makes automation with analytics trickier.  In the B2B sale, there is often multiple decision-makers and influencers, they can be less than forthcoming in sharing their intentions, and they face reputational risk when making buying decisions.  All of which requires more handholding, coaching, educating, understanding, and communicating—none of which easily automated.  


If we apply the ideal conditions for intuitive decision-making listed above to the B2B marketing and sales functions, we can see where the line could be drawn between automated algorithms and human decisions.  The first condition describes an “environment that is sufficiently regular to be predictable,” which would apply to a sales process whereby qualified prospects have a consistent set of pain points and requirements.  The second condition is where sales team “learns these regularities” and becomes more adept at educating, positioning, and pricing based on this understanding, resulting in faster sales cycles and higher close rates.

The antithesis of this is random leads that have no consistency and high levels of variability, which limits the intuitive decision-making ability of your sales team in determining what is “right” for the customer.  In this context, the job of analytics and algorithms is to eliminate the randomness by combing through data to find patterns that help identify consistent characteristics of qualified leads and deliver them to sales.  It’s not utopian to imagine machines and humans cooperating to make better decisions, you just need to have them each focus on the task they are best equipped to handle.

Photo credit: MattHurst via Visualhunt.com /  CC BY-SA      

Using Customer Lifetime Value to Create a Data-Driven Culture

Recent research shows that businesses have made some progress with their Big Data and analytics projects, but success is mostly limited to expense reduction initiatives.  Business transformation efforts and new revenue streams continue to lag.

Analytics Projects Still Expense-Driven

The results from a New Vantage survey of Fortune 1000 executives regarding their Big Data projects shows that “decrease expenses” was an area the showed the highest response (49.2%) for “Started and seen value.”  The responses for “Add revenue” and “Transform the business for the future” received the highest responses for “Not started.”  Interestingly, “Establish a data-driven culture” received the highest response (41.5%) for “Started and not seen value.” 

The report hints at the potential problem:

“In spite of the successes, executives still see lingering cultural impediments as a barrier to realizing the full value and full business adoption of Big Data in the corporate world.”

If one assumes that Big Data or advanced analytics is a major element of any business transformation that will create differentiation and competitive advantage, then removing the impediments to this transformation is paramount for execs. The key to creating a data-driven culture may lie not in focusing on data per se, but on customers and the value they create for your firm, and the value you deliver. Paradoxically, focusing externally on your customers may be the best way to drive internal cultural change.

Using CLV Metrics to Drive Change

MIT’s Michael Schrage talks about how companies can use customer lifetime value (CLV) to bring a more rigorous, data-driven approach to customer relationships focused on long-term relationships. Talking about the value of CLV, he noted:

“By imposing economic discipline, ruthlessly prioritizing segmentation, retention, and monetization, the metric assures future customer profitability is top of mind.”

He also notes the CLV is not enough: “While delighting customers and meeting their needs remain important, they’re not enough for a lifetime.” He argues that CLV metrics should measure how effectively “innovation investment” increases customer health and wealth.  From his workshops, he found that clients talked about how customers become more valuable to a company when “they buy more stuff,” or “they pay more” or “they’re loyal to our brand.”  All of which are traditional CLV type metrics.  He advocates going beyond these measures of value to incorporate more of an “investment ethos,” that looks at customer value created when customers:

  • Share good ideas
  • Evangelize for you on social media
  • Reduce your costs through self-service
  • Introduce you to new customers
  • Share data

By expanding the notion of what constitutes customer value companies can start to rethink segmentation, pricing, and promotions. It might also educate and better align your employees— regardless of their job title—with a complete view of customer value and the importance of measuring and tracking it. This investment view of CLV will help sales understands how new customer introductions create new opportunities; marketing can appreciate how evangelizing on social media drives more leads; product development gets new ideas; and customer support becomes more efficient resulting from greater customer self-service.  Once employees see the potential benefit to them, they just might be more motivated to seek out and use these metrics, thereby creating the data-driven behaviors and decision making that is key to transformation.

Schrage observed in one of his workshops how participants kept interchanging references to the creation of lifetime value as when “we” do something, or when “they”(customers) do something. He noted that there were much broader and deeper discussions around how to engage with and invest in their customers. And more comprehensive CLV metrics are the method for tracking how well the company is engaging and investing.


Cultural change is and has always been a difficult proposition for companies of any size.  Using a broader definition of CLV and the metrics to track it, could help align multiple areas of your organization around customer value that could jump start the data-driven cultural change that will drive transformation.  By clarifying what customer value means, how it is measured, and how each employee impacts these metrics you have a chance of creating a broader sense of purpose -- increasing customer value -- around which your team can rally.