The Internet has revolutionized how, when and from whom we buy products. But many brands are still struggling to create and support a truly unified, creative and technologically effective retail experience. While they can see the parallels in the brick-and-mortar and online buying experiences, they often manage them in very different ways. In my own career, I’ve seen ecommerce fall under the heading of marketing, IT, and finance, and even sequestered as a “specialty” channel of its own.
Why the confusion? Marketers now face a unique mix of familiar and unfamiliar when trying to generate online sales. On the one hand, there is a traditional focus on the customer, their needs and motivations. On the other, a seemingly unending collection of SaaS tools, sleek (and not so sleek) server boxes, and a world of data that might overwhelm a print or TV marketer who isn’t as used to incorporating such detailed behavioral and interaction-based data into the decision-making process. Beyond this lies a more fundamental reason: the odd place we ecommerce folks may find ourselves in within an organization. Quite simply, we are consistently trying to figure out how to meld the deluge of data with the creative, gut feelings any good marketer will tell you are assets for reaching the consumer in a fundamental way.
To be successful, you must bring together the experience and expertise of the traditional marketer with the best components of today’s technology and data. And to optimize this relationship and the success of your ecommerce strategy, testing is going to be key.
Why the Qualitative Matters More Than We Tend to Think
I’ve worked with retailers who truly live by the “Test everything!” methodology. Every major move in their UI, inbound journeys and marketing mix is supported by thorough and statistically significant results. They’ve collected months and years of incremental, provable gains in their online metrics. And if you looked at a history of their quarterly business reviews, you’d see great numbers that paint the picture of a thriving user experience strategy many would find enviable. Sadly though, upon visiting the site itself, their beautiful graphs documenting such success don’t quite make sense.
Despite the positive metrics, a good visual designer might look at sites like these and feel that something isn’t right. They may immediately see a very real need to simplify and streamline the experience. You can’t lose sight of the human element to marketing, but how do you resolve this conflict between the data and the marketer’s experience and intuition?
The more fundamental, easy fix is to consider a basic status that is sometimes lost in optimization segmentation: new visitors. Retailers often get so caught up in segmenting customers by behavior, channel or individual affinities that we overlook the new user experience. All those fancy navigation options and personalized touchpoints across the expected shopping path can be overwhelming to a new site visitor and can even drive them away. This isn’t always shown clearly in our reporting, partially because “new” visitors always contain a component of returning visitors we just haven’t identified as such.
The more difficult solution requires you to ask the right questions and create the right tests. Too often, we focus on a specific area of the site experience, and tend to ignore other on-page elements not specifically called out during design review. But every test should examine the entire page experience, with a relatively strict consideration given to the number of interaction points (clicks) required of the customer both in the original experience and test iterations. In this way, we ensure that our tests are targeting customers at the same point in their site journey by keeping an eye on the number of interactions we’ve asked of them while exploring the site.
Data Is King, But Sometimes It Should Take a Backseat
Data should be front and center in both creating and tracking tests, but anyone who has met with success in testing knows that it should only be a guide. If we stick solely to what the data is telling us, we eliminate the opportunity to be creative and try new things. Making decisions independent of the data is often the way to discover the most successful solutions.
I once worked with a marketing team that was very data-driven. Every campaign ideation session included prominently displayed analytics, purchase reports and seasonal trends. As I was participating in the first of such sessions, I noticed a tendency for the team to veer away from conjecture or assumptions about what a customer may or may not be doing. At several points, an outside-of-the-box idea was shot down right away if it wasn’t supported by prior trends and testing data. While being data-focused is important, it’s very easy to become dogmatic in that position. In my experience, every truly great test idea comes from someone looking at what has come before and applying new perspectives and suppositions to the task.
How do you strike a balance? Consider banishing data from your creative sessions and instead replacing it with images or descriptions of what the data represents. For example, instead of talking directly about bounce rates across email click-throughs, discuss who those likely visitors may be and define their motivations and what they are and aren’t interacting with. Creative sessions include a variety of individuals in the room, from digital marketing managers to visual designers to product experts. They each have different perspectives and ways of looking at data. By removing numbers and graphs and charts from a discussion and focusing instead on the customer, their indicated needs, and what their behavior may be telling us, you can speak more universally across various disciplines and teams. You’ll also reduce the likelihood that contradictory data will throw water on the next great new idea.
To learn more about Corra, visit our website.