While KPIs, measurement models, attribution, and optimization remain go-to frameworks of digital marketing plans, I often get the feeling we’re leaning too much on these performance metrics and not actually taking a step back and looking at the bigger picture. Instead of asking how we can decrease the CPA by 10 percent or increase enrollments by changing frequency caps, we need to do our due diligence to expand beyond our silos and look at the big picture to understand how evolving consumer consumption and usage is truly impacting our business.
U.S. Internet usage has reached saturation, and with that, so has usage of search engines and email. According to eMarketer, overall Internet penetration came in between 75 and 85 percent (depending on the source) in 2012 with a projected 2.4 percent increase this year. With saturation, consumers are developing a more cultured palate for their online experiences, and their expectations have evolved to determine what is and is not acceptable. The days of championing the glorious measureable results of digital media with year-over-year staggering increases in performance are coming to an end…and it is not all to blame on the economy. For example, consumers have become so desensitized to the clutter of banner ads that click-through rates are dismal, and retargeting is now the show pony, saving the day and keeping banners alive. However, with the continual threat of cookie deletion, multiple browsers, and the latest default setting of Mozilla to block cookies, “tracking” and “accountability” could have some challenges ahead.
While Internet usage has reached its saturation point, online ad spending continues to demonstrate healthy year-over-year growth rates and is not expected to dip to single digit growth until 2015, according to eMarketer. With online advertising spending reaching over $40 billion in 2013, we are experiencing a saturation point with these more sophisticated users and continual increases in advertising spending, which is leading to online confusion. With the growth of digital, the idea of a hyper-connected consumer 10 years ago is so outdated that nowadays consumers feel more disconnected than hyper-connected; there is always something they are missing out on, be it using one social platform vs. another, a missed promo code opportunity or daily/flash deal, or (GASP!) maybe a real-life event outside of their digital world.
As we attempt to take a step back and look at the big picture, here are a few things to consider:
There is never a shortage of opportunities to test and try out. If you are always the “wait and see” follower brand you will never stand out amongst the clutter and chaos because consumers become desensitized so quickly. What is new one day becomes standard the next—who doesn’t have a responsive design site? So just like dieting, it’s important to keep a healthy balance and practice moderation to ensure you have a good mix of tried-and-true, and new and innovative that will keep you in shape to be sure you’re continually looking at the big picture for opportunities to embrace the change rather than waiting till it changes you.
This article was originally published on ClickZ on March 13th.
I recently attended the first annual QCon held in New York City. I was fortunate enough to attend some excellent presentations from folks who design, build and operate some of the most heavily visited sites on the Internet: Twitter, Facebook, Etsy, and Netflix to name a few.
While the companies at QCon might all be in different businesses – from social networking to e-commerce to entertainment – they all have a few things in common: fanatical attention to detail, rapid delivery of enhancements, intolerance of wasted effort, a genuine understanding and appreciation of their users, and an intense focus on constant improvement. The end result is that all of their businesses are well regarded by not only their users but also by fellow software developers.
Twitter: It’s all about speed
Twitter is one of the most fascinating companies in the world. In a small amount of time they’ve tackled one of the most fundamentally challenging tasks humans have faced: scaling the delivery of information to a virtually unlimited audience. The challenge of sharing ideas with a wide audience was historically logistical in nature and tended to boil down to the cost of delivery; sharing a message with a few people was easy, but sharing a message with millions of people was difficult, time consuming, and expensive. Today, a message can be shared in virtually real-time with millions of people around the world just as easily as sharing the same idea with a few friends in your living room. This is a serious game-changer.
So how does Twitter deliver a tweet from Lady Gaga to 20 million of her followers almost instantly?
Rather than use a single approach for all tweets and a single set of technologies, they define sensible boundaries within the overall system and optimize within those boundaries. Timeline, searching, and delivery are all built with completely different sets of technologies and considered separate concerns within the overall Twitter system. They use a blend of technologies such as Ruby of Rails, Java, and Scala to accomplish this. The use of multiple languages is commonly referred to as polyglot programming and is becoming a much more widely-used technique. Applications are becoming more complex, more heavily used, and more depended on being available 24/7.
Twitter also uses a few other very simple techniques like storing data in the same format that it’s read to avoid unnecessary processing at runtime. They also do something most engineers haven’t quite mastered the knack of; they handle traffic outliers like Lady Gaga differently than regular users like me. When I send a tweet to a handful of people, my tweet is processed using a general-purpose delivery mechanism that is good enough for average users and doesn’t consume too many resources. But a tweet from Lady Gaga is considered a special event from a technology perspective, because after all, it needs to be delivered to millions of followers.
Web applications are often implemented with a one-size fits all approach. For sites as heavily visited as Twitter, fanatically measuring usage is critical, and being able to adapt to unusual usage patterns like a tweet from Lady Gaga is critical to keeping Twitter fast and stable.
Etsy: Optimizing for happiness
Etsy is a remarkable company. Not only do they sell fantastic products from very creative craftspeople, but they also regard the act of writing code for their own site as an act of craftsmanship. Their blog, Code as Craft, is one of my personal favourites.
Rather than use a customized general purpose e-commerce platform, Etsy built their own platform in-house using a set of simple technologies. Simplicity is what makes Etsy both remarkable and profitable. Rather than falling down the rabbit-hole of complexity, by keeping things simple they can rapidly enhance, maintain, and support a site that does some serious business: their merchants sold a combined $538 million of crafts in 2011. Just how rapidly do they deploy enhancements to the site? Over 20 times per day. In a single month, Etsy performs 300-600 upgrades. Rather than waiting months for the next quarterly release, at Etsy an idea can be planned, designed, built, and delivered in an afternoon.
How do they manage such a rapid evolution and keep both users and engineers happy?
* Part 2 in this series will be published next week. Stay tuned!
Data is the hot trend for 2012. With digital efforts representing more of the brand budget, there’s a rush to figure out how to better optimize a fixed budget to drive bigger results. Anyone who has been in the industry long enough knows that regardless of what the data says, looking at it in different ways can usually shed light on a good story. In the competitiveness of marketing and advertising, we sometimes focus on our own performance and how good “we” did with a measurement of performance more reflective of corporate objectives.
According to the Forrester Consulting study “Data-Driven Design,” digital professionals stated the most important team performance metrics were the ability to complete projects on time or faster (65 percent) and the ability to complete projects on or under budget (64 percent). Trailing behind these metrics were things that truly impacted the customer, such as bringing to market new ideas. When solely focusing on speed and budget efficiencies as these numbers show, the best, most innovative solution might be hidden. These numbers only tell the story in black and white.
There is an abundance of data available from industry, business, customer, website, advertising, and social media. The attribution model is in overdrive attempting to crack the code of exactly where you should spend that extra dollar in an effort to increase productivity efficiencies and drive a stronger return on investment (ROI). While there are many types of data, looking at it in silos often leads to more tactical executions or optimizations to improve performance within that channel; however, in order to truly take a step back, you must first look at your objectives and complete an audit of all of your available data. This will allow you to look at the different channels and sources to begin to see a more holistic picture. During the audit, you can peel apart what key performance indicators (KPIs) are available within each source and what other variables could be valuable as well. While closely looking at the numbers, stories tend to emerge around areas of improvement and opportunity.
Additionally, we need to ask the bigger questions. How does this data relate to the overall business objective, customer experience, or key brand drivers? Or the most obtuse question: why should we care?
Taking the quantitative data and adding a little qualitative research can be extremely beneficial. While the data may tell you where there are issues with a specific part of a website, digging further through qualitative research or user testing can uncover “why” there is an issue. Qualitative analysis could include focus groups, user interviews, behavioural segmentation, usability testing, eye tracking studies, or ethnography. In order to tell the full story from beginning to end, we must understand the customer journey and take the data plus these additional insights to improve performance. This process doesn’t just hold true for website performance, but all customer touch points where we use data to evaluate performance.
We need to push it further. We need to find the opportunities to shine. Adding the “colour” to any project always comes with challenges. Holes are opened, and it gets messy. Typically, adding this colour can start with asking a few questions. Below are a few ways to add a little chaos that by no means saves money initially, but will make everyone a little smarter to save time and energy in the long run.
At the end of the day, digging beyond the cold, dry numbers and looking outside the data can be so much more rewarding for employees. If it’s really done the right way, the consumer wins too.
This article was originally published on ClickZ on June 6th.