After reading a recent LinkedIn post by Jeff Haden, "Want to Achieve Lifelong Success? An Army Ranger Says You Need This 1 Trait the Most", (spoiler alert: It's adaptability) something occurred to me. One of the reasons I enjoy solving business problems with analytics is that analytics is all about […]
The rise of analytics and big data presents a once-in-a-generation opportunity for organisations to put themselves at the cutting edge, to create a competitive advantage by developing a culture of analytical success within their organisation. Yet most seem unable to grasp the opportunity that is within their reach.
Eric Hoffer wrote that "In times of change, learners inherit the Earth, while the learned find themselves beautifully equipped to deal with a world that no longer exists." So why in this time of rapid change are our organisations failing to invest in educating their most important asset - their employees?
Organisations should be investing time and money into programs and courses to supercharge their staff for success with analytics in business.
With the recent advances in analytical programs and training, many staff will be new to the latest education. Even those with relevant education may not have had the chance to complement their knowledge with industry experience, as a result they may be missing essential domain knowledge and communication skills.
Three things you can do to invest in your organisation’s biggest asset and secure a competitive advantage:
- Involve staff in classroom training courses and workshops available that provide the skills and experience needed to succeed in a rapidly changing environment. Couple this with professional accreditations where they are available.
- If struggling for initial budget to invest, access complimentary training to demonstrate the value created. For example, at SAS we have made our Programming 1 and Statistics 1 eLearning courses available at no cost, accessible online. Showing the ROI through investing time to develop skills can build support for investment in further education.
- Look out for some of the full degree University programs being developed around Data Science, Business Analytics and Advanced Analytics. Integrate these into individual’s professional development plans.
Interested in learning more about education programs? Please leave a comment below or tweet me @james_enoch
It doesn’t matter how many resources you have. If you don’t know how to use them it will never be enough.
The New Year brings fresh opportunities and 2014 is an exciting year for SAS users in the Australia New Zealand (ANZ) region to form connections, share ideas and get inspired! At the start of 2013, Hanlie Myburgh, SAS Customer Advocacy Manager for ANZ, announced a new incentive to encourage SAS users to present at their local SAS user group: the Best Presentation Award. This award is given to the presenter with the best presentation content determined by attendee feedback in the evaluation form/online survey. The winner receives an Apple iPod and the chance to win the MAIN PRIZE – a trip to SAS Global Forum 2014 in March, with flights, accommodation and registration expenses all covered by SAS Institute Australia. In 2013 there were 15 best presentation winners who were placed in the random draw for the MAIN PRIZE and on Friday 17th January 2014 Bob Whitehead was awarded the trip to SAS Global Forum 2014. Bob shares his experience in being a presenter at WASUP as follows:
“I have always enjoyed presenting new topics at these events. The opportunity to learn something new and impart that knowledge to other SAS customers is a great experience. The unexpected surprise of winning a trip to the SAS Global Forum in Washington DC this year is obviously fantastic. After almost 30 years of being involved with SAS, I have yet to attend a SAS Global Forum.”
Bob would also like to “Thank SAS Institute Australia for supporting the various user community groups around Australia and providing such a great incentive for people who choose to present papers.”
Keep an eye out for Bob’s presentation on his experience at SAS Global Forum 2014 at the Q2 2014 WASUP meeting in June.
What about you? Have you considered being a SAS user group presenter? There are many discussions and blogs on the personal and professional benefits of presenting at a local, regional or global user group. Some benefits mentioned over the past year are:
- Presenting helps to strengthen presentation and public speaking skills
- It allows us to share information with, and to get advice from, our peers
- It is an opportunity to stimulate ideas through collaboration
- It provides inspiration to others by sharing knowledge
- It allows the presenter to seek guidance and direction in their own SAS knowledge and learning
- It gives back to the community
These comments come from SAS users across the entire ANZ region at all levels of knowledge and experience. I am sure you will agree that presenting at a user group can be encouraging and valuable for all SAS users and students in our left-of-the-date-line community.
With the SUNZ conference in Wellington, New Zealand on the 18th February, and plans underway for the 1st quarter Australia SAS user group meetings, now is your opportunity to share the work that you have done, be recognized by your peers, and be rewarded personally and professionally for achieving your goals. And don’t forget the MAIN PRIZE, a chance to win a trip to SAS Global Forum 2015! If you are interested in presenting at a SAS user group meeting this year, let your SAS ANZ user group committee know before the presenter slots are filled!
If you are unsure about presenting and would like to discuss further with SAS ANZ user group committee members and users, feel free to start a discussion on SANZOC, our online community to complement the local SAS user group meetings within the Australia and New Zealand region through online collaboration, networking and sharing. We look forward in seeing you there…
In my recent post, I talked about how an easy to understand, and well communicated strategy can enhance shareholder value, and be the catalyst for better leadership. Now I'd like to continue with those themes.
The Intelligent Balanced Scorecard
Strategy formulation and execution is not just about finding one set of measures that can be tracked in order to ensure that the organisation is moving in the right direction by improving its strategic objectives, nor is it about once and for all estimating the cause-and-effect relationship between them.
Strategy formulation and execution is about setting bold goals and continuously monitoring and fine tuning a set of relevant measures that drive performance and understand what initiatives to focus on to boost performance further. No organisation operates in a vacuum, and therefore the balanced scorecard must continuously be updated to suit the current economic and competitive environment.
By applying correlation techniques to the traditional balanced scorecard, management will be able to better understand how strongly the defined objectives relate to each other and if the defined measures actually drive performance.
In addition to correlation techniques, the balanced scorecard can be further enhanced by leveraging statistical time series forecasting. The aim is to understand how performance driving measures are expected to perform in the future. This allows for various what-if analyses that can be used to optimise the allocation of strategy execution (StratEx) across initiatives and thus maximising the return on StratEx.
Make the traditional balanced scorecard intelligent
Let's look at a simplified example of how to apply intelligence to a balanced scorecard in a financial services organisation.
The organisation has defined four strategic objectives as shown in the figure below. In addition to that the performance of each objective is tracked by two measures; "customer satisfaction" (customer perspective) and "human capital readiness" (learning and growth perspective).
By calculating the correlation between the defined measures, management is able to understand to what extent, for example, "human capital readiness" drives "customer satisfaction". If the correlation between the two measures is above zero it means that an increase in "human capital readiness" leads to an increase in "customer satisfaction". The strength of the correlations determines by how much.
When understood to what extent an improvement in "human capital readiness" drives "customer satisfaction", management is able to determine how much StratEx is to be allocated to initiatives driving "human capital readiness" and thereby improving "customer satisfaction".
By applying correlation and statistical time series forecasting techniques to the entire balanced scorecard, management is able to understand to what extent measures drive performance of each other and validate the perceived cause-and-effect relationship between strategic objectives. Not only going backwards as a strategy validation exercise, but also going forward so that strategy can be aligned with a changing environment.
By using an intelligent balanced scorecard, management is able to perform what-if scenarios by leveraging their new understanding of how objectives, measures and initiatives drive performance today and how they are expected to drive performance in the future.
Additionally, StratEx can be allocated to initiatives that are expected to contribute the most to the organisation's performance. Hence, management can focus on what actually drives versus what is perceived to drive performance and in return always be one step ahead of the competition.
By applying business analytics to the balanced scorecard, perceived measures and cause-and-effect relationship are no longer just a perception but are validated, giving management the advantage of continuously being able to adapt to a changing environment.
The ability to adapt comes from rendering existing measures obsolete, and re-prioritising precious StratEx to initiatives that support measures that have a higher correlation with the measures that they are supposed to improve. The result is a fact-based approach to strategy execution.
Understanding how things relate to today and expect to do in the future will answer questions such as why certain measures are more relevant than others, and which areas of the business they impact. Once management has determined what actually drives performance, questions such as what it takes to improve results, achieve targets and where to focus can be intelligently answered.
The bottom line: It is time to put the old habit of guesswork aside and take strategy execution to the next level - are you ready?
All organizations aspire to build and maintain a strong and profitable business model that sets them apart from their peers. But very few organizations are able to achieve a level of performance that outperforms everyone in their industry. Contrary to popular perception, a membership among the club of high performing organizations is not handed out by just hard work, but rather has to be earned through a carefully planned and executed strategy.
An easy to understand and well communicated strategy can enhance shareholder value, it can become a catalyst for better leadership, it can help create alignment throughout the organization, and it can ultimately be the foundation for better performance. But how do you supercharge your strategy to become a high performing organization?
Classic Strategy Formulation and Execution
One of the most common ways for boards and management teams to create or update the strategy is to take a fresh look at the organization's mission, values and vision and use these as the foundation. A strategy that they believe will take the organization in the desired direction and to the next performance level.
Once the strategy is in place, management initiates a process wherein the organization's balanced scorecard and strategy map are updated with strategic themes, objectives, measures (KPIs, PIs, RIs, etc.), performance targets and expected cause-and-effect relationships.
Additionally, strategic initiatives to support the execution are defined, and sufficient Stratex (strategic expenses) allocated. Ultimately, the strategy is cascaded down through the organization to ensure proper alignment across business units, support units and employees.
Strategic performance is subsequently tracked by comparing the performance of measures and initiative progress indicators with their targets. Should a measure such as customer satisfaction fall below target, management might launch an initiative with the aim of getting customer satisfaction back on track.
The Challenge with Classic Strategy Formulation and Execution
Conceptually, there is nothing wrong with classic strategy formulation and execution but what if there was a better way of doing it? A way that would enable organizations to better focus on what actually drives performance and understand the actual cause-and-effect relationships between various strategic measures? A way that would enable organizations to better utilize their scarce resources and get more out of their precious Stratex? A way that would assist organizations in becoming a high performing organization?
One of the most common pitfalls with strategy formulation is that organizations pay too much attention to getting all the right measures and cause-and-effect relationships in place. The issues are that, as management does not sufficiently understand the linkages and effects of what actually drives performance in their business, they are not able to correctly outline which measures to track and how they affect each other, let alone which initiatives to fund in relation to how much they drive organizational performance.
In many organizations measures seldom change; however, as all organizations have witnessed lately, the economic environment frequently does. If the organization is not able to adapt its strategy to a changing economic or competitive environment it might be heading in the wrong direction. The result is focus on matters that do not drive performance or that add little value to the overall goal of shareholder value creation.
During the global financial crisis a CEO at a mid-size bank, during a meeting, told to me that "incenting our financial advisers to sell credit cards and thus tracking 'percentage of new customers to whom the bank sold a credit card' used to be a good KPI. Now that the worst credit crisis in decades is upon us, it is a bad KPI. The more credit cards we sell, the higher we drive our bad debt. Any bank or credit union can sell a credit card during a credit crisis, so why should we continue to incent our staff to collect all the high risk debt?"
As seen in the example above, what used to be a good measure (percentage of new credit card customers) can quickly become less so. Not only does the measure no longer drive performance, it erodes it; hence, the perception by management that "percentage of new customers to whom the bank sold a credit card" had a positive cause-and-effect on branch performance has turned negative.
The key question is of course if the bank in any way could have been alerted to the fact that the perceived cause-and-effect relationship had turned sour?
Not only that, but could boards and management teams in general have had tools at their disposal to add intelligence to their strategy formulation and execution in order not to perceive anything but rather make fact-based decisions on relevant performance driving measures, cause-and-effect relationships and then allocate Stratex to those initiatives that drive the most value?
They answer is yes, the intelligent balanced scorecard.
In part 2, we will look closer at how the intelligent balanced scorecard can support the board and executive management teams in continuously monitoring and fine tuning a set of relevant measures that drive performance, and understand what initiatives to focus on to boost performance further.
I was first exposed to experimental design in marketing over 15 years ago. It made sense then and it makes sense now. Sadly and somewhat surprisingly, it seems that few organisations have embraced this approach since then. Driving better marketing return on investment is a key part of effective marketing. It’s hard to succeed without being good at optimising offers, channels, price, targeting, timing and so on. And, with budgets shrinking, it’s more than a nice to have - it’s a necessity.
Some things we know ahead of time. Others we don’t. For these, in-market testing is essential. Marketers are good at many things. They’re often experts at creating offers and, given the freedom to innovate, they’ll end up with several competing alternatives for marketing the same product or service. Given this degree of choice, what next? Instincts and experience may help trim the list. This can only go so far though; even after ruthless cuts, the final handful may all look attractive. Settling on the final offer structure needs more than a best guess - ideally, it should be a fact-based decision. Who wants to take a chance on a potentially costly "roll-out" without quantifying the risk first?
The only way to get that evidence is through in-market testing. This takes planning. The tests need to happen quickly and not to be too costly, suggesting the need for small trials. “Small” is a relative term though - in balancing cost with accuracy, how big should these samples be? Get it wrong and the results might be unreliable or even totally misleading! The whole point of experimental design is to solve this problem. Given how common and obvious the problem is, it’s astounding that the marketing community has been so slow in applying these techniques. It’s not like this isn’t a real problem, either. I’ll often ask, “why are you continuing with this campaign, when it obviously isn’t delivering returns?”. I usually get a blank look followed by answers such as, "Oh, don’t worry; we’ve always run that campaign since I can remember”.
Comments like this show how big the opportunity is. We can all do things better, and for many groups, getting better at 'learning' and taking action, the results can drive real improvements.
Effective testing achieves two things. It helps marketers make informed decisions. Informed decisions lead to better decisions. The key is good testing takes rigour. My feel is that there’s a great deal of talk around 'test and learn' and 'experimental design' at the moment. Ironically, there seems to be little serious effort to support this. What makes things worse is that even where testing is done; it’s usually not done anywhere near rigorously enough. Sometimes, it’s a lack of focus. Other times it’s because of too few skilled practitioners in the marketing environment. Usually though, it’s a case of blinkered vision. Most groups have had the concept of mass-marketing hammered into their culture over the last decade. Given this culture, it’s almost counter-intuitive that adopting smaller more focused tests or campaigns can actually drive better results and more informed decision making.
The thing is, not testing offers in a controlled way can actually reduce effectiveness. Make the wrong offer and you won’t make a sale. You might even lose a customer. When debating the value of doing in-market tests, it’s important to also consider the cost of not doing the test.
Many groups seem more focused on pushing creative rather than relating to their customers. This may help win awards but it does little to drive better engagement. Good testing has an important role in driving offer relevancy. If a group is seriously interested in improving their effectiveness, they need to give testing methodologies an acknowledged place in their marketing plan, and make sure it happens by allocating budget and resource to testing.
To see where you sit, ask yourself the following questions:
- When faced with campaign choices, how do you know whether they will 'fly' with your customers or not?
- How do measure your 'customer-centricity'?
- How do you validate your beliefs or judgements about what will or won’t work?
- How do you determine the cost-effectiveness of a campaign prior to launch?
- How much risk are you carrying by going straight from ideation to execution?
Those that ignore testing should consider this: testing is the means by which an intelligent organisation acquires knowledge about customer behaviour. And this, after all, is at the heart of a customer-centric approach to marketing.
Relevancy requires timeliness and personalisation. Simple mass-marketing doesn’t work in a world defined by real-time engagement and “markets of one”. The best of the best have the ability to monitor results in continuous time, to make confident decisions on how to drive real results, and to squeeze every last drop of value out of every tweet they send.
Question: Where do you sit with respect to adopting a rigorous testing approach to support decision making?
Over the past month or so on this blog, we have been talking about big data and high performance analytics (HPA) across the Asia Pacific region. We explored the information management arena, looking into the mindset of today’s CIO, and explaining the reality of how local organisations are leveraging high performance analytics to drive big data value. Organisations across the region are looking for ways to gain new insights from big data in shorter reporting windows, for example, getting to the relevant data more quickly to reveal previously unseen patterns, sentiments and relationships. Delivering that information in real time and improving the time to insights places immense pressure on the office of the CIO, fortunately, this role is evolving into a more strategic business leader with renewed focus and capabilities.
Here is a recap of the HPA coverage:
David v Goliath: high performance analytics levels the big data playing field: How big does big data need to be before it is valuable? A look at where the real value is in high performance analytics.
Better analytics equals better decisions equals better business: It is often presented that analytics helps with making better decisions, more accurate and timely decisions. Greg takes us back to the basics of the individual and group decision making process.
Business Analytics is about change. Do something: Three easy steps you can take to drive real change with business analytics.
What do your online conversations really say? Text analytics tells the story: If a picture tells a thousand words, do a thousand words give us a picture? Millions of words are communicated in text conversation every day.
The value of high-performance analytics: The intersection of big data and high performance analytics creates competitive advantage. Evan explores the value in big data and the opportunity cost of ignoring it.
High-performance analytics. So what?: High Performance Analytics, what is it really about. David Hardoon works to disambiguate on-going terminology and give focus to the really important objectives that would justify such technology.
Big Data Seats CIOs at the Boardroom Table: The specific big data goals and challenges facing CIOs as they move from a mindset of keeping the lights on to that of a strategic seat at the boardroom table, including the top five challenges facing CIOs in trying to deliver to these business goals.
Culture, the real roadblock to unlocking big data value: SAS Asia Pacific's Vincent Cotte talks about the four recurring trends around culture and the roadblocks to big data value, sharing insights on how organisations need to adopt the right culture in order to drive big data value.
Big data in little Byron: Big data is even in small Australian towns - Natalie describes how the three Vs of big data (velocity, variety and volume) impact us in everyday life, even when we're on holidays!
How analytics is unlocking the power of big data in India: Jaydeep discusses how organisations in India are leveraging analytics on big data to become high performance organisations. We look at examples of how analytics is impacting everything from manufacturing to stock exchanges to telcos.
Three keys for analytics that drive outcomes: Evan discusses the 3 keys analytics driving business outcomes. Namely; everything you do should be linked to a measurable and valuable outcome, develop an understanding and awareness of the business and there's room for generalists and specialists.
Solving Business Problems with Dogfights and OODA Loops: The OODA loop concept demonstrates that when military thinking is applied to the decision making process, especially when combined with highly capable people and robust technology, you get sustainable competitive advantage.
Next we’ll be focussing on customer management and intelligence - we will look at how innovators like Globe Telecom from the Philippines and Telstra in Australia are adopting high performance analytics to drive profitable customer experience. Besides some fascinating case studies from across Asia Pacific, we will also take a detailed look at the challenges and explore the skills, techniques and capabilities that can help you drive competitive advantage.
To receive updates automatically, use the Subscribe feature on the right side of the window.
Today was a career highlight for me - while at the SAS Global Forum Executive Conference in Orlando, not only did I get to hear Guy Kawasaki talk about the Art of Enchantment, I got to meet him afterwards and record a short segment from him.
Guy (I can call him that now that we've talked the talk) has an engaging presentation style with ten steps to achieving this enchantment. You can read about them in detail in this post - Guy Kawaski on enchantment for achieving influence by John Balla. I also wanted to share Guy's golden rule of presentations:
30 point font
What valuable information - next time you are preparing a presentation, think about this and more importantly, follow the rules.
My personal favourite point from Guy's talk was the DICEE acronym which encapsulates the third pillar, quality - have something good be it a product or a service. This resonated with me because when I think about ways to tell the SAS story, what we do and how we work with our customers to achieve success, I can see the parallels that Guy draws to illustrate his point. DICEE stands for:
D - Deep
I - Intelligent
C - Complete
E - Empowering
E - Elegant
Guy talks more about this in our interview - enjoy!
Having the time to catch up with what's happening on the marketing front is a real privilege. While I'm giving a presentation later this afternoon on the need for real-time engagement, what I really love about the Marketing Association's Smarter Data conference is listening to some real visionaries.
Getting better doesn't need to be hard. It just means doing things ... better. Campbell Brown should know, too - he's the Marketing Director from GrabOne, a company that's grown from 6 staff to 120 since July 2010 and achieved an over 70% market share. Not a bad trajectory!
Daily deals aren't new - back in Australia, we have Catch of the Day. What is new is how effectively GrabOne has managed to not only corner their market but to start expanding into a variety of verticals. Critically though, they didn't get there overnight - they got there by focusing on the foundations. Return on investment. Customer engagement. And, testing and learning what works.
A key message of Campbell's presentation was that everything should be driven by return. Regardless of whether you're trying to engage via social media, electronic direct mail, or online, if you don't know your return from each of these channels, you're lost at sea. This is even more important given how quickly the social media landscape continues to change; paraphrasing Campbell, "adapt or die".
So what does Campbell credit for his success? Some simple lessons:
- Have a robust ability to do A / B testing
- Don't overcomplicate your segmentation
- Don't underestimate the importance of change management
- Make ROI transparent - don't rely on "branding campaigns"
Everything GrabOne does is based on exhaustive A / B testing. They know what works and more importantly, they drive to know why. That's a critical element in driving outcomes and yet, it's so rare.
Given GrabOne's success, there's an interesting question to ask: How well do you understand the return on investment you get from your marketing spend?