Category: Analytics

5 Steps to Transform your Sales

Today’s business environment is complex, uncertain and fast moving and efficiently targeting growth remains a challenge. Sales people certainly gain valuable instincts and skills from experience but the right data insights at the right time can make all the difference. Transform sales by organizing and segmenting your outlets so the best opportunities are prioritized you can save time whilst providing a higher success rate.

That’s all very well and good but where do you even begin the process of sales transformation?

How To Transform Sales:

1. Data

Getting the right data to transform sales 

It always starts with the data. Attaining a good Market Universe data set means better targeting within your Market Universe. Here you have a few options;

  • Buy a data set.
  • These are usually estimated with low coverage and not very rich in detail.

  • Conduct a census.
  • Very expensive and not easily repeatable.

  • Use new dynamic data methods, (web and big data)
  • Sounds complex and reputation for high cost.

    Finding the right data and using it efficiently is not a one-off problem and so we at Sales Align would advocate working with meaningful technologies, such as those within the Big Data umbrella, in a purposeful way with repeatable results. There are trials to navigate when it comes to these new data methods from data harvesting, size, enrichment and other preparation issues. So, whether that’s an in-house team or another provider you do need to work with people who have expertise.

    Whichever of the 3 options you choose you should understand that getting the right data is the first step to transform sales.

    2. Analysis for Prioritisation

    Analysis for Sales Prioritisation & Sales transform 

    So now that you have the right data how do you assess your coverage and plan your priorities? What are your key data drivers that help you decide your focus? For example, setting keywords that easily describe the outlet type can help you choose which to target and when. If you have social media data, you can use this to create popularity indicators. Area density could also be a key metric in decision-making.

    With the first two data options (Data Set or Census) you are a bit more limited as the data is static. This means you can’t see how the data changes in different time periods, such as when an outlet grows in popularity on social media. When you know the data is up-to-date and more accurate you can be more confident acting on the analysis.

    You can do many things with your analysis, whether that’s keeping it offline in spreadsheets (and silos 🙂 ) or in a tool that visualises the data and results. You should ideally find a good method for all stakeholders to interact with the data rather than just leaving it in the hands of the analysts.

    3. Optimise Your Focus with Segmentation

    Optimise Your Focus with Segmentation & Transform Sales 

    Custom targeting and segmentation exercises using your data can take your sales prioritisation one step further. You’re probably already engaging in segmentation to some degree but by using your data to define certain “types”, “segments” and “social media ranks” you can create target outlet lists that fit certain profiles. Additionally, if you want to get sophisticated, start grouping your previously set keywords into themes e.g “cheap”, “high-end” “garden” and create specific campaigns that align with marketing needs.

    To truly transform sales, even with all this great insight, it’s important to ensure you have a good process set up for sharing your lists and other information with your team so you can set them into action! Also think about how you monitor the progress of working through your targeted lists and segments.

    4. Keeping Up-to-Date – Look Out For Outlet Churn

    Keeping Up-to-Date - Transform Sales & Look Out For Outlet Churn 

    Becoming more aware of outlet churn can quickly make a positive impact on your sales performance. With accurate location information from your data and frequent updates that inform you of closed and new outlets you can manage your time more efficiently. Keep an eye on the evolution of churn in different segments, for example when a particular type of outlet grows in popularity on social media. Considering new outlets typically churn 4-8% per annum this could save you a lot of time & costs.

    5. Actionable Strategy

    Transform Sales with Actionable Strategy 

    The way you combine your experience with certain data-points is crucial in defining your strategic plan. Whether that’s planning how to grow by segment, targeting occasions or optimising call lists, it’s still the people who put this into action. Review the way you connect your strategy, your data and your people.

    With better use of good quality data in your strategy and in the way your team interacts with it you can transform sales. Saving time focusing on outlets that matter means saving time and costs. Not to mention higher potential sales!

    This article is only an outline of the beginning steps towards transforming your sales with modern data methods to optimize your sales effectiveness. We have often taken these steps together with our clients and that has given us experience but also a curated method. We are happy to share our successes and if you wish to see how our tool Sales Align comfortably enables all the above, please click here.

    Posted on February 12, 2018 by Danielle Mosimann

    Analytics, Decision Making & Wine

    As our society and economy has evolved, we’ve become accustomed to having an abundance of options in just about any decision we must make.  However, it’s the excessive alternatives we are constantly confronted with that often complicate and delay decision making in our personal and professional lives.  For example, I went out to dinner the other night and wanted to have a glass of wine with my meal.  The waiter handed me a book an inch and a half thick containing their vast array of wine selections. Instead of wading through the pages, I quickly came up with a set of criteria to help me focus and determine my selection.


    To start, white and rosé wines were immediately eliminated. I only drink white wine if I’m eating fish. Since I knew that I wasn’t going to order fish, it was simple for me to eliminate the whites (I ordered a pasta appetizer and beef entrée in case anyone is interested). Rosé isn’t really my thing unless I’m at an outdoor party in the summer and it’s mixed with fruit (à la homemade sangria).

    I then narrowed my selection according to the type of taste & texture I wanted to experience on this particular night, I was in the mood for a smooth, even balanced, medium bodied, but not too fruity taste. This criterion narrowed my quest to the great varietals of Pinot Noir and Chianti. Because I had ordered a pasta based appetizer, my search led me to select a glass of Chianti (this also went great with the wood fired Tuscan style bread and homemade olive oil).

    Finally, I assessed the value and cost (this is often where most people start every decision, particularly in business). I selected a $13 glass which was about middle of the road for the Chianti price point range.  Boom! I just solved my wine selection problem in less than a minute using simple qualitative analytics and all I had to do was establish a core set of criteria that fit my personal needs.

    Businesses should approach decision making in a similar fashion. By establishing a list of factors that matter to your organization today and that will also matter in the future, it will allow you to differentiate yourself amongst competitors and result in continuous growth.  Begin collecting data surrounding these factors, constantly evaluate the outcomes of your decisions and modify/tweak your approaches.  Let’s put this into some context.

    Say for instance you’re an executive at a multinational manufacturer and part of your strategy is to strive for continual efficiency through operations.  You may decide to invest in multiple Business Intelligence (BI) tools in order to meet this strategic initiative. The question then becomes who, where, and how should your dollars be invested to maximize the greatest return? Again, an abundance of alternatives exist.

    In order to solve this problem, the organization may decide to embark on creating a BI roadmap and assess factors that will determine the analytical capabilities of the current operation and where they should go in the future.  For instance, the manufacturer may want to assess the availability/timeliness of information. This factor will determine if the information is delivered to the users when required in order to do an effective job. Drilling down further, you may then assess the information’s relevancy. Does what I receive even matter in the context of my operating unit? If not, why would I continue to receive such information and what solutions are out there for me to resolve this issue would be typical follow up questions upon further evaluation. Asking simple yes/no questions such as “does the current technology allow me to view information in real-time?” can be just as insightful, particular for a manufacturing production facility.

    Decision-making doesn’t have to be challenging or scary. If you take the time to set up a repeatable model, subject to regular evaluation and refinement, which fits your needs you can now begin to solve, simple (i.e. what am I going to eat for dinner tonight?) or complex (i.e. what new markets should we be competing in during the next 1, 3, or 5 years?) issues with greater speed and accuracy.

    So, now that you’ve decided that analytical decision making is vital to your personal and professional success, let’s toast over a glass of wine (red preferably)!

    Author: Gabe Tribuiani 

    Posted on September 4, 2014 by Danielle Mosimann

    Analytics, Big Data and BI or: How I Learned To Stop Worrying And Love The Cricket

    One of the challenges of working for a company like AlignAlytics is explaining exactly what it is that one does all day. Nothing scares off a new potential friend quicker than phrases such as ‘data-driven strategy and insight’, accompanied by some vague hand waving, especially if said hand waving usually sends drinks flying. Typically, after several failed attempts at explaining the concepts of customer segmentation and advanced analytics, the standard fallback response is that we spend our days doing reporting & analysis before moving the conversation on to more interesting topics, such as Justin Bieber turning up really late for gigs or the on/off relationship between those two miserable leads from the Twilight movies.

    However, while analytics might appear a foreign concept to many people, the truth of the matter is that it has been part of people’s lives for a long time, even if they didn’t know about it. One particular area of modern life in which analytics is widely used is in sport, specifically in its coverage on television and via digital media. It’s also here that concepts such as big data can be most easily comprehended and explained.

    One particular sport close to the heart of this particular author is cricket, a sport built entirely on large numbers of discrete data points. Every single time a ball is bowled, a huge number of different pieces of information are collected – how fast was the delivery? Where did it land? What shot did the batsman play? Which Australian did it dismiss? This is repeated for every ball bowled in every day of (almost) every match around the world. Before you know it we have a genuine example of this mythical big data that everyone has been talking about.

    Of course, collecting data for data’s sake can be its own reward – apropos of nothing, nothing impresses a crowd like owning the entire set of classic Doctor Who DVDs – but it’s the interpretation of all this data that is really the key. Hence the proliferation of visual methods on TV and the internet to help commentators or writers provide insight and clarity, such as this example, a pitch map for a specific bowler:

    Stuart Broad Pitch Map

    Image Supplied by © Hawk-Eye Innovations

    Suddenly, and without really thinking about it, we have analytics. And not just that, analytics based on big data. In order to get to those analytics we’ve used specific software to turn our data into something that we can visually comprehend and interpret. And that’s Business Intelligence (BI) software explained at the same time.

    Of course, cricket isn’t the only sport to use these concepts. Football (or soccer as it’s occasionally known in the colonies) is a more recent convert to the idea of big data, albeit in a much more ‘closed shop’ way. The likes of Opta and Prozone provide enormous amounts of data around every single football match in the Premier League (and beyond), with every single pass, shot and run recorded in frightening detail. This data is generally not made available to the public, instead being closely guarded behind closed doors by those football clubs that use it (and largely ignored by those that don’t).

    Recently however, Manchester City made large amounts of this data available, encouraging members of the public to do their own analysis and trying to create an ‘analytics community’ in which ideas could be shared. Whilst it’s possible to argue about their motives for this – why pay for an analytics team when hardcore fans will do it all for free and then you can steal their ideas? – it’s clear evidence of the growing significance of analytics (and big data) across different areas of everyday life.

    To conclude, perhaps the best way to explain what one does all day is to talk about cricket and its approach to big data, analytics and BI. And then, after several hours of explaining the intricacies, such as the difference between the flipper and the topspinner, casually point out that AlignAlytics generally applies these concepts to the marginally less exciting worlds of consumer goods and utilities. We say generally because everyone needs a hobby for their free time, such as tracking Stuart Broad’s Test career over time:

    Bowling average vs Batting Average

    Author: Ashley Michael

    Posted on March 25, 2014 by Danielle Mosimann

    Outlier Reporting and Benefits from Unit Testing in R

    A recent AlignAlytics analysis project, reliant on Big Data processing and storage, required complex outlier reporting using the R statistical-programming language. This open-source software, combined with in-house statistical skills, allowed the team to quickly produce reports that are now the foundation of an on-going strategic analysis programme.

    Unit testing is one part of this story and we hope Peter Rosenmai can continue to share more with us.

    Getting started with unit testing in R

    Unit testing is an essential means of creating robust code. The basic idea is simple: You write tests that the functions you code are required to fulfil; whenever you thereafter make changes to your code, you can run the tests to ensure that your functions all still work.

    Such future-proofing is obviously useful, but unit testing brings other benefits. It forces you to break your code down into discrete, testable units. And the tests provide excellent examples of how your functions should be called. That can be really useful, especially when code commenting is shoddy or out of date.

    Here’s an example of unit testing in the R statistical-programming language using the RUnit package. We have a file main.r in our current working directory. That file contains main(), our top-level function:

    # main.r
    # Load in the unit testing package
    # Load in source files
    # Function to run all unit tests (functions named test.*) in all
    # R files in the current working directory
    runUnitTests <- function(){
       cat("Running all unit tests (being functions that begin with 'test.')
            in all R files in the current working directory.")
       tests <- defineTestSuite("Tests", dirs=getwd(),
                                testFileRegexp = "^.*.[rR]$",
                                testFuncRegexp = "^test..+")
       test.results <- runTestSuite(tests)
       cat(paste(test.results$Tests$nTestFunc,    " test(s) run.n",
                 test.results$Tests$nDeactivated, " test(s) deactivated.n",
                 test.results$Tests$nFail,        " test(s) failed.n",
                 test.results$Tests$nErr,         " errors reported.n",
       if((test.results$Tests$nFail > 0) || (test.results$Tests$nErr > 0)){
          stop("Execution halted following unit testing. Fix the
    	        above problem(s)!")
    main <- function(run.unit.tests=TRUE){
       if (run.unit.tests) runUnitTests()
       # Your code here...

    The above code loads in from our current working directory the file string-utils.r:

    # string-utils.r
    # Load the unit testing package
    # Function to trim a string
    trim <- function(str){
       if (class(str) != "character"){
          stop(paste("trim passed a non string:", str))
       return(gsub("^s+|s+$", "", str))
    test.trim <- function(){
       checkTrue(trim("  abc ") == "abc")
       checkTrue(trim("a b ")   == "a b")
       checkTrue(trim(" a b")   == "a b")
       checkTrue(trim("")       == "")
       checkException(trim(3), silent=TRUE)

    We run our top-level function using:

    rm(list=ls(all=TRUE)); source("main.R",echo=FALSE); main()

    That line removes all variables from the workspace, creates the functions in the above blocks of code and calls main(). The first thing main() does is call runUnitTests() to run all functions with names that start with "test." in all R files in the current working directory. Those are our unit tests.

    For example, one of those unit test functions is test.trim(), the function shown above that checks that trim() is working as it should. Note how test.trim() not only checks expected return values but makes sure that trim() throws exceptions when it should. And what does trim() do? The examples in the test code should make it clear—which is why I like to keep the unit tests together with the functions that they test.

    The above is the briefest of introductions to a huge topic. I could say a lot more about, for instance, test-driven development, refactoring and code coverage. But my aim here is not that ambitious. If you’re an analyst or a statistician, chances are you haven’t previously heard of unit testing. If that’s the case, I merely wish to suggest that you give the above a try the next time you find yourself coding in R. Unit testing really is worth the effort.

    Author: Peter Rosenmai

    Posted on January 19, 2014 by Danielle Mosimann

    Is BI Software A Big Game?

    Tableau Public, Mark Types & Classic Amiga/PC/console Mgm Title ‘Theme Park’

    After seeing a demo of some of the latest BI (Business Intelligence) software and how they use mapping or geocoding software I was reminded of some of the computer games I grew up playing.

    I’ve recently seen demos of various BI tools that are utilising various mapping extensions. The results are really impressive and fantastic at highlighting interesting areas of your business, related to a geographical area. However as I sat through these I began thinking I’d seen this sort of thing before. Probably about 15 years ago. They all remind me of the various computer games that have been around for ages – the specific area was top down strategy if I remember correctly. Games such as SimCity and Theme Park jump out at me as being of a similar style. You’d have a map of your area of interest – whether it was building a city, running a theme park or conquering the world – you’d deal with scarce resources, make decisions based on these restrictions and await the outcomes to react to. Sound familiar? Maybe this isn’t a surprising observation. The people developing BI tools are from a generation that grew up playing these games – actually many are probably from a generation that plays a more modern equivalent but I’m sure the point holds. Also many managers who use BI tools probably grew up using the interfaces of top down strategy games, so does this mean modern BI software development has been driven by some rather old computer games?

    At AlignAlytics we actually run a workshop program called ‘The Game’ in conjunction with COGNOS. In simplistic terms it uses the COGNOS BI platform to give you access to a fictional company’s performance data and the markets that they operate in. Utilising their dashboards you learn about the products they sell, how the sales are spread geographically and what sort of resources you have available. You then create a strategy, make choices about resource allocation and see how your decisions pan out. You can then forge on with your original strategy or react to the outcomes of your original choices. As I went through the workshop I was reminded of another classic computer game series – Championship Manager. Here you play a football manager who has limited money. You think about the style you want your team to play (your strategy) buy your players, set your tactics and play your match. After this you can continue to tweak tactics or be consistent with your original plan. Again you see similar themes and interfaces between BI tools and popular computer games.

    None of these comparisons are derogatory to BI software. The 2 things are essentially doing the same thing – trying to give you access to as much information as possible in the simplest most presentable way. The complication that BI software has is that the underlying datasets are often much more complex so much more attention is paid to the back-end crunching of data as to the front-end interfaces. Recently however BI software does seem to have made a leap forward in the standard of front-end dashboards, suggesting that companies now see these polished interfaces as an important way of driving effective decision making, alongside the back-end tool kit. In the 1994 computer game ‘Theme Park’ the gamer had to build various rides on his empty land and then hire staff, attract customers and run a profitable park. You’d see your various workers walking to sites to fix or clean them and customers would come and go based on the quality of the park.

    Perhaps this 18 year old game will give some clues as to where BI software is going. Could management be looking at real time views of sales reps moving around city maps trying to get to client sites before competitor reps!? As this happens could analysts already be trying to tweak prices and bundle products into contracts to win the deal!? Would customers be seen leaving in droves because of poor customer service!? All this would make business management sound quite fun with perhaps the main caveat being the fact that the concept of having 3 lives might not transfer as easily from the game world to the real world, or would it?

    Author: Gus Urquhart

    Posted on August 12, 2013 by Danielle Mosimann