Thursday, December 1, 2011

Hosted Analytics

Hosted IT solutions, or cloud computing is a technology trend that is enabling businesses to offload much of their IT infrastructure to a remote location that is managed by a third party.  Hosted services provide a variety of solutions: databases, file servers, applications and even computing power.

One service that still seems to remain local, however, is analytics.  For the most part, businesses are conducting their analytics in house and keeping the results somewhat centralized.  While business intelligence (BI) and analytics technologies exist that allow businesses to distribute analytics via the web and mobile devices, the price tag on these tools is quite high and prohibitive for small to medium sized businesses to own.

An emerging service to host analytics is starting to gain traction, however.  This is the notion that a third party would host reports, models, dashboards and analytics on web and mobile enabled platforms.  Since development of analytics requires a specialized skill set combining mathematics, IT, economics, business and statistics, these hosted solutions also often come packaged with business analytics consulting services that help transform business needs and overwhelmingly confusing data into insights and actions.  The full solution, therefore, is a service that enables an organization to understand the baseline of their current state, to set goals for their business through predictive and optimization models, and to monitor against these goals on an on-going basis.  All of this is delivered and consumed through web enabled tools.

While this notion, practice and consumption of analytics is not new, the offering of hosted solutions now allows small to medium sized businesses to get in on the analytics game that was previously enjoyed mostly by larger organizations.  All for a fraction of the price that would be required to make hardware and software purchases in the traditional self-hosting model.

The benefit of all this is that hosted analytics tend to be agnostic to enterprise IT platforms.  What this means is that a hosted analytics solution can tap into your ERP, CRM, HRIS, legacy and home grown systems (whether they are hosted locally or remotely) and gain integrated insights across these disparate platforms.  While some of these platforms do offer analytics modules or components, they often lack this ability to integrate with other systems to gain those cross-platform insights.  Additionally, these platforms are often competing technologies and don't "play nice" with each other when it comes to data integration.  This is why a hosted, independent analytics platform is so important.

While the concept of hosted IT has been around for a number of years now, the notion of hosted analytics is is just now being enjoyed by early adopters.  When it catches onto the mainstream like other hosted solutions, it just may change the way companies do business.

Tuesday, November 1, 2011

Small Data and Analytics

We hear a lot these days about "big data":  petabyte sized databases, millions of records, in-memory data processing.  I was at a conference recently where the speaker discussed his 70 BILLION record database!  Several large organizations that deal with data at these volumes have successfully leveraged their "big data" with analytics to better manage their business.

But what about the vast number of businesses that are NOT part of the Fortune 500?  Those that are not capturing massive data volumes of data as part of their core business?  Those that have small...data?  Mere gigabytes, perhaps.  Can analytics still benefit them?  The answer is: yes.

One key to leveraging small data for business success is to integrate data from disparate business systems: Google Analytics, SalesForce.com, SAP, legacy databases, etc.  Big data systems that for many house their small data and potential for business insights. 

Integration need not happen within the systems themselves, but rather with the data that they generate, or expel, for the purpose of analytics.  Many business intelligence, reporting and analytics tools allow for dynamic and virtual integration of disparate systems, at the point of analysis.  This eliminates the need to monkey with the architecture of the systems themselves, and the need to develop ETL processes, data warehouses and data marts.  The best part of "small data" is that computing power exceeds data volume demands, allowing integrated analytical data sets to be generated dynamically. 

If performance does become an issue, some business intelligence tools will create stand alone "extracts" that are mini-data marts specific to the analytics at hand.  These extracts are a consequence of the analysis and do not need to be separately designed or optimized.  In addition, scheduled refreshes can be performed such that data stay current and live.

But what of these tools, systems and databases?  Will insights come simply by cross system integration?  Well...maybe, but probably not.  Unlike big data environments, where analysts can go swimming in the data and come upon insights by thrashing about, small data requires a bit more finesse (not that big data analysts don't have finesse).  Given the relatively small size of "small data", and the somewhat complex nature of how any business defines, measures, characterizes and organizes themselves, the different combination ways to look at the data begin to dwarf the actual amount of data available.  Something statisticians call "degrees of freedom". 

As such it is imperative when working with small data to begin with clear and concise business goals and objectives (see previous blog post: Micro-goals).  This focus will help narrow down the perspective applied to small data and will increase the degrees of freedom necessary for valid, significant and insightful analysis.

While "big data" does seem to be getting a lot of attention, it is the collection of vast "small data" insights that will will propel change in our businesses and economy.  Let's get started!

Thursday, October 6, 2011

Micro-goals and Rapid Analytics

Let's face it, the future of our economy is uncertain.  With the wild swings in the stock market over the last few months, and the on-going turmoil in Europe, long range, rigid, business planning is not only nearly impossible, it is unadvisable.  Instead, many business are in "wait and see" mode, putting them in a position to be reactive to what the market brings. 

There is a middle ground, however: micro-goals.  Micro-goals represent short term targets within the context of a longer term, but fluid organizational vision.  For example, if the long range organization vision is to double annual revenue over the next five years - the micro goal would be to grow revenue by 3.7% in the next quarter (3.7% is the quarterly growth necessary to double revenue in five years).  The feasibility of this short term target is confirmed by baseline analysis of past results, executed with the help of analytic decision models and tracked real-time by business dashboards.  If, at the end of the quarter, the micro-goal is not met - or better if it is exceeded, then the fluid organization vision may shift to reflect the latest reality, with analytic decision models revisited to discover what they missed.

The methodology for applying micro-goals is to:
  1. Define a flexible and fluid vision and roadmap for the future
  2. Develop a process map and conduct baseline analytics of current practices
  3. Re-engineer the process for short-term improvement, and develop analytic decision models to evaluate the likely impact of changes
  4. Execute the improved process and monitor progress real-time, against decision model benchmarks
  5. Revisit and improve process and decision models, adjust the long-term vision and elevate performance
The key tool that supports this business practice is "rapid analytics".  While the long term fluid vision draw on professional judgment of the business owner or manager, it is business analytics that support the actual execution of micro-goals driving towards this vision.  Traditional "big data", enterprise business intelligence (BI) lead by IT will not cut it, however.  While IT plays a crucial role in providing the infrastructure to manage and store data necessary to feed business analytics, they are not in the best position to deliver "rapid analytics".  Instead this should be conceived of and generated by the business user and analytics experts - in direct response to their micro-goals.  Like the analytics that they produce, the technology to accomplish this must also be agile, easily deployed and programable and usable by business users.  These technologies exist today.

Micro-goals do not serve a specific business function, but instead require support from a number of often independent business functions.  As such data and analytics that support micro-goals must also cut across disparate data systems that tend to exist in stovepipes like their respective business functions.  In order to keep pace with the rapid cycles of micro-goals, data feeding the "rapid analytics" must too be agile, lightweight and flexible.  The best way to accomplish this is not through large data warehouses or data marts, but instead through virtual data stores that dynamically integrate data sources and apply to very specific micro-goals.

This methodology, that leverages business strategy, analytics and technology, keeps the benefit of a long term, big picture vision, without being blindly commited to unrealistic and unpredictable long term plans.  Additionally, focusing efforts on execution of short term micro-goals makes for a more flexible, agile yet intentional organization that is constant learning and evolving.

Saturday, September 24, 2011

Drivers of Finite Promotional Periods

Marketing campaigns are a tool used to promote products and services. Campaigns can use various tactics such as email, social media, ads, webinars, direct mail, etc. Marketing campaigns could also be part of an on-going promotion of a brand, or a finite promotion leading up to some discrete event. This article focuses on a scenario when the campaign is directed at a discrete event - in this case a charity running race. The obvious trait of this environment, since the campaign is focused on a discrete event, is that the promotional period is finite.

The race directors utilized email campaigns to past runners to promote their event, as well as facebook ads and postings on various running websites. These activities drove potential registrants to the race website, and ultimately to register for the race online. While facebook ads and the running website postings did drive traffic to the website, very little of that traffic converted to registrations. The main drivers of web visits and ultimate registrations were email campaigns to past runners and the time remaining until the event.

The chart below shows the rate of registrations to web visits. It shows that on average there was a registration for 8 or so web visits (0.13 registrations per visit). This rate was variable, but stayed fairly constant throughout the promotional period.



The magnitude of registrations and web visits did not stay constant during the promotional period, however.  The chart below show a steep increase in both web visits and registrations as the event date drew near.  This chart is also annotated with the dates when email blasts were sent to past runners.



Careful inspection of the trends show that there might be a slight lift in the number of web visits in the days following an email campaign.  This lift seemed to be greatest the day after the campaign, and then showed a decreasing impact as more time passed.  Since the influence of days remaining until the event is so strong in this trend, the email campaign lift is difficult to spot.  A statistical analysis was run on these data to filter out the significance of the email campaigns within the context of the days remaining until the event.

This analysis naturally showed that the days remaining until the event were the main driver of website visits.  After correcting for that factor, however, the days after an email campaign still had a statistically significant lift in web traffic.  This lift was inversely proportional to the number of days past since the campaign (the fewer days since the campaign, the greater the lift).  The results of this statistical analysis were built into a predictive model that smoothed the noise out of the trend and allowed the race directors to identify email campaign latency and frequency that could increase the number of web visits, and ultimately registrations (for next year's race).  A live and active version of that model appears below.
While there are obviously many endogenous and exogenous factors that drive web visits and ultimate registrations (sales) in a finite promotional period, this model offers a simple perspective on how one of those factors could vary to positively influence outcomes.  Naturally, this model could be expanded to include other factors such that their collective influence may be understood.

Friday, August 26, 2011

Bootstrapping a Marketing Program with Analytics

With the continued pressure of a down economy, marketing executives are driven to not only expend marketing budgets more effectively, but also to ensure business growth from those budgets.  Any request for additional marketing funds requires a solid business case analysis that illustrates a direct impact on revenue growth.

With the explosion of various marketing channels in the last few years (web, mobile, email, social media, etc.) has come a tidal wave of related data. As these marketing channels are usually distinct entities, data related to each channel is also distinct, and often not integrated. While fully integrated marketing channels that track customers across channels from "first touch" to sale may be the Holy Grail of marketing analytics, the reality is that many organizations do not have systems in place that can serve up data this way.

Investment in marketing automation tools can provide part of the solution, but this potentially costly investment can be a tough sell to for the marketer that is trying to justify additional marketing funds in the first place!  Leveraging available data across disparate channels, with the right mix of historical data analysis and statistical modeling can provide a business case for a growth strategy that requires additional marketing spend.

Since a common customer identifiers are not available across disparate marketing channels, analytics at this level of detail is not realistic.  Spend justification relies on illustrating relationships between marketing spend and revenue, however.  This relationship need not be defined at the customer level, but instead could be defined across some common time interval such as days, weeks, months, quarters or even years.  Naturally, the more granular the time period, business variability related to marketing spend will be better understood. 

The example below illustrates this relationship for an online retailer that wished to justify increased sales through an expanded marketing program budget.  This retailer used a multi-channel marketing program that includes Facebook, Google Ad Words and Search Engine Optimization (SEO) to promote their products.  The chart below shows daily spend on Facebook to ad clicks that lead to website visits.  The strong relationship between these two factors indicates that Facebook spend is leading to website visits (naturally with a pay-per-click campaign).  Similar relationships are seen with Ad Words and search.


Given the multi-channel nature of this retailer's marketing program, and the fact that data from Facebook, Ad Words and SEO were neither integrated nor tracked at the customer level in their sales system, direct insight of channel spend to sales was not possible.  Instead, this retailer looked at daily website visits (total) to daily sales (total).  The chart below tells the story that higher daily web site visit volume leads to higher daily sales.


This insight, while valuable, does not justify an expanded marketing program budget, however.  In order to make the business case for additional marketing spend, with the constraints of un-integrated and customer blind data, the retailer built a simple marketing forecast model that drew upon the relationships seen between spend and web visits, and web visits and sales.  This model projected web visits from each source (Facebook, Ad Words and SEO) based on the known relationships and multiple future spend scenarios.  It then aggregated projected visits across all three channels and used that result to project sales, again based on known relationships of visits to sales. 

The result was the model below which provides a sales forecast for two separate growth scenarios.  Both scenarios assume a baseline spend for each marketing channel that is equal to historical levels.  Scenario 1 assumes that spend will not grow from this baseline over then next 15 months.  Scenario 2 assumes a 2% monthly growth from the baseline for each marketing channel.  Based on the underlying relationships, the model projects expected sales for each scenario.  The model displayed below is a live, interactive model that allows for creation of various scenarios (try it!).



Although the results are based on a rather simplistic model of a multi-channel marketing program, they do provide a sufficient business case to justify increased spend.  This simple model could be easily adopted to more complex programs, to include more channels or seasonality, for example.

While having well integrated customer level data across marketing channels is a preferred starting point for marketing analytics, the reality is that this unattainable and costly endeavor for many organizations.  Leveraging available data with common attributes does have the very real potential to provide insights that can bootstrap a marketing program for growth and success.


Monday, July 25, 2011

Market Share Models

For any business in a competitive industry, market share is an important measure of success and growth.  Market share, which is the percentage of a businesses sales to all sales in the industry, indicates how much of the overall market that business controls.  Managing market share is difficult as it not only depends upon an individual businesses sales growth, but also the growth of competitors and the size of the overall market.  The latter two factors are generally beyond the control of the individual business. 

There are a number of factors that are in control of an individual business, however.  These include, but are not limited to: marketing and sales spend allocation, product prices, products offered, product quality, location, etc.  Factors that are not in control of an individual business, but yet still influence market share are: number of competitors, the presence of imitation products, number of years in the market, etc.

Given the number of internal and external factors that can influence market share, improving market share is a costly and complex endeavor.  One tool that can support this effort is a statistical model that helps determine the factors necessary to achieve specific market share goals.  While a tool such as this still relies on the professional judgement of the business line manager, it will help recommend the specific business decision that could lead to desired outcomes.  Statistical models such as this are based on historical results and evaluate not only the impact of each factor on market share, but also how the interaction between factors can influence market share in different ways.  In other words, market spend in one location may have more influence on market share than market spend in another location.

One especially competitive industry where models likes these have a benefit is in pharmaceuticals.  The sample model displayed below illustrates how market share for an individual company's drug can be evaluated across a number of factors that are both in and out of the company's control.  These factors include the size of the drug class to all drug classes in the market (class share), the number of competitors, the price per pill, the presence of generics, whether the drug is sold in the US market, the years on the market, the percent of spend on sales, and total spend.

The factors in the model above can be adjusted for the drug in question to determine expected market share.  Factors such as price per pill, and sales spend would be in control of a pharmaceutical company, and thus would directly affect market share.  The model also allows multiple market share scenarios for the same drug such that the relative impact of factors can be evaluated.  The relative difference between the two scenarios is represented with the odds ratio - this statistics determines how many times greater scenario 1 market share is is to scenario 2 market share.

While this particular model is somewhat simplistic, it illustrates how internal and external factors can influence market share, and how these factors can be leveraged to model potential market share outcomes.

Wednesday, June 8, 2011

Managing Business Variation

It could be said  that it is not missed goals that kill a business, but rather the inability to anticipate and manage through business variation.  When businesses plan for the future, they often do it based on goals that are either arbitrary, or derived from the "averages" of historical experience.  While historical averages may provide decent directional guidance when it comes to setting business goals, they only tell part of the business cylce story.  Aside from the seasonal trends that a business may consider, a large degree of variation exists throughout the business cycle, due to forces beyond, or directly because of, management control.  Without an understanding of both the magnitude and drivers of business variation, managing business growth and sustainability can be a frustrating endeavor.

The key to understanding the magnitude of business variation is to understand how a business metric of interest, say revenue, varies over discrete time periods.  These periods could be days, weeks, months, quarters or years.  The period size depends upon the size of the business cycle being managed.  For example, if there is a need to manage a revenue goal for the next quarter, understanding the variability of past weekly or monthly revenue will help determine possible revenue ranges for the coming quarter.  As these ranges are derived from statistical distributions (like the bell shape curve), they will inform the likelihood that a goal may be met, or not.  If the revenue goal for the quarter $5M, and the statistical distributions indicate this only happens 5% of the time, there is a good chance that this goal will not be met.

Of course, nothing in business is static, so relying on pure statistical distribution of past results will not consider business or economic changes that may have caused shifts in business outcomes.  As such, it is important to study and identify the drivers that explain business variation.  The most basic and commonly understood driver of business variation is the seasonal effect that comes with most business cycles.  Other drivers may include factors that are within or out of the management's control.  These may include the strength of the market economy, resource investments, business model changes, supply chain prices, etc.  As these drivers begin to explain business variation, the ability to predict business outcomes for a particular situation becomes more precise.  It will become clear, after considering drivers of past performance, what  the range of expected outcomes is for the current environment.  From this, business goals may be set accordingly.

While point estimates and single value goals are simple and easy to understand, they do not communicate business risk that may jeopardize their realization.  By considering business variability and its inherent drivers, business goals may be set in a way that not only considers what is capable in a current environment, but also evaluates the risk that various targets will not be met.  It is by planning for this variation that risks can be mitigated, and surprises can be managed.

Tuesday, May 17, 2011

Analytics for ERP Implementations

The implementation of an Enterprise Resource Planning (ERP) solution is a complicated endeavor.  It requires the coordination of cross-functional teams, huge IT commitments, and the definition or re engineering of business processes.  Needless to say, this is an undertaking that comes with a substantial price tag.  The right mix of business rules, meta data and analytics, however, can facilitate a successful result.
One of the difficulties encountered in ERP implementations is in defining the relationships between job responsibilities of individual users, and functionality within the ERP system.  This is a business management rather than technical exercise.  Most successful ERP implementations are able to clearly define the relationships that exist between individuals and their jobs (i.e. Mary the Accountant), the business roles they perform (i.e. payroll, accounts payable) and the ERP functionality they need to do their job (i.e. check register).
In an organization of reasonable size, these relationships will result in a vast network of complex associations that cross business function and organizational structure.  Many organizations will develop a “jobs” database, separate from the ERP system, to keep track of all these relationships, and that help define necessary training and security settings.
The existence of these jobs databases not only provide invaluable information about individual job needs, but they also offer a management oversight mechanism to monitor the progress of an ERP implementation.  Specifically, summary analytics provide the ability to understand progress to date, current needs, and potential risks to the implementation schedule.  These analytics are not meant to monitor the technology implementation, but rather the human resource readiness and controls that must be in place to successfully utilize the functionality that an ERP system will provide.

Once the implementation is complete and the system is being utlized, the relationships defined in the jobs database can be integrated with usage data to generate the next level of analytics.  In addition to monitoring usage patterns across a user base, the integrated datasets can also allow for identification of gaps with pre-defined job roles, capabilities and training that were not anticipated during the implementation.  These analytics can in effect "train" the system to constantly evolve the jobs relationships such that they are satisfying current business needs.
A sample of analytics that could be performed pre and post implementation include:
  - Review business role gaps necessary for specific jobs
  - Identify current and emerging security gaps
  - Identify current and emerging gaps in training
  - Monitor the progress of training
  - Monitor implementation progress to plan
Each ERP implementation will define its own analytics to support implementation oversight.  Starting with a well defined defined business rule framework and a mechanism to track progress, however, will provide an information system that supports and clarifies a complex business solution.

Wednesday, March 30, 2011

Inventory Control and Business Analytics

In this age of cost cutting measures, every aspect of business operations must be considered.  It is in business functions that have a high degree of variability that cost control can be a difficult endeavor, but yet provide a tremendous opportunity for savings.  One of these business functions is the management of inventory.  Inventory can be a physical or abstract concept.  It can represent real raw materials or products ready for sale.  It can also represent intellectual property or data that may have storage costs and a limited shelf life.  The general management of inventory, however, comes down to this trade off: If you have more than you can use, then there are storage or disposal costs.  If you don't have enough when a customer comes knocking, then there is lost revenue.  In rare cases inventory has low storage and disposal costs so excess quantities are kept on hand for future use.  In  most cases though, whether due to fads, styles, age, or market forces, inventory perishes and has a carrying cost.

Managing a complex base of inventory is difficult and costly.  One tool that can support this effort, however, is business analytics.   Business analytics offers a  framework for lending insight to trends, inefficiencies and and opportunities for the  management of inventory.  Since inventory control is a highly variable activity, analytics can help understand the drivers and magnitude of this variation.

Several different analytic methods are available to provide this insight.  The most simple and straightforward method for understanding inventory levels is historical reporting and analysis.  This will illustrate periods when inventory was either deficient or in excess, and will clearly show seasonal or business cycle variation.  It can also identify poor or successful inventory decisions that have been made in the past; these can inform better future decisions.

A historical perspective does not necessarily give full insight to current inventory decisions, however.  Real-time monitoring of current inventory levels with key metrics such as Current Inventory, Working Capital, Backlog and End of Life Inventory provide an illustration of where inventory may by actively lacking, in excess or at risk.  Through regular review of these metrics, real-time inventory control decisions may be made to mitigate potential future waste or lost opportunity.

Without insight into future needs, however, current inventory monitoring has limited benefit.  In order to truly stay ahead of demand for inventory, a reliable and accurate inventory forecast is essential for proactive decision making.  The nature of forecasts, however, is that they become less certain the further into the future they predict.  As such regular update and review of forecasts is required to allow for re-alignment of future inventory requirements.  Proactive inventory management is effectively a bet on future needs.  This bet needs to be balanced with the variability inherent in the inventory forecast: The bigger the forecast uncertainty, the smaller the bet, and visa versa.  The net effect is an optimized inventory program.

This article just begins to touch on the analytic methods that could support the management of inventory.  The actual techniques will vary depending on business needs and business objectives.  It is these analytic techniques allow for the control of uncertainty and variation that is so present in inventory management.

Wednesday, March 2, 2011

Social Media Analytics

How can you use social media to improve your marketing reach?  By working with an online marketing strategist, you will be able to define a program that best suits your business.  But how do you KNOW that your social media campaigns are effective?  That is where social media analytics are essential.

Social media analytics have various components, none more important than the other, that are all interconnected, informing and driving your social media campaigns. 

At the root of these analytics is, of course, data.  Most data related social media campaigns are "machine" data (see previous blog post) - data about some human activity that is captured by a machine .  For example, if you were click on the "Share" or "Like" buttons at the top of this page (go ahead, click them!), Facebook and LinkedIn machines would create data about this blog post: who liked, when they liked, what they liked, why they like (if they left a comment), and who they know that "liked" before them.  A tremendous amount of information for a single button click!

But what do you do with this information?  How can analytics help?  When thinking about social media analytics, one must consider three questions: What campaigns have you done (that work)?  How are your campaigns doing right now?  How do you expect campaigns to turn out?

The first question "What campaigns have you done (that work)?" goes beyond basic reporting.  It requires in depth statistical analysis that ties a particular campaign or marketing channel all the way back to earned revenue (or equivalent business metric if revenue is not a key business objective).  It is only through this linkage that you will be able to determine how successful a campaign has been.  Naturally this is easier said than done.  But going through the effort of linking relevant corporate data sources (marketing, sales, accounting) to make this connection provides exponentially more value than simply reporting on arbitrary (and self-fulfilling) key performance indicators (such as Tweets this month).  Despite all the "noise" that comes in between a Tweet and cashed check from a client, the right statistical analysis can tease out those campaigns or channels that contribute to increased revenue.

The next question of  "How are your campaigns doing right now?", gets at the "proper" use of key performanace indicators.  Through the statistical analysis of the previous question, you will be able to determine the relative effectiveness of various social media campaigns (on revenue) and work backwards from revenue goals to set smart and informed campaign targets (i.e. how many tweets, blogs and friends do I need to drive my revenue target?).  The collection metrics that are expected to contribute to revenue should be monitored in concert, against targets or goals, as no single campaign will drive revenue alone.  By the way, revenue should be monitered with these metrics as well.  Afterall, this is what we are ultimately trying to impact, right?

The last question, "How do you expect campaigns to turn out?" requires the modeling of social media campaigns or channels before they are initiated, or while they are in progress.  This will allow you to tweak or turn the dials of the campaign initially or mid-stream to influence the expected (revenue) outcome.  This type of decision support is critical to making smart spending decisions relative to social media campaigns.  The analytics involved in this effort include predictive modeling of revenue for a particular or group of campaign strategies, marketing resource optimization models that allocate budget and staff in a manner that maximizes (revenue) impact, and adaptive models that will rechart a course when a campaign gets off track or is derailed by unexpected events.

With the right analytics strategy of "looking back, being present and looking ahead", you can take the guess work and leaps of faith out of your social media campaigns.  Facebook does not have to be a "huge waste of time" as Betty White quipped in her Saturday night Live monologue.  It can, with social media analytics, be a powerful tool!

Wednesday, February 2, 2011

Machine vs. Human Data Capture

It is undeniable.  The volume of data that businesses face today is unprecedented.  The Boston Globe compares this "data deluge" to the flood of books that became available after Gutenberg invented the printing press in the 15th century (Information overload, the early years). 

What differs today is how information is captured.  Most of the data we interact with on a daily basis is created by humans: articles, books, weather reports, emails, text messages, blog posts (like this one), twitter feeds, the list is endless.  In addition, much of the data stored in business databases are also entered and generated by humans: sales prospects, clinical codes, risk evaluations, incident codes, quality assessments, manufacturing results, etc.  Name the business, and likely there is some aspect of their data capture that relies on the effort and judgement of a human.

There is a huge amount of data though that are automatically captured as a consequence of human interaction with a machine: user login timestamps, pages visited on a website, tv stations watched, links clicked, files accessed, location where a photo was taken, etc.  This "data exhaust"  holds tremendous analytic value to understand how humans are interacting with the world around them and the machines that they use.  For a software company, it could inform which aspects of an application are being used the most, and direct redesign.  For a commerce website it can explain why buyers leave the site prior to committing to the sale.  Google uses it to determine what search results come to the top of the list for a given set of search terms.  Some smart phone apps actually track where you are when they are being used.

The benefit of machine data over data captured by humans is that there is no human subjectivity or effort in its capture.  The downside is that it is limited to WHAT machines can capture or are programed to capture.  As business processes become more automated, however, and information systems more integrated, machine data capture and exchange will become more and more prevalent. This will open the door for ever more granular and accurate data describing human activities; which in the end is a major key to business growth. With human resources shifting the focus away from (but not eliminating) human data capture, that effort can be redirected to more and more advanced analytics that give important and more focused business insights.

While many businesses are just beginning to leverage and streamline the use of machine captured data, there is still a lot of growth potential in this area.  The future challenge will be to discern what to capture, how to most efficiently capture it, and how to use it for business benefit.

Tuesday, January 4, 2011

Trends in Business Analytics

The field of business analytics is one of ever increasing demand.  Recently, the Boston Globe listed "data modeler" as one of the 11 hot jobs of 2011: View Here

There is a reason for the attention that business analytics is getting in the business world: Due to the prolonged recession and its unclear direction, organizations are being asked to do more with less.  Coupled with that is the fact that they are being presented with more and more data about their business that on its own do not present any answers.  Most business managers realize that smart use of their data will help them address the pressures that are placed on them by the recession.  The problem is that they do not know where to start.  This is where the business analytics professional (or "data modeler" as the Globe calls them) can help.

This article describes five trends in business analytics that are a reaction to its increased demand:
  1. Marketing effectiveness
  2. Sales forecasts
  3. Operational efficiencies
  4. Cost control
  5. Revenue projections
Marketing effectiveness implies that with ever increasing amounts of data being captured about marketing campaigns (with tools like Eloqua), business analytics will not only be able to identify what campaigns worked and those that did not, but also will be able to predict the effectiveness of campaigns before they happens!  This type of predictive analytics will allow for more efficient spend of marketing dollars.

As leads come out of the marketing machine and into the sales pipeline, business analytics will be able to predict sales outcomes with ever increasing accuracy by combining sophisticated statistical models with data stored in CRM tools like Salesforce.com.  In addition, sales forecasts will offer mechanisms that allow sales managers to optimize their sales force, by recommending which opportunities should require focus at any given time.

Once the sale is made and the product needs to be delivered, operational analytics will leverage data captured in ERP systems to create optimization tools for operations managers.  These models will support varying needs such as resource allocation, inventory control, R&D, quality control and production processes.  These tools will recommend an optimal set of production decisions, at any point in time, that are specific to an individual business.

It is not poor performance that kills a business, but business variability that prevents the correct anticipation of business trends and preventative action.  Using historical business data, organizations will be able to better understand past drivers of business variability so that risk can be better managed in future periods.  This will be driven by precise cost control measures that maintain profitability, yet do not kill productivity.  Conversely it will facilitate preemptive expansion that will allow an organization to be ready to ride a wave of growth when it arrives.

Without an accurate revenue projection, a business is effectively flying blind.  With ever increasing detailed data about revenue earnings, organizations are positioned to make laser-sharp revenue forecasts that will give business insight that was previously seldom enjoyed.  These forecasts will allow managers to understand drivers of their revenue and will suggest what manageable factors are likely to contribute to the largest revenue growth.

While the business needs that business analytics will address over the next few years are not new, the approach to solving them is.  Business analytics is positioned to be a cornerstone of how business is "done" over the next decade, and it will provide a tremendous competitive advantage to those that are early adopters of its methodology.