Time series analysis and its applications is a topic that practitioners can can much when addressing process improvement efforts.
I was talking with a Master Black Belt candidate, in the process of being certified, who asked about other statistical tools that a practitioner should learn. Since the candidate is in a company that has a lot of time based continuous data, time series analysis comes to mind. I learned all of these tools as part of my Masters Degree, not in LSS training, and I have found a few good uses for these tools.
Time series analysis involves a series of tools that examine data in sequential order (by time) where they allow the use of time as a factor in the analysis. Generally we do not use clock or calendar time as a factor in the inferential statistical analysis tool set. This is because time is not an independent factor of a population. Time is not a value that is randomly sampled from a population. You are never able to sample the same number again, and every new value is a value that does not exist in the historical data set. This reasoning is the same one we use when we do not do a regression with time as the x. Knowing the historical time data does not tell me enough information on the future impact of X. If we used time as x in a regression to predict the future performance, we end up with an extrapolation from the other data, which is not allowed. The proper analysis method would be forecasting, which accounts for the increasing uncertainty as time moves beyond our current data.
The most common Time Series Analysis that I use in LSS applications is the auto-correlation analysis. (stat>time series>autocorrelation in Minitab). In a traditional inferential statistical analysis, we look at the average and the standard deviation. Time is never a consideration. In many cases, the standard deviation is not a single value, but rather a combination of a smaller variation value and a component that is varying over time. The mean of the process varying over time generally has a cyclic pattern. However, the variation changing over time does not. By examining the process data in the time domain along with the standard analysis methods, we are able to learn more about our process. The autocorrelation function takes a column of data and looks for a time based correlation. The reported correlation is between the original data and the original data shifted down (lagged) by a series of time values or rows. It will shift the data one row, calculate a correlation, shift another row, test correlation, and so on until it performs the total number requested. The output of the process is a series of correlation coefficients for each shift. We are looking for the coefficients to cycle from positive to negative or the opposite. When we see a significant correlation value, the amount of lag used to get that correlation is the time constant of the process.
An example of this process can be visualized by recording the temperature in the office or home. If we plot the temperature every minute, we will see that the thermostat allows the temperature to change until it triggers heating or cooling which causes a quick change, and then it drifts again. If we collect all of the data, it will show a mean and standard deviation. The probability plot will probably look a bit odd but it should still be acceptable. But we know it is more than a mean and a sigma. If we applied the autocorrelation analysis over a series of heating or cooling cycles, we would see that the period of the temperature cycle (where we have a positive correlation) is the time between thermostat events. By tracking that period, we can see when the process changes, the process being the external heat or cooling load that is causing the room temperature to change.
The advantage of the autocorrelation tool is that it will allow us to identify a periodicity that a process cause is changing. Now we examine the process for causes that change with the same periodicity. This review may rapidly eliminate many causes that either change more quickly or more slowly, allowing us to focus on lesser factors. In my experience, there are many times that this tool sends us back to brainstorming to list what factors may change on that periodicity. I have usually found causes that no one ever considered as related to the problem.
Consider using this tool if you believe the variability has a time dependent effect and you will be surprised how simple the tool is to use and interpret.
Contact Us to set up a time to discuss with Forrest Breyfogle how your organization might gain much from an Integrated Enterprise Excellence (IEE) Business Process Management System implementation for organizational analysis and improvement.