Quantcast
Channel: Minitab | Minitab
Viewing all 828 articles
Browse latest View live

Drive Efficiency in Your Process: 5 Critical Lean Tools

$
0
0

Lean, also known as “Lean Manufacturing” or “Lean Production,” focuses on maximizing customer value by removing waste and eliminating defects. Lean tools are about understanding the process, looking for waste, preventing mistakes and documenting what you did. 

Let’s look at five Lean tools used in process improvements, what they do, and why they’re important. Companion by Minitab can help you get started leveraging the tools of Lean and other continuous improvement methods to thrive in your business. These tools are even more powerful if you can share and collaborate with your team, so we encourage you to try Companion’s online dashboard reporting capabilities as well.

1. Voice of the Customer (VOC) Summary

Voice of Customer (VOC) Summary form using Companion

Collecting the Voice of the Customer is a critical step when implementing Lean. The VOC Summary provides a way to capture important data, so you can act on it to meet key business goals and improve customer relations.

A basic tenet of Lean is to understand the customer needs and design the process to fully meet (or hopefully exceed) customer expectations. Using the VOC Summary tool will help you understand the key customer issues and convert those issues into critical customer requirements.

2. Process Flow 

Process flows, also known as process maps, help you to understand and communicate the activities, or steps, in a process. They also help you to see the relationship between inputs and outputs in a process, identify key decision points and uncover rework loops.

Because this easy-to-visualize method allows people to clearly see a process, it is most effective to build process maps with the team working in the process. The creation of a process flow makes waste visible (bottlenecks, delays, storage, rework, etc.) and shows the best opportunities for improvement.

 

3. Value Stream Maps (VSM)

A value stream is the collection of all activities, both value-added and non-value added, that generate a product or service required to meet customer needs. Value Stream Maps extend the usefulness of process flows by adding more data (beyond x’s and y’s), material and information flow; operating parameters; defect rates, lead times, and so on. A current state value stream map identifies waste and helps your team to envision an improved future state.

4. Five Whys

5 Why's form using Companion

Lean is about understanding why things are done the way they are. Often, things are done improperly, at the wrong time, or skipped altogether resulting in process problems. Use the Five Whys tool to determine the root cause of a problem. Repeat the question "Why?” and you can uncover the problem’s root cause, the relationships between different root causes, and identify steps to prevent the problem from happening again.

The real root cause should always point to a process that does not work well or does not exist. Most of the time people do not fail; processes do.

5. 5S Audit: Sort, Set in Order, Shine, Standardize and Sustain

5S Audit form using Companion

5S is a team-based set of tools that systematically and methodically organize the workplace. A clean, well-ordered workplace improves efficiency and eliminates waste.

The fifth step, Sustain, is one of the hardest steps to accomplish. It’s akin to losing the weight and keeping it off. Sustaining requires maintaining the gains of process improvements on a regular basis.

Without it, old habits resurface, and the workplace falls into disarray. One of the advantages of Companion’s online dashboard reporting is the visibility into everyone’s progress. Everyone can see the benefits and be encouraged to keep it up.


Step 1: Pinpoint Waste. Step 2: Deal with It. 5 More Critical Lean Tools

$
0
0

Dramatic cost savings. Lead time and inventory reductions. Improved transactional processes.

Although Lean has its roots in manufacturing, nearly every industry and type of organization around the world can benefit from it. A little while back, we reviewed 5 Critical Lean Tools that are a great way to get started implementing Lean. Today, let’s continue with 5 More Critical Lean Tools. 

1. Kaizen

Kaizen in Companion by Minitab

Kaizen is a method for accelerating process improvement projects. While originally developed for manufacturing, Kaizen is used extensively in a variety of industries and is a valuable technique for the process improvement practitioner. Kaizen is a focused 3-5 day dedicated event to drive process improvements. Employees are pulled from their daily duties to participate. Solutions are implemented immediately. Leaders use the Companion Roadmap™ above to better plan and implement a Kaizen event.

 

2. Waste Analysis

Waste Analysis by Operation form in Companion by Minitab

In Lean, waste is anything in a process that is unnecessary and does not add value from the customer’s perspective. The purpose of Lean is to identify, analyze, and eliminate all sources of waste, such as defects or excessive inventory.

Use Companion by Minitab’s Waste Analysis by Operation form to document the types of waste at each process step and to quantify and color-code the degree of the waste. The Waste Analysis activity is most effective when performed by multiple observers, both within and outside of the process being examined.

 

3. Quick Changeover

Quick Changeover QCO-SMED form in Companion by Minitab

Quick Changeover QCO-SMED form in Companion by Minitab

Quick Changeover QCO-SMED form in Companion by Minitab

Quick Changeover is a method of analyzing your processes to reduce the time, skilled resources, or materials needed for setup. Initially this technique was applied in the manufacturing environment, specifically for tools and dies. Recently the approach has been applied to a large variety of workflows (including transactional processes) that require a quick reset such as hospital beds, operating rooms, or unloading/loading airline passengers. Quick Changeover involves identifying process steps and assigning each to one of two categories: 

  • Internal - must be done while the process is stopped, and
  • External - can be done while the process is running, either before or after performing the setup. 

Use the Quick Changeover (QCO-SMED) form to compare the internal and external components of process changeover, or setup, for both current and improved states. By implementing Quick Changeover, organizations can reduce internal setup time which reduces the amount of non-productive process time and enables more setups, smaller run batches, and improved flow. The secondary goal is to reduce total setup time to free up labor.

 

4. Line Balancing

Companion by Minitab’s Line Balancing - Process Map or Line Balancing – Value Stream Map form

Companion by Minitab’s Line Balancing - Process Map or Line Balancing – Value Stream Map form

Line balancing is a technique for “equalizing” a set of process steps to smooth the time required to accomplish them. When process steps are not balanced, one or more constraints or bottlenecks may be present.

The goal is to eliminate non-value-added tasks, combine tasks and closely balance the remaining steps so that all meet the rate of customer demand. Use Companion by Minitab’s Line Balancing - Process Map or Line Balancing – Value Stream Map form to compare the cycle time for multiple operations on a process map or value stream map against the takt time (time required to meet customer demand).

This analysis is useful when you try to balance either a cell or a sequential series of process steps. It highlights the waste of waiting.

 

5. Standard Work

Companion by Minitab’s Standard Work Combination Chart form

Standard work establishes a set of work procedures that provide the best and most reliable methods and sequences for processes. Standard work clarifies the process, documents the best way to do a job, ensures consistency, expedites employee training, and provides a baseline for further improvement.

Use Companion by Minitab’s Standard Work Combination Chart form to show the manual, machine and walking time associated with each work element. The output graphically displays the cumulative time as manual (operator-controlled) time, machine time and walk time. Looking at the combined data helps to identify the wastes of excess motion and waiting.

You can visit Companion by Minitab online help to watch videos, learn more about these tools and download the free 30-day trial to try them yourself. Comment here or let us know on Facebook or Twitter if you have any questions or other areas you would like us to cover.

Snowy Statistics: 2018 Winter Weather and Analyzing Boston's Record 2015 Snowfall with Histograms

$
0
0

snowAs we start off 2018, our eyes are on the winter weather, specifically low temperatures and snowfall. After 2015-2016's warmest winter on record and Chicago breaking records in 2017 with no snow sticking to the ground in January or February, our luck might have run out. We shall see, though. The Old Farmer's Almanac is reporting that 2017-2018 winter temperatures will be colder than last winter.

If you live in the United States, you might know the winter of 2014-2015 was one for the record books. In fact, more than 90 inches of snow fell in Boston in the winter of 2015! Have you ever wondered how likely of an occurrence this was?

Dr. Diane Evans, Six Sigma Black Belt and professor of engineering management at Rose-Hulman Institute of Technology, and Thomas Foulkes, National Science Foundation Graduate Research Fellow in the electrical and computer engineering department at the University of Illinois at Urbana-Champaign, also wondered. They set out to explore the rarity of the 2015 Boston snowfall by examining University of Oklahoma meteorologist Sam Lillo’s estimate of the likelihood of this event occurring. Below I’ll outline some points from their article, A Statistical Analysis of Boston’s 2015 Record Snowfall.

Meteorologist’s Analysis of Boston’s Historic Snowfall in The Washington Post

Following this historic snowfall of 94.4 inches in a 30-day period in 2015 Lillo analyzed historical weather data from the Boston area from as far back as 1938 in order to determine the rarity of this event.

Lillo developed a simulated set of one million hypothetical Boston winters by sampling with replacement snowfall amounts gathered over 30-day periods. Eric Holthaus, a journalist with The Washington Post, reported that Lillo’s results indicated that winters like the 30 days of consecutive snowfall from January 24 to February 22, 2015 should “only occur approximately once every 26,315 years” in Boston:

Snowfall Image

To assess Lilo’s findings, Evans and Foulkes obtained snowfall amounts in a specified Boston location from 1891 to 2015 via the National Oceanic and Atmospheric Administration (NOAA) for comparison with his simulated data.

Recreating the Simulated Data

On March 15, 2015, the cumulative Boston snowfall of 108.6 inches surpassed the previous Boston record of 107.6 inches set in the winter of 1996. In the figure below, a graphical display of Boston snow statistics from 1938 to 2015 illustrates the quick rise in snowfall amounts in 2015 as compared to record setting snowfalls in years 1996, 1994, and 1948:

Snowfall Image 03

Also included in the figure is the annual average Boston snowfall through early June. The final tally on Boston’s brutal snowfall in 2015 clocked in at 110 inches!

The dashed rectangular region inserted in the graphic highlights the 30 days of snowfall from January 24 to February 22, 2015, which resulted in 94.4 inches of snow. In order to obtain hypothetical 30-day Boston snowfall amounts, Lillo first generated one million resampled winters by:

... stitching together days sampled from past winters. A three-day period was chosen, to represent the typical timescale of synoptic weather systems. In addition, to account for the effect of long-term pattern forcing, the random selection of 3-day periods was weighted by the correlation between consecutive periods. Anomalies tended to persist across multiple periods, such that there’s a better chance that a snowier than normal three days would follow a similarly snowy three days. This is well observed (and in extreme cases, like this year), so it’s important to include in the simulation.

After generating the one million resampled winters, Lillo recorded the snowiest 10-period stretches, i.e., 30 days, from each winter. Percentile ranges of the resampled distribution were compared to the distribution of observed winters to check the validity of the simulated data. In simulating the winters’ snowfalls in this manner, Lillo had to assume that consecutive winters and winter snow patterns within a particular year were independent and identically distributed (IID). Evans and Foulkes recognize that these assumptions are not necessarily valid.

Since they were unable to obtain Lillo’s simulated data and used actual historical data for their own Sigma Level calculations, Evans and Foulkes used a digitizer and the graphical display of Boston snow statistics above to simply create a “copy” of his data for further analysis.

Once they had data values for “Maximum 30-day snowfall (inches)” and “Number of winters,” they added them to Minitab and created histograms of the snowfall amounts with overlaid probability plots to offer reasonable distributions to fit the data:

Minitab Histograms

For more on how they used Minitab distributions to fit the snowfall data and how they determined Sigma levels for the 2015 Boston snowfall using Lilo’s data, check out the full article here.

The Journey Towards Industry 4.0

$
0
0

The convergence of the widespread deployment of low-cost sensors, cloud and greater compute power has brought together a multitude of connected devices which can monitor, collect, exchange, analyze and deliver insight like never before. Industry 4.0 (or Industrial Internet of Things) is transforming industries. This is especially true within manufacturing where many organizations are making investments into ‘smart manufacturing’ and machine learning tools in order to make significant improvements. Whether the goal is to decrease labor costs, reduce product defects, shorten unplanned downtimes, improve transition times, or speed up production, the core technologies of machine learning align well to the complex problems manufacturers face on a day-to-day basis.  As more manufacturers are discovering, the paradigm of performance is shifting, after years of squeezing costs out of supply chains and operations, they are now realizing that more cost cuts will only impact the customer.  The way to achieve real efficiency gains is to make their factories more agile and responsive.

Machine Learning

To go beyond the hype, leaders need to understand the challenges and look at ways they can successfully implement the tools that will deliver real value. The journey towards Industry 4.0 is complex. At its foundation is data, but it often lives across multiple relational and non-relational systems. Whilst innovations in storage and managed services have improved the capture process, accessing and understanding the data still poses a significant challenge. As a result, demand is growing for advanced analytical and machine learning tools which can help unlock the value of the data.  In a recent webinar Minitab ran on Big Data and Machine Learning, participants were asked how ready they were to handle data management and algorithm processing, with 24% of respondents stating they had no specific tool or infrastructure in place and 34% saying they needed to adapt their current structure.

Data Management and Algorithm Processing

This is where solutions such as Minitab’s Salford Predictive Modeler (SPM) are playing a significant role. These highly accurate and ultra-fast engines provide users with automated modelling solutions to help quickly and accurately find actionable predictions and patterns in large and often complex data, allowing manufacturers to be able to make better decisions across the board. Where these complex problems once took months to solve, they can now be answered in minutes.

As the domain of manufacturing becomes more complex and dynamic, data-driven approaches to find highly complex and non-linear patterns in data are becoming ever more critical.  Connected devices and systems are radically altering the nature of manufacturing, offering new opportunities for extremely focused control and monitoring, with self configurating automations heralding a step-change in productivity. It is becoming more and more essential that organizations find new tools to transform raw data into models that can be applied to prediction, detection, classification, regression and forecasting, so that operations become more efficient and organizations can find real competitive advantage. 

Learn how Minitab’s Salford Predictive Modeler can unlock the value of your data, and enable you to maximize the potential of Industry 4.0. The highly accurate and ultra-fast engines of  CART®, MARS®, TreeNet® and Random Forests® will help you to make better decisions faster.

Salford Predictive Modeller Contact Us

Analyze in One Click with These 4 Tricks Using Minitab Macros

$
0
0

Get started automating your analyses with Minitab Macros

At Minitab, we want our users to focus their time on drawing sensible conclusions from their data that they can use to resolve business problems or take advantage of opportunities. However, with more and more sources of data available, you often spend more time getting ready for analysis and less time interpreting it.

Here are four ideas that demonstrate how Minitab macros deliver “one-click analysis” for the repetitive part of any analysis project. If you’re interested in seeing macros in action, sign up to watch our live webinar March 29: Tips from the Experts: Fast-track Your Data Analysis with Basic Macros.

 

1) Creating a Customized Chart

The charts that are produced directly from the Minitab menus may not look exactly the way you want them to. For example, take this plant growth data set in the boxplot example below. It produces a chart that looks like this.

 

Creating a Customized Chart - Boxplot example

However, you might want it to look more like this.

Creating a Customized Chart - Boxplot example with a new look

It is easy to edit the original Boxplot chart to look like this, but if you had to do this for a large number of charts each month, then it would become a real burden. However, it would be possible to create a Minitab Exec– the simplest type of automation in Minitab – to do this quickly with one click.

 

2) Getting External Datasets Analysis Ready

Nowadays we often get data from a number of sources and before we can do any analysis we need to import the data and get it ready for analysis, which might involve subsetting the data, sorting, transposing, recoding and creating new variables.

I had a similar problem to this with the data I used for my blog, Sunny Day for a Statistician. The data from my solar panel system came in 50 monthly CSV files, each looking like this.

Sunny Day for a Statistician solar panel data

I needed all this data to be in one file, so instead of doing this manually one at a time, I wrote macro that:

  1. Asked me the start date and end date for my analysis.
  2. Determined how many files needed to be created and created the file names for each month.
  3. Flowed in the monthly files, ignoring the first 7 header rows.
  4. Stacked the individual worksheets into one worksheet.
  5. Extracted the year and month from the date, ready for my analysis.

 

Sunny Day for a Statistician solar panel data after Minitab Macro

Now I have this macro I can quickly add data to my historical worksheet of solar data, ready to repeat my analysis. I could even extend this macro to do the analysis too!

 

3) Analysis Using a Method or Formula not in Minitab

Minitab has an extensive library of methods and formulae that cover the analyses. However, what if you want to use a different methodology or formula? Many users think they have to use other software that allows them to do calculations from first principles.  

However – provided you know what you want and have the required formulae/methods you could use a macro to complete this analysis. You can find more than 100 examples of these types of analysis that can be completed using macros in the Minitab Macro Library.

 

4) Add Your Macros to Your Minitab Menu Bar

Did you know that you can customize the Menu Bar in Minitab? I have used the Tools > Customize dialog box to create my own menu item called “My 1-Clicks.” You will see it has the three items corresponding to the three tricks above.

Add Your Minitab Macros to Your Menu Bar 

So now all I need to do to complete any of this analysis is to click on these menu items.

 

I hope these tricks have given you some ideas for improving the efficiency of analysis in your organization. If you want to find out more on how to create these, join our live webinar, Tips from the Experts: Fast-track Your Data Analysis with Basic Macros on March 29. We will send out a recording and the presentation afterward in case you miss it or would like to review the information later on or share with colleagues.

Trimming Decision Trees to Make Paper: A Hands-on Machine Learning and Root Cause Analysis Exercise

$
0
0

Are you headed to Lean and Six Sigma World in Las Vegas April 3-4? In Beyond the Buzzwords: Application of Machine Learning in Lean Six Sigma, Minitab statistician Charles Harrison and I will share even more information on modern-day machine learning techniques. Learn more about Minitab at Lean and Six Sigma World

 

Trimming Decision Trees to Make Paper: A Hands-on Machine Learning and Root Cause Analysis Exercise

 

As we collect more and more observational data from our processes, we may need new tools to provide meaningful insights into this information. You can add modern-day machine learning techniques alongside traditional statistical tools to analyze, improve and control your processes.

Don’t worry if you’re not familiar with machine learning and Classification and Regression Trees (CART). I’ll walk you through an example below and then give you step-by-step instructions.

 

Finding the Root Cause of Excessive Variation in a Pulp Bleaching Process

You can quickly detect the root cause of an out-of-control or out-of-specification process condition using the tree-based machine learning methods in Minitab’s Salford Predictive Modeler (SPM) alongside traditional control charts.

Consider a paper manufacturer that needs to use current process data to determine which factors are contributing to excessive variation in its pulp bleaching process. An Individual’s Chart created in Minitab indicates that the process is very non-stable, which in turn results in an unacceptable defect rate.

Paper Bleaching Individual's Chart

To begin looking at the root cause of the excessive variability in this process, you might begin with a Binary Logistic Regression in Minitab where the response variable is one if the point falls outside the lower control limit and zero otherwise. Unfortunately, for these data, the crazy patterns in the residual plots below indicate that the binary logistic regression model may not be adequate.

Paper Bleaching Residual Plots

 

The CART Approach

CART is a decision tree algorithm that works by creating a set of yes/no rules that split the response (Y) variable into partitions based on the predictor (X) settings. Using the CART feature in SPM, I see that one of my predictor variables – Production – is a large contributor to a point falling outside the lower control limit. 

Using CART in SPM, I see Production is a large contributor to a point falling outside the lower control limit.

 If production rate <= 91.76, then the estimated probability of the process being out of control is relatively high (33%). If production rate > 91.76, then the process is likely in statistical control.

The Minitab graph below explains why this rule works. The CART model finds the vertical line corresponding to production rate that best separates the Response = 0 (in control) from the Response = 1 (out-of-control) group.

Minitab Production Rate graph   I can continue growing the CART tree to eventually find more causes excessive variability.

I can continue growing the CART tree to eventually find more causes of the excessive variability in this process. Once I’ve narrowed the problem down to the vital few X’s, I can put controls in place to reduce the chance of the process drifting out of control, resulting in the process improvement shown in the Minitab Individuals Chart with Stages shown below.

Brightness Before and After

 

The focus on analytics and data-driven decisions within most organizations should not present a threat. It’s a great opportunity for all of us. Experience in data analysis is not only valuable to your job. It’s becoming crucial. Consider augmenting your current analytics tool kit with some additional machine learning tools developed specifically for the problems that occur using large, observational data sets.

Ready to try for yourself?

Try it Now

 

 

Understanding Monte Carlo Simulation

$
0
0

If you collect and analyze real data for a living, the idea of using simulated data for a Monte Carlo simulation sounds a bit odd. How can you improve a real product with simulated data? In this post.

Companion by Minitab is a software platform that combines a desktop app for executing quality projects with a web dashboard that makes reporting on your entire quality initiative literally effortless. Among the first-in-class tools in the desktop app is a Monte Carlo simulation tool that makes this method extremely accessible. 

The Monte Carlo method uses repeated random sampling to generate simulated data to use with a mathematical model. This model often comes from a statistical analysis, such as a designed experiment or a regression analysis.

Suppose you study a process and use statistics to model it like this:

Regression equation for the process

With this type of linear model, you can enter the process input values into the equation and predict the process output. However, in the real world, the input values won’t be a single value thanks to variability. Unfortunately, this input variability causes variability and defects in the output.

To design a better process, you could collect a mountain of data in order to determine how input variability relates to output variability under a variety of conditions. However, if you understand the typical distribution of the input values and you have an equation that models the process, you can easily generate a vast amount of simulated input values and enter them into the process equation to produce a simulated distribution of the process outputs.

You can also easily change these input distributions to answer "what if" types of questions. That's what Monte Carlo simulation is all about. In the example we are about to work through, we'll change both the mean and standard deviation of the simulated data to improve the quality of a product.

Today, simulated data is routinely used in situations where resources are limited or gathering real data would be too expensive or impractical.

 

Want to learn more? Read the full article explaining Monte Carlo simulation methods and walking through an example inQuality Digest. Plus, sign up for our webinar April 19.

Webinar April 19. Save Your Seat.

What is User-centered Design and why is it important?

$
0
0

User-centered Design

Minitab’s R&D department uses modern design principles, including a user-centered design approach – adopting the Neilsen Norman Group’s (NN/g) methods and processes. NN/g is the world’s leader in all things usability. Hoa Loranger, their vice president, summarized their philosophy best: “UX without users Isn't UX.”

For Minitab, this means putting our users at the center of our design processes. From our early needs analysis to ideation workshops, from wireframing to usability testing, we want to verify at every step of the design process that we are making something that our users really want and can easily use.

 

The Minitab Design Process

The Minitab Design Process

What We’re Doing

We have always valued customer input. We recently changed our approach to testing – conducting smaller tests throughout a project cycle rather than conducting large usability tests at the end of the design lifecycle. This allows us to maximize opportunities as well as identify and correct issues as they come up.

We also retooled our interviewing process to better unearth users’ general attitudes about features and services, and the feedback gathered in our post-interview analyses has proven invaluable. We started using a broader array of study types. Using a combination of card sorts, tree tests, and surveys, we have been able to collect more nuanced data about our designs, resulting in faster, more valuable design iterations.

What We’re Planning Next

We are investigating more ways to bring user feedback into our process:

Participatory Design

In participatory design sessions, we will sit down with our customers, discuss design ideas, and even collaborate on the development of wireframes and working prototypes.

Ethnography Studies

There is only so much that you can learn from a scripted usability study; in the end, the best information comes from watching a customer do their day-to-day work. That shows us not just if the software is usable, but it shows us what specific features are being used, what is being overlooked, where are the gaps, where are the pain points. We want to see our customers doing their own work.

In the coming weeks and months, we’ll be reaching out to customers to see who might be interested in collaborating with Minitab designers. If you are interested or would simply like to learn more about Minitab design process, you can meet the design team this fall at Minitab’s annual Insights Conference – join us in Scottsdale, Arizona September 12-14 for three days of sharing wisdom, networking and inspiration against the backdrop of the beautiful Southwest. Head to info.minitab.com/insights to learn more and register.

 

Minitab Insights 2018 conference September 12-14 in Scottsdale, AZ


Viewing all 828 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>