Photo by Shadman Sakib / Unsplash

In late mid-October 1900, in one of the hundred biggest cities in the US with its 40,000 people, Sioux City welcomed the life of W. Edwards Deming. Starting in Iowa and culminating in a PhD in math and physics from Yale at the age of 28 was merely the beginning of his life.

To say Deming was critical to modern quality practices in business is a lot like saying Richard Feynman is critical to nuclear physics or Henry Ford is critical to the assembly line. Both of them pushed the fields to towering new levels, but did so on the shoulders of giants. Ford didn't produce the first automobile nor was he the first to have repeated assemblies in business, and Feynman diagrams were based on research by Stueckelberg and since then have evolved into other diagrams. However, Deming's application of quality improvement has echoes to today.

A brief, bulleted list of Deming's successes with quality:

  • Rebooting the Japanese economy after World War II
  • Developed sampling techniques for the US Census (which was still used in the 2020 Census)
  • Accidently launched the Total Quality Management movement
  • Championed "continuous improvement"
  • Said the greatest quote - "In God we trust. All others must bring data"

You may notice a lack of "Plan Do Study Act" in there. That's because Plan-Do-Study-Act, or PDSA is part of continous improvment, something Deming heavily attributed to successes. In fact, despite the fact that the PDSA (and Plan-Do-Check-Act) cycles are often attributed to Deming, he in fact attributed them to Walter Shewhart as the "Shewhart Cycle". Deming notably improved the accessibility of it and expanded on it, but much like Feynman or Ford, he stood on the shoulders of giants. A 1993 article in New Economist by Deming thoroughly explains this in his "Plan-Do-Study-Act" article. However, Deming is definitely the catalyst that not only brought this cycle to the forefront, but applied it in several industries and countries.

So, what is continuous improvement? And PDCA? PDSA? Just buzzwords that make people higher up in an organization throw around to sound smart? Ways consultants can try to sell you something? Resume padding, much like the unnecessary application of "blockchain" to everything? Sometimes, honestly, yes. It gets abused or turned into something significantly more grand than it is, because at its core, it's very simple and just requires planning and measuring results and, most importantly, a willingness to apply what you measure.

Let's break down the big pieces.

What is Continous Improvement?

Honestly, this is way easier as a core concept and way harder as execution than most people believe. Given a situation, take what you learn and improve it. Simple, right? Kids do it all the time - they learn that adding water to sand makes it form better and can build a sandcastle. They add too much? It melts away. Not enough? It blows away. They experiment until they can build a sandcastle on a beach, often sorting it out in an hour or less of experimenting.

Well...that's not exactly accurate. That's trial-and-error, and that's what many people believe continuous improvement is. Try something, discard the failure or keep the success, and move on. It's easy to do but inefficient at scale and speed. Imagine trying to print a book. You can't just print out 1000 pages at a time on your home printer for 1000 copies and glue them into construction paper and call it good. You'll need to make them last, but you'll need the manuscript to fit on the page. What about an ISBN? If you've ever had to publish and need an ISBN you can know that is not a trial-and-error nightmare you want to fight (and especially not at cost). What about actually binding the glue? Which glue works the best? Does it melt the paper? Does the paper hold up to repeated uses? Wait, does the size need to be different for libraries and bookshelves? The amount of trial and error to publish a book would end up being wildly expensive, time consuming, and if you are graceful as I am, your cat would end up with various pages accidentally glued to her.

Cat named Boudicca looking at camera
Boudicca would not appreciate me trying bookbinding

Continuous improvement is really about taking an existing process and making it incrementally better. Major overhauls can be an outcome of continous improvment, but those are rare and far between, because the hallmark of a great continuous improvement study is data. By using the data, you can apply small changes to get different outcomes that are measurable. Rather than guessing at paper or dredging a beach, you use data to guide your path.

I should note that you don't, technically, need a framework to do continuous improvement. You could, in fact, just wing it. However, much like using an assistant or software to do your taxes or studying the official study guide before a licensure exam, having a framework makes it significantly easier. An additional benefit of PDSA cycle is that it is used in many industries from manufacturing to education - in fact, the PDCA case study used by the American Society for Quality is that of the Pearl River School District. This makes it easier to communicate and learn outside of silos and get varying view points not often available to us.

The PDSA (PDCA) framework

I'll note early that I will usually refer to it as the PDSA cycle, but please note that the differences between "Study" and "Check" are, by large, semantic. Deming himself preferred "study" because of the English subtext of "check" being quick but "study" requiring more time and review.  Use what your industry prefers, but know that they are similar enough for most.

The Plan-Do-Study-Act framework has four-and-a-half parts to it, and four of them are pretty obvious, but you'll see why I'll dedicate half a part to it at the end.

Diagram by Karn G. Bulsuk (

This part is pretty forward, but it is a cycle; one piece leads into another. It's not a checklist and done, but it keeps going over and over. Each piece of it is a bit straightfoward too, at a high-level.


The planning part is what really differs this from trial and error. With existing information you have, make a plan that will change that data. This will become the measurable outcome, which means, by definition, it needs to be measurable. Although you should get specific, you can start of easier at the beginning, as long as you're measurable.

Imagine you are a movie producer and are working on a sequel. Your first movie, Movie, made $70m in sales in the first 2 months and your fans are giving an average of 4.5/5 rating. You might thing "people spend more" is an outcome, but it isn't measurable. You would be better served by "Ticket sales are $100 million". The more specific you can get, the better, such as "Ticket sales are $50m in the first two weeks, and another $50m in the six weeks after".

Step .5 - Observe

Here's the secret sauce - sometimes you don't have data yet. This is the "half step" I referred to above; the "observe" step. Before the first movie, you had no idea what your fans would do. You could infer from other studies but you would want to set potential metrics and observe the outcomes - by recording them. When you need an "observe" step, you will consume so much information that parsing it into something valuable will be a struggle, but will be worth it for your plan. In fact, in the movie industry, you can predict how much money a movie will make during the tentpole (opening weeks) phase, customer satisfaction, and ticket sales. Although your final outcome is "making more money" you will usually want to focus on those contributing factors. The more often you do the cycle the more precise your changes can become.

However, for sake of the example, let's say to get ticket sales $30m higher, you do a marketing campaign of dropping the cost of the first movie to cost in stores, encouraging people to buy it and watch it. You profit nothing on the first movie now, but you should be encouraging people to see the second one in this way.


Honestly, don't be too surprised here - enact your plan.

Record the data, record the anomolies, and, if possible, record why the anomolies. The biggest pitfall to avoid is while enacting your plan, don't make too many changes. Stay the course. This is why your plan should, usually, be small changes that you can iterate on, or do, quickly. If you regularly test something, you can make changes and record them quickly.

So, in our movie example, the price for Movie goes down to $5 at retailers and then your movie, Movie 2, hits the big screen.


Now to do the real work - we look at the outcomes and learn what we can. Let's say you make $90m on your film. $10m short of your goal, but $20m higher than the last one. The real sleuthing is why.

Did sales of Movie grow significantly? Did your release coincide with events, such as holidays or another Keanu Reeves movie? Were people boycotting your movie because an actor did something inappopriate? Taking surveys is a common tactic to study, but we also can go through a trove of other information.

Let's say while studying, you learn that Ram Charan put out a movie in India and your international sales slumped because of that market watching his movie instead. It could be that you would've hit your $100m, or higher, with your technique of dropping Movie to cost, had that external factor not been there. You could then further isolate your study to see if the growth on US-based sales was different than Southern Asian sales and if that was truly the issue. If in your investigation you find that, yes, Ram Charan is responsible for taking $10m of your potential sales due to the study, you can conclude that your marketing technique probably worked, and you can add it to your repitoire.


Here's where you prepare to do the cycle again.

Movie 3 starts to go into production and as you look at your release planning, you now know to check international movie releases to know where you might have weaknesses or strengths - luckily, Ram Charan isn't dropping another blockbuster the same year as you and you carefully study your other major international markets for similar potential issues before landing on a solid release weekend.

This phase of "acting" is taking your plan and adding it to your standards. This means, as you continue to improve, you continue to set your base standards. From now on, you consider international movie releases when releasing your films, and you now observe metrics such as sales in a given geographical region. In fact, looking back at your data, you can't help but notice your July sales bumped in the Nordic regions. Maybe there is a plan there you can do to keep moving the needle up on sales?


Another thing to always consider is that this is continuous. You're never done. Whether you are trying to improve your high school's ACT scores or speed with which you deal with a critical software bug, you can find ways to improve. No one is ever done. In fact, there are times you slide and you need to spend more time studying and sometimes even adjust your standards. However, it never needs to stop and as you practice continuous improvement, with the PDSA framework or another, you'll find it something you naturally apply to your life. If you want, I can give you way too much data on my smoked meats cooking and appreciation by friends and family and the dog (who is an outlier, because she always appreciates the smoked meats). If you have a cool application of the PDSA (or PDCA) cycle, let me know, I'll gladly get all of them in an examples post - including my secret to excellent smoked pork belly.

Marty Henderson

Marty Henderson

Marty is an Independent Consultant and an AWS Community Builder. Outside of work, he fixes the various 3D printers in his house, drinks copious amounts of iced tea, and tries to learn new things.
Madison, WI