Not every process solution fits within the guidance of the process improvement methodologies we know. Sometimes, the best solution to the challenge comes from doing what our methodologies would decidedly not do.
Rules are important, especially when it comes to governance, but they cannot possibly anticipate or address every single scenario that might come along. Sometimes the best or right thing to do is to break the rules.
Here is an example to set the argument to rest. Suppose you are stopped at a traffic light and an emergency paramedic vehicle needs to get through. It may not be your right-of-way to move into the intersection, but you are blocking that emergency vehicle’s path. The best thing to do is to verify that cross traffic has stopped and to enter the intersection to let the emergency vehicle through. It’s against the normal rule, but right for the immediate need.
Every once in a while process solutions will break the rules of the best methodologies as well. I’d like to share two examples that I ran into this week. From these, I’d like us to remember for the future that when we run into a difficult process improvement, or when our best ideas don’t seem to take our metrics in the right direction, the best solution may be to break the normal rules.
I was sharing experiences with a friend and peer who told me a story that is not necessarily unique, but rings of numerous other challenges in my own experience. His story takes place in a blood bank.
At the time, he was a line supervisor for the blood processing function on the night shift. His team’s job was to process recently collected blood and prepare it for distribution and use. The rule was that every member of the team was to have three 10-minute breaks and one 30-minute “lunch” during the course of the shift.
The breaks were staggered so that during any one person’s break, the rest of the team could be operating the process. This to ensured that the process perpetuated without any stops, and maximized the manpower at any given time throughout the shift.
On paper it makes perfect sense. However, my friend broke the rules and reset the break schedules for his shift and team. Without going into detail about the process, it is important to understand that once certain phases of the process have begun, they cannot be halted until the entire process phase is complete (another driver for the staggered breaks). However, between two phases, there is a distinct switch from one process system to another. It is a change point between processes and a safe place to allow a pause in the movement of product.
My friend reset the break schedule so that the entire team took one 35-minute lunch break when their product run had completed the first phase, and a 25-minute break after the entire run had completed the second phase. The night shift’s productivity consistently exceeded that of the day shift, which ran a larger team.
Within a few weeks, word got around to the management that something was up. Complaints were made that the night shift wasn’t following the rules. Some of it was curiosity concerning how the smallest shift was processing the most product. Management decided to investigate.
After asking my friend why he thought it was OK to break the rules, and he referred them to the numbers, some members of management decided to stay and watch the night shift work one night. Afterward he was free to proceed as he saw best, no more arguments.
Any of us could sit down with a scratch paper and draft a process productivity model that shows how an un-halted process running 80 to 100 percent manpower throughout the entire shift would produce the most productivity, so why did my friend’s solution work better? By the way, he had no formal process improvement training of any kind at the time; he made the decision based on leadership intuition.
I have seen the same phenomenon numerous times in other scenarios, and I believe that I have some insight to answer the question as to why. I welcome comments from others if anyone would like to share.
I believe that the reason that the model with the shared breaks and the pauses in the process works better, is because people are not machines. As living, thinking, feeling, organisms, our energy levels and productivity depend a great deal on our mood and attitudes, affected by morale. Also, our system reset may not happen according to the schedule convenient to the process.
A 10-minute break, alone, to use the restroom and get a drink, may not effectively refresh our energy levels for the next push through the process. However, taking a break twice as long, with work friends or colleagues, with a little time to sit down and share a laugh, might refresh us better. Working assembly-line-style shifts is more like running a distance race than running a set of sprints.
So, a 10-minute break is not something we much look forward to, or at least not as much as a longer break that is more meaningful. So, when we are motivated to get to that break, our energy level for the production process is greater and we are naturally inspired to drive a little harder to get to the break a little sooner. It’s like a basic reward scenario.
Similarly, the process is less stressful and requires less energy when 100 percent of the team is working it at the same time (assuming it is properly resourced). This means that the team members can maintain a higher energy level and better morale for a longer period of time and the productivity of the process is higher, for longer. In this example, that difference was enough for the intermittent model to exceed the continuous model, even in spite of the breaks.
I think there is more, but the point can be made. When productivity hinges on the performance of people, we are often better off designing the process around maximizing the morale and energy of the people, than we are modeling the process as if people are machines with a consistent, infallible output. We are not machines.
Unfortunately, because we are not machines, simple manpower calculations are invariably flawed, and our models will rarely predict actual performance. The best way to find the highest efficiency process, that is sustainable over time, is to experiment. Let your intuition be your guide rather than calculations and models, as it was for my friend. Chase the pain and get rid of it and chances are the process improves.
Coincidentally, my friend was telling me his story just after he and I had finished with my second example for discussion. It takes place in a small cafeteria-style kitchen. My friend and I had volunteered in the kitchen to help out the regular staff.
This lunch kitchen, which I have volunteered in many times, is a Lean expert’s nightmare. The food is ordered several days in advance, according to an anticipated customer demand and prepared, for the most part, in entirety prior to lunch service. Models of human behavior, particularly with regard to how many people will buy lunch on a given day, are nearly impossible to model with any reliability. Therefore, food is often thrown away, and sometimes one meal or another runs out before everyone has had a turn to get lunch.
The Lean solution would be to minimize the amount of food prepared at any given time and to prepare and serve food to order. Food that was ordered into inventory and was not used should be used the next day and a minimum inventory for each service maintained on a daily basis. That would eliminate waste.
The Lean model could be done, and with a little head-scratching, I think I can even make it possible to do that without increasing the staff for the kitchen, though it’s unlikely. The equipment in the kitchen would need to be completely overhauled and changed, but such is not uncommon when redesigning a production line around a Lean model. I would not recommend even entertaining the idea, even if I could show a return on investment for the kitchen. Why wouldn’t I present a solution?
Let me share some basic data and see if you don’t see the answer without me stating it. The price for the standard lunch is $2.05. Yesterday, we served 256 lunches in 50 minutes. The normal staff is 3 employees: one operating the cash register, and two serving preset meals (4 choices) and re-stocking the salad bar between waves of customers. That includes pulling more batches of prepared food from ovens, warmers, and refrigerators, while rapidly serving up trays as fast as customers can select and grab one.
As I stated, customers arrive in waves, but if we distribute the pace, that’s roughly one customer served every 11.7 seconds. On some days the throughput can be as high as 280 customers, on others as low as 190, though it is usually between 220 and 250, roughly. The Lean process that we could all design to eliminate wasted food could not possibly provide that throughput with only one full-time and two part-time employees.
The cost of wasted food is far less than the expense of additional or alternative equipment and more staff. To put a little bit of perspective on that statement, for special occasions, the kitchen will offer a greater meal, such as barbeque or Thanks-giving-style dinner. On those occasions, the kitchen will serve as many as 500 meals in around 65 minutes. To do that requires 4 employees and 8-10 volunteers (12-14 operators) and 6-times the serving space and twice the food holding/stationing space and equipment.
If I haven’t dropped enough clues as to why the Lean methodology and focus on waste elimination isn’t the right solution, I’ll just say it. The kitchen and customers have only 65 minutes to serve and eat food in six different waves. On-time delivery is the driving need. The best solution for the challenge is the one that enables 280 customers to be served in 50 minutes. There isn’t time to allow customers to order and wait for food. The food must be waiting for the customer.
The challenge is to minimize the time the food waits so that it is as fresh and warm as possible. The kitchen does this very, very well, but at the expense of few options, and pre-staged (batch inventoried) production items.
I have no doubt that a Lean expert could come in and design a way to eliminate wasted food. Six Sigma experts could drive themselves nuts trying to address the variation in customer demand by figuring out the driving factors of customer decisions and mapping out meal selections that minimize demand variation (and would probably watch sales fall off as customers got bored with the same menu repeated over-and-over).
Neither of those methods, following their sets of rules, could produce a better solution for this kitchen than the one that they have, which breaks almost every Lean rule, and might not afford the changes in food that the Six Sigma solution would drive. Yes, I’ve tried to think of one, and after two years of volunteering regularly have yet to devise a practical one. It’s not easy to get two servers and one cashier to serve 4 meal choices every 11 seconds.
If the reader is experienced in Lean or Six Sigma, then I will admit one concession to put the reader’s mind at ease. When we apply the Lean or Six Sigma methods to the customer process of selecting and paying for lunch, and not to the kitchen’s process of preparing lunch, then we design exactly the process the kitchen has. If we focus the method on the primary need, minimizing the customer’s time in process in this case, the methods work and we don’t need to break the rules.
Until, I made the concession, I bet most of us all agreed that the solution did break the rules. It’s a fine point, and we can choose either side of the argument we prefer. Either way, these examples give us some easy things to look for to know when we need to either break the rules, or differently focus our methods.
The most obvious is when we try using the rules and they don’t work. We apply the rules and come up with performance measures going in the wrong direction. That’s a clue that we have either a problem outside of the normal rule purview, or we didn’t focus on the right problem. I warn us though; if our metrics are also not focused on the correct problem we might improve our metric, believing we have made things better, when actually we did not.
The second thing to look for is the people-dependent performance. In my experience, when productivity and performance hinges significantly on the performance of people, the purely logical and mathematical process models will lie. Instead, make it a habit to challenge those models with experiments that address the people’s needs more than the apparent process needs to see if addressing those doesn’t improve efficiency.
The third thing to look for is to verify that the challenge involves the enemy of the methodology. For example, Lean’s enemy is waste. Six Sigma’s enemy is variation. Total Quality’s enemy is complacency and defective output. We have the wrong method for the problem if, we cannot solve it by eliminating that method’s enemy. In the kitchen example above, the enemy is not wasted food; at least that is not the enemy of the primary need.
Since the process is very good at meeting its primary need, eliminating waste or variation or defects may not be important. If we do decide to try to further reduce those costly elements, we must be sure that we do not jeopardize the primary performance, and we may need to break the rules to ensure the primary element’s performance is maintained. In this case, we must accept some thrown away food or defects in the form of customers who do not get their first choice of meals.
As you proceed to improve processes look for the human elements of the process that might not fit the normal rules, or conflicts between the process problem and your method’s enemy of focus. If those don’t warn you and you still end up with improvements that don’t work, consider that your improvement rules might not be suited to address that particular problem. Step back and see if by breaking those rules you can come up with a better solution.
Stay wise, friends.
If you like what you just read, find more of Alan’s thoughts at www.bizwizwithin.com.