Often we think that increasing the speed of a process means compromising quality, that faster is sloppier. But flow achieves just the opposite—it generally improves quality. We show one defective computer, with an X on the monitor. That one failed to turn on in the test stage. In the large batch approach, by the time the problem is discovered, there are at least 21 parts in process that might also have that problem. And if the defect occurred in the base department, it could take as long as 21 minutes to discover it in the test department. By contrast, when we discover a defect, there can be only two other computers in process that also have the defect and the maximum time it will take to discover the defect is two minutes from when it was made. The reality is that in a large batch operation there are probably weeks of work in process between operations and it can take weeks or even months from the time a defect was caused until it is discovered. By then the trail of cause and effect is cold, making it nearly impossible to track down and identify why the defect occurred.
The same logic applies to a business or engineering process. Let individual departments do the work in batches and pass the batches to other departments and you guarantee major delays in getting work done. Lots of excessive bureaucracy will creep up, governing the standards for each department, and lots of non-value-adding positions will be created to monitor the flow. Most of the time will be spent with projects waiting for decisions or action. The result will be chaos and poor quality. Take the right people who do the value-added work, line them up, and flow the project through those people with appropriate meetings to work on integration and you will get speed, productivity, and better quality results.