“If you want something new, you have to stop doing something old.”
The last 70 or so years has seen the development and codification of organisation-wide processes to drive sustained improvement. Generally, these improvement processes have been called “total quality”1 and are built on four core principles:
- Focus on work processes
- Analysis of process variability
- Management by data
- Learning and continuous improvement2
Over the years these tools and techniques have been collected together and applied in most organisational settings3. The ability to use the tools and techniques is integral to many job roles across an organisation and has come to be seen as something that is a part of everyone’s role. So, what we see in the various task listings – and in particular in the competency listings – is the capability to make improvements. Key here is the nature of the improvement capability which can be built into everyday operational roles, and this depends on the scale and cycle time of the variances being managed4. Relatively small scale, short cycle variances can be relatively easily accommodated into most roles. The development and maintenance of proficiency to make improvement at speed grows with practice5 i.e. a mix of learning-by-doing over time and acquiring a deeper understanding of the process. Hence, over time, the ability to handle a greater range of variances increases and the process itself changes as a result of the increased level of process understanding.
The longer run cycle of variances can be captured in several different ways. First, through a group process where learning and improvements are pooled and captured, and a series of projects are defined and progressed either by the work group itself or by specialist groups. Second, the work group may seek to redefine the whole process or substantial parts of the process – possibly taking a Business Process Re-engineering (BPRE) approach – and so create a new process. In both cases, new work tasks are defined and (re)allocated. A task-level database is often used to capture data about the task itself with its supporting tools, techniques and technology. One of the real challenges is relating the tasks to a work process and support workflow analysis.
- Feigenbaum, A. V. (1956) “Total quality control”, Harvard Business Review, 34 (6), 93-101; Crosby, P.B. (1979) Quality is Free. McGraw-Hill, New York; Deming, W.E. (1982) Quality, Productivity and Competitive Position. MIT, Cambridge; Ishikawa, K. (1985) What is Total Quality Control? The Japanese Way. Prentice-Hill, New York; Juran, J.M., Gryna, F.M. and Bingham, R.S. (eds) (1974) Quality Control Handbook. 3rd Edition. McGraw-Hill, New York
- Hackman, J.R. and Wageman, R. (1995) “Total quality management: empirical, conceptual and practical issues”, Administrative Science Quarterly, 40 (2), 309-342
- E.g. in the UK National Health Service 75 tools and techniques have been brought together in their own internal handbook (The Handbook of Quality and Service Improvement Tools, 2010, 320 pages) complied by the NHS Institute for Innovation and Improvement.
- Buchanan, D.A. and McCalman, J. (1989) High Performance Work Systems. The Digital Experience. Routledge. 227 pages (see, Chapter 5, 76-86, Stuffing modules)
- Rosenberg, N. (1982) Inside the black box: technology and economics. Cambridge University Press. 304 pages (see, Chapter 6, 120-143, Learning by using)