Date: 2024-12-21 Page is: DBtxt003.php txt00009620 | |||||||||
LinkedIn Dialog | |||||||||
Burgess COMMENTARY How do we solve the problem of a domino effect on the activities of companies in a number of separate business sectors where individual businesses within their zones of activity are either unaware of, unconcerned with or not responsible for the others’ behaviour? This is what has happened in just two massive-expense cases – the just-ending banking debacle, which has cost more than any other business crisis since the Wall Street crash of 1929, and the on-going health care holocaust, which is already affecting more than a quarter of the world’s population. Technically it’s a KM problem of experiential NON-learning and poor decision-making on the part of managers and experts who are making what appear to be good decisions for their own companies but whose combined outputs are punishingly damaging at a much wider level. I call it the domino effect of daisy-chain companies in separate business sectors. It’s the butterfly effect gone mad. My KM answer is surprisingly unemployed and so logical that its no-show is astonishing. What’s yours? Go to https://biggernumbers.wordpress.com/yesteryear/ Yester'Year biggernumbers.wordpress.com The Answer History is dead! Long live history! The missing link to KM, EL and decision making ….. There is a curious convention following the death of some British and Danish Monarchs. The eight-wo... Like Comment (1) Unfollow Reply Privately2 days ago Comments 1 comment
Peter Burgess
Peter Burgess
Founder/CEO at TrueValueMetrics developing Multi Dimension Impact Accounting
A good read, and thought provoking. I could not agree more that we should be able to learn from history and this should help us not to make the same mistakes again, and I agree that the churn in employment is an issue that could easily result in the weakening of the enterprise.
But we should also be careful about trying to make history fit the present and future when, for a variety of reasons, things have changed. It has often been said that soldiers fight the current war, the way they wished they had fought the last war ... and something like that happens in corporate management.
Making the best use of Knowledge should be a no-brainer ... but very few people use much of the available knowledge, and especially knowledge outside their comfort zone. Experience helps people to link with relevant knowledge ... so that all fits together.
But the biggest constraint to good performance is a lack of understanding of the metrics that are actually used to measure performance and those that ought to be used to measure performance. The idea that profit and GDP growth are important measures of performance on their own is remarkably silly ... but these are central to almost all policy conversation. Where are the equally powerful numbers about quality of life and the way the planet is being compromised?
So yes ... a good piece to read ... but only a part of a much bigger picture that needs to be understood as well.
| |||||||||
The Death of Wisdom: Why Our Companies Have Lost It–and How They Can Get It Back The Answer ... History is dead! Long live history! The missing link to KM, EL and decision making ….. history1 There is a curious convention following the death of some British and Danish Monarchs. The eight-word proclamation has read: “The King is dead. Long live the King”, last properly used in the UK in 1936 when King George V was succeeded by King Edward V111, who later abdicated and became the Duke of Windsor. Coming from a tradition started in 1422 in France, it signifies continuity, stability and the expected wisdom that comes from Monarchical experience. Unfortunately, such virtues are not reflected in British business and – strangely – many other places. My purpose in mentioning this quaint ritual is to highlight the acknowledgement of continuity’s importance to a major institution and the visible gap it doesn’t play in business education. Continuity is the element of experience that allows experiential learning to take place seamlessly while the observable hole in business education is the discipline known as business history. As the memoir of how we did business, it’s the only portably reliable mouthpiece that can provide experience comprehensively and – incidentally – cheaply. It is a subject that is widely neglected – and which could provide an answer to the hugely big problem of experiential NON-learning as presented by the banking crisis and the developing health catastrophe (see https://biggernumbers.wordpress.com/growth-how/). In addition to being titanic, these examples are particularly difficult to address managerially because poor decision-making spans several business sectors that operate independently and whose combined effects are inter-related and larger than them all by a huge margin. In both cases, individual businesses within their speciality activity were, and/or still are, either unaware, unconcerned with and/or not responsible for other sectors’ behaviour. In health care, for example, the soil sector, food producers, food manufacturers, doctors and the pharmaceutical industries all operate independently of each other, and ALL have contributed to the problem now harmfully affecting the health of more than a quarter of the world’s population. The soil scientists have given farmers a nutrient-deficient way of increasing their production. This critical shortfall has encouraged a food manufacturing industry to use chemical additives to substitute but – would you believe it – the nutrients used are not all absorbable into the human body. Competition has also encouraged the food manufacturers to introduce into their products sugar, salt and trans-fats – the manufacturers call these additions ‘bliss factors’ to sell their wares – many of which have hosted a range of opportunistic diseases to fill doctors surgeries and hospitals. To this double whammy comes the pharmaceutical industry, whose experts then skilfully invent pills for all reasons, extending the life of many, expensively. Metaphorically, the result in the First World has been generations of elderly people dying unkindly on their Zimmer frames. In the Third World, they just die. However one assesses the outcome of this domino effect of daisy-chain companies in separate business sectors, many major decision makers and their experts are making poor decisions, remedial actions around which would normally fall into the disciplines of KM, EL and decision-making. Conventionally these disciplines can be more easily applied to individual companies. But for the evidence-based decision-making that takes their consequences to another level, the established practices are deficient; they don’t normally address the bigger issues and, anyway, at a sector level, it’s none of their business …. The overview of business history WILL do the job. It will give the next generations of businessmen the perspectives that the current generations lack. Perhaps, then, they’ll be able to apply their wares more responsibly. I know brevity is King when it comes to busy-busy managers but indulge me a few more paragraphs to illustrate how resistant to change have been the main characters in this dramatic tragedy. Over the past half decade I’ve been preoccupied how to better apply KM to really big issues, the banking misdemeanours and the healthcare holocaust being the inducement. I’ve argued that the Cinderella discipline of KM should be able to handle a straightforward problem like experiential non-learning, however it fits in the business pyramid. When I first took an interest in KM, the discipline still had nine years to be conceived in its modern format out of a Boston conference in 1993. Then, all I was aware of was that so many managers and employees had little historical awareness of their products, their companies or their industry. Without this, they were unable to benefit from any hindsight they might otherwise have acquired so, given that the newly-arrived flexible labour market was stripping companies of their unique organisational memory, I took to offering them a service to produce their corporate histories in a form that was readable and suitable for both induction and in-house learning. It wasn’t an easy sell in an environment where the fourth generation of ‘educated’ businessmen and women were emerging from the newly created business schools. In quite short measure, the corporate history industry died on its feet, no thanks to the imposed departure of older, experienced hands in favour of waves of new bloods who, literally, thought they knew better. They didn’t, because the incidence of repeated mistakes escalated, wheels were re-invented and other lessons went unlearned, all reflected in their more difficult-to-maintain productivity growth scores. For me, it opened up another opportunity, this time to introduce the oral debriefing of important, low-tenure employees to companies who were finding themselves in various stages of corporate amnesia. It was an attempt to give to business what US historian Studs Turkel gave to social history after the invention of the tape recorder. He was followed by the US academic Professor Allan Nivens, who persuaded many US educationalists to introduce oral history as a tool for serious scholarship. In the process of a near dead demand for corporate histories and my launching of oral debriefing, I came across the academic world of business history. It, too, was leaving its heyday, with fewer companies wanting to embark on worthy scholastic tomes with hand-full audiences. As a teaching discipline, the wider subject of business history was unknown except at Harvard Business School, where all first year students were compulsorily exposed to the subject (apparently temporarily as the specialty was dropped in the 2000s). In the UK the academic stars of the field took flight to the US and Japan and the discipline locally also almost died on its feet. Its cousin, economic history, additionally took a dive as it was reduced or subsumed into social history curricula. Over all this time my own specific efforts to persuade academia to introduce corporate history and its bigger brother, business history, fell on deaf ears. To the suggestion, for example, that business schools trying to educate students to enter, say, the textile industry should put several corporate histories of textile companies on their reading list, the typical response was: “We know better.” I came away with the feeling that business schools and the business history community were – if I’m being kind – kind of resistant to outside ideas and especially change. Twenty years later I am of the same opinion towards an even smaller community of business historians, but with the belief that their ownership of a discipline could yet provide a solution to a management problem that has – and is – costing us more than any other. For one, it could provide missing perspectives and recall the precedents and vital awareness of past business activity to help avoid tomorrow’s banking and health catastrophes, which – as history confirms – will reoccur without adequate suitable defensive measures. In banking, for example, there are at least a dozen precedents in recent history that could have helped avoid the 2008 crash and provide the evidence to apply workable solutions. Then there are all the other unaddressed business-related fiascos …. To achieve continuity in a short-tenure jobs environment – as even top decision-makers and experts experience – I would prefer that corporate connectivity should come from a good helping of business history to support locally-relevant and conventional KM, EL and decision-making processes. This means that business history needs to be introduced as a prominent feature of business education and training. In many other professions, for example in music, architecture, art, soldiering and politics, there is a generic component of their history in their education. So why not business? That business history is not already an established learning tool – as general history is elsewhere in the academic biosphere – has always been a puzzle. Business is, after all, our flagship breadwinner. But what if business history don’t, won’t or can’t put their toes into KM, or KM don’t, won’t or can’t put their toes into business history? Given the scale and importance of nothing happening, there’s always the scarier option – the politicians. Someone has to do it! But hang on, historical precedence points to one other alternative. In the early 1970s, when Japan was on its productivity roll, the then Crown Prince instructed that every faculty of business and commerce should have its own business historian. Perhaps Monarchy does have a role in the business world? What’s Buckingham Palace’s email address? Ma’am, c.c. The Prince of Wales. Will you front up?. Respectfully, on behalf of British business, Knowledge Management. Discuss …. This is a companion Paper to The Problem on https://biggernumbers.wordpress.com/growth-how/ Share this: The Death of Wisdom: Why Our Companies Have Lost It–and How They Can Get It Back Growth, how? The Problem How do you explain this biggest-ever example of experiential NON-learning? And all given to us by EXPERTS in their fields ….. The story goes like this. Doctors know that the more robust the human immune system, the better a person can fight ALL – literally all – illness. For a fully-functional immune system, the body requires a comprehensive nutritional diet. Yet the nutritional content of most diets falls short of what is necessary, thanks to soil-depleting farming and nutrient-depleting food refining, giving us so-called ‘sterile’ food. The problem is compounded by the ubiquitous addition of sugar, salt and trans-fats – the manufacturers call these additions the ‘bliss factors’ to sell their wares – many of which give rise to their own illnesses (obesity, diabetes, high blood pressure, etc) that are drowning health care systems all over the world. Along with other maladies such as HIV and TB – even Ebola – it’s a picture that would suggest that, even though medical technology has ensured we’re living longer, we’ve never been more plagued by illnesses kept at bay by costly pharmacology. In a sensible world, this would have provided decision-making alerts at the outset but NO …… Instead of directly addressing the problem by fixing the food chain, scientists introduced food additives. But – get this – most of the chemical additives used are not fully absorbable into the human body, leaving the immune system vulnerable twice over. It’s a food and health strategy that has been stubbornly promoted for years, costing more – both in terms of cash and unnecessary health impacts – than if the food chain had not been compromised in the first place. It’s a bizarre contradiction, as if decision-making has taken leave of its senses. Behind this seemingly intractable scenario, some questions. Are their individual group objectives really more to do with production than health? Do such big issues automatically compromise good decision-making? Are the experts who make these decisions not aware of the effect of their determinations or are they exempt from normal decision-making processes? Or have specialisation, impressive technological advances and the separation of the many cog-businesses blinded decision-makers to their ‘intended’ shared outcome? Cog-businesses, otherwise called business silos, are those companies operating independently within the wider fields of food production and health care. Following two recent occurrences, I am coming down on the side of the latter, which has blurred the grander goals of the larger group. The first is my reading of a book that has had the same effect on me as Rachel Carson’s 1962 book ‘Silent Spring’, which blew the lid off synthetic pesticides and much of environmental science. The second was a conversation I had with the owner of one of our Linkedin groups who thought a post I had published had a tenuous relationship with his speciality subject. First, the book. In what seemed like a perfectly ordinary personal memoir, physician Geoffrey Douglas’s beguiling volume ‘Life is a Fatal Illness’ (Olympia Publishers, London, 2015) outlines one of the biggest examples of experiential NON-learning I’ve come across, the other most recent being the just-past sub-prime banking debacle. Trashing the supposed ‘informed’ decisions of experts ranging from soli and food scientists, health officials and organisations like UNICEF, the World Health Organization (WHO), the World Food Programme (WFP), NGOs (Non-governmental organisations) and Médecins Sans Frontières (MSF) – even national governments trying to altruistically support sustainable development – he writes: “The truth is that health professionals may be knowledgeable within their narrow area of expertise, but this does not mean that they know best, and they are often wrong.” Douglas, an Oxford-trained doctor, recently knighted, has the distinction of diagnosing the first case of HIV in Swaziland, a country that now has the world’s highest rate of the disease. His reflections around food production and health care are brutal, bizarrely predicated on a working lifetime almost entirely in Third World Africa, but with many of his observations having a perceptive relevance to First World medicine. They include: * Modern farming methods have conspired to maximise yields at the expense of nutrient content. * The art of medicine has been usurped by pharmacology’s response to a foisted mindset of a ‘pill for every ill’. * More drugs are dispensed for the management of disease than for its cure. * Even when a person consumes adequate calories and protein, if they lack one single nutrient – or a combination of vitamins and minerals – their immune system is compromised. * Fortifying or supplementing a defective diet with micronutrients in the form of chemical isolates is now commonplace, despite a plethora of scientific evidence that they are poorly absorbed, rarely act in the body in the way intended and, in some cases, may even be toxic. * Vitamin and mineral deficiencies account for an estimated 7.3% of the global disease burden, providing the most widespread health problem in the world. * Malnutrition compounded by repeated bouts of infectious disease causes an estimated 3.5 million preventable child deaths annually. * What has been done to our soil and our food in the name of progress is nothing short of criminal. * And much, much more. Don’t the experts know any better? And if not, how come they’re making decisions of such import and impact that contradict better evidence? And given that much of it is historical, it confirms that not much experiential learning is taking place. To mix a metaphor with business terminology, the experts have, literally, put more of their technological energy into downstream activities. Had they done the upstream right first, the cart would, as it should be, travelling behind a much healthier horse. The Linkedin group owner’s coincidental complaint that my post was tenuous came next. My response was that there was a close association between Knowledge Management (KM), Experiential Learning (EL) and decision-making, which were referenced in my post. I further suggested that business history and corporate history were similarly part of the wider discipline. Although I didn’t mention it then, I usually go even further; that the ‘package’ also has specific relevance to Human Resources (HR) and, not unimportantly, business education. Neither has a good handle on either KM or EL, the former needing to incorporate overlooked tacit knowledge into its orbit and the latter required to adapt its ‘one-size-fits-all’ teaching of decision-making to include similarly neglected organisation-specific intelligence. And this, because the flexible labour market has removed employee continuity and knowledge sharing from organisations’ traditional decision-making practice. Although the issue of being tenuous on a dedicated Linkedin group might appear to be unimportant, it did provide me with the idea that related groups were probably equally resistant to each others’ subjects. If so, my suggestion is that the separation and specialisation of individual cog-businesses – just like the separation and specialisation of business disciplines – are depriving them of importantly related and wider knowledge and experience. Furthermore, their separation encourages competition that, whilst healthy in most situations, imposes a culture of secrecy, non-communication and defensiveness, even when it comes to critical issues of public health. At a different level, the silo effect within health care can be clearly seen via the fact that doctors are not generally knowledgeable about nutrition, a focus that is absent in most physicians’ education. Yet, even with the oft-suggestion that better nourished patients always respond better to dispensed drugs, I am personally aware of a collegial conversation with a large pharmaceutical company that elicited the surprising information that no such clinical study has ever been undertaken to support this single most effective sales stratagem for their products. Is there a better example of cog-business disconnect? As such, their non-alignment, whilst valuable for their own development, would be helping to disassociate them from the awareness of both food production/health care’s loftier aspirations, which would account for the Douglas suggestion that the experts don’t always know best. This, incidentally, would be in addition to the organic knowledge loss – known as corporate amnesia – that typically emanates from high staff turnover in today’s highly flexible labour market. Together they contribute to the marked disconnect with the wider picture and the absence of any proper Experiential Learning, which, for exactly the reason that the marketplace and workplace has changed dramatically, should be closely integrated with the process of decision-making. With this scenario dogging food production and health care – and coincidentally the rest of similarly fragmented sectors in other areas of commerce and industry – it is clear that the teaching of decision-making has not evolved to the same extent as the more diverse and unrecognisable workplace. An explanation, perhaps, for the sectors – just like banking – being too big for its constituent parts to be allowed to fail? If so, they’ve been allowed to develop at the expense of much of society, irrespective of the health impacts, creating a iceberg-like reality where both rich and poor nations are starving through nutrient deficiency. It’s fashionably called Hidden Hunger, which affects an estimated 2 billion souls worldwide (The World Food Programme, 2007), fully 28% of the world’s population. As current practice insinuates I suggest that the stand-alone approach to individual cog-businesses flaws the way we do business, just as at least one KM group wants to ignore EL alongside, history, HR and decision-making. Of course, each cog-business and their management disciplines should have the opportunity to develop optimally, even be encouraged to argue their own corner. But no thanks to the way the evidence for making decisions is generally short changed in today’s new marketplace and workplace, the specialisation and separation of the component parts of the evidential mix makes it more difficult for the left hand to know what the right hand is doing – or rather, should NOT be doing. With a generous helping of hubris, all this is not helped by traditional lead decision-makers who, by the very nature of others’ fast-moving specialisations, are often unqualified to broker the right outcomes and whose seniority often goes unchallenged. Consequently the real reason for the speciality domain’s existence – better overall decision-making – gets disengaged. Just like EL without KM without history without HR and without good decision-making. The way we’re taught to make decisions has to adapt, if only to keep billions of us out of doctors’ surgeries and hospitals, the cost of which raises the inevitable question: Is it sustainable? The companion Paper to this article – The Answer – is athttps://biggernumbers.wordpress.com/growth-how/ + Author notes that he has a family member in the food industry. His own overlapping interest in nutrition and health care is coincidental and professional-associated with management concerns around corporate amnesia and experiential learning. Dave and Ed (the UK’s Prime Minister and opposition leader) should stop pretending they can stop immigration to Britain. Read who CAN …. Immigrants are taking jobs from British workers. They are prepared to work for less money. They are doing jobs that British workers are not prepared to do. They burden public services such as housing, health care and schools, and much, much more. These accusations, fuelled by floods of illegals and lawful others from EC countries, have been tossed around for years, and now politicised to the extent that the issue had a hand in determining the current government and may well define the shape of the next administration. All the main political parties have had to address the matter directly but there is one aspect of the debate that gets little or no public attention; it’s seemingly politically incorrect and therefore too hot to voice too loudly. It’s that too many British workers are less productive than equivalent foreign workers, a charge that brings with it some electorally unacceptable allusions that casts aspersions on many British workers’ abilities and their commitment to work. Said differently, the unarticulated supposition is that many immigrants simply make better investments in the workplace. However one measures it, productivity is first and foremost a business issue. The fact that it has been hijacked by politics through immigration reflects the reality of a quite separate outcome that is one of those examples of 18th Century economist Adam Smith’s invisible hand of an unintended consequence. Essentially, if the occupational output of British workers were higher, fewer ‘more competitive’ operatives could (or would want to) crowd the workplace, making politicians, many of whom are seemingly trigger-ready to blame the less responsible constituency, the WRONG party to help solve the problem. Controversial as this thesis will seem, it is an argument that confirms the classic economic model of how supply and demand operates in a healthy market economy that is now EC-franchised. As such, it is business that has to step up, take causal responsibility and do something more – much more – about it themselves. And – not un-coincidentally – FOR themselves, for higher productivity would automatically provide higher profitability. For immigration is the symptom, not the cause. While most of immigrations’ ire comes from the trades quarter of industry, there have also been significant imports of the tertiary professions but even though their combined percentage presence in the UK is still relatively small, the strident issue still goes to the heart of the British workplace and the nation’s underlying wealth. The assertion that British workers are less productive than many of their foreign counterparts is not a fairy tale. Nor is it new, the so-called ‘British disease’ having long historical roots. It refers to the country’s low industrial productivity and frequent labour strikes that plagued Britain in the 1960s and 1970s, when labour productivity lagged the US by 50% and West Germany’s by 25%. Improvement began in the 1980s when the UK started to employ technology and introduce competition policies. Comparatively, though, productivity has continued to drag its feet behind its major competitors. This can be easily substantiated through international statistics. In fact the data has been evident since WW2 with spikes. Throughout much of this time Government ministers have been quietly concerned, the latest distressing performance saying that worker output per hour fell a humiliating 3% during the recession period 2007-2013, the worst outcome among the Group of Seven leading high-income countries (Office for National Statistics, October 17, 2014). In the same period the equivalent output in the US rose by 7.6%, in Canada by 2.7% and in France by 1.3%. Even in Italy, it fell by just 1.3%, and at the time when the British economy has been recovering strongly. With the figures indicating that the productivity shortfall is variable across individual industry or skill groups, it is still a conundrum that many of the UK’s serious newspapers report is puzzling the experts big time, no doubt aggravated by other statistics that say the average British worker works longer hours than operatives across the European Union (Office for National Statistics, December 8, 2011). Their confusion must deepen even further when the same research shows that the country with the shortest hours worked is Denmark, which also shares near top status in the world productivity league. The measure of efficiency Widely misunderstood in the UK (puzzling for an otherwise educated and sophisticated industrial economy), productivity and productivity growth’s importance is key to defining underlying national wealth and individuals’ living standards. It’s meaning can be better understood by the imagery underlying the classic question of how many people it takes to change a light bulb; it is the measure of efficiency, anything less being unnecessarily more expensive and less competitive. In the UK’s case, lower productivity broadly explains why many developed countries are more prosperous, it inhibiting employer ability to improve salaries. The seriousness of the UK’s productivity deficiency has been called by Jamie Murray, the man who headed the independent Office of Budget Responsibility’s forecasts of the UK economy’s supply potential before he moved to Bloomberg Economics in London. If weak productivity growth continues alongside the current austerity drive, he says, it will take until 2028 – a decade later than expected – to restore balance to the public finances (“Ten more years of borrowing if …”, The Telegraph, December 15, 2014). This is one of those estimates that blur the sensibilities so, on the basis of the stated productivity rate against our main competitors and Mr Murray’s forecast, I invite one of you actuaries out there to calculate the number of unnecessary man-hours that will have to be worked to pay off the public deficit. Or even the sterling value of the additional cost. My prediction is that the numbers will astonish. It’s expensive to be unproductive! It is instructive that most of the historic effort directed at improving this widespread malady has come from Government-initiated macro policies such as improving infrastructure, giving tax breaks to research, continuing to make available better education/training and then privately cheering when sterling’s exchange rate goes southwards because it encourages exports. In the private sector there have been a rising number of foreign takeovers, the British car industry being a prime example of productivity turnaround with the introduction of non-British – among them Japanese, German, American, Chinese and Indian – management, all using the same British workers as before. These examples are more broadly validated by official statistics that show that average value added per foreign-owned business was higher than UK-owned businesses, regardless of employment size; indeed the 1% of such companies in Britain contributed 28% of the business economy (Office for National Statistics, 2013). There are good examples of British companies enjoying good productivity, John Lewis being one, but all exceptions aside, the wider problem has remained, with the long-time Government-initiated fiscal approaches to productivity improvement being more sticking plaster than cure. The introduction of foreign management is instructive in one fundamental sense; that it has made British workers more productive, which suggests where the problem may lie, at least in part. If so, this raises the question of what else to do – continue the customary top-down strategies, which would confirm that the problem continues to be considered exclusively monetary, or also directly address the underlying administrative and cultural deficit, which would mean that part of the answer is down to workers themselves? Whilst the latter would help to explain why it would be uncomfortable to accuse politically-active voters of being slouches, less capable than their overseas peers or, now, the ultimate insult, of being racist, there is another, albeit linked, party that has yet to rise to the occasion. Managers have to step up Consider the words of the late management guru, Peter Drucker: “It is only managers – not nature or laws of economics or governments – that make resources productive.” (Managing in Turbulent Times, 1980). On this basis, there is an unexploited bottom-up remedy for low productivity, which can be justified by the same mentioned unarticulated characteristic of the British worker that the poorer productivity statistics confer. This is that Britain’s eight million managers and their 30 million subordinates are not particularly good at learning from experience, specifically their own employers’ experience, a trait that is supported – certainly anecdotally – by the pandemic of difficult-to-calculate cost of repeated mistakes, re-invented wheels and other unlearned lessons that litter the workplace and the deafening promises of ‘We must learn the lessons”. These corporate missteps, more like a clumsy ballet, contribute generously to the wider cost of wasted productivity; in 2005 Proudfoot Consulting put the number at 7.5% of GDP. Adjusted for inflation, this percentage would total £113 BILLION for the four quarters to end-September 2013, more than the total revenue expenditure of the National Health Service for roughly the same period. If, then, experiential NON-learning is such a high contributor to low productivity, experiential learning must be an answer, an approach that, in order to make a difference, requires the alliance of managers and education, both of which have overlooked this huge area of productivity shortfall. It concerns a defined and important area of knowledge loss arising from the actively pursued flexible labour market, which has been in train for around 40 years, coincidentally the exact period of time when most of the Government’s remedial measures started out in earnest. Flexible working’s known advantages aside, short tenure and jobs disruption have been responsible for constantly disconnecting individual employers from their unique knowledge and experience that follows the rolling generations of exiting employees, including managers, which are being replaced at the average rate of 20% a year, higher in the US. Without this important component of one’s own intellectual capital, the ability of new bloods – even remaining employees – to fully experientially learn cannot happen. All that transpires is that employers become wholly dependent on the unfamiliar knowledge and experience of replacement staff, displacing the accepted way most progress occurs in organisations – organically, i.e. from the building of one experience on another. Without access to institutions’ own knowledge and experience, generations of managers and their employees have only half the evidence with which to work. Such short measure, the result of what’s called corporate amnesia, is no friend of good decision-making or productivity – however competent are replacements. It’s the brain drain on an industrial scale At the wider national level, it was (as still is) known as the brain drain, when highly trained people emigrated. Now the same brain drain is functioning on an industrial scale without knowledge having to leave the country, and with nary a peep from anyone in authority – until now. Vince Cable, MP, the UK’s Business Secretary, has at last conceded that the flexible labour market may, indeed, be “too flexible” and that it “was” contributing to low productivity (Resolution Foundation, May 13, 2014). Then, just as 2014 turned, the Confederation of British Industry (CBI) agreed that the role of productivity rested with “business on the ground” and that some outdated assumptions around flexible working should be “challenged”, albeit in the context of issues like the work/life balance and child care (A Better Off Britain, CBI Report, December 30, 2014). Instructively, the report does venture into the political arena, but only by referring to low productivity’s effect on wage restraint. It says bluntly that politicising the Low Pay Commission through calls for a higher minimum wage would do more harm than good and it suggests that government needs better data and understanding of the realities of productivity. Raising the separate idea that being a productivity straggler needs even more evidential support, it adds: “This is not a political challenge – a social partnership is not what we need – it’s one about helping with cold, hard facts on productivity trends, challenges and sectoral patterns to raise understanding and build consensus.” My thesis on the iceberg-like downside of short tenure working is not intended to diminish the many other, more conventional theories, for the UK’s lowly productivity performance such as references to the country’s short-termism (Kay Review, 2012), its investment ratio-to-GDP (Dataflexible set 1980 to 2018, International Monetary Fund, May 2013) or even the effect of management attachment to bonus payments based on share price performance (“The Road to Recovery”, Andrew Smithers, Wiley, 2013). My knowledge-loss argument might also appear to fall down through the apparent contradictions that there are more productive nations with higher percentage rates of wasted productivity than the UK (Germany, the US) as well as greater staff turnover (the US), but the point here is that these countries STILL manage to be more productive than the UK. It’s a quality that comprises the informal ability to better learn from experience that Britain’s immigrants carry with them and which is further enhanced by the fact that they necessarily need to be more productive to survive outside their home environments. The US, a nation of recent immigrants and the stellar productivity achiever, is a practical example of this. But overall, there IS a consensus that the UK does have significantly poorer management skills (Department for Business, Innovation and Skills report, July 2012). By way of the deafening echoes of institutions needing to learn their own and others’ lessons and the ill-disposed part of learning from experience being seemingly innate, there IS a British explanation. I have witnessed a curious and widely-based attitude that persists around managements’ disparagement of business/corporate history, making such an outlook resolutely cultural. It’s an observation that can be confirmed by the decimation of the UK’s academic business history activity, the widespread falling-off of economic history’s use in business schools, the uncoordinated use of others’ case studies and the poor teaching/appreciation/application of history generally. Strange for a country that reveres everything historical except for the way it makes its living ….! Then, alongside the commonplace promotional stratagem that the best cook in the kitchen makes the best manager, experience has shown that constant organizational change is always difficult, so one of the best ways to make it happen is to change bodies on the ground. It is a strategy that fits rather well with the flexible labour market’s easy-switch HR facility (and which, inconveniently, has been taken up with gusto by employees as a way of boosting their salary and experience). It also happens to be a vote of no-confidence in in-situ staff and imported training, which – in a context where Experiential Learning (EL) is important – becomes equally explicable where remembered experiences are both absent and/or where individuals are not taught how best to benefit from hindsight. More management muddle This is associated with another – this time a seemingly altruistic – belief that just because someone leaves a company, the knowledge they take with them doesn’t disappear. It will be used elsewhere, so there is no real loss; what goes around, then, will come around, my riposte being, yes, eventually but for immediate application, only if you rehire the departee. Generosity is then compounded by the mistaken belief that the imported experiences of replacement employees will automatically substitute; institutions, unhappily, don’t take account of new replacements’ unfamiliarity with their new employers’ unique and often subtle circumstances. In actuality, whether the change process is conducted by ‘oldies’ or new-bloods, being mindful of the individual corporate past allows change to be coordinated within a more familiar, and therefore more relevant, context. Management muddle doesn’t stop there. Their understanding of how one best learns from experience gets further confused by the reality that however employees exercise the discipline – informally via unreliable and/or absent organizational memory (OM) or formally through considered reflection – EL can only work in today’s flexible labour market if individual institutions capture their unique knowledge and experience before it walks out of the front door. In today’s British workplace, few institutions bother to share or capture this most valuable constituent of their intellectual capital with the intent of passing it down their churning generations. With such evidence being so important to the quality of decision-making – and therefore productivity – its oversight as a huge source of quality knowledge by commerce, industry and education is puzzling in the extreme. In truth, the compounded forfeiture of institute-specific knowledge over the modern era of flexible working is astronomical, with explanations varying from decision-makers themselves being part of the flexible working community (“we don’t have to care any more”) to a proprietary culture that discourages knowledge sharing (“its my knowledge and mine alone”) of a product paid for by host organisations and created within their walls. It’s an attitude that flexible working itself spawned after the era when it was unusual to have more than one or two employers in a working lifetime. Revisiting this pre-1980’s culture – in-house knowledge sharing, not long employer tenures – would be the first stage in how to help solve the UK’s productivity problem. Such is the state of the workplace in the UK and, not un-coincidentally, many of those other low-productivity countries which also sport high labour turnovers and education systems that exclude business-related history as an evidential source of knowledge. So, what to do? HERE’S THE ACTION PLAN …. In the UK, the complement to the top-down Government approach is to upgrade existing Knowledge Management (KM) processes to provide in-situ managers, their subordinates – and importantly new appointees – with a FULL supplement of employers’ hard-won and expensively-acquired knowledge and experience that would otherwise go walkabout. This could include specialised oral debriefings of the non-explicit ‘how’ of know-how, the unique and more practical non-technical component of how an organisation gets things done (otherwise called tacit knowledge) that is normally unwritten, unspoken, unshared and, of course, portably capable of going AWOL. To enable knowledge sharing in the first place, it would be practical for employers to include the obligation of knowledge capture in their employment contracts with staff. Then, alongside a dedicated knowledge capture programme, employers would need to include in their training courses formal Experiential Learning (EL) processes to enable managers and their subordinates to better apply hindsight. This would help to fill in the short- and medium-term knowledge gap of organisations, leaving the long-term gap to be filled by the most useful of all induction tools – a well-constructed corporate history. At the even wider level, the same corporate history, along with more modern business history, could be introduced into the curricula of business education, so new entrants to commerce and industry could be familiarised with their expected business-related activity over the next 40-odd years. The urgency For another verbal image of the problem and the solution, it’s a way of getting from “A” to “B” without going via “Z” – and, I argue, the more obvious way to improve a rooted productivity-averse culture. Peter Drucker, the management visionary who puts the responsibility for productivity firmly on the shoulders of management, also said back in the 1980s that the urgency of the productivity challenge was great. The country that did this first, he said, would dominate the twenty-first century economically. The fact that the issue of productivity in the UK has moved out of business into immigration and then to politics suggests a level of carelessness undeserving of a long-tolerant society. If so, is this not a penalty for the world’s oldest industrial economy squandering its in-built inheritance that, if it was more proactively captured and applied, could have given – and could still give – British workers their innate competitive edge? History confirms that British business is falling short in how we earn our living. Accordingly, it is time for managers to find the time to raise the important issue of productivity above the defensive approaches of reflex cost cutting, stick-wielding target setting and engrained short-termism. If the experts are any less puzzled at what’s happening in the UK, perhaps a frustrated Top-Down could do things to encourage Bottom-Up? Or Bottom-Up might take the initiative to do what Bottom-Up should be doing anyway? Without it, the UK may well get its GDP to grow but the cost of it would only continue to make more productive and better experiential learners – i.e. “those foreigners” – more attractive. After all, managers will just be doing their job. The bigger question is: Are they up to multi-tasking to make our own nationals more productive? The 21st Century is already a decade and a half old, so Peter Drucker’s 30-year-old prognostication is a little late for starters – but the millennium is still young. More pressing is the wider political fallout of the impending general election ….. More dark days for UK productivity and STILL nothing of substance is being done about this elephant in the room. Here’s one unconsidered explanation for the problem – and the solution…. In his worrying critique on the UK’s productivity decline since 1999 (“Hope for the best on productivity, but prepare for the worst”, November 13, 2014), Martin Wolf, the influential economic commentator of the Financial Times, paints the picture darkly. In the second quarter of 2014, output per hour was a staggering 16% lower than if the trend rate of increase from the start of 1999 to the end of 2007 had continued. The picture during the 2007-2013 period – the really dark days – was worse. Output per hour fell a humiliating 3%, the worst performance among the Group of Seven leading high-income countries. In the same period US output per hour rose by 7.6%, in Japan by 4.8%, in Canada by 2.7%, in Germany by 1.8% and in France by 1.3%. Even in Italy it fell by just 1.3%, confirming that hard times improved competitors’ work rate while the UK’s got worse, a conundrum that many of the UK’s serious newspapers suggest is puzzling the experts. At best, does this not indicate that something is dramatically wrong with the way we’re doing business? By way of clarification, Martin Wolf’s references to productivity is a two-edged sword. One edge is the output produced per unit of labor, usually reported as output per hour worked or output per employee. Productivity growth – the other edge – is the increase in output not attributable to inputs such as labor, capital, and natural resources and is driven by technological advances and/or improvements in efficiency, the latter being the decisive factor through which decision-making drives profitability and competitiveness. It’s the supervisory end of the management foil. For the UK’s lower productivity, consider this still widely UNACKNOWEDGED explanation that falls squarely into the important area of efficiency: that UK workers across the board, that’s including its decision-makers, are less capable of learning from experience, particularly their own employers’ experiences. Then consider the following underlying reason that doesn’t appear to be affecting other countries’ workers as much; that the flexible labour market’s continual workplace displacement and disruption, the absence of knowledge sharing between the generations and the questionable ability to experientially learn at the individual organisational level has imposed an extensive NON-learning culture. It is arguable that this toxic combination is even responsible for much of the UK’s productivity decline since flexible working was introduced in the 1980s, the compounded impact of which has inflicted widespread corporate amnesia, where the ability to organically progress – i.e. build one experience on another – has been severely compromised. The bottom line is that stop-start at the rate of the UK’s workplace turnover plus its inability to better experientially learn has soured both productivity and productivity growth. That the rate at which productivity rises is the chief determinant of the standard of living, Martin Wolf properly warns that lower worker output has big consequences. Yet, he notices, politicians, policy makers and business are largely ignoring the implications of this collapse. If one were needed, such an implication would be the not insubstantial price of wasted productivity. With experiential NON-learning contributing greatly, its cost across a selection of OECD countries – most of which are G7 and include the UK – is between 5.9% and 9.7% of GDP (source: Proudfoot Consulting, 2005). Because of the subsequent recession, the impacts today are likely greater, confirmed by the number of times those immortal words “We must learn the lessons” echo around the workplace. That’s the effect at the macro level. Consider the outcome at the coal face by imagining EVERY job in a company, including top executive positions, changing every four-to-five years and those same individuals taking with them the academic estimates of up to 90% of the organisations’ acquired knowledge and experience ….. Oh I’m sorry, it’s already happening. And it’s been going on for more than 30 years! To more efficiently offset the effects of high workplace discontinuity is not rocket science. Two things are necessary. Firstly, businesses and other types of organisation need to ensure that they capture their more important exiting knowledge and experience before it walks out of the front door. To be comprehensive, this should include the organisations’ non-explicit know-how – its called tacit or cognitive knowledge – that typically doesn’t get recorded in the institutions’ data banks. It is the non-technical “how” of getting things done, what has been called “operacy” or “techne” (Greek for “skill”). Buried in actual experience, much of it is implicit, ambiguous, certainly esoteric, and acquired largely by experience that is functional and, in its most instructive forms, context-, co-worker- and institution-specific. This unique constituent of intellectual capital can be described as the grease gun that lubricates what’s in those data banks. Without it, there’s little effective sharing – and even less opportunity of learning. It’s an activity that falls squarely into the emergent discipline of Knowledge Management (KM). After capture, new bloods then need to be able to APPLY their predecessors’ knowledge and experience (and their own) to their new employers’ new circumstances and environment. This is the discipline known as Experientially Learning (EL). What typically happens in existing classrooms is a one-size-fits-all approach to decision-making that overlooks the current employers’ past experience. In truth, the capture of an employers’ OWN knowledge and experience allows decisions to be made using more relevant and contextual evidence that is already adapted to its USP. Ipso facto, workplace discontinuity becomes more seamless and the knowledge gap is both closed and improved – as should be the decision-making skills set once business education teaches how to better apply the enlarged evidential base. Knowledge loss is not only pertinent to top decision-makers. Its effects are visible at many other corporate levels. In one example of my own experience, the secretary of a departmental head in a major pharmaceutical company was a big-time knowledge owner and knowledge sharer ahead of her departure. In actual fact, both business and business education have neglected the downside effects of flexible working for more than 30 years, forgetting that institutions are ALL different and believing that the imported experiences of replacement employees will substitute. They are and it doesn’t. However smart are those replacement employees, they typically have only half the available evidence with which to work. Belatedly, there is at least some recognition of the problem, although no evidence of how it is to be addressed. Earlier this year, Vince Cable, MP, the UK’s business secretary, conceded that the flexible labour market may be “too flexible” and that it was indeed contributing to low productivity (Resolution Foundation, May 13, 2014). Given this admission, I would want to alert him that his substantial efforts to encourage employment are then at risk. Without productivity, surely he must be aware that competitive pressures will just make his Cable-initiated employees – and more – uneconomic. Martin Wolf has some sensible stratagems for Government, politicians and the independent Bank of England but the headline above his challenging commentary is hardly optimistic. My suggestion is that while the policy makers ruminate, hopefully not for another 30 years, business and business education should address the problem themselves. The late management guru Peter Drucker says it better: “It is only managers – not nature or laws of economics or governments – that make resources productive.” At last, someone in the British Government has admitted that our flexible labour market is not all its cracked up to be. So, what to do? It’s taken several decades to admit and I suppose late is better than never but the warning by Vince Cable, our Secretary of State for Business, Innovation and Skill, that the flexible labor market “may” be too flexible at least flags up the awareness of the problem that, in his words, “undermines the incentive to be more productive.” (http://www.theguardian.com/politics/2014/may/14/british-labour-market-too-flexible-vince-cable) The problem with his understanding – that the flexible labor market has led to wage restraint – is that he’s completely missed the other, related and more impactful, reason for our productivity shortfall. It is the endemic phenomenon known as corporate amnesia, the disappearance of organisation-specific knowledge and experience that walks out of the front door every time an employee moves on. In today’s walkabout workplace, this is happening on average EVERY four to five years in many developed countries, even quicker in the US. It’s evident across the corporate hierarchy, even at the top of the ladder. Few companies make any effort to capture and share their departing knowledge and experience, thereby having to depend almost entirely on the unfamiliar practice of external appointees. Whilst incoming knowledge is not necessarily all bad, the extent of employee churn is such that it has affected a fundamental paradigm of learning and progress. What business has done is upturn the universal dynamic of how best to grow – by the building of one experience on another. Simply put, internal organic has largely been replaced by external non-organic. Decision-making is being based on someone else’s practice, without much consideration for one’s own unique environment, circumstances and tried-and-tested way of doing things.Compound the effect since the early 1980s – when flexible working started out (now around six complete staff changes) – and the retained level of an organisation’ own unique knowledge and experience is homeopathic. The result? The pandemic of repeated mistakes, re-invented wheels and other unlearned lessons that litter the modern workplace – and slower progress and lower productivity. It is arguable that the increase in skills that flexible working was supposed to bring to the table is inconspicuous. What Mr Cable is overlooking is that flexible working is going to be extremely difficult to reign in. After being in place so long, it is now an embedded cultural norm in the workplace and will react exactly like the proverbial super tanker. Underlying discontinuity will remain disruptive and institute-specific knowledge loss will continue – unless ….. The mentioned problems of corporate amnesia and experiential learning are also addressed. For the former, organisations will need to ensure that one of their most valuable elements of their intellectual capital – their knowledge and experience – is captured and shared with transiting generations of employees. For this Knowledge Management needs to up its game. Alongside this, the same employees will need to know how best to experientially learn. This is a formal discipline that needs to be introduced into business education. Both are notable for their widespread absence. As a collaborative effort, it needs a champion. What about you Mr Cable? Government has been responsible for encouraging the flexible labor market. Why should it not have a hand in solving the problems it helped to create? OK, UK efforts to awaken the recovery seem to be going well, but what’s being done to tackle the ‘elephant in the room’? The ‘elephant’ is errant productivity, without which the only way competitiveness, sales, profit and growth can survive is through a higher cost of living at home and, so that we can export, a lower currency exchange rate. The labor market is recovering but the latest international comparison by the Office for National Statistics shows that the day-to-day output of British workers has dropped by more than almost all the G7 nations since the financial crisis. The UK beats only Italy in terms of productivity changes since 2007, with output per employee still dropping. Officialdom is said to be baffled. In truth, productivity has been a British bugbear for much longer. So why are British workers unproductive? I’m sure correspondents will come up with a whole range of explanations but, I maintain, not one that is ACTUALLY encouraged, largely unacknowledged and – accordingly – unaddressed. Simply put, British employers are not very good experiential learners, especially at learning from their own knowledge and experience, sadly evidenced by the deafening chorus throughout commerce and industry of “We must learn the lessons ….’. And the underlying reason for this is tucked away in how employers and the business education system deal with the iceberg-like effects of that modern workplace phenomenon, the flexible labor market. Bottom line, flexible working has given us short jobs tenure. Remember when employees had one or two employers in their working lifetime? Well, the average today is around eight. In the US for example, the median level of tenure of ALL the Fortune 500 companies is just three years and eight months. That’s across the board – all grades, including managers. Even allowing for the fact that not everyone leaves at the same time, this means that each company has an institutional memory that does not go back much beyond 2009. Getting back to the bottom line, all this churn has given employers a discontinuous and incomplete institutional knowledge base that constrains much of the ability to learn from their own, unique, experience. Given that most progress is organic – i.e. organisation-specific and dependent on the building of one experience on another – this has delivered an increased rate of repeated mistakes, re-invented wheels and other unlearned lessons, all of which contribute to lost productivity. For an indication of what this is costing several developed countries, the number that Proudfoot Consulting puts on this delinquent figure is between 5.9% and 9.7% of GDP, an amount applicable to before the latest crisis (2005). The knowledge loss component of this I call corporate amnesia. Without an intimate awareness of one’s employer’s own knowledge and experience, experiential learning is restricted to the imported knowledge and experience of replacement employees. Their contribution to the decision-making process becomes the business equivalent of a seedbed of otherwise healthy plants WITHOUT any suitable top-dressing. For top-dressing read good Knowledge Management (KM) and its handmaiden Experiential Learning (EL), the mechanisms through which employers’ hard-won and expensively acquired knowledge and experience can be inherited – and then applied – by rolling generations of employees. In truth, employers do little to adequately capture their valuable intellectual capital that walks out of their front door on a regular basis while business education’s contribution to teaching decision making largely overlooks the formal discipline of experiential learning. While short tenure and productivity pressures affects ALL economies that practice flexible working, it will be the better experiential learners that will inherit the mantle of being progressive. In the UK, the country has moved from recession to recovery, but not from recovery to strength.The elephant needs attention; otherwise a lot of good work – and pain – will be wasted. The importance of being “self-sustaining” Even the UK’s incoming Governor of the Bank of England, Mark Carney, has had his say on the subject, albeit more obliquely, by counselling that growth should be “self-sustaining”. By this he was saying that the quality of production should not be exclusively dependent on the Government measures that have tried to make things easier for commerce and industry. By implication productivity also had to be self-sourced to ensure the quantity and quality of output. Using imagery, individual businesses had to get from ‘A’ to ‘B’ in their endeavours without going via ‘Z’. Although none has mentioned it specifically, the causal inference was that decision-making needed to be enhanced. For Barosso, the call to his 27-country constituency was to improve their competitiveness[3], which both Cameron and Merkel (speaking more about her EC partners than Germany itself) almost immediately echoed[4] as an urgent priority. With both productivity and competitiveness intimately conjoined, their unexplained reasoning is simple; that if commerce and industry can become more productive, said businesses will automatically become more competitive. If they’re more competitive, then the opportunity arises to sell more. Ipso facto GROWTH. Ever since the dark days of April 2007 when New Century Financial, which specialised in sub-prime mortgages, kicked off the credit crunch and the subsequent recession, the main defensive reaction has been limited to top-down monetary measures such as massive quantitative easing, low interest rates, austerity and currency manipulation, the latter being the easiest and foremost reply to most downturns since WW2. As such, coal-face productivity’s importance has been given only cursory attention, and not only in this latest recession as evidenced by the gradual decline in post-WW2 productivity growth among OECD countries[5] and new-blood Carney’s late insistence that growth should be self-sustaining. Simply argued in export-orientated economies, lower-value currencies enable more attractive prices and increased demand. It is inconveniently overlooked that imports, in retaliation to currency-manipulated cheaper exports, instantly become more expensive, fuelling inflation and, ultimately, embedding a similar cycle of unemployment and economic decline. Growth is still absent After six years of the credit crunch, with commerce and industry echoing the public sector’s austerity alongside an understandable reluctance to invest, growth – the main objective of all these strategies – is still mysteriously absent in the UK and EC. And the prospects for any sort of further top-down stimulus look bleak; the outgoing Governor of the Bank of England has warned that there are limits to what can be achieved via general monetary stimulus in any form on its own[6]. Sir Mervyn had earlier put another spanner in the works by suggesting that all those countries struggling to find growth would revert to type and start cutting the value of their currency[7]. In truth, the spectre of a currency war is already in train, with Japan, Europe, the US and Britain – all countries desperately looking for growth – joining battle. Since King’s speech in early December (2012) and by the time this article was written (end-April 2013) five months later, the gyrations have been noticeably mercurial. Sterling fell against the and the Euro. In just the first month of 2013, sterling’s descent against the Euro was 7.5%, greater than the whole of the previous year. In the five-month period, the also slipped against the Euro while the Japanese Yen depreciated against Sterling, the and the Euro. The effect over a three-month period to mid-February, for example, was it became 20% cheaper to do business in Japan than the US. In a slightly longer period, working in Europe became a third more expensive than in Japan. Complicated these machinations are but it is arguable, for example, that sterling’s fall against the Euro in the opening period of 2013 helped prevent the UK from falling into its possible third quarter of recession. But whilst providing an undoubted solution to confirming the traditional historical reaction to economic hardship, the reality for countries where money is devalued is that exports will become more competitive only until the other struggling countries parrot the exchange differential. And whereas this might provide hope that things are improving, however temporary, it camouflages the already-stated reality: that commerce and industry don’t have to concentrate very hard on Mr Carney’s call for growth to be self-sustaining. The British Chancellor of the Exchequer George Osborne’s more recent call for the Bank of England’s new Financial Services Committee to prioritise short-term growth[8] will likely further disincentivise the sharp end of business from depending more on their own resources. In truth, devaluing one’s currency is not only artificial but the lazy way of achieving growth – and, pointedly, one huge indicator of industry and commerce’s poor decision-making. The black hole of experiential learning So, while the American experience of having higher productivity would appear to indicate the pathway to competitiveness and growth, what’s the actual way to speed up the treadmill? Within the obvious circumstance that decision-making has to be improved, there is a huge industry that addresses this task ranging from time management to just-in-time inventory management that is the role of vehicles such as business academia, management consultancy, conferences and business books. Curiously, though, there is one dedicated discipline, arguably the most effective at improving productivity, that is widely ignored. It is the formalized process of experiential learning (EL), self-evidently the ability to learn from experience. Within this discipline – an offshoot of Knowledge Management – there are two separate but interconnecting arms. The one that is used relatively widely is benchmarking, the practice of comparing one’s own processes within a peer group and applying perceived improvements. The other, the widely disregarded practice, is learning from one’s own experiences, the more useful of the two because they are already tried and tested to one’s own special circumstances. In any event, the reality of most progress is that it occurs organically and incrementally, a topography that has become virtually impossible in today’s workplace that houses the biggest-ever change in employment practice. That change is the flexible labour market that, for the past 35 years, has given business the phenomenon of short jobs tenure. While it has had definite advantages for employers – the ability to accommodate changing circumstances at short notice – there is one iceberg-like downside. Employees – including top decision makers – have been moving their paymasters on average every four to five years, taking with them up to 90% of their employers’ distinctive knowledge and experience[9] and leaving only their paper trail. What actually disappears is the vital tacit knowledge component of their tenure – the organisation-specific how of know-how – that is typically unrecorded and, usually, just as important as the remaining explicit data and information. In truth, the level of such memory that is retained within the organisation and available for organic and incremental progress has reduced to homeopathic levels. With such corporate amnesia has come the inability to learn from one’s own experiences and the pandemic of repeated mistakes, reinvented wheels and other unlearned lessons that litter the workplace. So, what does it cost? That managerial skills are less than optimal is not in doubt. This judgment comes straight from the horse’s mouth – managers themselves[10]. Their own assessment – in this case by senior British managers and/or direct board report positions in companies turning over more than £200 million a year – is that an astonishing one in four of their decisions is wrong. According to the study, the rate in the financial services sector is even higher – nearly one in three. With an average 20 ‘business critical’ decisions taken by each manager every year, the financial impact of which is computed to be worth an average £3.4 million, this equates to a wrong determination every eight weeks by each of every one of an average 33 decision-makers in every organization. The research was undertaken in 2004, a boom year. By being self-determining, it is arguable that the figures are conservative. Ironically, the failure rate is an outcome that managers would likely not tolerate among their vocational subordinates. The cost is enormous. Another management consultancy has done the sums[11], finding that in 2005, a boom year, the cost of wasted productivity in a selection of five OECD countries varied from 5.9% to 9.7% of GDP in 2005. In the UK, for example, the figure was £120 Billion (yes, Billion), nearly £20 Billion more than the entire National Health Service budget in 2012, while in the US, the figure was $888 BILLION. The UK figure gets corroboration – and additional weight – by the 2013 estimate that poor decision-making is responsible for £120 billion of wasteful spending in just the public sector[12]. This equally astonishing estimate, details of which are laboriously listed by The Taxpayers’ Alliance, an influential pressure group and think tank, “could wipe out the UK’s budget deficit without closing a single hospital, firing a single teacher or disbanding a single regiment”. A corresponding assessment by the European Central Bank found that Whitehall wastage was even more – £137 BILLION. These are titanic number of unaddressed opportunity that, in truth, knocks the stuffing out of any individual company – and national – output. Any of the figures would suggest, nay confirm, that modern industry and commerce’s ability to compete has been severely compromised. Screen Shot 2013-05-10 at 11.17.57 There is one further aspect of this subject that is relevant to improving productivity. It involves the traditional way decision-makers are taught how to make their determinations. As a general rule, it’s a one-size-fits-all process; thus a “trained” decision-maker supposedly becomes a ‘man for all seasons’ who can, notionally, do the job whatever the environment. The fact that productivity growth has been declining since WW2 among OECD countries, where and since when formal business education has never been more widely available, suggests that the way decision-making is being taught is less than effective. The additional fact that workplace continuity is as rare as hens’ teeth, leaving organisations with little of their own ‘memory’ with which to work, must also challenge the conventional wisdom that lots of other employers’ experience is necessarily beneficial. For this, there is the possible explanation, also to do with memory, albeit memory of a personal nature, that individuals are inherently susceptible to recollections that are short, selective and defensive, all of which degrade the ability to be objective. This raises the question whether the new smorgasbord of experience is working as it should. As it stands, walkabout employees, including managers, lack their employers’ intimate knowledge and experience, which is also unique and already tailored to its distinctive environment. Without it, they lack the necessary familiarity with which to marry their own know-how to their new employers’ individual experience. My own conclusion is that they can only become truly productive through effective, genuine and formal EL. It goes without saying that the discipline has be a dedicated subject in business education that involves decision-making being taught as something less multifunctional and more subject-specific. As such, it behoves an employer to be responsible for providing new employees with a detailed awareness of its own prior knowledge and experiences that goes beyond the paper trail content of their sophisticated data bases and, importantly, that their new employees have the ability to apply both this and their own knowledge and experience in the cause of better decision-making. Where experiential learning is different Genuine EL is a separate ability from the usual decision-making delivery that involves a necessary reflective component traditional business education explicitly short change. Its application is deliberated against two tried-and-tested assets – institutional-specific experience, which organisations allow to constantly walk out of their front doors in the name of the flexible labour market, and the experience of individual decision-makers, typically imports from other employers; the former is usually inaccessible, the latter subject to short, selective and defensive memory recall. Whilst little can be done with the latter, acquiring the former provides an evidential basis greater than would otherwise be. The process of EL Bears the footprints of Ivan Pavlov, most famous for his learning experiments with dogs, Alfred Binet, the pioneer of the intelligence test, and even Albert Einstein, who described Binet’s work on experience and knowledge as “so simple that only a genius could have thought of it,” EL is the widely unaddressed black hole of business education. Its most modern champion is David Kolb, Professor of Organizational Behavior in the Weatherhead School of Management at Case Western Reserve University, whose reflective model of EL is acknowledged as the most advanced. Why its importance as a dedicated learning tool has taken so long to be introduced into business academia and commerce and industry is truly a puzzle. In a methodology that I’ve adapted to the short-tenure character of the flexible labour market – I’ve labeled it Experience-Based Management (EBM)[13]. Its practical application is not rocket science, involving two broad constituents – a capture module and a learning module. Because of the short-tenure nature of the flexible labour market, the former involves the collection of relevant knowledge and experience of key employees before it walks out of the front door. For short- and medium-term memory, the easiest and cost-effective way of collection is through detailed oral debriefings, either done as end-of-tenure interviews, specific project-related interviews or regular annual interviews. Long-term memory can be captured in the form of the traditional corporate history, constructed as a learning tool rather than its usual format as a public relations document. Once these are in place, the latter module – how to learn from experience – can take place, the process whereby individuals can apply their employers’ tried-and-tested knowledge and experience with their own to arrive at decisions that achieve continuity, as follows. The methodology includes the creation of a ‘lessons audit’ for easy roll-down the generations. EBM’s six-stage learning cycle Screen Shot 2013-05-10 at 11.22.56 For EL to happen, there are a number of mindsets that have to change. + Because of the changed working environment, both academics and commerce/industry need to stop seeing their knowledge as exclusively employee resident. Institutional revolving doors and musical chairs have given knowledge a fleeting character that prescribes value to the employer only if it is resident within the organisation and available to itinerant employees to apply. + To overcome innate managerial defensiveness about personal and corporate performance, employers need to encourage cultures that support objective reflection without penalty. Specifically, they need to see EL as less of a threat and more of an opportunity by demonstrating a corporate maturity that extends beyond the defensive posture of the insecure. On that basis, failure can be delayed success rather than an event that risks repetition. + Industry/commerce should accept that higher productivity is as much an issue for management as it is for workface employees. In truth, high-skill employees are no substitute for poor decision making from above. + And finally, business academics need to admit the wider definition of EL into their orbit and historians have to also see themselves as knowledge practitioners. History provides experience cheaply! Only then will Mark Carney’s prescription for self-sustaining growth be achievable. Is Sir Mervyn King right on this ‘deck chairs on the Titantic’ issue? The outgoing governor of the Bank of England predicted that all those countries struggling to find growth would start cutting the value of their currency. If a country devalues, its exports become cheaper. Ipso facto that country will theoretically be able to sell more, triggering growth. But if that country imports a lot – and most modern industrialised countries do – domestic prices will eventually rise, depressing growth. Of course the net balance will depend on how much the currency is cheapened and the proportion of its imports but …. If every one of the contenders for growth devalues – as Sir Mervyn is expecting – won’t all their prices do the same, restoring the status quo? There might be some marginal benefits depending on the extent of the mentioned variable factors, but is this just the equivalent of moving the deck chairs on the Titanic? Instead of fooling around with fancy macro policy, might it not be better to concentrate more fully on improving underlying productivity? Simply stated, if the price of goods and services are reduced without the artificial means of currency devaluation, wouldn’t growth be better grounded? Over the last four years – actually a lot longer – the achievement of real productivity has been widely ignored. Have a look at the stats for ROI, productivity growth and the cost of wasted productivity, the latter totting up figures between 5.9% and 9.7% of GDP. It’s a widely unaddressed factor of production and shameful for supposedly sophisticated industrialized countries with all that experience behind them. Wasn’t it the late Peter Drucker who said “The urgency of the productivity challenge is great. The country that does this first will dominate the twenty-first century economically.” And is not Sir Mervyn’s prediction the last refuge of the desperate? His – and all those other rocket scientists’ – advice should be productivity, productivity, productivity. There is no question that in the business of productivity, conventional business education has not served us well. It can’t even be controversial to conclude that what we’ve received in the way of dedicated business education in the past 50 years has given us low productivity, declining productivity growth and a huge cost when one puts a hard figure on the bottom line. Converting the stated researched percentages of GDP into hard cash, wasted productivity cost the US $888 billion, Germany €203 billion, the UK £120 billion, France €92 billion and Spain €84 billion in 2005. And this was in a good year before the current recession. It is pertinent to underline that these numbers represent WASTED effort and cost – and are therefore avoidable. Given the unrelenting southward profile of these matrixes over so many years, the bizarre thing is that this clearly defined area of business dysfunction has been largely unaddressed by industry, commerce and their supposed champion, business education. Strip most of these figures out of our production costs and imagine …. It is unarguable that these three constituencies have to wake up and smell the coffee. Robinson and Pink’s observations are surely valid but may I suggest another area of huge oversight – experiential learning, self-evidently the ability to learn from experience. Not the facility to repeat prior practice but to APPLY the tried-and-tested experience of the past. In its formalized format, it is a distinct and separate discipline that needs to be married to the actual experiences of individual institutions, made more relevant in today’s flexible labor market because acquired knowledge – hard-won and expensively paid for – is constantly walking out of the front door. Discontinuity promotes experiential NON-learning, the very scourge of lost productivity. Growth, growth, growth. But oh, so elusive. What else to do… Britain’s economy shrinks anew, flirts with “triple dip” screams the newspapers. What’s the answer? Growth, the new religion that appears to be so elusive. Are we missing a trick? Yes. In the UK, the annual cost of wasted productivity is £120 billion. Yes £120 billion, equal to 7.5% of our GDP. And that was in a good year, 2005, before the current recession (Proudfoot Consulting). A reduction in this figure would go a long, long way to helping to trigger growth. How? By cutting the pandemic of repeated mistakes, reinvented wheels and other unlearned lessons that contribute to the waste. And how does one do that? With the skill known as experiential learning, self-evidently the ability to learn from experience. As a formal discipline it’s widely absent from business education and in the way employers make their day-to-day decisions. This is made more difficult by the modern workplace, where the actively encouraged flexible labour market ensures short jobs tenure and which, in turn, lets an organisation’s institution-specific knowledge and experience to continually walk out of the front door; without this unique corporate asset comes the inability to learn from their own hard-won and expensively acquired experience. The established myth is that the replacements who have wider exposure to the workplace will adequately compensate for the loss of one’s own knowledge and experience. Not so. There is academic research that indicates that skills are not widely transferable to new environments, which gets corroboration from the continual decline in productivity growth over the last 40 years. However qualified, new blood still has to learn new ropes and their new employers’ unique experience. In any event, most progress comes about organically, the building of one experience on another, whether that be the improvement on success or the reversal of failure. Take away the unique experience and there are no recognisable building blocks! So, outside of the fiscal area of government activity and short of reversing the flexible labor market, what can YOU do? Champion better experiential learning, the most effective key to productivity. Nobody else is doing it, with everybody expecting government to be responsible for the recovery. It needs a loud voice. The late Peter Drucker says it better. It is managers, not nature, economic laws, or governments, that make resources productive. It’s not rocket science. Business educators, who – quite astonishingly – overlook this central skill, need to be encouraged to teach the future generation of workers how to do experientially learning properly. Alongside this, employers need to be made aware that their own specially-acquired knowledge and experience is essential for organic growth and therefore worth capturing before it walks out of the front door. Without the ability to learn from one’s own experience, progress is necessarily slow – even reverses in some cases – as the already-mentioned decline in productivity shows. This discourse boils down to the following: if modern business can get to ‘B’ without going via ‘Z’, said businesses would automatically become more competitive. If they’re more competitive, then the opportunity arises to sell more. Ipso facto GROWTH. To attention-grab, it is often useful to mix a few more metaphors: to help cure our shared migraine, smell the coffee. I am sure the Prime Minister will want to know how HIS imagery of the roasted beverage can be demonstrably fair trade and tax efficient. References: [1] http://www.oanda.com/currency/historical-rates/ [2] Global Competiveness Report http://www.weforum.org/issues/global-competitiveness [3] http://europa.eu/rapid/press-release_SPEECH-13-218_en.htm [4] http://www.bbc.co.uk/news/uk-politics-22119096 [5] Groningen Growth and Development Centre and the Conference Board, Total Economy Database. http://www.ggdc.net [6] http://uk.finance.yahoo.com/news/boe-inflation-report-sir-mervyn-113853159.html [7] http://www.telegraph.co.uk/finance/financialcrisis/9736265/Mervyn-King-raises-spectre-of-currency-wars-in-2013.html) [8] http://www.ifaonline.co.uk/ifaonline/news/2265160/chancellor-tells-fcp-to-focus-on-shortterm-growth [9] D. Bonner, American Society for Training & Development, 2000. [10] UK Business Decisiveness Report, Capgemini, August 2004. [11] Proudfoot Consulting, 2005. [12] Bumper Book of Government Waste, June 20123, http://www.taxpayersalliance.com/ [13] http://www.businessexpertpress.com/books/knowledge-management-death-wisdom-why-our-companies-have-lost-it-and-how-they-can-get-it-back Share this: Twitter5Facebook52LinkedIn247 Posted May 10, 2013 by waytoogo 3 responses to “Growth, how?” Subscribe to comments with RSS. In my opinion there is another factor to consider – there is no pressure for a person born into the System in the UK to work.. period. It’s not a case of work or starve, which is the case with most immigrants. it’s a lot easier to just ride on the Welfare bus and never get your hands dirty. There are families that have generations that have never worked a day in their life and manage to live just fine. A lazy culture has developed in what used to be the working class. Immigrants will risk death for the opportunity to work in the UK to feed themselves and their families, and work like a Trojan day and night until they too can take advantage of the welfare system and become like everyone else. Reply Ray February 3, 2015 at 8:05 am Ray, your immigrant argument is well made. The US, a nation of migrants, confirms it but your ‘welfare culture” observations are too complicated to be as black and white as you suggest. Yes, welfare is generally generous and there are lazy people/families around but the bulk of us British workers work the longest hours in Europe and when THEY emigrate, they automatically join your immigrant theory and BECOME very productive. While this does suggest that the wider SYSTEM is flawed, my own reasoning puts much of the responsibility for any misuse on our home-grown managers and business educators. And, dare I say it, to a lesser extent on our politicians. However politically incorrect this may be, if they were ALL better decision makers in their various disciplines, workers would be more productive. And better productivity would help to override many of our more endemic problems. I believe it’s as simple as that and the generous welfare system, which reflects a commendable intention by a caring nation to be supportive of deserving people, would be able to be better applied. Thank you Ray. I welcome the debate. Reply waytoogo February 3, 2015 at 11:55 am So real, one wonders why it is not called common sense!!!! Reply sid kabaso April 1, 2015 at 3:22 am Leave a Reply Buy the book today and get 15% off. Just use the code KRANSDORFF when you click on the cover here. |