From: John Conover <john@email.johncon.com>
Subject: Re: The Standard: Warning: Recession Ahead
Date: 7 May 2001 00:17:25 -0000
And what about the dynamics of the dot-com industry, as mentioned below: Its not so much that static methodologies, (like MBA/MBO,) don't work in organizing high entropy dynamic systems, (which the dot-coms are/were,) as it is that they lead to irrational, unattainable, expectations. and: As approximate numbers, (taken from the average of many industries on http://www.johncon.com/ndustrix/,) to exploit the lack of predictability in the US GDP and its constituent industrial markets, the short term predictability of a few months to a 70% accuracy means that about (2 * 0.7) - 1 = 40% of a company's assets should be at risk in the short term, in WIP, capital expenditures, R&D, etc. If the short term management horizon is a month, (which it has to be to get a maximally optimal 2X growth per year,) then, on average, a management paradigm, (prediction, and execution,) would be obsolete where 1 / sqrt (t) = 0.5, or about four months-i.e., half would be obsolete in less than four months, half would last longer. About 30% of the paradigms would have to be changed by the end of the second month, 58% by the end of the third month, and half by the end of the fourth month. That's how fast the executive suite has to move the company-which is a continual struggle in management trails and tribulations. Its what it takes to sustain a 2X annual corporate growth. Quarterly planning isn't a viable option, (which is the typical MBA/MBO planning cycle-which, although workable in old economy smokestack/assembly line industries, isn't applicable to contemporary service industries.) To achieve 2X growth requires a managed, (as opposed to an administered,) solution. And that's where the dot-coms went wrong. John BTW, the distributions, (they are from fractal dynamics,) have very "fat tails". (They have nothing to do whatsoever with Gaussian/Normal distributions, and are a completely different family.) All is not always over by the end of four months, however. What's the chances that a management paradigm will last a century of months? Its about 3%, (1 / sqrt (1200) = 2.9%.) Those are the companies in industries that become the mainstay of the GDP. About 3% of the industries in the last century, (including oil, rail, automotive, electronics, etc.,) achieved that status. The natural evolution of companies and industries is a log-normal distribution; where the 3%, or so, dominate the GDP, (like 70% of it, or so.) They end up generating almost all the wealth. John Conover writes: > > As another interesting sidebar on project management, many of the > projects attempted in the dot-coms were executed under MBA/MBO > methodologies. > > Unfortunately, about 84% failed, (that's the not-so-pretty industry > numbers for the IT projects-failure meaning over budget, over > schedule, or both; re: The Standish Group's, Jim Johnson, > http://www.standishgroup.com/msie.htm, which authored > http://www.scs.carleton.ca/~beau/PM/Standish-Report.html, > http://www.garynorth.com/y2k/detail_.cfm/1088, and > http://www.systemcraftllc.com/standishGrp.html.) > > Why are the numbers like that? > > If project milestone completion, c, is considered a linear function of > resource allocation, (i.e., c = kt,) and the project's decision > process has sufficiently high entropy, then the real milestone > completion would be closer to c = sqrt (kt). > > In other words, the ratio of the projected milestone completion, to > the real would be sqrt (kt); and for all t, the average project would > have less than sqrt (kt) milestone's finished, for one standard > deviation of the time, (i.e., 84.1% of the projects would be less than > sqrt (kt) finished-where they should be kt finished-and 15.9% would be > greater than sqrt (kt) finished.) > > Or, 1 / 0.841 = 1.19 standard deviations would be better than on > schedule, or on budget, which is 11.7%; and the remaining 88.3% would > be over budget, in one way or another, (which is less than a 5% error > compared with the Standish metrics on the IT industry, as a whole.) > > Its not so much that static methodologies, (like MBA/MBO,) don't work > in organizing high entropy dynamic systems, (which the dot-coms > are/were,) as it is that they lead to irrational, unattainable, > expectations. > > So, what would be rational expectations? > > One should square the resource allocation requirements obtained by > linear estimates to get a realistic budgetary and schedule > requirements. > > John > > BTW, you have to keep the projects small, and simple, too. Complex > projects have resource allocation issues that explode-the square > function starts rising very rapidly on complexity. > > How many complex IT projects have succeeded, (The Standish Group > uses ten million bucks as the benchmark for a complex system?) > > In the history of IT, none. > > John Conover writes: > > As a sidebar, VCs usually want an investment to run about 5 years, > > then the company IPOs, and the VC makes lots of money-and so do the > > employees, (sound like a Ponzi scheme? It is.) > > > > What's the VC industry's success rate? > > > > The industry numbers are about one in nine, (the remaining eight get > > redeployed, reorganized, or re-something'ed into Chapter 7 and 11 > > which is a popular end game with the dot-coms these days.) > > > > What's the probability of a company's strategy and marketplace lasting > > at least 5 years = 60 months, without having a negative month where it > > couldn't meet the payroll? > > > > The deviation of the distribution of the GR of all companies at the > > end of 5 years is about 1 / sqrt (60) = 0.13, or about 15.8% of the > > companies will have done at least 87% of what they were supposed to > > do. So, we need 1 / 0.87 standard deviations = 1.14, which corresponds > > to 0.127, or about one in 8-or about a 10% error from the empirical > > metrics published by the industry associations. > > > > Call it about one in ten, which is the number I used, below, in the > > statement: > > > > ... We would, also, expect about a 1 in 10 survival rate, (e.g., > > any company that has less than 10% market share, e.g., a 90% > > chance of failure, won't make it,) and we are seeing those > > numbers." ... > > > > So, one probably shouldn't invest in companies with less than a 10% > > market share, (or do business with them, either,) unless there is > > compelling reason to do so, since only about 1 in 10 will survive at > > least 5 years. > > > > The one in ten number is consistent with what has been published about > > the other industrial "bubbles" of the 20'th century, (specifically, > > both the radio and television set industry, of the 20's and 50's, the > > electronic game industry of the mid 70's, the CB radio industry of the > > early 80s, and the software industry of the late 80's-and now the > > dot-com industry.) > > > > John > > > > BTW, the number is consistent with the other "bubbles" in the distant > > historical perspective, too, including tulips in 17'th century > > Holland, and the South Sea "bubble" of the 18'th century. As a > > generality, about 90% of the investors lost 90% of their investment, > > and the remaining 10% increased their wealth by about 10X, on average, > > in each of the "bubbles" throughout the last 5 centuries. Such things > > are Ponzi, (or pyramid,) schemes, and executive management should know > > better than to participate, and run companies like a grand industry > > sanctioned Ponzi scheme, (that goes for the investors, too.) > > > > Interestingly, according to MBA dogma, the executives in the dot-coms > > did everything right. The MBA methodology of corporate organization is > > not new, and it works well for what it was designed to do-efficiently > > manage an industrial assembly line process through Taylor's > > time-motion analysis. Extrapolating the methodology as a management > > paradigm is a large leap of faith, however, and business schools have > > to shoulder a large part of the blame for the dot-com "bubble" for not > > including modern concepts into the curriculum. > > > > But the young executives that operated the dot-com "bubble" have a lot > > of historical company. None other than Sir Isaac Newton lost a > > fortune in the South Seas "bubble" of 1720: > > > > ... Sir Isaac Newton, scientist, master of the mint, and a > > certifiably (sic) rational man ... sold his 7,000 pounds of stock > > in April for a profit of 100 percent. But something induced him to > > reenter the market at the top, and he lost 20,000 pounds, > > ... [prompting him to say] ... "I can calculate the motions of the > > heavenly bodies, but not the madness of people." > > > > Re: Harvard Magazine, > > http://www.harvard-magazine.com/issues/mj99/damnd.html, and search > > forward for the word "Newton". > > > > John Conover writes: > > > Looking at the valuation of a well managed company, (i.e., one where > > > management knows what its doing-knows what has to be done, and knows > > > how to execute on it, optimally, e.g., grow 2X per year, for at least > > > 10 years, before it starts its demise,) and if it starts with a > > > million bucks of GR the first year, (a reasonable value,) then IPOs in > > > year 5, (with a GR = 2^5 * 10^6 = 32 million-a reasonable number from > > > the historical perspective,) then in year 10 it would have a GR of 2^5 > > > * 32^6 = one billion bucks. > > > > > > If the company was floated on an initial million bucks investment, > > > (i.e., it was not profitable only in the first year,) it would be a > > > thousand X increase in valuation over the decade, (assuming that the > > > capitalization vs. GR remained constant, which was 3X in the 20'th > > > century, averaged over all companies-I'm using a conservative 1X.) > > > Compare that with the 100X the best dot-coms delivered, (and then > > > didn't.) > > > > > > Now assume that management is less efficient. It grows the company at > > > only 10 percent less than optimal, at 1.9X per year-everything else > > > remaining the same. At the end of the decade, the company's GR would > > > be 613 million, or its valuation/capitalization would be almost 40% > > > less. About half, for a consistent 10% inefficiency in management. > > > > > > Suppose the management of the company putzed around and didn't execute > > > on engaging the marketplace for a year, then ran optimally. At the end > > > of the decade, the company's GR would be 512 billion bucks, or its > > > valuation/capitalization would be about half, for a 10% delay in > > > execution. > > > > > > So, roughly, every delay or inefficiency by management of 10% > > > ultimately costs the company 50% of its valuation/capitalization. > > > > > > John > > > > > > BTW, its not a linear relationship. Another inefficiency/delay of 10% > > > doesn't cost another half billion bucks again. It cuts it in about > > > half again. > > > > > > Note that a delay of one year costs a 50% market share in a > > > competitive environment, so the market would shake out, ultimately, at > > > a 2:1 market share ratio between a well run company, and a not so well > > > run one, who has a 30% market share. In other words, a loss of 20% > > > market share would spell the ultimate demise as the cost for the > > > management mistake. It wouldn't make it through the decade. > > > > > > John Conover writes: > > > > While we are on the subject of the dot-coms, (which is an interesting > > > > case study-it is/was capitalism and free-market'ism at its finest-no > > > > barrier to entry, insignificant infrastructure requirements, ample > > > > investment or access to capital, and a choice of thousands for > > > > consumers-what more could anyone want?) and completely mismanaged by > > > > our finest young executives with their recent MBAs in tote. (Although > > > > entropic methodologies are taught as a core competency requirement in > > > > all university financial curriculum, it is not taught in business > > > > school.) > > > > > > > > As an example, what is the chance that a company with 1% market share > > > > will ever replace an industry leader with 50%? > > > > > > > > The answer is 2%, (you just divide the two numbers-its the gambler's > > > > ultimate ruin scenario-and if it is an entropic system, it has to be > > > > that way.) > > > > > > > > What that means is that, as an investor, one has to expect returns of > > > > at least 50X to justify investing in companies with a 1% market share, > > > > no matter how good the concept or vision is. > > > > > > > > We would, also, expect about a 1 in 10 survival rate, (e.g., any > > > > company that has less than 10% market share, e.g., a 90% chance of > > > > failure, won't make it,) and we are seeing those numbers. > > > > > > > > What's the duration of time one would expect for such a company to > > > > be in business? > > > > > > > > About 50 * (50 - 10) = 2,000 days, or about 7.9 years for 253 business > > > > days per year. (And, we have seen that, too; many of the dot-coms that > > > > are failing were started as early as 1994, or so.) > > > > > > > > That's why so many dot-coms are failing. > > > > > > > > It is called management mistakes. > > > > > > > > John > > > > > > > > BTW, compare this to a company with competent executive staff that put > > > > the company in biz early in the industry's development, started with, > > > > and maintained a 50% market share. Such a company would be expected to > > > > last about 100 * (100 - 50) = 5,000 days = 19.7 years, (at 253 > > > > business days per year.) The empirical metrics on the history of US > > > > commerce in the 20'th century says 22 about years, (i.e., about a 10% > > > > error in the probability estimate.) > > > > > > > > Only GE survived the entire century, BTW. > > > > > > > > But nothing is eternal in entropic systems. What's the chance of a > > > > company in a leadership position eventually failing? > > > > > > > > Its a virtual certainty. That's the way capitalism works. > > > > > > > > John Conover writes: > > > > > > > > > > As approximate numbers, (taken from the average of many industries on > > > > > http://www.johncon.com/ndustrix/,) to exploit the lack of > > > > > predictability in the US GDP and its constituent industrial markets, > > > > > the short term predictability of a few months to a 70% accuracy means > > > > > that about (2 * 0.7) - 1 = 40% of a company's assets should be at risk > > > > > in the short term, in WIP, capital expenditures, R&D, etc. > > > > > > > > > > Note that it is a conceptual framework, and just says what has to be > > > > > done-not how to do it. The implementational paradigm is evaluation of > > > > > "what if we do?", and "what if we don't?" scenarios, with more > > > > > attention paid to the latter, (since you get more for better > > > > > mitigation of risk than picking winners.) > > > > > > > > > > In the long term, (i.e., more than a few months,) mitigate risk > > > > > through product diversification, (no less than 8 product lines, 10 > > > > > being about right, 12 probably too many,) and shuffle investment > > > > > capital around in the product lines to avoid leptokurtotic behavior in > > > > > the corporate P&L, (i.e., as a first order approximation, the GR > > > > > generated by each product line should be about equal as a management > > > > > paradigm; as a better approximation, the GR generated by each product > > > > > should be proportional to the average divided by the deviation of the > > > > > marginal increments in the GR. Likewise for the investment of > > > > > capital.) That, also, defines investment in new product areas. The > > > > > long term plan is to put about 2% of the company at risk, per business > > > > > day, (or, cumulative, about 40% per month,) which is consistent with > > > > > the short term strategy. > > > > > > > > > > The numbers would support a corporate growth of slightly more than 2X > > > > > a year, and would maximize growth, while at the same time, minimizing > > > > > risk exposure. > > > > > > > > > > Whether such growth could be managed and executed in the long run is > > > > > an entirely different issue. (Its tough.) > > > > > > > > > > John > > > > > > > > > > BTW, there are other solutions for the same numbers-but they aren't > > > > > optimally maximal. It is possible to grow a company faster than 2X a > > > > > year-for awhile. Its a "coffin corner" solution, that will exhibit > > > > > high growth, followed by a crash, (no matter what growth metric is > > > > > used.) About 2X is the maximum SUSTAINABLE growth with those numbers. > > > > > (Does the dot-com industry ring a bell as a case in point? Many of > > > > > those companies exhibited growth of 2X in a single month-for > > > > > awhile. The amount of the company placed at risk was far larger than > > > > > the optimal 40% per month-most of it in capital investment. Glory has > > > > > its price.) > > > > > > > > > > John Conover writes: > > > > > > Interestingly, the lack of predictability in the US GDP is > > > > > > exploitable. > > > > > > > > > > > > As a mathematical expediency most corporate strategies are divided > > > > > > into short term and long term, (short term being a few months, long > > > > > > term being more.) > > > > > > > > > > > > For long term strategies, the size of the window of predictability is > > > > > > regarded as zero-meaning that the US GDP is treated as an entropic > > > > > > system that behaves randomly. Strategies are developed that fit into a > > > > > > framework determined by the average, (the potential gain,) and the > > > > > > standard deviation, (the risk,) of the marginal increments of the US > > > > > > GDP or industrial market-usually using monthly or quarterly data. > > > > > > > > > > > > Short term predictability is an inefficiency, (at least in the sense > > > > > > of the efficient market hypothesis,) and can be exploited by adjusting > > > > > > operations to near term GDP/market anticipations faster than anyone > > > > > > else, by using the like of JIT techniques to minimize WIP risk, etc. > > > > > > > > > > > > The potential advantage is quite significant. > > > > > > > > > > > > John > > > > > > > > > > > > BTW, what would happen if every company that contributes to the US GDP > > > > > > did that? Not much, except that it would grow faster. The US GDP would > > > > > > become entirely entropic, (i.e., unpredictable,) and would be > > > > > > efficient, and fair. It would be stable, (in the sense that it could > > > > > > exist like that, forever.) Further, if the average of the marginal > > > > > > increments equaled the variance, it would be maximally optimal. > > > > > > > > > > > > Unfortunately, it is a politically inexpedient solution. Such concepts > > > > > > as monetary/fiscal policy to affect consistency in full employment, > > > > > > etc., (which has been the paradigm of the last seven decades in all > > > > > > industrialized countries,) would have to be discarded. Perceived lack > > > > > > of influence over economic issues is not an expedient political > > > > > > platform. > > > > > > > > > > > > (BTW, full employment is not necessarily optimal. It has, in general, > > > > > > an unsustainable cost. Maximal sustainable employment is when the > > > > > > GDP's average of its marginal increments equals the variance. However, > > > > > > maximal employment does not, necessarily, mean full employment.) > > > > > > > > > > > > John Conover writes: > > > > > > > > > > > > > > In case you are curious, the big US economic recessions since > > > > > > > Independence happened in 1819, 1833, 1837, 1857, 1873, 1893, > > > > > > > 1929, (using the GNP/GDP numbers, which are not necessarily > > > > > > > coincident with the stock market numbers, as far as downturns > > > > > > > go,) and, (possibly-we don't know yet,) 2000. > > > > > > > > > > > > > > Note that this is the first generation in US history that has not > > > > > > > had to endure a famine/depression, (at least yet,) and our > > > > > > > perspective does not include how ugly they really are. > > > > > > > > > > > > > > John > > > > > > > > > > > > > > BTW, the numbers are interesting. To make predictions-like in the > > > > > > > attached-the US GDP must be a deterministic system. Finding a > > > > > > > mechanism that gives zero-free paths representing those numbers is > > > > > > > a formidable proposition. If that can't be done, then, predictions > > > > > > > can not be made. > > > > > > > > > > > > > > In NLDS systems, (of which the US GDP is certainly one,) the > > > > > > > influence of the past on the future deteriorates rapidly-meaning that > > > > > > > a small window into the past can be used to predict a small window > > > > > > > into the future, and that is the best that can be done. The size of > > > > > > > the window for the US GDP, is, at best, a few months, to a 70%, or so, > > > > > > > accuracy. > > > > > > > > > > > > > > Unfortunately, the prevailing wisdom is that fiscal/monetary policy > > > > > > > can not be used to influence the fluctuations in the US GDP-which > > > > > > > was the paradigm of the past seven decades, and has since been > > > > > > > abandoned. > > > > > > > > > > > > > > http://www.thestandard.com/article/0,1902,24243,00.html -- John Conover, john@email.johncon.com, http://www.johncon.com/