Throughput Accounting is a lot of things: an accounting system, a financial application, and most importantly, the decision-making arm of Dr Eli Goldratt’s Theory of Constraints. A deep understanding of the Five Focusing Steps is essential in being successful with Throughput Accounting. More on this in a subsequent article.
Throughput Accounting first saw the light of day around 1984, but it was only in 2019 that the AICPA (the biggest Financial Accounting body in the world) included the topic of Throughput Accounting in its syllabus.
Today, many misapplications – some would say flawed mental models – persist in the use of Financial Accounting and Management Accounting. Financial Accounting is meant for external reporting, while Management Accounting’s focus is internal. The two are distinct and operate on separate bodies of knowledge and different paradigms. The line that separates these two sciences is blurred by the practice of absorption costing in Cost Accounting. More on this turn of events as we proceed.
In the Accounting world, Financial Accounting is the 900-pound gorilla in the room. For markets to operate smoothly, it is essential for external stakeholders to have a comparable understanding of firms’ performances over time as well as within and across industries and, finally, for the taxman to collect his dues. For these reasons, the CPA profession (or CA in certain countries) operates within a legal and regulated framework. Management Accounting is represented by other certifying bodies (CMA, CIMA, etc.) who do not have the legal authority to audit Financial Statements, which is a task reserved for the CPAs. Management Accounting is inclusive and will use any tool, process or way of working that can help decision-making such as all accounting flavors, Lean, Six Sigma, KPI, balanced scorecards, and the like.
Throughput Accounting rightfully belongs under the large and welcoming umbrella of Management Accounting in order to align motivation, behavior and performance.
The discussion that follows is in accordance with thought leaders in the field of Throughput Accounting regarding decision-making. It often diverges from the perspective of traditional accounting, where cost information is of paramount importance. That same information is of no use in Throughput Accounting when it comes to decision-making.
For this reason, this article will highlight five of the most common flawed mental models that are part of the fabric of society and serve to impair our judgment. To fully explore and understand this subject, we will tackle all topics in the context of physical goods that we manufacture and sell. The nuances and subtleties will be easier to grasp as a result.
Subsequent articles will address novelties in Throughput Accounting applied to Knowledge Work, which is invisible by nature.
As a reminder, a constraint is the bottleneck – possibly one out of many – that prevents an organisation from reaching its goal.
Cost Accounting is paradoxical. It is found in both Financial and Management Accounting. According to Steven M. Bragg, Cost Accounting is meant - and designed primarily - to support Financial Accounting reporting. For Debra Smith, Cost Accounting is a technical, hybrid process designed to satisfy generally accepted accounting principles (GAAP).
Cost Accounting is, for all intents and purposes, Financial Accounting’s best friend, as it feeds it the carrying cost of inventory - the ‘Holy Grail’ of Financial Accounting - in order to assess profits.
The chaos that warranted the birth of formal and rigorous Financial Accounting Standards - circa 1916 in the US railroad industry and forcefully after the Great Depression - was in part due to the lack of guidelines or consistent approaches regarding unit costs to tally fair reported profits from revenues. Revenues are always easy to reconcile as they are contemporaneous to sales. The actual matching of expenses to sales is much harder to assess as costs are unknown and unknowable, hence the need for Cost Accounting to support Financial Accounting!
Cost Accounting does have practical uses for managerial purposes – medium to long term planning of capacity is one – but curiously enough, its most distinguishable ‘absorption costing’ functionality has found a niche in the field of Management Accounting for no valid reason as it is largely useless when it comes to daily operational issues. This ‘invasion’ is an oddity as the ‘cost per number’ – of a historical nature – is moot in daily decision-making. Cost Accounting occupies more than 50% of the curriculum offered in Management Accounting.
Take the example of the ‘cost per call’ in a new call center. The landline requires a cash outlay at a CO$T of $1000 per month. In the first month, 100 calls are made. The COST per call is therefore $10. The ensuing month, 1000 calls are made. The COST per call has dropped to $1!
Is this knowledge actionable? Does this type of allocation have any managerial value for any type of decisions? The monthly charge will remain $1000 per month no matter the number of calls.
We also often hear of the ‘COST per meeting’, where twenty people are found in a room for a day. Again, all the salaries are paid for along with the meeting room. No cash outlay actually occurs. The ‘COST per meeting’ falls into the same category as ‘cost per number’.
All ‘cost per number’ metrics have no cash outlays and are irrelevant for operational and day-to-day decision-making. (These ‘cost per number’ are of great significance, and essential in Financial Accounting and Cost Accounting. Yet, the cost per product is irrelevant in decision-making as far as Throughput Accounting is concerned and therefore, so is the notion of profit per product!)
‘Cost per numbers’ are still used today for decision-making but as early as 1916, W. Hooper from the Interstate Commerce Commission had this to say:
“There is no difference of opinion, however, as to the fact that no one basis for division or no general rule for a division of these expenses can be made which will conform to the facts, even approximately, under all conditions.”
In the brick-and-mortar economy of the physical flow of goods, Cost Accounting (read: absorption costing used for Financial Accounting) is fit for purpose and reflects reality for the carrying cost of products in only three scenarios, the first two occurring only rarely:
Throughput Accounting is clear on the issue of fixed costs and Operational Expenses: they are predictable, do not change with sales or production, and therefore do not need to be allocated or ‘absorbed’ to products transiting in inventory to arrive at a ‘cost per unit’. All inventories in Throughput Accounting – raw materials, work in process and finished goods – are carried at purchase price or TVC (Totally Variable Costs, calculated as purchase price plus freight-in.)
Throughput Accounting is more concerned with the incremental impact (delta) of decisions upon the following variables in the precise order presented below:
Δ (TH) – Throughput = Sales minus TVC – Totally Variable Costs
Δ (I) – Investments/Inventory = All the money in the system
Δ (OE) - Operational Expenses = All period expenses, excluding TVC, required to turn Investment into Throughput
To get in harmony with Throughput Accounting, one would be advised to think like a financial analyst! Marginal, direct and variable costings have a lot in common with Throughput Accounting except for the flawed mental model regarding labor costs, which we will cover next.
There are two tragedies in traditional accounting regarding the treatment of labor as a variable cost. One is human and the other financial. We need to thank Dr Goldratt on both counts for exposing and resolving those two flaws.
This first part is tragic, and demonstrates Dr Eli Goldratt’s contributions towards building a more humane workplace with its handling of labor costs. In the early 1900s, workers were paid on a piece rate basis. If the chair you built was not acceptable, you were simply not paid.
In that context, labor is indeed a variable cost. Today, knowledge workers are paid no matter what. That goes for full time employees and consultants alike. The proper accounting angle is therefore to consider labor as a fixed cost, such as capacity. In that regard, our modern treatment of labor is perfectly in line with the 1900s, and not the 2020s!
Today, in the Western world, this handling of labor costs as variable has dire consequences. Variable costs are always subjected to more managerial scrutiny than fixed costs such as capacity. Idle machinery at night is totally acceptable even if the interest to pay it off continues to accrue when the plant is closed.
Capacity can therefore be idle, in a wait state, in a blocked state, etc., and no one would give it a second thought. Such is the nature of fixed costs. Knowledge workers often find themselves in an idle, wait, feedback delay or blocked state! See the filiation and analogy between Knowledge Work and capacity … Stunning.
But so much for the humane aspect. Let’s address the financial and cash flow incongruity insofar as project planning is concerned. No one likes to leave money on the table. But we do so all the time by ignoring projects that are profitable and replacing them with projects that are less attractive.
When we budget for a project, we must find the benefits (revenues) and match the proper costs in order to arrive at a profit to rank and prioritize our various undertakings.
Remember the cost per number fallacies we discussed earlier? The ‘cost per project’ is often time blurred by the human resource COST, not CO$T, that we associate to it. Throughput Accounting is concerned with the incremental impact, the marginal effect of decisions. No matter what you choose to do, or not to do, labor expenses are incompressible and therefore not pertinent to marginal decision-making.
Let’s say that you have, on a permanent basis, ten full-time employees and ten consultants each on a five-year contract. Employees all require cash outlays of $100,000 per year and each consultant, $200,000. Let it be understood that no matter what, three million dollars will be spent on labor. The notion of variable cost is totally absent in this scenario.
Including a single penny of this money to create a COST for a project will falsely raise the bar and serve no economic purpose.
The proper approach to measure absolute profitability is with incremental Net Profit. In Throughput Accounting it is appraised according to this formula:
Δ Net Profit: NP = Δ (TH) Throughput – Δ (OE) Operating Expense
Using the work force actually in place will in no way increase Operating Expenses and treating it otherwise as a project cost is a mistake.
One of my favorite managerial quotes is from Albert Allen Bartlett and has nothing to do with management.
"The greatest shortcoming of the human race is our inability to understand the exponential function.”
For accountants of all breeds, a simple flat curve is not how capacity is envisioned. But, in reality, capacity is always a flat line. Yet accountants will go above and beyond the call of duty to shape this function into anything but a horizontal line. The obsession of Cost Accounting to change the behavior of fixed cost is the cause of flawed decision-making. Common sense dictates that once you have acquired something at a CO$T, you use it as you deem fit and necessary without hesitation.
But not for capacity. Not for accountants. There has to be double counting. Cost Accounting must indeed assign an additional COST value upon usage of capacity that has already been paid for, raising the bar at each attempt you make to use existing capacity.
In Throughput Accounting, data relevancy is king and transactions cease to be a concern upon completion of an economic activity in the current period. For Cost Accounting – and Financial Accounting – data accuracy is mandatory and the historical details of a transaction can persist during and after the economic undertaking, as transactions have no predetermined cut-off requirements for post-financial recording.
To set the record straight, I like the following logic paraphrased from Reginald Thomas Lee Sr regarding capacity:
Let me explain using the following three illustrations where Cost Accounting (on the left) always tries to change the behavior and shape of the capacity function. Throughput Accounting (on the right) accepts the true nature of fixed costs, irrespective of sales, production or other subterfuges.
Figure 1 – The more I produce, the lower the unit COST. This ignores the fact that Operating Expenses are totally predictable and fixed per period.
Figure 2 – Each unit sold must ‘carry’ its fair share of fixed costs. If it fails to do so, it will simply not be produced and will not generate additional Throughput! This, despite the fact that fixed costs are predictable and seldom change in relation to sales volume or production!
Figure 3 – The more units produced, the more you need to pay for capacity. This COST illusion is in no way related to CO$T. There are no reasons why you should not use capacity that has already been paid for!
Figure 3 warrants further explanation by providing another illustration.
In Cost Accounting, the product selling floor appears at the top left corner of the above picture. At budget time, after you have ‘guessed’ your sales volume for the year, all the costs are allocated according to some arbitrary driver (hours worked, square footage, units of production, etc.)
Once you have reached your sales volume for the year, the hurdle remains. You still need to pass the line drawn in the sand by the contribution margin required and cover all your fixed costs at every single item sale.
Throughput Accounting paves the way for TH generators. All selling prices above TVC can be molded to target markets. In the Cost Accounting world, the product selling floor is off the mark and can really frustrate the Sales and Marketing departments!
Whether we like to admit it or not, we behave as if we manage in an unconstrained world, where all parts of a system carry equal weight. This is in harmony with Cost Accounting. Each one is responsible for optimizing its part and, as if by magic, the sum of all local optima will be carried forward on the bottom line.
The Theory of Constraints has a generous view on capacity and grants it three attributes: Productive capacity (to feed the constraint), Protective capacity (to protect the constraint from starving), and on top and above, the welcome Excess capacity to provide slack and idleness and other significant managerial benefits. Lean and traditional accounting accept productive capacity at par but do not tolerate the existence of either Protective capacity or Excess capacity, as they are considered wasteful and must be eliminated. The mere presence of slack or observed idleness is chastised.
Let’s consider the below value stream from a cold, detached managerial angle. Each step of the value stream is associated with a geometric shape that depicts its capacity. The BUILD phase has the lowest capacity (where a ticket spends the most amount of time when compared with the other phases) and is the identified constraint.
Traditional management would dictate that the entire value stream be managed and that all non-productive capacities be eliminated. In that scenario, we would have not one but four constraints and live in an unpredictable, unstable, and costly environment.
Under the lens of the Theory of Constraints, we would be satisfied with simply keeping an eye on the BUILD phase as it dictates the Throughput of the system. All Throughput Accounting decisions should be geared in light of the Five Focusing Steps.
The choice is clear: manage the entire system or focus on the one part that dictates Throughput. Operational excellence is greatly simplified when you have unbalanced capacity and the constraint governs the process.
Many Kanban schools of thought reckon that constraints in Knowledge Work move all the time and that managing them is fruitless. Constraints in the physical world are visible and can be managed and kept in place at will by acting on the three levers of Throughput Accounting: (T) - Throughput (sales), (I) – Investments, and (OE) Operational Expenses in that order.
It is important to note that, in the following illustration, the traditional accounting lens is inverted. It puts emphasis first on reducing Operational Expenses, then shrinking Investments and lastly, augmenting sales to increase profits. This is the opposite of Throughput Accounting, where we argue that there is only so much cutting that you can do before making yourself obsolete!
The fact that work is invisible in Knowledge Work is not an impediment to strategic Constraints management. Let me illustrate with this trivial example: if you wish UAT to be the constraint, then just put one resource in this phase as opposed to dozens in each of the Architecture, Design and Build phases. I guarantee that the constraint will freeze and never leave UAT!
All that is required is that the non-constraint phases have excess capacity above and beyond that of the constraint. Visible or invisible work is simply not an issue. We all have ways – especially in Kanban – of gathering metrics pertaining to capacity, queues, and constraint awareness. These include Lead Time, WIP ageing, Backlog Size, Queue Size, Throughput, Flow Efficiency, Order to Cash cycle, Touch Time, Wait Time, etc.
Nevertheless, let us build a better case and see where pinning a constraint (or not) makes sense according to 1) (OE) - Operational Expenses, 2) (I) - Investments and 3) (T) – Throughput. (The same rules apply for physical goods.)
First (OE) - Operational Expenses. It is fair to assume that Operational Expenses are directly linked to operational complexity. Usually, the further upstream we are in Knowledge Work, the more we enter the ‘fuzzy front-end’. We are talking here about requirements as expressed by the client and that the technical team must materialize. Of all places, this is not where one would like to see a constraint where capacity is kept to a minimum. On the contrary, whenever we are dealing with upstream operational complexity, capacity should be plentiful. Slack and idleness belong here and the Architecture and Design gates in our illustration deserve the protective and excess capacities reserved for them, even if this capacity must be harvested downstream from UAT!
Next, let us assume that (I) - Investments, meaning all the equipment and software for running our business, are prohibitive cost wise. It clearly impacts the BUILD phase in the earlier illustration. We all know that the non-constraint must have excess capacity. Thus, putting the constraint at the BUILD phase and keeping capacity to a minimum at that point makes sense. Doing otherwise would require to pack up the BUILD phase capacity so that it could support the constraint. Having slack at the most expensive spot is not indicated in this case. When you are indecisive about where you should put a constraint, think in terms of slack. You do not want slack at the constraint as it must be kept busy at all times!
Lastly, (T) - Throughput considerations as a constraint. The further you are downstream, the closer you are to the market and on the verge of cashing in. For example, Amazon with its ‘one click’ feature wants to 1) make life easy for you, 2) confirm the inventory availability of your cart, 3) bill you, and 4) ship ASAP. All parts of the system are subordinated to the ‘one click’ constraint and they all have plenty of capacity across the value stream to get you out the door in no time! Amazon’s ‘one click’ never waits for upstream resources.
The presence of a constraint has an impact on how we treat transaction and coordination costs. When we are away from the constraint, idle capacity is plentiful. At those locations, there is plenty of slack and transaction and coordination costs have no impact on the Throughput of the system and should be used to increase capacity. Increasing capacity at the non-constraints is totally in line with a form of exploitation of the constraint. By increasing capacity through existing operating expenses, a non-constraint can then be better equipped to ‘sprint’ should the system require it upstream of the constraint. Alternatively, increased capacity can improve lead time downstream of the constraint to achieve better delivery-due-date performance.
Coordination and transaction costs do, however, have an impact at the constraint. All activities matter at the constraint. Here, it is of paramount importance to guarantee that these activities impact financial Throughput positively. A CO$T is either defined as 1) a cash outflow – never the case with existing operating expenses that are used in transaction/coordination costs – or 2) a lost opportunity for Throughput (cash inflow), which is more likely here if the constraint is not optimized and kept busy at all times. Calling in your top architect or specialist for daily ritual meetings is an example where coordination and transaction costs are meaningless.
Planning, scheduling, orchestrating, working in parallel, and handling big batches are significant transaction and coordination activities, which are essential to having the constraint run at full throttle. These activities are less of a concern at the non-constraint points since slack and idleness is in abundance and cherished.
Agile organisations wishing to adhere to the precept of ‘maximising the amount of work not done’, should minimise managerial oversight at the non-constraints.
Financial Accounting treats Inventory as an asset. The more inventory, the shinier the Balance Sheet and the better looking your Income Statement will appear to the detriment of your Cash Flow position. Window dressing and the doctoring of Financial Statements are easily achieved by manipulating inventory levels.
Cost Accounting and Financial Accounting have these innate and perverse properties:
The astute reader will quickly decipher the underlying motivations as to why top management might be tepid in their adoption of Throughput Accounting. The apprehended and immediate negative impacts of reduced inventory levels on the Income Statement is deemed a primary impediment in embracing Throughput Accounting. And this will happen regardless of the quality of your actions or decisions: when inventory decreases so will reported Income.
Reconciliation of traditional accounting with Throughput Accounting is straightforward and should never be a reason not to use both models, which is the optimal scenario.
Stellar financial performance systems have one thing in common: their quest to minimize Inventory. (In Agile parlance, we can equate Inventory to WIP – Work In Process. The lower the amount of WIP, the better the flow and performance of the system.)
Manufacturing pioneer Henry Ford built his high-performance system in part by minimising WIP. His assembly lines featured homogeneous parts, very small and restrained workstation space to prevent inventory from piling up, and a simple automobile design with few features. We all remember Ford’s famous quote regarding the T model: ‘You can have any color that you want as long as it is black’.
In post-war Japan, where supplies were scarce, Taiichi Ohno – father of the Toyota Production System – demonstrated brilliance by adapting to the situation and embracing small batch sizes and frequent replenishments. He pursued this approach after a memorable visit to a US supermarket where he observed the practice of filling shelves many times per day, if required, with just enough supplies. High item variability was a factor that Ford didn’t have to worry about. Letting go of the practice of long runs to reduce unit costs was contrary to common practice at the time and is a tribute to Ohno’s vision.
As Ford limited workstation space to limit WIP accumulation and Ohno embraced small batch sizes and frequent replenishments to tame inventory levels, along came Dr Eli Goldratt – developer of the Theory of Constraints – who decided to replenish on the time axis with his Drum-Buffer-Rope construct. As opposed to Just In Time, where inventory buffers are witnessed at every gate, Goldratt chose to place a physical Buffer at the constraint that would be an equivalent of the time it took to empty that buffer. The Rope function made sure that materials upstream would find their way to the constraint in due time once a replenishment signal was triggered by the pace of the Drum – the beat of work performed at the constraint.
While Ford and Ohno excelled in the physical and manufacturing world, Dr Eli Goldratt’s applications covered many topics that resonate with Knowledge Work, like his Marketing application and CCPM for project management.
We could not leave this discussion on reducing Inventory (WIP) in Knowledge Work without mentioning Little’s Law and its impact on Agility, especially with Kanban where it is used in a probabilistic manner.
The Theory of Constraints is also a consumer of Little’s Law and uses it in a deterministic fashion for operations management.
We will revisit Little’s Law for Knowledge Work and its impact on the Five Focusing Steps in another article but not before making the following observations on the merits of reducing WIP in Knowledge Work. The passage of time – which is in direct correlation with high WIP – makes a mockery of our planning and estimating efforts. It has a direct decay effect on all our deliverables that spend too much time awaiting treatment. The ‘piling’ up of deliverables at non-constraints – architecture, design, written code, etc. – must be subordinated to the pace of the constraint if it were located at the UAT phase, for example.
The speed of feedback loops is key in agile. The faster the answer, the more agile we become. Once, someone introduced me to this agile metric: agility = 1/time it takes for you to get an answer. It gets the message across. Low WIP supports this.
There are two ways to cut WIP for Knowledge Work: The Lean approach would dictate that someone in authority cuts the amount of WIP in the system, therefore creating a chain reaction. Why? When WIP is reduced, cooperation is forced and as collaboration grows, so does WIP reduction. It is an example of a self-reinforcing feedback loop. The agile approach ignites the process in another way. It begets cooperation first, which reduces WIP, which begets more collaboration, and so on. Those are the only two alternatives to cutting WIP: Lean or Agile.
Another kind of Inventory that impacts (OE) – Operating Expenses must be kept in mind, and it has to do with the depth of your bench.
A good Business Analyst or Architect takes a long time to ramp up. It may take up to a year to appreciate their values. When these resources are in abundance with current demand for these skills, it is imperative to keep them on hand, as business domain knowledge is slow to amass.
On the other hand, a shortage of DBAs or SYSADMINS can easily be dampened with the ad hoc hiring of resources, as knowledge of the business domain is orthogonal to their performances. The quality of a DBA can be assessed at interview time with little risk. A good Business Analyst, on the other hand, will have to understand the business and take the time required to do so.
While I was learning the accounting profession, all Management Accounting books were written by Charles T Horngren. If memory serves me right, he must have written no fewer than twenty books on the topic and is considered the pioneer of Management Accounting.
In 1993, he stated as much, breaking with tradition and all his previous teachings:
‘"The criterion for maximizing profits when one factor limits sales is to obtain the greatest contribution to profit for each unit of the limiting or scarce factor. The product that is most profitable when one particular factor limits sales may be the least profitable if a different factor restricts sales. When there are limitations, the conventional contribution or gross margin-per-sales-dollar ratios provide an insufficient clue to profitability" –The Measurement Nightmare, Debra Smith, p21
To this day, this revelation is yet to materialize itself in the business world!
The following heuristics can also be of use to put Throughput Accounting into practice:
To conclude, all CPA jurisdictions are governed by laws within geographical regions of a country: states in the USA and provinces in Canada. All the laws stipulate that CPAs must help their clients increase their wealth, in addition to providing fiscal, accounting and auditing services.
We can only recall Dr Eli Goldratt’s own words: ‘Let me help you make more money now and in the future’.
The future is now.
- Daniel Doiron, CPA
I love systems, enjoy measuring improvement while embracing teamwork that actually works and find its way on the bottom line! Throughput Accounting helps me get Unity of Purpose within teams and organisations.
I am bringing Throughput Accounting to CPAs, CxOs and Agile Center of Excellence.
For more on Throughput Accounting for Knowledge-Work, visit https://www.agileagonist.com/
 A long time ago, both Taiichi Ohno from Toyota and Dr Eli Goldratt did not mince words with Traditional Accounting. They did not want accountants around as they were deemed public enemy number one of productivity. That did not help adoption of Throughput Accounting within the traditional CPA profession.
 Bragg, Steven M. ‘Constraint Management: A Financial and Operational Guide’. AccountingTools Series. 3rd edition
 Smith, Debra. ‘The Measurement Nightmare’. Page IX
 Smith, Debra. ‘The measurement nightmare’. Page 55
 The use of ‘standard costing’ in numerous industries to charge clients for work – car repairs for example – has great managerial benefits
 Smith, Debra. ‘The Measurement Nightmare’. Page 45
 The term ‘CO$T’ in this article implies a cash outlay. On the other hand, the term ‘COST’ simply depicts a Cost Accounting constructs arrived at by allocating fixed costs to products or services. (Fixed costs that have already been paid for and do not infer a second cash outlay.)
 Hooper, W. The Accounting System Prescribed for Railroads by the Interstate Commerce Commission. The Annals of the American Academy of Political and Social Science. Vol. 63, National Industries and the Federal Government (Jan. 1916, pp 222-231)
 Doiron, Daniel, CPA. Throughput Accounting class for CPA & CxO at www.agileagonist.com 2021
 In Knowledge Work, the concept of TVC (Totally Variable Costs) – meaning costs that vary with a correlation coefficient of 1:1 with sales – seldom apply
 Du Plooy, Etienne. ‘Throughput Accounting Techniques’. Page 63
 From the book: ‘Lies, Damned Lies, and Cost Accounting: How Capacity Management Enables Improved Cost and Cash Flow Management’.
 Inspired from Sproull, Robert and Nelson Bruce. ‘Epiphanized’. Page 280
 Balancing FLOW is not done by balancing CAPACITY! FLOW issues are self-revealing in a low WIP system!
 Ricketts, John A. ‘Reaching The Goal’. IBM Press. Page 14
 Smith, Debra. ’The Measurement Nightmare’. Page 92
 When goods go into inventory, one debits (increases) an asset account instead of an expense account, making the Income Statements look better. Continuously growing inventories over time will always delay the recognition of expenses on the Income Statement. The value of inventories always decay with the passage of time and ‘garbage’ on the Balance Sheet will eventually have to be written off and report a loss on the Income Statement!
 A third reason why Throughput Accounting is not ‘embraced’ is that Cost Accounting systems cost a fortune. Having this costly data and information must surely have value in decision-making. It rarely does. It must be stressed again that Financial Accounting is mandatory as it allows the economy and the financial markets to communicate and operate in an orderly fashion.
Please subscribe and become a member to access the entire Business Agility Library without restriction.