Clause 61: The Pushback Blog

Because ideas have consequences

Archive for the ‘Fundamentals’ Category

Freedom from Choice

with one comment

Moral autonomy has been a central feature of Western thought since the Enlightenment. In the history of philosophy, the concept is usually considered to have been developed by Kant and further refined by Mill. Given the deep pluralism of belief begun with the Reformation and the Enlightenment emphasis on rational inquiry, I believe the development of a morally autonomous individual was a logical outcome.

Where personal autonomy is the ability to choose one’s own actions, whether moral or not, moral autonomy is the ability to conduct one’s own inquiry into moral behavior and determine for oneself the morally correct course of action. Individualism would not be possible without moral autonomy. Separate from these is political autonomy, which, since it concerns politics, applies to groups: a group having political autonomy can set its own political course.

Moral autonomy requires some discussion of the selfhood of the person involved. A key question is: Does an authentic self really exist apart from the society in which the person lives? A person who answers in the negative will likely emphasize belonging and relationships above individuality and autonomy. At the extreme end of this view, moral autonomy would not even make any sense.

Moral autonomy is also impracticable, if not unthinkable, in a clan-based society like Afghanistan. The individual who attempted to assert his autonomy would put himself outside the protection of his clan. He would be a target for other clans and anyone who wanted someone to pick on. His life would necessarily be solitary, mean, nasty, brutish and short.

Although Western thought has been very far-reaching, it is not universal. It has critics both inside and outside of Western nations. Furthermore, we now have a large number of people in the West who are unaware of the advantages that Western thought has conferred upon them and are not prepared to defend it.

Beyond this, there are seasonal tides that cause moral autonomy to be viewed differently through the decades. The 1930s, for example, were very collectivist years in history, and autonomy was under attack almost everywhere. Since 1960, moral autonomy has made an uneven comeback in the West, galloping forward in some areas while advancing fitfully and tentatively in others. Being aware of the history, one cannot simply extrapolate the continued advancement of moral autonomy without reversal into the future.

MacIntyre’s Objections

Irving Babbitt quoted a joke from the 1920s asserting that everyone would ultimately have to become either a Marxist or a Roman Catholic. Alasdair MacIntyre has done both, starting as a Marxist but later converting to Roman Catholicism and ultimately taking up a Thomist approach. MacIntyre is considered a very important communitarian thinker.

His first important major work was After Virtue (1981), wherein he asserted that the liberal Enlightenment project had failed and had done so necessarily, not accidentally. While After Virtue was primarily a criticism of where the Enlightenment had gone wrong, he provides hints of what he would substitute for it. Later writings, particularly Dependent Rational Animals (1999), advanced MacIntyre’s positive communitarian program.

Justice and Moral Anarchy

.. modern politics cannot be a matter of genuine moral consensus. And it is not. Modem politics is civil war carried on by other means …
After Virtue, p. 253.

MacIntyre sees modern liberal individualism having descended into emotivism, where there is no ground to reach agreement among partisans having competing moral claims. Although we engage in rational argument to persuade others of the correctness of our viewpoint, there is no shared moral basis to which we can appeal in order to serve as a foundation, offering mutually agreed-upon premises for persuasive argument.

Although the moral claims are advanced by persons, the partisans claim that their arguments are impersonal and even universal.

Yet if we possess no unassailable criteria, no set of compelling reasons by means of which we may convince our opponents, it follows that in the process of making up our own minds we can have made no appeal to such criteria or such reasons. If I lack any good reasons to invoke against you, it must seem that I lack any good reasons. Hence it seems that underlying my own position there must be some non-rational decision to adopt that position. Corresponding to the interminability of public argument there is at least the appearance of a disquieting private arbitrariness. It is small wonder if we become defensive and therefore shrill.
After Virtue, p. 8.

In a community of people attempting to reach political decisions in this way, they cannot do so on a moral basis, because they cannot achieve agreement upon premises. Therefore, resolution of disputes must in the end be a matter of which side has the stronger will and is prepared to use the least restraint in order that their will should prevail upon others not so minded.

An honest assessment of the events of the past year, at the very least, leads me to believe that the above is an accurate rendering of what we have come to.

The Telos

MacIntyre has a very direct writing style. Chapter 5 of After Virtue is titled, “Why the Enlightenment Project of Justifying Morality Had to Fail.” At root, he claims it had to fail because it disputed the idea of an ordained human purpose, a telos. A telos exists outside of human choice. It imposes ethical obligations on all persons, “just because you live here.” You don’t get to choose whether or not to morally accept it. You can always refuse to honor its demands, but you will be morally less of a person because of your refusal, and good people will shun you.

The assertion of a human telos is a direct attack on moral autonomy. Or, if you prefer, it is equally true the other way around: the assertion of moral autonomy is a direct attack on a human telos. The latter is the more historically correct, because that is one of the consequences of the Enlightenment. It logically follows from the Reformation: once there was no longer one monolithic authority — namely, the Roman Catholic Church — to interpret the telos, who was going to be in charge of the interpretation?

Dissent in Communitarian Societies

How does a communitarian society, which rejects individual autonomy, turn back when it starts to go wrong? The events of the twentieth century have demonstrated that organizations at all levels and scales, from clans to religious movements to commercial enterprises to political entities, are fully capable of going astray. To avoid this issue is to engage in philosophical negligence; it is simply bad risk management. There must be a framework for individuals to dissent from the decisions of the community on moral grounds and seek to have these decisions reconsidered.

The show trials in the Soviet Union in the 1930s were considered remarkable because the defendants willingly acknowledged their own guilt. Why did they do that? Why did they not defend themselves? Solzhenitsyn wrote that they had gone to a moral place from which they could not defend themselves.

And what did Bukharin fear most in those months before his arrest? It is reliably known that above all he feared expulsion from the Party! Being deprived of the Party! Being left alive but outside the Party! And Dear Koba [Stalin] had played magnificently on this trait of his (as he had with them all) from the very moment he had himself become the Party. Bukharin (like all the rest of them) did not have his own individual point of view. They didn’t have their own genuine ideology of opposition, on the strength of which they could step aside and on which they could take their stand. Before they became an opposition, Stalin declared them to be one, and by this move he rendered them powerless. And all their efforts were directed toward staying in the Party. And toward not harming the Party at the same time!
These added up to too many different obligations for them to be independent.
The Gulag Archipelago, vol. 1, p. 414; italics in original.

Without moral autonomy, it was not possible for any of the accused Communists to have an individual point of view, at least not in ethical terms. Without moral autonomy, who were they to oppose the community, even when the community demanded that they sacrifice themselves to it?

This behavior was not confined to communists. I have previously cited the example of Hindenburg. Once the German citizens really accepted “the conviction that the subordination of the individual to the good of the community was not only a necessity but a positive blessing,” they did not have a moral leg to stand on when that community chose racist and exploitive collectivists to lead them.

Virtues and Autonomy

The assertion of virtues with a prior moral claim upon all persons can only be squared with moral autonomy if all persons would somehow converge on the acceptance of these virtues. This was part of the great Enlightenment project. Kant hoped to resolve this with the categorical imperative, which American progressive education simplified to, “What if everybody thought that way?” He hoped that all moral and thinking persons, no matter their starting point, would be able to use this to reason their way to a common moral understanding. Kant both underestimated the potential scope of deep moral pluralism and failed to reckon with the ability of people to rationalize.

The discovery of a telos, a higher human purpose, which all persons could assent without compromising moral autonomy would be a worthwhile project, and I wish success to anyone who undertakes it. However, after all these years of life, study and experience, I doubt that it can be achieved. Where does this leave us?

I hear and acknowledge MacIntyre’s criticisms of Enlightenment inquiry and moral autonomy, but I am deeply skeptical of his program to address them. Individual moral autonomy is a supreme achievement of Western civilization. It is our front-line defense against mass movements that would lead us lemming-like to our destruction.

Supplementary Links

This discussion only scratches the surface of issues involving moral responsibility and political consequences thereof. For the reader having a deeper interest in the subjects discussed, here are some leads. One should bear in mind that any of these will be written by a person with a point of view.

Internet Encyclopedia of Philosophy

Stanford Encyclopedia of Philosophy


Written by srojak

January 1, 2017 at 12:42 pm

Consumption, Investment and Speculation

leave a comment »

With the recent Nobel Prize announcement, previous interviews with co-laureate Robert Shiller have been recycled, such as the interview from February 2013 where he challenges the received wisdom of a home as an investment.

When is an activity an investment, and when is it not? And if it is not an investment, what is it? Although Shiller makes valid points, the answers are not entirely clear-cut.


It is obvious from talking to people that the differences among consumption, investment and speculation are not well understood. So our starting point has to be clarifications of what each of these are.

Current Accounts

Start with the current activities of any person or group. The entity is producing and consuming wealth in order to stay in existence.


People need food, clothing and shelter in order to live. These are forms of wealth that, to varying degrees, we use up in the course of living. Food has the shortest lifetime; we eat it and it is gone. Clothing lasts a little longer, but it has very little residual value.

Periodically, there is public discussion around the assertion that “70% of Gross Domestic Product (GDP) comes from personal consumption.” As Robert Higgs points out, it is easy to make an unwarranted leap from that assertion to a meaning. There is no agreed-upon norm of what proportion of the economy personal consumption ought to be.

The assertion itself is open to question, as Michael Mandel explained in 2009. A percentage is a ratio, and as is usual with statistics based on ratios, the denominator is where the slipperiness is. More about this shortly.


Few people discuss surplus, but it is the secret sauce of prosperity. Surplus is production less consumption. In order to have anything left over to store up and get richer, one has to be producing more than one consumes; this is called capital formation.

The surplus produced by a business is called retained earnings. Typically, accountants back into retained earnings; it is what is left on the balance sheet after liabilities and equity, which are sources of funds, are subtracted from total assets, which are uses of funds.

In order to have any wealth to work with above the subsistence level, somebody has to either be running a surplus now or have formed and preserved capital from a prior surplus. Borrowing in order to invest presupposes that the lender has capital as a result of a prior surplus, or there would be nothing to lend. If a couple buying a house had enough money saved to pay cash for it, they would not need a mortgage. This is usually not the case. The source of the wealth for the mortgage must have accumulated the capital in order to lend.

The problem with GDP is that it really doesn’t measure production at all, but spending. We look at what was spent and assume that the wealth must have been produced in order to be spent. However, there is mounting evidence in the form of international capital flows in support of the argument that this assumption is being violated for the United States.


A store of wealth which can be used to fund other wealth-producing activities is called capital stock. Modern economic activity depends on the existence of capital stocks that can finance other activities. I will begin by discussing these in terms of a for-profit business, and then switch to examine what changes when applying the terms to the activities of an individual person.

Paddy Hirsch has created the Whiteboard series on; it is a well-developed series that explains complicated topics in economics, business and finance (although with some biases). His discussion of lending is typical of how the topic is typically presented: from the perspective of the bank. While the bank’s perspective is important, the demand for capital is even more important, and is essential to understanding the nature of business activity.

When you start a business, your payables are due now but your income will arrive mañana, maybe. Your suppliers and employees don’t want to wait until you get paid before they get paid. The rent and electric bill are due immediately whether you have money in the till or not. You, the business person, have to front the money and bear the risk of ownership. Thus, you start out in a cash flow bind. Capital provides the funding to pay the bills now and see the business through until the income arrives later. As such, capital is a factor of production and is entitled to its share of income, which is called interest and dividends.


A lender makes capital available at relatively low risk for a fixed return, the interest on the debt. Lenders typically expect to not have their principal at risk, and loaned money has a senior legal claim to invested money. The lender relinquishes some upside in return for this.


Investors provide capital in return for an ownership interest. Investors do expect to have their principal at risk, but they also expect greater returns. At least in theory, the management of the business has significant accountability to the investors. This is less so in a publicly traded company, which is another topic of discussion. An investor with a significant position in a private company will have a presence on the board of directors and will have transparency into the operations of the business.


People often describe themselves as investors where they are really speculators. A person who takes an equity position in a business, not because he understands how the business makes money, but because a stranger on an airplane told him to, is really a speculator.

Upper-class people whose social norms discourage them from direct involvement in business affairs are often speculators and targets of swindlers. An example was provided in an en passant remark in the third series of Downton Abbey. The Earl was casting about for a source of income to keep the estate going, and he hit upon: “I hear of schemes every day that will double whatever’s put into them. There’s a chap in America — Charles Ponzi!” The reviewer in Television Without Pity described this as “super-cheap,” but I think it captures the propensity for such people to fall for speculative schemes without being overly preachy about it.

Speculation is inherently risky by nature. The speculator embraces risks that are not even visible to him in the expectation that a greater fool will come along. Public policy tolerates speculators, who magnify market movements, because they absorb risk.


Where does this leave the holder of common stock in a publicly traded company? Well, it depends on the purpose. A day trader is a speculator, pure and simple; there is not enough time for him to understand the business activities of the companies in which he is taking a position. His motivation is to take advantage of an opportunity that exists based on his expectation of what others will pay him for his stock in the immediate future.

A person who wants to be an investor rather than a speculator would have to know something about the business of the company whose stock she is buying in order to make an effective business judgment about it. She would be able to read the financials and understand the entire package line by line, including the notes. She would often still encounter challenges obtaining transparency into the business; this leads into corporate governance issues that are beyond the scope of this post.

The issues are further muddied because we commonly describe activities such as stock purchasing as “investments” when they may be in reality highly speculative. We discuss investments as if the mere fact that stock was purchased establishes that the purchaser is investing. We also misapply the term to describe speculation in metals. Then we take the same terminology and apply it in a parallel way to real estate.

The Non-Business Perspective

The foregoing definitions are from the perspective of a commercial business which produces wealth and is a going concern as long as it is financially solvent. How do these definitions apply to living persons, whose existence is not contingent on financial results?

Think about paying cash for a new car, with a service life of 5 years. A business that purchased a vehicle as part of its operations would expense part of the value of the car every year, matching this expense against the revenue generated by using the car in each year. The cash out the door at purchase would show up in the statement of cash flows in year one, but the car would be carried on the balance sheet as a depreciating asset.

An individual person is usually on cash-basis accounting. Nevertheless, the person who uses the car to commute to work is able to match some of the depreciating value of the car to income produced by its use in every year. At least part of the value of the car paid for up-front would then represent an investment, because it is amortized over a period of years to participate in income-producing activities. It also provides value in logistical functions such as getting to the grocery store. The owner of the car runs down the value of the vehicle, not in a malicious or destructive way, but by using it over its service life.

One could argue that a $20,000 car can perform these functions just as well as can an $80,000 car. If we were to use the $20,000 car as the baseline, a person who paid $80,000 cash up-front for a luxury car would be making a $20,000 investment and consuming the remaining $60,000. This is not a value judgment; the person choosing the luxury car has reasons that are perfectly valid to her, and I do not propose to criticize them. The important point is to recognize that not all of the expenditure in purchasing a long-lived asset is necessarily investment. I am not saying, “investment good, consumption bad.”

We can extend the metaphor by imagining a person paying $100,000 cash for a limited-production collector car, in the hope that the car will be worth more in the future. This would be speculative behavior; the value of the car in the future will be determined not by the use of the car by the owner but by the future demand for the car and the survival of other cars of its kind. The purchaser may represent that he is making an investment here, but it is really mere speculation.

Real Estate

I chose the car example both because it is easy to visualize and because it ties back into Shiller’s argument. What about “investing” in real estate?

Components of the Real Estate Purchase

When you buy a single-family house, you get the house and the land. The structure is, as Shiller says, a depreciating asset — it has a longer life than a car, but it depreciates nevertheless. For example, the expected service life of a well-constructed asphalt shingle roof is 15-20 years, although wind and hail can damage it and reduce the life.

Also, notice the introduction of the term “well-constructed.” The investor has to make sure that the roof really is well-constructed, possibly by personal supervision, either by the owner or a delegate. The speculator doesn’t care much beyond outward appearance; her only interest is what she can get someone to pay for the house.

Not only does the house wear out, but it ages in other ways. Interior décor goes out of style and requires updating or the house looks “dated” and the market value is impaired.

One would expect land to appreciate in value, because we can’t make more land (fills in coastal cities can’t keep up) and we are constantly making more people who want to live on it. However, several factors influence this. Land can be less desirable because it is less productive than neighboring land, or because it is in an area where demand is lower. When the conditions of a neighborhood (South Bronx) or the economic of a region (Detroit metro) deteriorate, demand for land there will go down. Thus, even land is not certain to appreciate in value over time.

I will exclude from this discussion townhouses and condominiums, which carry minimal or no land with the deed. These do not change the discussion except to increase the emphasis on the structure, which is the depreciating asset.

Home Ownership Considered

The headline is naturally sensationalized to get your attention: “Robert Shiller Destroys The Idea Of Investing In A Home.” Does he really? We need to understand the benefits and costs of home ownership.

Personal Benefits of Home Ownership

There are several benefits that a home owner obtains from owning his residence:

  • Control: You can do what you want with the house. Subject to local ordnances and association restrictions, you can hang items, paint and even move walls if you see fit. You can redo the kitchen, choose the appliances and arrange the storage to suit yourself.
  • Permanence: You do not have the uncertainty of leases, changes in landlord/management and deterioration of service. You can count on living there as long as you want to and you can pay the mortgage.
  • Privacy: Many people who would be renting apartments can qualify for a single-family house with greater separation from the neighbors.
  • Credibility: Outside of areas that are predominantly populated with renters, such as Manhattan, a homeowner has credibility as a citizen that a renter does not.
  • Tax shelter: Both mortgage interest and real estate taxes are deductible on Schedule A. This helps reduce the operating cost of owning compared with renting.

Notice that most of these are intangible benefits. It would be next to impossible to rigorously assign financial values to them.

Social Benefits of Home Ownership

Watch the movie It’s a Wonderful Life [trailer]. It was directed by Frank Capra, who had made the Why We Fight series to explain the necessity of American involvement in World War II. Capra really knew how to drive his point home. It’s a Wonderful Life is a paean to home ownership. He even gives the viewer a look at the alternative universe where most people don’t own their own homes: the town has become a pit, with dime-a-dance joints, pawn shops and prostitutes.

Frank Capra was not alone in the belief that a community of owners would be a better place to live than a community of renters. It has been tacit policy for decades to subsidize owners through tax relief and by the creation of quasi-government entities to influence home financing and socialize financing risk. These latter entities are commonly known as Fannie Mae and Freddy Mac.

We believed that a nation of homeowners would build stronger communities than would a nation of renters, because property owners would have a stake in their community. To an extent this is true, although there is not an exact correlation; it is possible to find locales with high levels of owner occupation that are not all that desirable.

When the owners start speculating on real estate, there can be problems because the owners overextend themselves and don’t have anything left to allocate for the upkeep of the community. However, without the community, the asset value of the property itself is at risk. Jack Knuepfer, who was chairman of the Board of Commissioners of DuPage County, Illinois from 1978-1990, summarized the problem this way in a 1990 interview:

It`s difficult keeping up with growth because when people move to a new house, they generally put every nickel that they have into it and maybe a nickel that they don`t have. And then, all of a sudden, they find out they have to pay for highways and schools and everything else, and that`s sometimes the straw that breaks the camel`s back. They want the lifestyle but would like someone else to pay for it.

At that time, you could drive around DuPage County at night and look at the see-through houses: the owners had made themselves so house-poor that they couldn’t afford window treatments, so you could see from the road clear through the house if the lights were on.

Drawbacks of Home Ownership

A house is a very illiquid asset. It ties up a significant amount of personal wealth in an asset that is not quickly or easily converted back into cash. People have been known to overextend themselves and become house-poor; they have the personal wealth, but most of it is tied up in the house.

If the homeowner wants to relocate to another part of the country, the house must be sold. If the local economy is doing badly, the homeowner can be stuck there if he owes more on the secured debt than the house is currently worth.

Don’t Spook the Cattle

Because of the way real estate pricing fluctuates, it has aspects of a Keynesian beauty contest: what matters when you go to sell a house is not what you think it is worth, but what you think other people will think it is worth.

In the face of a perceived deterioration in the surroundings, the value of the property will fall. The deterioration does not have to be real; all that matters is that enough people believe that the neighborhood is falling apart to, in effect, make it so. Even if one homeowner wants to keep his head in these conditions, the fear of sticking around too long while others pull out can be overwhelming.

Blockbusting is a practice where unscrupulous real estate agents take advantage of racial fears and prejudices of homeowners to create a stampede of panic selling. Blockbusting practitioners turned over a substantial portion of the housing in East Cleveland in the late 1960s, leading to the decline of the city. The problem for the city was not the fact that blacks were moving in, but the rate of turnover and the fire-sale prices leading to reduced valuations and a diminished tax base. Moreover, many white residents who fled took their businesses with them, causing further attrition to the economy and finances of the city.

Real Estate as Consumption

Consider a couple who purchase a newly constructed home, move in and live there for the next thirty years. As soon as the home is finished, the meter starts running on the service life of the various components. As they live there over the course of the years, they are in effect consuming the asset. If they did no maintenance or improvements on it, they would reduce it to its salvage value.

By living in the house, they obtain the shelter benefit. They avoid having to pay rent for a place to live. One could argue that they are paying rent to themselves, and that is the origin of imputed rent. Some countries in Europe tax this as income, which is another topic to take up another day.

If this hypothetical couple were to have made a significant down payment, so that they had equity in the house eight o’clock day one, they would have made an investment in the accrual accounting sense: they would have paid up front for an asset (or at least part of one) that they would depreciate through use over subsequent years. However, this is not the same as the general understanding of the term investment.

Real Estate as an Investment

What does investing in real estate look like? How do we tell the difference between investing and speculating?

  • A person who purchases a house to rent it is an investor. The investor seeks an income stream in the form of rents as return on the investment. The investor may engage the services of property managers, tradespersons and various others to manage and maintain the property; this does not modify the situation. The investor doesn’t have to put her own personal sweat into the property in order to be an investor.
  • A person who purchases a house planning to live in it for two years and sell it at a profit is a speculator. He may have projections that, based on past performance of real estate in the area; the future results may not meet these expectations.
  • What about a person who purchases a house needing work, fixes it up and flips it? This falls in a gray area. To the extent that he depends upon his improvements to realize value, he is an investor. To the extent he is counting on future market conditions to be above current market conditions, he is speculating.

To an extent, we need to be able to see inside the soul of the person buying the house to distinguish between investment and speculation.

Closing the Loop

Return now to the interview with Robert Shiller. The title of the article is, “Robert Shiller Destroys The Idea Of Investing In A Home,” but it should really be, “Robert Shiller Destroys the Idea of Speculating in a Home.” Shiller takes the idea of investing in real estate at face value, considers the financials and shows that the home buyer motivated by asset speculation is taking on a lot of risks.

However, the total package as described above is more than just the asset price of the house. For the owner-occupier who wants control over the premises, doesn’t want a landlord or is motivated by the other, less tangible benefits, home ownership still makes sense. Such a person should maintain the house, avoid large renovations justified by the rationale that “it’s an investment” and look for ways to diversify holdings so that there are more liquid assets that can be liquidated if cash is needed.

Written by srojak

October 25, 2013 at 10:51 pm

Positive and Normative Economics

leave a comment »

In economics, we make a distinction between the positive study, which is descriptive, and the normative study, which is prescriptive. It is a distinction that you may hear early on in an introductory class and never revisit through a whole degree program. However, it is worth revisiting in order to understand the limitations of economics. It also helps understand some of the behaviors we make jokes about:

If you stacked all the economists in the world end to end, they still wouldn’t reach a conclusion.

Ask five economists and you’ll get five different explanations; six if one went to Harvard.
— Edgar Fiedler

An economist is an expert who will know tomorrow why the things he predicted yesterday didn’t happen today.
— Laurence J. Peter

Economics is the study of allocation of scarce resources. It is typically considered to discuss money, but time is also a scarce resource and the rules of economics can be applied. We also use economics to figure out how to make scarce resources less scarce. One cannot do much to make more hours in a day, but one can do a lot to make wealth more abundant. One can also screw up and make wealth less abundant.

Positive Economics

Positive economics concerns itself with how people, firms and nations actually do make economic decisions. It is fact-based and testable.

Microeconomics, which is the study of individual people making decisions for themselves and the firms for whom they act as agents, is by nature more testable than macroeconomics, which is the study of entire national economies. It is very difficult to construct a control group for an entire nation. The nation either took an action or it didn’t; there is no apple-to-apple comparison available to the same nation in the same circumstance making an alternative choice.

Because positive economics is all about the facts, it does not attempt to say what we should do. The most positive economics can offer as guidance is to understand the costs of an action; if you do this, you give up being able to do that.

Normative Economics

Normative economics attempts to advise us on how we should make economic decisions, and what those decisions ought to be. This is where we attempt to apply norms to the science of economics to obtain better results. But where do the norms come from? The norms have to be informed by ethics, which is larger than economics.

Any economist has an ethical position from which she obtains her norms and applies them to economics. When Robert Barro, Paul Krugman or Greg Mankiw advise what we should do about an economic problem, their preferences are informed by their ethics. Even the determination of whether or not an economic condition is actually a problem to be solved, such as income inequality, requires principles from outside of economics.

Without agreement on philosophical and ethical questions, the most we can do in economics is concur with others. Concurrence is useful in building effective political coalitions, but it is more limited than agreement. We have to recognize the difference.

It would be really helpful if economists would do full disclosure on their ethical principles so that we could recognize where they are coming from when they make their pronouncements, but usually they do not. We have to interpret their works in order to infer their beliefs.


Economic discussions often feature arguments from the point of view of efficiency. Many of these presentations are attempts to provide normative arguments justified solely in economic terms. To the extent that this is going on, it doesn’t succeed as a justification.

Efficiency is not an end in itself. We do all sorts of things every day that is inefficient, especially where risk management is involved. Having locks on your doors is inefficient; it would take less effort to dispense with carrying and finding keys. Backing up computer data is inefficient. The systems of checks and balances built into the US Constitution is enormously inefficient — by design: the framers did not want a government that was efficient at ordering citizens around.

If you are going in the wrong direction, doing so efficiently is not a justification. Arguments solely based on efficiency beg the question: before deciding whether we can do this more efficiently, should we be doing it at all?

The Is-Ought Problem

One of the great projects of the Enlightenment was to identify a set of naturally and objectively observable facts and derive normative principles from them, without resort to any outside assertions or dogmas. Europe had just been through a series of bloody wars of religion, where subgroups within Christianity had tortured and murdered those who did not worship Christ as did the people in power. Enlightenment thinkers hoped to find a way to start with observations of facts that everyone could agree upon and then, without resorting to anyone’s specific faith or interpretation, derive moral principles that we could use to build a just and ethical society. Such a society would wield authority with legitimacy, because all citizens could see that the rules were derived only from objective facts accessible to all. It would not depend on faith (or lack of it), belief or subjective feeling of the people in power.

This was a great aspiration, and I truly wish it could have been achieved. In fact, it is not possible. In 1739, David Hume wrote that normative statements cannot be validly derived solely from positive statements of observable facts. Hume was correct. Yet even he shrank from the full implications of his theory:

Those who have denied the reality of moral distinctions, may be ranked among the disingenuous disputants; nor is it conceivable, that any human creature could ever seriously believe, that all characters and actions were alike entitled to the affection and regard of everyone. The difference, which nature has placed between one man and another, is so wide, and this difference is still so much farther widened, by education, example, and habit, that, where the opposite extremes come at once under our apprehension, there is no skepticism so scrupulous, and scarce any assurance so determined, as absolutely to deny all distinction between them. Let a man’s insensibility be ever so great, he must often be touched with the images of Right and Wrong; and let his prejudices be ever so obstinate, he must observe, that others are susceptible of like impressions. The only way, therefore, of converting an antagonist of this kind, is to leave him to himself. For, finding that nobody keeps up the controversy with him, it is probable he will, at last, of himself, from mere weariness, come over to the side of common sense and reason.
— David Hume, An Enquiry Concerning the Principles of Morals, 1751

Hume believed that there were implicit limits to what people could morally disagree upon, that there was some natural common ground where we all would agree on moral principles. The 20th century has been a rather conclusive demonstration to the contrary. Hume hadn’t met really energetic believers such as those that emerged after 1800, who don’t get weary of being ignored in their own lifetimes and even seek out extreme positions in order to provoke a reaction. If Hume’s antagonist is left to himself, he may circulate his ideas like viruses until they find men of action who will attempt to put them into practice. The Nazi extermination camps and the Soviet Gulag were created and operated by people who believed in the moral correctness of what they were doing.

So where does this leave us? The problem of the legitimacy of norms appears more intractable than the Enlightenment though leaders had believed. The first call to action is for all of us to understand where we are being summoned to go. When presented with normative prescriptions, we must demand transparency into the norms that underpin the prescriptions. This is a lot of work, but the alternative risk being led by high-sounding words into a moral place we never meant to go, doing things we never really wanted to do in pursuit of ends we do not ultimately find acceptable.

Written by srojak

August 10, 2013 at 11:37 am

Liberal Capitalist Democracy

leave a comment »

When words have no defined meanings, it is hard to hold an intelligent conversation. It should really be no surprise that our national conversation is so shrill and so inconclusive, when we can’t even agree on what the words mean.


The words liberal and liberty have a common root, the Latin liber, meaning free. In late medieval England, a liberty was a piece of land in possession of a lord or abbot, into which royal officials such as sheriffs may not enter without permission of the possessor [Roberts and Roberts, A History of England, vol. 1, p. 339]. By 1630, the meaning of the word liberty had been broadened to mean freedom for anyone from domination by the king and court. A person who supported the rights of persons against the divine right of the monarch became a liberal.

The word was also used by opponents to include persons who did whatever they wanted without moral restraint, seeking to imply that those who would defy the will of the king today would act without any limits tomorrow. However, by 1700 there was general understanding that such a person was not a liberal, but a libertine.

In nineteenth century terms, the opposite of a liberal was a reactionary, someone who reacted to threats to established order and privilege. Liberals sought freedom of speech, assembly and worship, and to end slavery and involuntary servitude. Liberals wanted all people (well, initially all men, but the program did broaden to include women over time) to have the freedom to choose their residence, occupation and avocations.

In the United States in the 1930s, political activists who were not at all liberal adopted the term for themselves in order to give themselves the appearance of continuity with existing political traditions. They sought to paint themselves as the heirs of liberalism and their opponents as the party of reaction. The success model for this was Lenin: in 1903, after a split within the Russian Communist Party, he and his followers began calling themselves bolshevik (majority) and their opponents menshevik (minority), even though it was quite the other way around. Lenin and his followers had lost the vote, were marginalized within the party and ultimately sought exile abroad. However, through endless repetition, they have succeeded in being known to history as Bolsheviks.

Irving Babbitt had seen this coming: in his 1924 book Democracy and Leadership, he devoted an entire chapter to “True and False Liberals.”

It is a matter of no small importance in any case to be defective in one’s definition of liberty; for any defect here will be reflected in one’s definition of peace and justice; and the outlook for a society which has defective notions of peace and justice cannot be regarded as very promising. [p. 262]

The history of the twentieth century has demonstrated the wisdom of this observation.

Most people who we identify as liberals are not really liberals at all. They are prepared to sacrifice liberal goals such as individual self-determination and rule of law to ends that they consider to be higher priorities. This in itself does not invalidate their programs, but we should not mistake them for liberals. It confuses everyone’s thinking.


Capitalism was named by Marx and primarily defined by its detractors. It seems that those who understand capitalism go out and make money, while those who do not write content complaining how unfair the system is.

Partly as a result of the ideological conflicts during the Cold War, capitalism has typically been identified with private property, but not all economic systems featuring private ownership of property are inevitably capitalist. Manorialism, which was the economy of feudalism, included private property but also enforced servitude and restricted economic growth. Fascist states have typically allowed private ownership of property, but restricted how the individual could make use of it. You can have all the headaches of owning and caring for property; we’ll just tell you what you can and cannot do with it.

Capitalism could not take root until ordered conditions such as rule of law and enforcement of agreements over time — contracts — had been firmly established. Even today, in countries where one cannot count on enforcement of a contract, economic development is next to impossible.

The deployment of capital necessitates risk. Lenders have relatively low risk; they are senior to investors in their legal rights to recover their money. Investors have greater risk than lenders, and expect greater rewards when successful.

Owners have the greatest risk of all. Being an owner just means that you get paid after everyone else is paid, if there is anything left to pay you with. If not, you get the losses.

A person who can form capital and manage risks in its use can obtain rewards far greater than a hard-working person who takes no risks. This is the part that Marx, who was wedded to the labor theory of value, did not comprehend. Marx, along with many others, could not understand why a hard-working laborer should be rewarded less than a man who sent his money out to work for him. The answer is that you can find a thousand persons who are willing to work hard for every one person who is willing to take risks. However, without the person who is willing to take risks, you don’t get the benefit of capital. At best, you have people piling their surplus up and storing it under the mattress. At worst, you have a peasant society, where people eat their surplus when times are good and starve en masse when times are bad.

It is human nature to try to fob the risk off to someone else but keep the reward. However, this breaks capitalism. An economy that allows this is not capitalist, but something rather different. Theodore Lowi noticed this decades ago:

Privileges in the form of money or license or underwriting are granted to established interests, largely in order to keep them established, and largely done in the name of maintaining public order and avoiding disequilibrium. The state grows, but the opportunities for sponsorship and privilege grow proportionately. Power goes up, but in the form of personal plunder rather than public choice. It would not be accurate to evaluate this model as “socialism for the rich and capitalism for the poor,” because many thousands of low-income persons and groups have profited within the system. The more accurate characterization might be “socialism for the organized, capitalism for the unorganized.”
— Lowi, The End of Liberalism, 2nd ed. (1979), pp. 278-279.

Lowi developed his characterization of the new political economy this way:

Permanent receivership would simply involve public or joint public-private maintenance of the assets in their prebankrupt form and never disposing of them at all, regardless of inequities, inefficiencies, or costs of maintenance.
— Lowi, p. 279.

The enterprise in question need not be on the verge of bankruptcy or a candidate for liquidation. It could simply be large enough to represent a risk of dislocation to the economy if it were to collapse: too big to fail.

This could be called anticipatory receivership suggesting that the policy measures appropriate for the concept give the government a very special capacity to plan. Permanent receivership can be extended outward to include organizations that are not businesses. If there are public policies which are inspired by or can be understood in terms of this expanded definition, then we have all the elements of a state of permanent receivership.
— Lowi, pp. 279-280.

This is not capitalism at all. There is no creative destruction; it is a goal of policy to avoid destruction in any form. There is no risk, provided you are included in an approved group. There is no profit-and-loss discipline. And all organization, including but not limited to business enterprises, become “public-private partnerships” directed to obtain public policy goals. Properly understood, this is a species of corporatism:

A U.S.-style corporate state has arrived unsung, unheralded and almost never mentioned. The emergence of corporatism has to do with the parallel emergence of Big Labor, Big Agriculture, Big Business, Big Universities, Big Defense, Big Welfare and Big Government, all operating in a symbiotic relationship. It also has to do with the growth of modern social policy, with the government assuming a great role in the management of the economy, with the greater emphasis on group rights and group entitlements over individual rights, and with the growth of a large administrative-state regulatory apparatus.
— Howard J. Wiarda, Corporatism and Comparative Politics, p. 147.

Main Street may be mostly capitalist, but Wall Street and K Street are solidly corporatist.


There is a material difference between a democracy and a republic. Although many people use the terms interchangeably, they are not in fact synonymous.

In The Federalist #10, James Madison makes clear that a republic can offer safeguards against mob rule that a democracy cannot.

A republic, by which I mean a government in which the scheme of representation takes place, opens a different prospect, and promises the cure for which we are seeking. Let us examine the points in which it varies from pure democracy, and we shall comprehend both the nature of the cure and the efficacy which it must derive from the Union.

The two great points of difference between a democracy and a republic are: first, the delegation of the government, in the latter, to a small number of citizens elected by the rest; secondly, the greater number of citizens, and greater sphere of country, over which the latter may be extended.

Neither Madison nor most of his contemporaries — possibly excepting Jefferson — saw direct democracy as a desirable outcome. This viewpoint was not limited to southern planters:

It is a besetting vice of democracies to substitute publick opinion for law. This is the usual form in which masses of men exhibit their tyranny. When the majority of the entire community commits this fault it is a sore grievance, but when local bodies, influenced by local interests, pretend to style themselves the publick, they are assuming powers that belong to the whole body of the people, and to them only under constitutional limitations. No tyranny of one, nor any tyranny of the few, is worse than this. All attempts in the publick, therefore, to do that which the publick has no right to do, should be frowned upon as the precise form in which tyranny is the most apt to be displayed in a democracy.
— James Fenimore Cooper, The American Democrat (1838), p. 71.

The framers of the Constitution sought safeguards to prevent the tyranny of the mob. They divided the government into distinct branches that could block the initiatives of the others. They also set specific limits on what each branch could do.

The framers also divided power between the federal government and the governments of the sovereign states. An unfortunate casualty of the civil war was this balance. The Secession Crisis discredited the concept of states’ rights, and there was a subsequent erosion of state power. In 1913, the 17th Amendment was ratified; this replaced the election of senators by state legislators with election by the citizens of the states directly. This, together with the increasing cost of running a statewide election campaign, has turned the Senate from the legislative body representing the states and a counterweight to the House of Representatives into an American House of Lords. The attempt by Caroline Kennedy to obtain the seat from New York being vacated by Hillary Clinton in 2008 was symptomatic of this state of affairs.

Even Irving Babbitt must be questioned; after all, he titled his book Democracy and Leadership, not Republicanism and Leadership. What did he really want? He clearly did not support egalitarian democracy:

If we go back, indeed, to the beginnings of our institutions, we find that America stood from the start for two different views of government that have their origin in different views of liberty and ultimately of human nature. The view that is set forth in the Declaration of Independence assumes that man has certain abstract rights; it has therefore important points of contact with the French revolutionary “idealism.” The view that inspired our Constitution, on the other hand, has much in common with Burke. If the first of these political philosophies is properly associated with Jefferson, the second has its most distinguished representative in Washington. The Jeffersonian liberal has faith in the goodness of the natural man, and so tends to overlook the need of a veto power either in the individual or in the state. The liberals of whom I have taken Washington to be the type are less expansive in their attitude toward the natural man. Just as man has a higher self that acts restrictively on his ordinary self, so, they hold, the state should have a higher or permanent self, embodied in institutions, that should set bounds to its ordinary self as expressed by the popular will at any moment. The contrast that I am establishing is, of course, that between a constitutional and a direct democracy.
— Babbitt, pp. 272-3.

More properly, it is the contrast between a republic and a democracy.

A direct democracy is, in fact, a sentimentalist fantasy. Each citizen must allocate her time among the demands of citizenship and other interests and occupations she may have. Some citizens will make the economic decision to relinquish participation in governance, delegating their voice to others and accepting the results. There can never be an effective direct democracy, because even if everyone can participate, not everyone will. This is not alleviated by technology; it is a natural consequence of the different priorities and time allocation decisions of the citizens. As Madison believed, a republic in which citizens were represented by those who had chosen to commit their time to the responsibility is the only practical approach to self-government.

The sentimentalist also ignores the possibility that some citizens may not view political responsibility as a good at all. Anyone who is out in the world paying attention knows some persons who would rather relinquish power to others than have to take responsibility for their own decisions. Such persons are easily led, and their scope for malignant effects on the body politic are much greater in a democracy than a republic.

Before calling for reforms to increase democracy, we must review whether democracy is something we really want. Our predecessors who founded this country did not, and there is no evidence that they were wrong.

Written by srojak

May 26, 2013 at 11:02 pm

Fiat Money

leave a comment »

Here is a simple but effective presentation on fiat money:

The problem with a gold standard is that the monetary base, which is the amount of money in circulation, can’t expand to keep up with the economy. Then there is too little money chasing the available wealth, which depresses the economy.

The problem with fiat money is that the monetary base can be expanded by the government to be anything. It need have no relation to the actual economy. Then there is too much money chasing the available wealth, which inflates the economy.

Written by srojak

May 23, 2013 at 3:59 am

Three Forces that Will Shape American Life

leave a comment »

There are three forces that will determine our future. They are essential to our lives, and they are often misunderstood or underappreciated. They are ownership, accountability and risk.


The term knowledge worker is generally credited to Peter Drucker, who had been writing about it back in 1959. There were knowledge workers before this: a military officer is a commander, but has also been a knowledge worker since at least Napoleonic times. Lawyers have always been knowledge workers. However, it was in the mid-twentieth century that knowledge work began to proliferate.

If you want to make your commanding officer look like an absolute fool, do exactly what he tells you to.
— Military precept

In order for a person to be effective in knowledge work, the person must be engaged. A person waiting to be told what to do will be a failure as a knowledge worker. For the person to be engaged, the person must feel that her contribution is wanted. You can’t have it both ways: you can’t have an engaged worker hanging on your every command. Without ownership of the task, engagement falls away.

Why is it that I always get the whole person when what I really want is a pair of hands?
— Attributed to Henry Ford

There has long been a tension in organizations over knowledge work. By its very definition, it does not admit detailed supervision. If the manager has to review every decision of the worker, then two are doing the job of one. The manager who attempts this will have a very small span of control. However, there is a conflict with the centripetal forces of the organization: status, privilege and the need for control.

The tension between ownership and control has never been fully resolved. Some management teams achieve a modus vivendi that allows their enterprise to succeed for a while, but ultimately the balance cannot be maintained. The organization either hardens into a controlling environment, or the lack of control causes it to burst at the seams.

That’s just how it always went with one of these new Silicon Valley hardware companies: once it showed promise, it ditched its visionary founder, who everyone deep down thought was a psycho anyway, and became a sane, ordinary place.
— Michael Lewis, The New New Thing, p. 47.

And sane, ordinary places get same, predictable results. Disengaged people hang around sane, ordinary places; engaged people leave at the first opportunity.

The problem also exists in the public space. The orthodoxy we learned in seventh grade civics says that power derives from the consent of the governed. Passive consent or active consent? It’s a harder question than it looks. From the evidence of recent experience, you can fool many more of the people much more of the time than we are comfortable with. But, having fooled them, you can’t credibly turn around and ask them to take responsibility for the results.


You can find material praising accountability, but it is really more selective than commonly understood. Second-person accountability — accountability for you — is always a Good Thing. First-person accountability — accountability for me — is a trickier subject.

Clearly, when the outcome is positive, the person responsible is in favor of accountability. However, when the outcome is unfavorable, it becomes an orphan. There are always extenuating circumstances why I shouldn’t be held accountable for the bad outcome.

The American education establishment has shown thought leadership here, promoting the idea that only they can evaluate their own results. Only the professionals are equipped to hold themselves accountable. From this you make a living?

The problem with inadequate accountability is that no learning takes place. If no one is accountable, everyone does the same things they always did and hopes for a different outcome. “The definition of insanity is doing the same thing over and over, hoping for a different result.”


The story of the society over the past hundred years is the story of the attempt to remove risk from everyday life. It has not worked, and resulted in epic boredom. It has also caused people to no longer understand risk and how to manage the tradeoffs. The country has become risk-averse in to a level that cannot be sustained, that cannot allow it to function. We have also fixated on spectacular hazards with remote probabilities while being blind to less consequential hazards with higher probabilities.

Our modern middle class is the descendant of an older gentry composed of independent farmers, small businessmen, self-employed lawyers, doctors and ministers.
— Barbara Ehrenreich, Fear of Falling, pp. 78-79.

As Ehrenreich notes, independent farmers, small businessmen and self-employed lawyers faced business risk. The new middle class that arose since mid-century largely consisted of a credentialed salariat. This cohort is highly risk-averse but often does not recognize the risk it actually incurs. When I hear someone talking about buying stock in his own employer at market price, I know I am in the presence of someone who does not naturally think about risk.

Some nations have sought planning capacity by socializing production — one or more basic industries. Some may try planning by the socialization of natural resources, while others may socialize the delivery of central services. Some may socialize banks while others seek to socialize the distribution of goods. The United States has so far skirted all these alternatives in favor of socialization of its most valuable resource: risk.
— Theodore J. Lowi, The End of Liberalism (2nd ed.), p. 289.

At a macro level, the elimination of risk had produced, by 1960, a producer-centric economy with only the fiction of profit-and-loss discipline at the large corporate level. This ultimately resulted in bailouts for corporations such as Lockheed (1971) and Chrysler (1980). If the corporations were allowed to go under, where would all the workers go?

The economy can still be characterized as “socialism for the organized, capitalism for the unorganized,” as Lowi did in 1979. Patches of capitalism are observable in technology, where business mutates too fast for public policy and administration to keep up. The core of the economy, however, is managed with the goal of socializing risk. That is great for people who fail, but not so great for people who have to subsidize supporting those who fail.

Reducing risk through public policy, by regulation and underwriting, protects the underperforming and incompetent. It rewards them for holding the economy hostage. This approach suffers the same deficiencies as avoidance of accountability: what incentive does anyone have to learn, to do better next time?

These three basic forces — ownership, accountability and risk — are out in the wild having real, if unseen, effects on everyday life. Left improperly understood, the will upturn the carefully crafted society that has been built in partial ignorance of them.

Written by srojak

May 11, 2013 at 10:22 pm

Material Scarcity

leave a comment »

Currently, there is an extensive debate going on both here and in Europe as to whether austerity or some form of pump-priming is called for to rectify the problems in the economy. Both sides have persuasive arguments, yet seem to be talking past one another. There seems to be a basic difference that is not being addressed. Without discussing it and resolving it, any agreement on conclusions is at best accidental and temporary.

Meanwhile, you can plug the term post-scarcity into the search engine of your choosing and find all kinds of material purporting to explain how we live in an age of abundance, and scarcity is a thing of the past. They people writing this content are often articulate and, one infers, intelligent. So why do they think this?

To answer these questions, we have to go back in history. We must return to the birth of macroeconomics and the creation of the world as we have come to know it.

Urbanization and the Great Depression

Prior to about 1850 here, or 1800 in England, the workings of the economy were very simple. Most people were engaged in the production of food. There were few factors of production available to produce anything other than food. Manufactured goods were relatively expensive compared to what they are today. The pre-industrial economy was basically a subsistence agricultural economy with relatively little surplus. Agricultural production was heavily dependent on labor, and that labor needed most of the product for its own maintenance. When conditions adversely affected food production, such as bad weather, famine ensued.

Through deliberate capital formation and risk management, the West crawled out of subsistence during the 1800s. As manufactured goods became cheaper, there was more scope to substitute capital for labor. The released labor moved to the cities and became available to produce more manufactured goods, and a virtuous cycle began. By 1920, there were more Americans living in towns and cities than on farms. This was not thought possible one hundred years earlier.

But around 1930, an inexplicable disaster arrived. Economic activity just stopped, and few people really understood why. Worse, it seemed that any rational action by any participant just made the situation worse. When people reacted to the uncertainty of future incomes by cutting spending, production or lending, these actions just accelerate the problem.

But Who Is Confident Enough to Commit to Car Payments?

But Who Was Confident Enough to Commit to Car Payments?

The government took to putting up posters such as the one above, exhorting people to buy consumer goods. But when their own incomes were uncertain, they didn’t dare make that kind of commitment. They pulled back, as did factory owners and lenders. In fact, the pro-cyclical policies of the Federal Reserve turned a bad business downturn into an epic depression.

At the height of the Depression, it seemed as if everything had come unglued. George Orwell captures the spirit and feeling in this often-quoted passage from The Road to Wigan Pier:

We walked up to the top of the slag-heap. The men were shovelling the dirt out of the trucks, while down below their wives and children were kneeling, swiftly scrabbling with their hands in the damp dirt and picking out lumps of coal the size of an egg or smaller. You would see a woman pounce on a tiny fragment of stuff, wipe it on her apron, scrutinize it to make sure it was coal, and pop it jealously into her sack. Of course, when you are boarding a truck you don’t know beforehand what is in it; it may be actual ‘dirt’ from the roads or it may merely be shale from the roofing. If it is a shale truck there will be no coal in it, but there occurs among the shale another inflammable rock called cannel, which looks very like ordinary shale but is slightly darker and is known by splitting in parallel lines, like slate. It makes tolerable fuel, not good enough to be commercially valuable, but good enough to be eagerly sought after by the unemployed. The miners on the shale trucks were picking out the cannel and splitting it up with their hammers. Down at the bottom of the ‘broo’ the people who had failed to get on to either train were gleaning the tiny chips of coal that came rolling down from above—fragments no bigger than a hazel-nut, these, but the people were glad enough to get them.

That scene stays in my mind as one of my pictures of Lancashire: the dumpy, shawled women, with their sacking aprons and their heavy black clogs, kneeling in the cindery mud and the bitter wind, searching eagerly for tiny chips of coal. They are glad enough to do it. In winter they are desperate for fuel; it is more important almost than food. Meanwhile all round, as far as the eye can see, are the slag-heaps and hoisting gear of collieries, and not one of those collieries can sell all the coal it is capable of producing.

The scene defied explanation. There was labor available to produce, but no labor was wanted. The means of production were there, but were laying unused and decaying. The raw materials were there, but there was no effort to extract them, for they could not be sold. What happened? More importantly, how could we make it stop, returning to the conditions of prosperity?

Demand Management

Intelligent, caring, earnest people looked at this situation and concluded that the basics of existence had changed: we were no longer constrained by the supply of wealth, but by the demand for it. They reasoned that, with the onset of urbanization and mechanization, we were now living in a time of material abundance, not scarcity.

This notion was not entirely new at the time. Catchings and Foster had done influential work in the 1920s developing the theory of underconsumption. Once the economy had reached a level where there was sufficient food, clothing and shelter for most people, would they become sated and just stop consuming? If they did, would the economy seize up? Could that be the underlying cause of the Depression?

Demand for wealth, not supply, was now seen as the bottleneck. As such, all the truisms of the past were inverted. The problem of economics would not be production, but distribution. Say’s Law had said that supply creates its own demand; in this new world, demand would create its own supply. Create the demand and the wealth to meet it would come from somewhere.

There was no effective competing theory available. Andrew Mellon’s liquidationism was manifestly unacceptable; these were real people — and real voters — living in Hoovervilles. The economic and political establishment was largely uninterested in the Austrian School.

While FDR never got the economy out of the hole until war production began in earnest in 1940, he was seen by the public to be Doing Something. Even though people who lived through it remembered that unemployment did not really return to normal until the war began, they still remembered FDR as the leader who saw us through the Depression. The grief captured on the newsreels when he died was genuine.

During the 40s and 50s, the new demand-based approach seemed to work. America outproduced its enemies in World War II. After the war, instead of a ruinous inflation and reversion to widespread unemployment that had been forecast, America experienced unprecedented prosperity.

As we entered the sixties, economic experts in government were confident that they could fine-tune the economy and smooth out the business cycle. Other thought leaders looked forward to a time when we would end poverty and include all Americans in prosperity.

So What Went Wrong?

There were several incorrect assumptions and faulty expectations baked into this cake; they shall be subjects of their own articles, as detailed treatment of them is outside the scope of this post. Among them:

  • The fact that most of the developed world outside America had been reduced to pre-industrial conditions by the war was overlooked. Even Britain, who won the war, was impoverished by the necessity of beating plowshares into swords.
  • The Johnson Administration tried to implement a guns-and-butter policy, launching the Great Society programs and conducting the Vietnam War.
  • Political vote-buying accelerated, making ever-greater promises to larger numbers of people.
  • There was a general hubris among the population, having overcome the Depression and won the War, leading to an exaggerated sense of what could be accomplished.

However, a fundamental problem that underpins many of the above is the fallacy of material abundance. Demand is not the bottleneck. No matter how affluent people become, their wants will still exceed the resources available to satisfy them.

The thinkers and would-be planners had thought that the masses would content themselves with having their basic material needs met, and demand would have to be stimulated. But that is not the way it played out:

They were heading out to the suburbs — the suburbs! — to places like Islip, Long Island and the San Fernando Valley of Los Angeles — and buying houses with clapboard siding and pitched roofs and shingles and gaslight-style front-porch lamps and mailboxes set up on lengths of stiffened chain that seemed to defy gravity, and all sorts of other unbelievably cute or antiquey touches, and they loaded these homes with “drapes” such as baffled all description and wall-to-wall carpeting you could lose a shoe in, and they put barbecue pits and fish ponds with concrete cherubs urinating into them on the lawn out back, and they parked twenty-five-foot-long cars out front and Evinrude cruisers up on tow trailers in the carport just beyond the breezeway.
— Tom Wolfe, “The Me Decade and the Third Great Awakening”, Mauve Gloves & Madmen, Clutter & Vine (1976)

There was no underconsumption. There was no problem with demand. As long as the ability to pay, to provide value in exchange for value, was present, demand would take care of itself.

By the 1970s, we had the curious phenomenon of stagflation, a portmanteau of stagnation + inflation, which Keynesian orthodoxy preached was impossible. This was an early warning of troubles to come, and provoked some rethinking in economics. It barely rippled out to public policy.

In response to stagflation and the oil price shocks of the time, there was a brief notional flirtation with a return of focus to supply at that time, captured in the school commonly known as supply-side economics. However, this school had limited objectives, arguing for lower tax rates for the good of the collective — overall increase in growth and government receipts — rather than justifying these on an individualist basis. The mere recognition that supply had an active influence on the market was radical as it was. There was little rethinking of the macroeconomic focus on demand.

More recently, there was the downturn of 2001-2002, documented here: At the time, it was recognized that consumers were carrying the economy. How were they doing it? They were dissaving. They were increasing their debt levels. They were withdrawing equity from their homes, through second mortgages, home equity lines of credit and cash-out refis. This had to end sometime, and ultimately it did.

The consumer couldn’t go the distance, but consumers don’t have a printing press. What about the government? The support of demand through government debt will ultimately fail as well. In FY 2012, Federal debt service was 5.5% of federal spending. Of the total of federal spending, only 81.8% was covered by revenue [based on data from]. Thus, revenue can’t keep up with spending and interest, the interest rolls into the balance due, and the interest compounds. Eventually, unless something gives, debt service will devour the federal budget.

The most powerful force in the universe is compound interest.
— attributed to Albert Einstein

Eating the Seed Corn

A fundamental but often-forgotten assumption behind macroeconomics is that the overriding majority of consumers are also the producers. They are producing the increased wealth, a share of which they take home as wages. These wages support the ability do satisfy the demand in the marketplace.

There will be a small minority of people who do not produce. Police are an example. Enforcing the law is very necessary, but it is not a wealth-creating activity. The economy has to create enough surplus wealth to pay for the enforcement of its laws and preservation of the security of its members.

An economy has to generate sufficient wealth to replace depreciated capital (such as equipment that wears out and has to be retired) and pay for governance and security functions (law enforcement, firemen, military) that do not create wealth. Any charitable transfers to people who cannot produce themselves must be subtracted here as well. If the wealth produced by the economy after these subtractions declining, the nation as a whole cannot maintain its standard of living. This is the current situation of the United States. We have been eating the seed corn for years.

Macroeconomics is in such theoretically bad shape that no one really knows how much wealth is out there or how much we produce. Few seem to care. Traditionally, we have backed into it with measures such as Gross Domestic Product (GDP). However, GDP should really be called Gross Domestic Expenditure. We count up how much we spent and assume that roughly that much wealth has been created. As the debt-financed economy of 2001-2007 demonstrated, it ain’t necessarily so.

The Material Qualifier

All the things of the physical world, including wealth, are subject to scarcity. These rules do not apply to things of the mental world, such as human energy, human attention, affection and ideas. The mental world is ruled by plenitude.

Consider human energy. Within limits, human energy becomes increasingly available as one expends more of it. This is witnessed by the saying, “If you want something done, give it to a busy person.” The more one does, the more one is capable of doing, until one reaches some boundary limit. This can be observed in everyday life.

Repent, the End Is Near

Common sense tells us that we cannot spend our way to prosperity, but macroeconomics was born in defiance of common sense.

If something cannot go on forever, it will stop.
— Herbert Stein (1916-1999)

The use of debt to prop demand up above the level of wealth creation that cannot support it cannot go on forever. It will stop, and it will stop in a very messy manner. Long-term decisions people have made, such as career investments, will be upturned. Plans that had seemed safe and prudent for decades will be suddenly exposed as reckless and foolhardy. There will be suffering, destroyed dreams, lost years.

The idea that we live in a time of material abundance is a delusion. It is a dangerous delusion, hazardous to anyone who imbibes it. No matter how much wealth exists in a community, people will always want more. They will not become sated on necessities; they will make luxuries into necessities. Except in times of extraordinary upheaval, such as an economic contraction, demand for wealth will always outrun supply.

An industrial economy is a very complex creature, a giant with feet of clay. It can survive some incredible stresses, but smaller forces acting “with the grain” can bring it down in a way that is very difficult to undo. This was the experience of the Depression.

Macroeconomics needs a comprehensive overhaul. There is currently no recognition of the foundation of prosperity in wealth creation. The study can be changed before its advice leads the economy to slam into a wall, or wait until after, when the evidence becomes too obvious for any but the truest of believers to ignore.

Written by srojak

May 10, 2013 at 2:44 am