Saturday, July 23, 2022

When Good Advice Is Ignored: Economic Policy During the Ford Administration

David Stockman was elected to the U.S. House of Representatives in November 1976 and took office in January 1977. Prior to that, he’d worked since 1970 as a congressional staffer. Stockman was first elected to office in the same election which ended the electoral career of President Gerald R. Ford.

Ford had become president upon the resignation of President Richard Nixon. Ford had been Nixon’s vice president. Before becoming vice president, Ford had been elected, and then many times re-elected to the House of Representatives. Both Ford and Stockman had spent most of their lives in Michigan, and had represented that state in Congress.

Stockman would go on to have a multi-faceted and famous career after the November 1976 election, ultimately becoming the Director of the Office of Management and Budget from January 1981 to August 1985, being a part of the administration of President Ronald Reagan.

The political dynamics which influenced monetary, fiscal, and budgetary policy during Stockman’s tenure in the Office of Management and Budget (OMB) were eye-opening to him, and ultimately ended his political career, and — some observers might suggest — turned him into a bit of a bitter cynic. Stockman learned that political interests — i.e., elected officials doing what they need to do in order to get reelected — will usually trump the prudent advice of serious economists.

In the case of the Reagan administration, a simple formula — cut taxes and cut government spending — seemed to gain widespread agreement and approval, until the time came when specific and real budget cuts had to be identified. At that point, legislators worked to make sure that their pet projects — “set asides” and “pork barrel” items — were not going to be cut. With each legislator defending some expenditure, no expenditure was left to be reduced.

The predictable result of taking the first step — tax cuts — without taking the second step — spending cuts — was an increasing deficit. Stockman knew that deficits are harmful, and the plan had been to avoid them. But the political reality was that no legislator would embrace a spending cut that affected his electoral base. The tax cuts during the Reagan administration fueled economic growth and wage increases for the working class, but in the long run, the national debt increased during those same years, laying the foundation for problems in the future.

In hindsight, Stockman saw that this bitter experience in the 1980s was foreshadowed by President Ford’s similar experience in the 1970s. Stockman writes:

After assuming the presidency in August 1974, Gerald Ford had started off on the right foot. As a fiscally orthodox Midwestern Republican, he had been frightened by the recent runaway inflation and repulsed by the insanity of the Nixon freeze and the ever-changing wage and price control rules and phases which followed. Ford had also been just plain embarrassed by Nixon’s five straight years of large budget deficits.

President Nixon had hoped to fix the inflation problem by regulation: wage and price controls. Retailers could not set the prices of the goods on their shelves: the government did. Employers could not decide how much to pay their workers: the government did. The result was shortages of consumer products and the development of numerous workarounds designed to help businesses sidestep the government regulations. The inflation problem continued unabated.

President Ford, when he took office, saw the error of Nixon's policies and determined not to repeat the folly of Nixon’s economic regulation and Nixon’s deficits.

Ford’s first approach was to try to be serious about disciplined budgeting and to eliminate wasteful spending. The government’s budget for any one year should be the amount needed, and no more, to carry out the legislated tasks given by Congress to the executive branch.

Stockman recalls Ford’s attempt to instill sobriety into the Congressional budgeting process:

So for a brief moment in the fall of 1974, he launched a campaign to get back to the basics. Ford proposed to jettison the notion that the budget was an economic policy tool, and demanded that Washington return to the sober business of responsibly managing the spending and revenue accounts of the federal government.

Because he understood that a balanced budget and the avoidance of deficits was one of the primary responsibilities of the government, President Ford was even willing to consider a temporary tax increase. To obtain a responsible balanced budget, spending cuts must always be the primary method, but occasionally tax increases must be employed as well. If managed correctly, the tax increases would ultimately lead to tax cuts in the future, if the budget were brought under control.

The trends and fashions of politics can change from decade to decade. Ford was a conservative midwestern Republican. In the 1970s, it was thinkable for him to advocate for a tax increase. In later decades, conservative midwestern Republicans would be known often for opposing tax increases. Stockman’s narrative continues:

To this end, he called for drastic spending cuts to keep the current-year budget under $300 billion. He also requested a 5 percent surtax on the incomes of corporations and more affluent households to staunch the flow of budget red ink. At that point in history Ford’s proposed tax increase was applauded by fiscal conservatives, and there was no supply-side chorus around to denounce it. In fact, Art Laffer had just vacated his position as an underling at OMB.

One of the changes which happened after the 1970s was that the government budget came to be viewed, in later decades, as a tool for repairing the economy. Changing the levels of taxation and spending were later seen as ways to tweak the national business climate. In the 1970s and in earlier decades, the government’s budget was understood to simply be the funding for the government to carry out whichever tasks were essential and necessary as legislated.

Stockman explains how Ford attempted to redirect the executive branch and the legislative branch toward a responsible budgeting process:

In attempting to get Washington off the fiscal stimulus drug, Ford was aided immeasurably by the fact that Schultz had vacated the Treasury Department and had been replaced by Bill Simon. The latter was from a wholly different kettle of fish.

Both Congress and the entire bureaucratic apparatus of the executive branch, however, had momentum in a direction different from President Ford’s goals. This momentum came from a decade of ever-increasing government spending, apathy about growing debt and deficits, and policies which viewed taxation as a way to steer the economy rather than a way to raise revenue to fund legitimate government tasks. This decade was the combined years of the Johnson administration and the Nixon administration. Although in many ways opposed to each other, Johnson and Nixon had similar economic views.

Rather than let a deregulated business environment find its way to an optimum and natural equilibrium, Johnson and Nixon had continuously tinkered with regulations, thinking that the government knew best.

The momentum of bad economic policies for a decade proved to be the brick wall against which Ford’s common sense would crash; Treasury Secretary Bill Simon fared no better:

Bill Simon’s militant crusade within the Ford administration for the old-time fiscal religion and unfettered free markets was consequently short-lived. To be sure, his advocacy was not the run-of-the-mill Republican bombast about private enterprise. In speeches and congressional testimony, Simon offered consistent, forceful, and intelligent opposition to all forms of federal market intervention designed to stimulate the general economy or boost particular sectors like housing, agriculture, and energy.

The vertical separation of powers — between city, county, state, and federal governments — provided a clear example. The federal government has no responsibility to pay for the bad decisions of city governments. Yet when President Ford made a principled stand against using taxpayer dollars to pay for a city government’s bad spending, his reasoning was not universally accepted, as David Stockman reports:

In famously telling New York City to “drop dead” in its request for federal money, Gerald Ford betrayed a fundamental sympathy with his treasury secretary’s approach to fiscal rectitude. Yet the economic wreckage left behind by the Nixon abominations soon overwhelmed Ford’s best intentions.

The mess left by Johnson and Nixon was too big to fix without pain, and too many people were not willing to accept the pain: not Congress, not the media, not the lobbyists, not the labor unions. While many businesses were willing to ride out the bumps of a free market to get back to a healthy economy, some businesses relied on cronyism-type relationships with the government. Instead of offering good products at competitive prices, such businesses relied on governmental regulations to tilt the marketplace in their favor: they did not want a truly free market.

Under political pressure, President Ford changed his policies, against his own better judgment. The new policies, accompanied by similarly-designed legislations from Congress, were nearly opposite to Ford’s original position: The new policies moved away from a truly free market and moved toward a cronyism between a few businesses and the government; the new policies implemented tax cuts, not because they were analogous to the federal government’s obligations and responsibilities, but because they were part of an anti-inflation tactic, as David Stockman describes:

As the US economy weakened in the winter of 1974-1975, the Ford administration reversed direction at the urging of businessmen like OMB director Roy Ash and big business lobbies like the Committee for Economic Development. In the place of October’s tax surcharge to close the budget gap, Ford proposed in his January 1975 budget message that Congress enact a $16 billion tax cut, including a $12 billion rebate to households designed to encourage them to spend.

Although President Ford knew better, political realities forced him to allow a type of “crony capitalism” to gain ground against “free market capitalism” — the result was that, instead of the marketplace being a level playing field for free and fair competition between businesses, a few businesses were favored over all the others, leading to higher prices, leading to shortages, leading to increasing unemployment, and leading to falling wages and to falling standards of living for the working class.

And although he knew better, Ford was pressured into allowing tax policy to be used as a way to attempt to tweak the economy, instead of being used in a straightforward way to gather the necessary revenue.

As usual, the legislators ignored budget cuts, which can reduce the deficit and fight inflation. Cutting taxes without reducing spending leads inevitably to economic downturns and leads necessarily to increased deficits and debt, which in turn fuel inflation and unemployment. While inflation may cause nominal wages to rise, it causes real wages to fall, and the standard of living to fall as well. David Stockman quantifies the situation:

At length, Congress upped the tax cut ante to $30 billion. It also completely ignored Ford’s plea to make compensating spending cuts of about $5 billion.

Tax cuts may seem like generosity from the Congress, but they are salutary only if spending cuts accompany them. The generosity is an illusion because, for the consumer, any benefit from the tax cuts will be negated by inflation, shortages, and falling real wages.

Political presentations about tax cuts or tax rebates can seem like generosity — but this is an illusion, because all of that money belonged to the citizens in the first place. A tax cut is not a gift, it’s merely the government taking less from the people than it did the year before. By analogy, a victim of crime would not consider it to be ‘generosity’ if a burglar decided to steal only a computer instead of a computer and a smartphone.

A tax cut is not a gift from Santa Claus. It is merely the government’s decision to confiscate less wealth from people than it might have otherwise seized:

Then in the fall of that year (1975) the Ford White House escalated the tax stimulus battle further, proposing a tax reduction double the size of its January plan. This led to even more tax-cut largesse on Capitol Hill, a Ford veto, and then a final compromise tax bill on Christmas Eve 1975. Senate finance chairman Russell Long aptly described this final resolution as “putting Santa Claus back in his sleigh.”

Just as the 1974/1975 economic trouble was beginning to self-correct, the government’s efforts to fix it kicked in. These unnecessary measures simply drove the economy back away from equilibrium and caused more trouble:

Thus, the 1969-1972 cycle repeated itself: by the time the big Ford tax cut was enacted, the inventory liquidation had run its course and a natural rebound was under way. So the 1970s second round of fiscal stimulus was destined to fuel a renewed inflationary expansion, and this time it virtually blew the lid off the budget.

A single bad action simultaneously drove inflation and increased the deficit. Freemarket thought, by contrast, would have the government stand down from any contemplated intervention, and allowed the organic forces of buying and selling to nudge the economy back toward equilibrium.

With the horizontal separation of powers, it is always questionable whether one should ascribe credit or blame to the President or to the Congress. In most cases, the glory or the shame is shared. President Ford could have resisted interventionist tendencies more. Congress could have done the same. Neither branch of government behaved perfectly. Ford’s instincts were probably better than Congress’s, and even if he’d done exactly the correct thing in each case, Congress still had the ability to override his veto.

Political factors having nothing to do with economics — mainly Nixon’s Watergate scandal — played into Congress’s ability to override Ford’s veto. The Watergate scandal had also brought candidates into Congress who had a significant tendency to spend — there being no direct link between Watergate and excessive spending, but such are the dynamics of electoral politics, as David Stockman depicts them:

Despite Ford’s resolute veto of some appropriations bills, his red pen was no match for the massive Democratic congressional majorities that had come in with the Watergate election of 1974. Their urge to spend would not be denied, as attested by the figures for budget outlays.

The problems which confronted President Ford between August 1974 and January 1977 can be traced back to a meeting in August 1971. President Nixon called the meeting at Camp David, and invited most of his important economic advisors. It was at this meeting that the concept of wage and price controls became a policy. The results were disastrous, and even after Nixon was gone, and his wage and price controls were no longer in effect, the ripple effects of that meeting continued to plague the economy:

Federal spending grew by 23 percent in fiscal 1975 and then by more than 10 percent in each of fiscal 1976 and 1977. All told, federal outlays reached $410 billion in the Ford administration’s outgoing budget, a figure nearly double the spending level in place six years earlier when Nixon hustled his advisors off to Camp David.

The history of Ford’s economic policies is in some ways a tragedy. President Ford was a lonely advocate for common-sense measures: reduce spending and deregulate the economy. He was defeated, not by economic theorists who had better equations and graphs, but by the political reality that Congressmen like to make themselves popular by spending government funds — funds which in actuality belong to the people.

It is no coincidence that in February 1976, halfway around the globe, Margaret Thatcher made her famous comment about governments which spend “other people’s money.”

Once expensive government programs have legislatively been put into place, it is difficult for anyone to stop them. A president cannot overturn legislation. Congress can overturn its own legislation. A court can nullify congressional legislation if such legislation violates the Constitution. But a president is powerless in such cases — by design: this is the separation of powers in action, as David Stockman explicates:

This was ironic in the extreme. Ford was a stalwart fiscal conservative who went down to defeat in 1976 in a flurry of spending bill vetoes. But the massive increase in entitlement spending enacted during the Nixon years, particularly the 1972 act which indexed Social Security for cost of living increases just as runaway inflation materialized, could not be stopped with the veto pen. In fact, the specious facade of the Nixon-Schultz full-employment budget provided cover for a historic breakdown of financial discipline.

The reader will be aware that both David Stockman and President Reagan were, as policymakers and elected officials, sometimes controversial. There are various interpretations of the narrative about Stockman’s years in the Reagan administration.

What, however, is uncontroversial, is the basic narrative of economic policies and economic conditions during the Ford administration. The events of the Ford administration in some ways foreshadowed Stockman’s career in the Reagan administration.

The axiom to be learned is this: the pressures of electoral politics will often overrule serious economic thought.

Tuesday, June 14, 2022

Promoting Public Health and Economic Justice: The Single-Family Dwelling

Home ownership has often — always? — been a part of the “American Dream,” whatever the American Dream may be. But now it’s becoming clear that it’s also a vehicle for creating economic equity and for helping people stay healthy.

Since March 2020, it’s become clear that people who live in freestanding houses are not only less likely to test positive for COVID, but demonstrate better overall health by a number of metrics. The structure of a freestanding home reduces virus transmission.

Apartments, condos, townhouses, row houses, and other homes which have shared crawlspaces, attics, and walls create paths for airborne pathogens. If a neighbor’s cooking can be smelled, then particles and vapors are being communicated from one living space to another. A virus can easily be among those things transmitted.

By contrast, a neighbor might cough and sneeze in one house, but if the next house is separated by several feet of grass, trees, and breeze, the chances of a SARS-CoV-2 transmission is nearly nonexistent. Freestanding single-family homes demonstrated their health benefits during the pandemic.

In addition to saving lives, however, home ownership is an important instrument in the effort to achieve an aspect of societal equity. Families who own a house are economically more resilient. Children who grow up in a single-family dwelling do better in school, are less likely to run afoul of the police, and are less likely to be obsese.

Some observers have asked whether home ownership might be a substitute for affluence in general. Could it be that the benefits attributed to home ownership are actually simply the benefits of wealth?

Further analysis, however, reveals that the smallest and most humble single family dwelling yields both the health and economic benefits which the grandest condo cannot give. A very modest house reduces virus transmission and bestows educational and social benefits, while a lavish upscale flat in an urban center does not.

Both for reasons of public health, and for reasons of social equity, zoning boards and local city councils should encourage the construction of single family dwellings more than condos and townhouses. This would especially benefit ethnic and racial demographic groups who are traditionally underrepresented in home ownership.

If local governments encourage “affordable housing,” but that housing isn’t freestanding houses, then those demographic groups will not be able to access the full benefits of home ownership. If society can increase the percentages of people who own a single family dwelling on its own piece of land, then that is a step forward for equity, equality, and justice.

Wednesday, June 8, 2022

LBJ’s Big Mistake: How Johnson’s Great Society Significantly Slowed Economic Progress for Black Americans

“Stagnant rates of increase in black prosperity” have troubled America for decades, writes Ben Shapiro. In the immediate wake of the 1863 Emancipation Proclamation and the end of the U.S. Civil War in 1865, it seemed that African-Americans were set to enjoy the opportunities of the economy.

In bits and pieces, Black entrepreneurship did indeed produce successes. But this growth could have flourished more, had not the government stood in the way. State and local governments which were not the result of free and fair elections took power in some places in the late 1870s.

For about a decade after the war’s end, the Reconstruction Era offered political and economic freedom to African-Americans, albeit imperfectly. Both in business leadership and in elected public offices, Blacks had high-profile roles.

In the late 1870s, that changed. Blacks lost many of the advancements they’d gained, as the Democratic Party reasserted itself. In some places, elections were neither free nor fair, and the Democrats gained control of cities, counties, and states. They enacted legislation and adopted policies designed to reduce opportunities for Blacks. After a decade of achievement, African-Americans lost ground, and “government involvement” was “to blame,” notes Shapiro.

Black economic fortunes were at a low point for several decades, until the administration of Theodore Roosevelt at the turn of the century, and the administrations of Warren Harding and Calvin Coolidge in the 1920s.

President Theodore Roosevelt began a new trend when he invited Booker T. Washington to dine at the White House in October 1901. This was a signal of new openness to African-Americans. Theodore Roosevelt’s forward movement was interrupted by the election of Woodrow Wilson in 1912.

Wilson, who assumed office in March 1913, undid much the access which Roosevelt had created for Blacks. Wilson imposed harsh segregation in government departments which had been desegregated and integrated prior to 1913.

Happily, Warren Harding and Calvin Coolidge picked up Theodore Roosevelt’s trend and carried it further. The 1920s were a time when African-Americans again advanced, both in business and in higher education. President Coolidge became the first president to deliver a commencement address at Black college when he spoke at Howard University in 1924.

Again, sadly, civil rights were put on hold. The direction given by Theodore Roosevelt at the beginning of the century was paused by Franklin Roosevelt in the middle of the century. While Franklin Roosevelt gave lip service to civil rights in order to obtain votes from African-American citizens, he took no action on their behalf. Instead, he insisted on segregation in the U.S. military during WW2. Eisenhower defied FDR’s orders and allowed Black and White troops to work together.

In this up-and-down narrative, the next up was Eisenhower’s presidency in the 1950s. Working together with Vice President Richard Nixon and with Martin Luther King, Jr., President Eisenhower drove two landmark pieces of legislation through Congress: the 1957 Civil Rights Act and the 1960 Civil Rights Act. Eisenhower also sent the U.S. military, in the form of the 101st Airborne Division, to Little Rock, Arkansas, where the Democratic Party’s Governor Faubus was determined to deny African-American children the right to attend school. Eisenhower made Faubus into an irrelevance and made sure that the Arkansas schools were desegregated and integrated, giving Black children major opportunities.

After the benefits of the Eisenhower years, African-Americans experienced another downturn during the 1960s. President Johnson inflicted a series of programs on the Blacks. These detrimental and even racist policies were lumped together under the heading of “The Great Society,” as Ben Shapiro notes:

In essence, the Great Society drove impoverished black people into dependency. In 1960, 22 percent of black children were born out of wedlock; today, that number is over 70 percent. The single greatest indicator of intergenerational poverty is single motherhood. As Thomas Sowell writes, “What about ghetto riots, crimes in general and murder in particular? What about low levels of labor force participation and high levels of welfare dependency? None of those things was as bad in the first 100 years after slavery as they became in the wake of the policies and notions of the 1960s.”

Presented as beneficial to African-Americans, these programs were not produced out of a sincere desire to help them or to promote justice. President Lyndon Johnson was an unrepentant racist. Behind the scenes, he routinely referred to Blacks using hateful and inappropriate epithets. He openly spoke to friends about how his programs were designed to deceive Black voters into supporting Johnson’s political party. But in public, he proclaimed himself a friend of the African-Americans.

The Great Society programs of President Lyndon Johnson, touted as a sort of reparations-lite by Johnson allies, actually harmed the black community in significant ways that continue to play out today. According to former Air Force One steward Ronald MacMillan, LBJ pushed the Great Society programs and civil rights bill out of desire to win black votes.

Johnson’s propaganda was designed not only to trick Black voters, but also to mislead the public into believing that his “Great Society” was having a meaningful impact, when in fact it was not, as historian Thomas Sowell writes:

Despite the grand myth that black economic progress began or accelerated with the passage of the Civil Rights laws and “War on Poverty” programs of the 1960s, the cold fact is that the poverty rate among blacks fell from 87 percent in 1940 to 47 percent by 1960. This was before any of those programs began. Over the next 20 years, the poverty rate among blacks fell another 18 percentage points, compared to the 40-point drop in the previous 20 years. This was the continuation of a previous economic trend, at a slower rate of progress, not the economic grand deliverance proclaimed by liberals and self-serving black “leaders.”

The 1964 Civil Rights Act was an example of President Johnson’s duplicity. While he’d opposed the 1957 Civil Rights Act and the 1960 Civil Rights Act, and had attempted to damage both of those laws by attaching amendments to them which could have weakened them, he suddenly presented himself as a supporter of civil rights and promoted the 1964 Civil Rights Act. In reality, however, the 1964 bill was different from the two previous ones: it proposed that the government intervene into the business practices of individuals and companies, effectively limiting civil rights rather than expanding them.

Johnson’s malignant racism hid behind his rhetoric, and a segment of the voting public believed his propaganda. Yet the Black experience during the Johnson administration showed only economic decline.

The next energizing step forward for civil rights, after the dreary oppression imposed by Lyndon Johnson, came during the presidency of Gerald Ford. President Ford had a longstanding friendship with Judge Willis Ward. Ward was a jurist who’d played football with Ford at the University of Michigan. Together, Willis Ward and Gerald Ford exemplified how civil rights could be promoted in practical situations.

The history of civil rights in the United States is not one of continuous and uninterrupted growth. It has been a narrative of ups and downs and ups, from 1863 to the present. It is a narrative with heroes and villains. Like the stock market, despite repeated downturns, the long term trend has been on the upside. Civil rights continue to thrive in the United States.

Wednesday, April 6, 2022

The Role of Revenue in Policy: Why to Tax, or Not

The role of taxation in American political thought changed significantly during the last decade or two of the twentieth century and during the first decade or two of the twenty-first century. Prior to that time, the primary role, if not the only role, of taxation was to generate revenue.

During the earlier decades of the twentieth century, there had been some effort to use taxation to guide behavior, e.g., so-called “sin taxes” like those placed on tobacco. But these examples were a small part of tax policy, and a small part of tax revenue.

The big change came when the idea gained popularity that tax cuts could be used to stimulate consumer spending. Whether implemented as simple tax cuts or as “stimulus payments,” the government hoped to energize the economy as citizens spent the extra money. The government failed, however, to correspondingly cut spending, so the increased debt and deficit partially negated any stimulating effect.

David Stockman, who was the Director of the Office of Management and Budget from January 1981 to August 1985, explains:

Until then, conservatives had generally treated taxes as an element of balancing the expenditure and revenue accounts, not as an explicit tool of economic stimulus. All three postwar Republican presidents — Eisenhower, Nixon, and Ford — had even resorted to tax increases to eliminate red ink, albeit as a matter of last resort after spending-cut options had been exhausted.

While it is true that tax cuts sometimes stimulate the economy, and can even be used to increase revenue according to the Laffer curve, most leaders prior to 1975 thought of cutting taxes for that purpose. The stimulating effect was seen as an incidental byproduct of tax cuts, not the goal of tax cuts.

Monday, January 17, 2022

Income Inequality: Why It’s Not as Bad as the Media Thinks, and Why the Numbers Are Misleading

A famous phrase — often but uncertainly attributed to Mark Twain — refers to the increasing evils of “lies, damned lies, and statistics.”

No matter who said it first, it’s true that numbers are used, misused, and abused, especially in political debates. In the early twenty-first century in the United States, debates about the nature, existence, and extent of so-called “income inequality” have made generous use of statistics.

These numbers demand examination. How does one quantify income? There are numerous ways. But income is not the only way to measure economic well-being, and perhaps not the most accurate way. Some economists point out that measuring consumption, as opposed to income, is a truer measure of one’s standard of living. Edward Conard writes:

Consumption is a more relevant measure of poverty, prosperity, and inequality. University of Chicago economist Bruce Meyer and the University of Notre Dame economist James Sullivan, leading researchers in the measurement of consumption, find that consumption has grown faster than income, faster still among the poor, and that inequality is substantially less than it appears to be.

Consumption is, after all, a measure of the items which constitute a standard of living: clothing, food, housing, transportation, etc.

Income and consumption are two variables which can increase or decrease independently of each other, as Edward Conard notes:

Measures of consumption paint a more robust picture of growth than proper measures of income.

Misleading income measures assume tax returns — including pass-through tax entities — represent households. They exclude faster-growing healthcare and other nontaxed benefits. They fail to account for shrinking family sizes, where an increasing number of taxpayers file individual returns. They don’t separate retirees from workers. They ignore large demographic shifts that affect the distribution of income.

It may well be a mistake to think of “income inequality” in simplistic terms as “the gap between the highest earners and the lowest earners,” as Ben Shapiro reports. There can be low earners whose standard of living is higher than the standard of living of high earners. A simple example is retirees, whose earnings may be low, but whose standard of living is supported by a lifetime of saving and investing.

More to the point, as noted above, a low earner may receive health insurance worth thousands of dollars, and therefore have a higher standard of living than someone whose nominal income is greater.

Income gaps have reliably “widened and narrowed over time”, Ben Shapiro explains, and there is no “correlation between levels of inequality of outcome and general success of the society or individuals within it.” Income inequality at any one point in time is misleading, because it is a continuously changing variable. Income inequality between various social classes is also misleading, because mobility means that individuals are constantly moving in and out of the various classes.

It’s quite possible for income inequality to grow while those at the bottom end of the scale get richer. In fact, that’s precisely what’s been happening in America: the middle class hasn’t dissipated, it’s bifurcated, with more Americans moving into the upper middle class over the past few decades. The upper middle class grew from 12 percent of Americans in 1979 to 30 percent as of 2014. As far as median income, myths of stagnating income are greatly exaggerated

What is now called “income inequality” should be understood as an often transient condition. The fact that one person earns more and another person earns less is evil only if those individuals are irreversibly locked into those conditions. But in fact, most American wage-earners are in a position of mobility: they can work their way up, and earn more in the future.

Well-intentioned but mistaken efforts to “eliminate income inequality” lead to the unintended consequence of freezing individuals at certain income levels and reducing chances for advancement.

Income inequality exists everywhere, and “social justice” destroys personal liberty and exacerbates inequality.

Attempts to “eliminate income inequality” actually ossify inequality. Only the fluid system of a free market creates chances for individuals to move up in terms of their incomes.

Monday, January 3, 2022

Enslaving the Free Market: The Era of the “Bailout”

In August and September 2008, Lehman Brothers (officially Lehman Brothers Holdings, Inc.) filed for bankruptcy. The firm came to an end as part of an event known as the “subprime mortgage crisis.” Both executive policies and Congressional legislation gave rise to this event, or more precisely, series of events.

The policies and legislation in question encouraged or required financial institutions to give loans and mortgages to customers who were manifestly unable and unfit to repay. This money was lent primarily for the purpose of buying houses. The inevitable and foreseeable result was a wave of defaults and foreclosures: individuals and families unable to pay their monthly amounts.

As the number of defaults and foreclosures increased, the banks and other institutions who’d lent the money and who now were unable to get it back, were left with little or nothing, and went bankrupt. The number of lenders going bankrupt grew, and the size of the institutions going bankrupt grew. The pattern culminated in the bankruptcy of Lehman Brothers.

Lehman Brothers was a major company. It had more than 26,000 employees and over $600 billion in assets.

Naturally, some observers were surprised, shocked, or worried. They assumed that the bankruptcy of a major company would cause trouble for the economy. They were wrong. They had forgotten the economic principle of “creative destruction.”

It is easy to assume that, if a large corporation goes bankrupt, that this will create problems like unemployment, inventory shortages, etc.

This assumption ignores the fact that when a business fails and collapses, it creates opportunities for new businesses which are better, more effective, more efficient, bigger, more profitable, and more adapted to the marketplace. The end of an old business creates space for a new business.

Metaphors may be useful in understanding this concept: One demolishes an old building in order to construct a new, better, larger building; one cuts down some old trees in a forest in order to plant younger and healthier trees.

A bankruptcy can create short term dislocation, like temporary unemployment and a dip in the stock market. In the long run, however, it can create more jobs than it destroyed, and better-paying jobs with better chances for advancement, resulting in a net increase in prosperity. In many scenarios, workers who are laid off eventually find employment at higher wages than the job they lost. The stock market, likewise, will not only recover from a downtick, but eventually go even higher in the wake of bankruptcy.

This principle is associated with a broad variety of economists: Joseph Schumpeter, Karl Marx, Werner Sombart, etc.

Adherence to this principle would have directed that the government, in the wake of the Lehman Brothers collapse, should have refrained from any intervention, and allowed the next two companies in line to go bankrupt as well: Goldman Sachs and Morgan Stanley. Had they gone bankrupt, their workers would have found new jobs at higher wages, the stock market would have recovered from a drop and gone on to new highs, and general prosperity would have increased.

Sadly, various elected and appointed leaders in government forgot this basic principle — or never knew it to begin with.

Congress passed several pieces of legislation, primarily the Troubled Asset Relief Program (TARP) and the Emergency Economic Stabilization Act of 2008. These legislations gave billions of dollars to companies which were in danger of going bankrupt. In addition to Goldman Sachs and Morgan Stanley, other companies soon asked for help, including Citigroup, Chrysler, American Express, and many others. The money given to these businesses came from two sources: either it was confiscated from ordinary American citizens by means of taxes, or it was borrowed, and ordinary American citizens will be required to repay this money by means of taxes.

Instead of allowing these corporations to go bankrupt — and only a few of them would have done so; the others simply asked for the money and got it — the TARP legislation kept them alive, but allowed them to remain inefficient and irresponsible. Had they gone out of business, new and more productive companies would have arisen in their places.

The end result was higher taxes for ordinary people, and debts which will have to be repaid with even more higher taxes.

This colossal misjudgment was made possible by government officials who were ignorant of basic economic principles, or who ignored them, or who forgot them.

David Stockman was a U.S. Congressman and later Director of the Office of Management and Budget. Concerning TARP, he points out that many government officials understood why it was wrong:

Certainly President Eisenhower’s treasury secretary and doughty opponent of Big Government, George Humphrey, would never have conflated the future of capitalism with the stock price of two or even two dozen Wall Street firms. Nor would President Kennedy’s treasury secretary, Douglas Dillon, have done so, even had his own family’s firm been imperiled. President Ford’s treasury secretary and fiery apostle of free market capitalism, Bill Simon, would have crushed any bailout proposal in a thunder of denunciation. Even President Reagan’s man at the Treasury Department, Don Regan, a Wall Street lifer who had built the modern Merrill Lynch, resisted the 1984 bailout of Continental Illinois until the very end.

As David Stockman shows, this was not a “Democrat” issue or a “Republican” issue. It was an issue about the basic principles of economics. A free market must be allowed to find its own way to an equilibrium.

The women and men who promoted TARP were in many cases people of good will: they legitimately wanted to help. But even well-intentioned governmental interventions in a free market economy are harmful.

The economy organically works toward an equilibrium, and at that equilibrium point lies maximal prosperity for all. The blind forces of demand and supply distribute better wages and higher standards of living to everyone in the marketplace, from the smallest to the largest.

Statist intervention in markets can only prevent the economy from achieving the best results for everyone.

Friday, November 26, 2021

The Mint and the Pandemic: Making Coins

When the pandemic struck the world in March 2020, it was clear that it would have significant economic impacts. Exactly what those impacts would be was, however, at that time, not always clear.

One of the less obvious effects was a shortage of circulating coins in the United States. Billions of coins existed, but they were either in businesses which were temporarily closed or permanently closed, or they were in people’s homes, and with millions of people under “lockdown,” those coins weren’t circulating.

The types of transactions which often involve coins were particularly hard-hit: people used more credit cards than cash, made more online purchases, and casual foot traffic in city centers was sparse to non-existent. Buying a newspaper, a cup of coffee, or a candy bar while walking downtown — once an ordinary part of daily life — quickly became rare.

To keep the economy alive, the U.S. Mint ramped up production to replace the coins which were frozen in idle cash registers or in homes. Writing for American Banker magazine, Jon Prior reports:

To help push more coins into circulation, the U.S. Mint last year boosted production to levels not seen since 2017.

The Mint’s two facilities in Denver and Philadelphia churned out 14.8 billion coins for circulation in 2020, up 26% from less than 12 billion the year before, according to data the agency provided to American Banker.

The effort was part of a plan between the government, coin collection companies, retailers and banks to cure a shortage in tills across the U.S. as in-person spending slowed in the early months of the pandemic and online and card transactions soared. The sudden scarcity of change was one of the unseen economic side-effects of the coronavirus pandemic, but that boost in production, combined with increased economic activity in recent months, means that coin circulation is finally returning to normal, industry officials say.

The extra production by the Mint happened at a time when maintaining normal production levels was already a challenge. The increased mintages represent a heroic effort.

In raw numbers, for example, the total number of nickels produced in 2018 was 1256.4 million; in 2019 it was 1094.89 million, but the pandemic in 2020 pushed the mintage to 1623.1 million. The increase in output was shared by both the Denver mint and the Philadelphia mint. Those are the only two mints in the United States which produce circulating coins. Smaller mints in San Francisco and in West Point produce coins only for investing and collecting, but not for retail circulation.

Similar increases were achieved in the production of the dime and quarter.

On the other hand, coins deemed less essential to commerce — the penny, the half-dollar, and the dollar coin — saw production levels in 2020 similar to, or slightly lower than, the previous two years, as resources were directed to the more urgently-needed coins.