This week’s letter examines the proliferation of lawyers in America and how they are reducing our economic productivity. In grade school civics class, we were taught that America is a nation of laws; that no one is above the law. Since the 1960s we have become a nation of competing rights, not laws. An army of lawyers stands ready to argue the cause of any business or advocacy group with access to sufficient funds. Those who can afford the legal bills can lengthen legal proceedings against them for a decade or more. Conflicts over land use hamper infrastructure projects and housing reform.
In 2018, Steven Brill, author of Tailspin and many other books, wrote an article in Time magazine titled “How Baby Boomers Broke America.” Brill is a Yale educated lawyer who founded Court TV several decades ago. Brill noted that the best and brightest among us, particularly those in the financial and legal professions, have become part of a protected class. They are shielded from the laws that govern the rest of us, the unprotected class. The professional class claims to have the public’s best interest at heart but it often acts to protect itself first at the expense of the public interest and social mobility.
In 1951 there were 220,000 lawyers for 155 million people in America, according to the American Bar Association (ABA). That represented a ratio of one lawyer to 700 people. is In the 1960s and 1970s, Congress passed much social and environmental legislation that left the actual rulemaking up to lawyers at federal and state agencies. During the 1970s, businesses hired many lawyers to thwart the impact of this new legislation. By 1984, the number of lawyers had tripled to 664,000 for a population of 237 million, a ratio of one lawyer to 357 Americans. In an annual address to the ABA that year, Chief Justice Warren Burger remarked on this worrisome trend, warning that society would be overrun by hordes of lawyers. By 2018, there were 1.1 million lawyers for 315 million people in America, the highest number of lawyers per capita in the world. Just five years later, there are now 1.3 million lawyers, a ratio of one lawyer for 255 people.
With the advent of Johnson’s Great Society and the Environmental Protection Act in the 1960s, the burden of regulation grew heavy. Large companies hired lawyers to discover and develop loopholes that created a legal safe harbor from the regulatory machine. Burdened by regulation, smaller companies became less efficient, making them less competitive. Wage gains which might have gone to workers now went to accountants, lawyers, government and insurance fees to protect business owners from the fines and liabilities of the new regulations. Larger companies, able to wield more legal power per dollar of revenue, absorbed their smaller competitors, giving larger companies greater pricing power.
In 2021, the American Bar Association listed 175 members of Congress with law degrees, a third of the 535 members of the House and Senate. By design, bargaining or incompetence Congress writes laws in imprecise language, leaving it up to the legal staff of executive agencies and the courts to determine what Congress meant. There is a public outcry against rule by unelected bureaucrats and judges but in an evenly divided electorate, those unelected officials protect the minority of 49 from the abuses of the majority 51. Computer algorithms enable a slim majority in a state to gerrymander voting districts to give one party representative power that enfeebles the 49% who belong to the other party. Those who control the democratic process control the power.
The growing adoption of computer technology in the late 1980s inspired the hope that automation would reduce the need for lawyers. Instead, compliance and regulatory work has increased each year. A 2017 CNBC article speculated that Artifical Intelligence (AI) might replace lawyers. Its doubtful that lawyers would allow that to happen. They write the rules that protect them from the rules, including the rule of competition. John Dingell, former Congressman from Michigan, once said “If I let you write the substance and you let me write the procedure, I’ll screw you every time.” Like an infestation of grasshoppers in a field of plants, too many lawyers diminish the productive vitality of our economy.
This week’s letter is a prediction that house price growth will decline to near zero in the coming few years based on historical trends of price growth and the 30-year mortgage rate. The pattern is similar to that in the late 1970s and mid-2000s. In each case the Fed kept its key interest rate below the annual rate of home price appreciation to achieve a broad economic growth. In each case that accommodating monetary policy helped fuel a bubble that led to severe recessions when the economy corrected.
This week the National Association of Realtors (NAR) reported another drop in existing home sales, the fourth drop in the past five months. At the same time, the Commerce Department reported that new single family home sales in July were up 31% over the same month last year. At first glance, that seems excessive but this past quarter was the first positive annual gain in single family home sales since the second quarter of 2021. Existing homeowners are interest rate bound to their homes until mortgage rates come down. New homes are filling the inventory gap.
Residential investment, which includes new homes and remodeling costs, contributes only 3-5% to GDP, according to the National Association of Home Builders. It varies by several factors. Homebuilders rely on the crystal ball predictions of the banking industry for financing. Homeowners’ remodel plans depend on the growth in home equity and interest rates available for financing. The pandemic sparked a shift in consumer preferences for existing homes. During the pandemic, new home sales decreased but remodeling increased. In this recovery period, the opposite has occurred. Home Depot has reported two consecutive quarters of negative sales growth, the first time since the housing crisis 15 years ago.
Let’s look at two previous periods when monetary policy was a major contributing factor to a subsequent decline in home prices and a recession. In the chart below (link to FRED chart is here), the red line is the average 30-year mortgage rate. The green line is the annual change in a broad home price index. As soon as the green line gets above the red line, homebuyers are making more in price appreciation than they are paying in interest, a form of arbitrage. That signals that monetary policy is too accommodating. The dotted line in the graph is the effective federal funds rate (FRED Series FEDFUNDS). Mortgage rates follow the Fed’s lead. In the mid-2000s, home price growth, the green line in the graph, rose up above the red mortgage interest line. As it did in the late 1970s, the Fed was watching other indicators and was slow to raise interest rates.
The period between the mid-1980s and the financial crisis is called the Great Moderation. From the end of the 1982 recession until the late 1990s, the Fed kept its key interest rate (dotted line) higher than home price appreciation and lower than the 30-year mortgage rate, a moderating balance. Since 2014, home price growth has been above the 30-year mortgage rate. When this latest period of arbitrage unwinds, the effects will disturb the rest of the economy. When will that moment come?
Asset bubbles leave an economy vulnerable to shocks. In an interconnected global economy, disturbances from malinvestment can cascade through one prominent economy to test the strength of institutions and businesses in other countries. The U.S. financial crisis demonstrated that process. The foundations of companies like AIG and Goldman Sachs, thought to be financial fortresses, cracked and threatened a collapse that would bring other large companies down with them.
One of the roles of a central bank is to curb the heady expectations that fuel asset bubbles. In a 1993 paper John Taylor introduced a rule, now called the Taylor rule, to guide the Fed’s setting of interest rates. His rule was based on the actual decisions that had guided Fed policy during the decade that followed the severe 1982 recession, part of a period called the “Great Moderation.”
In their textbook on money and banking, Cecchetti & Schoenhoeltz (2021, 498) describe the rule succinctly: Taylor fed funds rate = Natural rate of interest + Current inflation + ½ (Inflation gap) + ½ (Output gap). I’ll leave the equation in the notes at the end. This policy rule was meant as a guideline so the equals sign should probably be read as an approximately equals sign. John Taylor originally used 2% as the natural rate of interest. To simplify the calculation and understand the relationships, the authors present a simple scenario. If the inflation rate is 2% and the target inflation rate is 2%, then there is no inflation gap. If real (i.e. inflation-adjusted) GDP growth is 2% and potential output is also estimated to be 2%, then there is no output gap. I’ll note the calculation in table format below:
The Congressional Budget Office (CBO) estimates potential GDP based on a full utilization of the economy’s resources. Here’s a screenshot of the two series since the financial crisis. Real potential GDP is the red line. Real actual GDP is the blue line. The financial crisis in 2007 – 2009 had profound and persistent effects on our economy. The graph is drawn on a log scale to show the difference in percent. As a guideline, the gap for 2012 is about 3%.
I propose using the inflation in house prices as a substitute for the inflation part of the calculation. I’ve included the equation in the end notes. Presumably, home price growth implicitly includes the neutral rate of interest so I exclude that from this alternative measure. The price of a home includes a decades-long stream of owner equivalent rent priced in current dollars. It incorporates estimates of housing consumption and long-term wealth accumulation. Home prices include evolving community characteristics and public investment like the quality of schools, parks, transportation, employment and personal safety. They are a broad market consensus. This particular series is compiled quarterly but follows the trend of the monthly Case & Shiller National Home Price Index, giving the central bank timely home price trends.
Fifty years ago, Alchian and Klein (1973) proposed that central banks include asset prices in their formulation of monetary policy. They wrote that a composite index of many types of assets would be an ideal measure but difficult to calculate. A broad stock index like the S&P 500 would capture the current price of capital stock but stocks can overreact to interest rate changes by the central bank (p. 180, 183). The S&P500 index is relatively volatile, with a 10-year standard deviation of 14.88%. The 30-year metric is 15%. The home price index is stable, with a 40-year standard deviation of just 4.74%, slightly above the 4.07% deviation of the Federal funds rate itself.
During the early years of the Great Moderation, this alternative policy rule was approximately the interest rate policy that the Fed adopted. In the graph below is the alternative rule in red and the actual Fed funds rate in blue. Notice the sharp divergence just before the 1990 recession. In the aftermath of the Savings and Loan crisis, the annual growth in home prices fell from 7% in 1987 to 2.5% in the fall of 1990. This was below the 4% long-term average of home price growth, signaling a call for a more accommodating monetary policy. The Fed did not recognize the economic weakness until it was too late and the economy went into a mild recession. For several years following the recession, the labor market struggled to regain its footing and this slow recovery contributed to President H.W. Bush’s defeat in his 1992 re-election bid.
The employment slack of the first half of the 1990s might have been lessened by a monetary easing. In the second half of that decade, the alternative rule called for a tighter monetary policy, which would have curbed the enthusiasm in the stock and housing markets. The divergence between the alternative rule and the actual Fed funds rate grew as the housing bubble developed. By the time the Fed started raising interest rates in 2004-2005, it was too late.
I will finish up this analysis with a look at the past decade. The alternative rule and the Taylor rule would have called for a higher policy rate. Persistent low rates helped fuel a growing price bubble in the housing market. The pandemic accentuated that trend. High home prices have contributed to unaffordable housing costs in popular coastal cities, sparking a surge in homelessness.
Exiting an asset bubble is painful. Expansion plans are put on hold. As investment decreases, hiring growth declines and unemployment rises among those most vulnerable in the labor force. Withholding taxes decline, reducing revenues to state and federal governments who must carry the additional burden of benefit programs that automatically stabilize household incomes.
Housing costs constitute 18% of the core price index that the Fed uses to gauge inflation, but accounts for 40% of core price inflation. Because housing is a major component of household expenditures, home prices can act as a stable measure of inflation. Home prices capitalize the future flows of those expenses. Persistently low interest rates can distort those calculations, promoting malinvestment and an asset bubble. This alternative rule incorporates that signal into policymaking and should help the Fed make more timely course changes before the disturbances spread throughout the economy.
Keywords: Savings and Loan Crisis, Financial Crisis, Inflation, Federal Funds Rate, Taylor Rule, Home Price Index
(1) FFR = NRI +πt + α(πt – πt*) + ß(γt – γt*), where π is the annual change in the Personal Consumption index (FRED Series PCEPI). NRI is set at 2.0%. γ is the natural log of real GDP (FRED Series GDPC1) and γ* is real potential GDP (FRED Series GDPPOT). α and ß coefficients are the degree of concern and should add up to 1. If inflation is more of a concern then α would be higher than ½. If output is more of a concern ß would be more than ½.
(2) Alternative Taylor Rule: FFR = hpi + α(hpi – avg30(hpi)) + ß(γt – γt*), where hpi = the annual percent change in the All-Transactions House Price Index (FRED Series USSTHPI). avg30(hpi) is the 30 year average of the hpi.
Alchian, A. A., & Klein, B. (1973). On a correct measure of inflation. Journal of Money, Credit and Banking, 5(1), 173. https://doi.org/10.2307/1991070.
Cecchetti, S. G., & Schoenholtz, K. L. (2021). Money, banking, and Financial Markets. McGraw-Hill.
This week’s letter is about the formation of a price consensus between buyers and sellers. I’ll introduce a different perspective that might help us understand broad price changes. Visually oriented readers familiar with economics and statistics can listen to this letter and mentally picture these ideas as they walk the dog. However, I’ll present several graphs to illustrate the perspective.
Anyone who has taken an Economic course has been introduced to a supply demand diagram. Quantity is on the x-axis and Price is on the y-axis. The lines may be curved or straight. The intersection of the demand and supply lines is the equilibrium price, the long term average. A capitalist economy promotes change and the supply-demand diagram is a visual aid to understand how price and quantity respond to shifting conditions. Students learn how the supply-demand curves respond to changes in income, to better production technology, to price changes in other kinds of goods. That simple diagram demonstrates responses to government policies like taxes, transfer programs, price controls like apartment rents, and agricultural price supports.
The dotted line represents demand after a period of time, the one component missing from this 2-dimensional graph. While it pictures the formation of an equlibrium price it does not emphasize the broad price consensus that forms between buyers and sellers. To picture that let’s draw a probability distribution of sales at various prices and quantities. I’ll exchange the quantity and price axes so that price is on the horizontal x-axis and quantity is on the vertical y-axis.
I’ll redraw the chart, setting the average price to $0 with a short range of prices above and below that average. The equilibrium price is just the average long term price. The chart below highlights the narrow consensus over price between consumers (blue line) and suppliers (orange line). Rising prices induce more suppliers to enter the market. Declining prices attract more buyers. The supply and demand lines are curved, representing the number of sales taking place at each price level. The total number of units sold is 10,000.
Let’s consider a garden tool whose average price is $30. We will see some customers willing to pay $34, or $4 above that average price, but there are few of them. Likewise, there are few suppliers willing to sell at a price of $26, a price that is $4 below the average price.
To show price and quantity dynamics, the normal distribution graph is not as flexible or as simple as the conventional supply-demand diagram. The normal distribution chart can be viewed as a spread of prices over time, the third dimension. Just imagine that wedge of blue is a piece of pie so many weeks or months thick. Seeing price as a probability distribution does reflect a buyer’s reality in the sense that we prefer to shop with approximate prices in mind. Monthly surveys conducted by the BLS tell us that the prices of two categories – food and energy – are volatile, making it difficult for us to anchor a price expectation. These are the prices we encounter frequently when we fill up our cars, pay our utility bills and shop at the grocery store.
The graph is similar to a 2-dimensional triangle and is missing a critical component – time. The depth of a slice of pie can represent time periods. Demand operates on a shorter time scale than supply, an idea central to the analysis of Alfred Marshall, the economist who developed the supply-demand graph we use today. It’s a thinner wedge of pie.
Imagine that the average of a weekly tank of gas is $40. The blue pie of the normal distribution in the graph is sliced into a 100 vertical strips that statisticians call “bins.” Imagine that every one of those bins is a week and the center, the zero point, is an average of gas prices over two years. That is the time scale of demand. The time scale of supply is thicker, perhaps four times as long in some industries. The chart below shows the 2-year (demand) and 4-year (supply) averages of weekly gas prices for the past 25 years. Current 2-year average prices are at a historical peak. These prices are not adjusted for inflation. Adjusted for inflation, gas prices have declined in the past 40 years. After adjusting for fuel efficiency, gas prices are comparable to those in the 1950s.
The supply chain, including the banks that fund them, must look far into the future. Each one of the bins in the normal distribution chart could represent a month, not a week, forming an eight year average. No one could have foreseen a pandemic that interrupted global production. Coming out of the pandemic, businesses responded to low interest rates and anticipated a surge in demand. In an 18-month period from the fall of 2020 to the spring of 2022, private investment increased by 22%. The inflation that erupted in the spring of 2022 was a combination of growth in short term demand and long term supply investment. As soon as the Fed began raising interest rates, the surge in private investment ended and leveled out.
The normal distribution chart helps us see price as a probability distribution dispersed over time. Any chart that reminds us of pie is a useful and welcome analytical tool.
This week’s letter explores the free market and rule of two laws in our lives. Advocates for a laissez-faire market quote Adam Smith’s mention of the invisible hand in The Wealth of Nations, or WON. In a free market, individuals pursuing their own self-interest unintentionally promote a general welfare, a positive outcome as a result of the Law of Unintended Consequences.
Smith was particularly concerned with what I will call the Law of Intended Consequences. Under the guise of acting in the public interest, individuals furthered their own self-interest at the expense of the public welfare. In Part I of WON, Smith spent several chapters documenting many examples of collusion between business owners and merchants, labor guilds and city magistrates to further the gains of a small minority at the expense of the majority. This included price supports, price fixing, protective trade restrictions and the granting of monopolies through licensing. The only solution was a system of governance that promoted a general law and order with as few laws as possible.
The free market encourages a set of problems that subtract from the general welfare. Individuals pursue the most gain with the least cost. We want to buy low and sell high. We tout the principles of equality, but more often choose to maximize our own welfare. Transportation is most affected by this trait. Railroad, truck and airline carriers would prefer to supply the shorter distance routes which generate the most profits at the least cost. Without regulation and cross-subsidy, long distance routes that connect local or regional markets are underserved. This cripples the formation of a national transportation system. Although Adam Smith died a few decades before the introduction of railroads, he compared shipping goods by water to land based transportation by horse drawn wagon (Chapter 3). The former was far more profitable and explained why the “art and industry” of cities and towns close to water improved at a faster pace than inland communities.
This free market mechanism of the invisible hand fostered densely populated cities whose crude sanitation promoted epidemics of disease. In 1800 London had a population density averaging 30,000 people per square kilometer, a density more than twice as high as present-day New York City. The rich could afford a wagon and horse for transportation and moved to the outskirts of a city to escape the filth, smoke and disease of congested cities. The poor died prematurely. That was the invisible hand at work, subject to the same Law of Unintended Consequences.
In an ideal world, public laws would strike a balance between the laws of intended and unintended consequences. However, the very making of public law invokes the Law of Intended Consequences. Elected representatives tend to serve narrow ideological or geographical constituencies that are aligned with a representative’s own welfare. That is not a condemnation of their self interest but a description of the difficulty an elected body faces when trying to pass any law that claims to serve the public welfare.
In Article I, Section 8 of the Constitution, the framers limited Congress’ lawmaking authority to specific powers and those that promoted the general welfare. To James Madison, the main architect of the Constitution, that wording was clear. It meant only those laws that supported a broad public welfare like the common defense. Richard Lee, one of the anti-Federalists suspicious of centralized authority, protested that the general welfare could include “every possible object of human legislation,” as Michael Klarman (2016) quoted in his account of the making of the Constitution. Lee was worried that a strong central government could expand its power to tax for any reason that it deemed to be in the general welfare. A small class of people or a central government could argue that their welfare was the general welfare.
People in difficult circumstances clamor for a piece of the tax purse. Pharmaceutical manufacturers argue that a liberal extension of profit-protecting patent rights will promote more drug development and advance the general welfare. Advocates of trickle-down economics champion laws that promote lower taxes and fewer regulations, arguing that business owners will spread the wealth to working families. This is the collusion between private industry and lawmakers that Adam Smith documented 250 years ago. Our motivations and machinations do not evolve.
The welfare of the individual and that of the public must ever come in conflict. There is an inherent weighting we attach to each person’s welfare and each of us gives greater value to our own welfare. In the Part II, Chapter 3 of the Theory of Moral Sentiments, Smith remarked that we get more upset over the loss of the tip of a finger than we do over the loss of millions of lives if China were to be swallowed up by an earthquake. We cannot agree on society’s maximum welfare, or ophelimity, because we use different weighting coefficients to measure welfare. Lawmaking is a compromise between competing calculations of interest, both individual and public.
A laissez-faire market, like a pure white paint, is not efficient. A bit of black or umber tint mixed into a white paint base gets a wall covered in fewer coats and the tint is not noticeable. Each participant in a free market gains from cheating so some regulation is necessary as an incentive toward self-policing. We argue over how much regulation to mix into the free market base. We have different personal convictions, values and tastes, ensuring that our disagreements will persist.
This week’s letter contains some idle thoughts on the exchange of money. Imagine a visitor from outer space who observes the exchange of cash for a $5 ice cream cone at a store. The buyer starts eating the ice cream. The store clerk puts the piece of paper in a black box, the cash drawer. It is clear to the visitor that the buyer has received something useful. What is not clear is what the store clerk received in exchange. The visitor has learned that creatures throughout the universe give up something in response to a reward or a threat. If the piece of paper was a threat, the clerk did not seem alarmed when the buyer offered the piece of paper across the counter. The visitor reasons that the paper is an energy packet which the clerk will consume later. Perhaps the clerk can put the paper into water and it will become food.
The visitor from outer space has realized an essential aspect of money. It contains potential energy that can be released immediately and exchanged for a good or service. It’s like an electrical capacitor ready to deliver an energy packet. We call that exchange energy purchasing power. Immediate release means that money has no maturity period, or a maturity of 0. A one-year CD cannot be spent because it has a maturity of one-year. To spend it, I need to wait until the one year is over or convert the CD to cash and receive a reduced amount, a penalty for early withdrawal.
The charge on that money capacitor can increase or decrease. We use the terms deflation and inflation for the increase and decay of money’s purchasing power. Inflation measures the percent of purchasing power lost over a period of time and that percentage lost cannot be more than 100%. In other words, a $1 bill cannot lose more than $1 of purchasing power. There is no theoretical limit to deflation, the increase in money’s purchasing power.
We use the terms yield and discount rate for the increase and decrease of money’s nominal value at some future time. These rates of change in purchasing power reveal another aspect of money – an exchange of time. What is time? A container of probabilities and that involves risk. Instead of consuming the ice cream cone, let’s say the buyer left the ice cream cone in a freezer for a period of time. What is the chance that some random power outage occurs and the ice cream melted while the freezer was off? When the freezer is opened, will the cone of ice cream be intact and ready to enjoy or will it be a discardable mess of ice cream protoplasm? Readers will note the analogy with Schrodinger’s thought experiment about a cat in an unopened box containing a life-threatening amount of radioactive material.
A lottery winner must decide between a series of payments and a lump sum payment that is far less than the nominal amount of winnings. The difference between now consumption and future consumption is called the discount rate, which includes a rate of risk that the winner does not live to receive all the payments. Social Security allows people to claim benefits at age 62 but the monthly payments are substantially lower than someone who waits until full retirement age. The discount rate is about 8% per year, almost the average annual return on the stock market. The risk is that a retiree dies prematurely and leaves money on the table, so to speak.
There is no objective or time-invariant value in an exchange of energy. The exchange value of time varies by person and circumstance. In case of death or injury, the insurance industry calculates an average annual loss of income, or purchasing power, multiplied by an average estimate of years remaining in a person’s life. Ken Feinberg worked pro bono as the master of the 9-11 Victim’s Compensation Fund set up by Congress to resolve and expedite the many lawsuits that victims and their families would bring against the airlines. Feinberg agreed that an objective measure like years lost cannot compensate for the subjective loss of a life to a victim’s family. Money cannot measure life.
The exchange of money for goods and services connects buyers and sellers to a tree of information. Transactions take two forms: those that are recorded or “witnessed” by a third party and those that are not. If a buyer uses a check or credit card to buy an ice cream cone, a bank records the exchange of money for ice cream. If the buyer uses cash, only the merchant and the buyer have a record. The merchant records a sale and the buyers gets a receipt if they ask. The merchant reports the sum of sales to a government agency for sales tax and income purposes. In the case of a food-borne disease the USDA will investigate the many branches of that tree, searching the vendors, distributors, packagers and growers to isolate the source of contamination.
The exchange of ice cream for money is a connection to a tree of production. The manufacture and storage of the ice cream involved a network of power plants to generate electricity, machinery and labor for production, as well as natural and artificial ingredients. The buyer has the purchasing power in her pocket because of some past exchange of energy as part of a production process.
The sidewalk and street outside a store connects a buyer to a tree of organizational authority. Some public entity had the ability to gather the resources to build that infrastructure. The use of those public facilities might have cost the ice cream buyer 30 to 40 cents in sales tax. Over decades metropolitan areas have attracted people because cities offer an economic bargain of public benefit for relatively small cost.
The exchange of ice cream for money connects the buyer and seller to a network of rules and expectations of behavior. The clerk won’t sing the Star Spangled Banner when she receives the money. She won’t do a magic trick with the ice cream. The buyer will not give a dramatic reading of the serial number on the $5 bill before handing it to the clerk. Exchange requires a tacit cooperation, an agreement to follow the rules.
As a child I learned that there was electricity inside the walls. The air I walked through and breathed was full of radio waves in addition to light and sound waves. There was literally music in the air, captured and translated by a portable radio. Today our cell phones interpret the microwave radiation emitted by hundreds of nearby cell towers. Our thoughts and senses are connected to each other as we walk through the information stream. The ultimate source of every bit of that information is the effort of another human being. For the monthly cost of a cell phone bill, we have access to that effort, that infrastructure, the taxing and regulatory authority that supports that exchange. The exchange of money connects our efforts, yet talking about money usually introduces dissension. We disagree about the distribution of money, the priorities of public spending and the principles of taxation. Then we blame money instead of ourselves.
This week’s letter is a proposal for an alternative measure to guide the Fed’s monetary policy. In 1978, Congress passed theFull Employment and Balanced Growth Actwhich gave the Fed a dual mandate – giving equal importance to price stability and full employment. The Canadian central bank has a hierarchical mandate with price stability as a priority. As with most Congressional mandates, the legislation left it up to the agency, the Fed, to determine what pricestability and full employment meant. The Fed eventually settled on a 2% inflation target. Full employment varies between 95-97% and is hinged on inflation.
For its measure of inflation, the Fed relies on the Bureau of Labor Statistics (BLS) who conducts monthly surveys of consumer expenditures. The BLS compiles a CPI based on the its price surveys of hundreds of items. The Fed prefers an alternative measure based on the Consumer Expenditure Survey, but the weakness in both measures is the complexity of the methodology and the inherent inaccuracy of important data points.
According to the BLS, housing costs account for more than a third of the CPI calculation. Twenty-five percent of the CPI is based on an estimate of the imputed rental income that homeowners receive from their home. This estimate is based on a homeowner’s response to the following question: “If someone were to rent your home today, how much do you think it would rent for monthly, unfurnished and without utilities?” How many owners pay close attention to the rental prices in their area? The BLS also surveys rental prices but tenants have six to 12 month leases so these rental estimates are lagging data points. The BLS tries to reconcile its survey of rents with homeowners’ estimates of rents using what it admits is a complex adjustment algorithm.
The BLS regards the purchase of a home as an investment, not an expenditure so it must make these convoluted estimates of housing expense. There is a simpler way. Buyers and sellers capitalize income and expense flows into the price of an asset like a house. The annual growth in home prices would be a more reliable and less complex measure of inflation. Federal agencies already publish monthly price indexes based on mortgage data, not homeowner estimates and complex methodology. An all-transactions index includes refinancing as well as purchases. Bank loan officers have a vested interest in monitoring local real estate prices so their knowledge is an input to the calculation of a home’s value when an owner refinances.
The Federal Housing Finance Agency (FHFA) publishes the All-Transactions House Price Index based on the millions of mortgages that Fannie Mae and Freddie Mac underwrite. From 1990 – 2020, home prices rose by an average of 3.5% per year. A purchase only index that does not include refinances rose almost 3.9% during that period. As an aside, disposable personal income rose an average of 4.6% during that period.
The Fed does not need authorization from the Congress to adopt an alternative measure of inflation to guide monetary policy. As its strategy for price stability, the Fed could set a benchmark of 4% – 5% home price growth, near the 30 year average. If house prices are rising faster than that benchmark, monetary policy is too accommodating and the Fed should raise rates. Since the onset of the pandemic, home prices have risen 11% per year, three times the 40 year average. This same growth marked the peak of the housing boom in 2005-2006 before the financial crisis. The Fed did not begin raising interest rates until the spring of 2022. Had it used a home price index, it would have reacted sooner.
The annual growth in home prices first rose above 4% in the second quarter of 2013. The Fed kept interest rates at near zero until 2016, helping to fuel a boom in both the stock market and housing market. Since 2013, house prices have stayed above 4% annual growth, helping to fuel a surge in homelessness. Let’s look at several earlier periods when using home prices as a target would have indicated a different policy to monetary policymakers at the Fed.
In 1997, the annual growth of home prices rose above 4% and remained elevated until the beginning of 2007 when the housing boom began to unravel. In 2001, home prices had risen almost 8% in the past four quarters but the Fed began lowering its benchmark Federal Funds rate from 5.5% to just 1% at the start of 2004. The Fed was responding to increasing unemployment and a short recession following the dot-com bust. Near the end of that recession came 9-11. By lowering rates the Fed was pushing asset capital that had left the stock market into the housing market where investors took advantage of the spread between low mortgage rates and high home price growth.
In 2004, home price growth was over 8% and accelerating. Had the Fed been targeting home prices, it would have acted sooner. However, the Fed waited until the general price level began rising above its target of 2%. In the 2004-2006 period, the Fed raised rates by 4%, but it was too late to tame the growing bubble in the housing market. In 2005, home prices grew by 12% but began responding to rising interest rates. By the first quarter of 2007, home price growth had declined to just 3.3%.
The Fed models itself as an independent agency crafting a monetary policy that is less subject to political whims. However, the variance in their policy reactions indicates that the Fed is subject to the same faults as fiscal policy. If the Congress is crippled, then the Fed feels a greater pressure to react and is helping to fuel the boom and bust in asset markets. Let’s turn to the issue of full employment.
The condition of the labor market is guided by two surveys. The employer survey measures the change in employment but does not capture a lot of self-employment. The household survey captures demographic trends in employment and measures the unemployment rate. The BLS makes a number of adjustments to reconcile the two series. The collection of large datasets and the complex adjustments needed to reconcile separate surveys naturally introduces error.
The labor market has experienced large structural changes in the past several decades. Despite that, construction employment remains about 4.5 – 5.5% of all employment so it is a descriptive sample of the condition of the overall market. Declines in construction employment coincide with or precede a rise in the unemployment rate. In the past 70 years, the construction market has averaged 1.5% annual growth. During the historic baby boom years of the 1950s and 1960s, the growth rate averaged 2%. The Fed might set a target window of 1.5% – 2.5% annual growth in construction employment. Anything below that would warrant accommodative monetary policy. Anything above that would indicate monetary tightening. In 1999, the growth rate was 7%, confirming the home price indicator and strongly suggesting that fiscal or monetary policy was promoting an unsustainable housing sector boom.
If the Fed had adopted these targets, what would be its current policy? The FHFA releases their home price data quarterly. The growth in home prices has declined in the past year but was still 8.1% in the first quarter of 2023. However, the S&P National Home Price index tracks the FHFA index closely and it indicates a slight decline in the past 4 quarters. Growth in construction employment has leveled at 2.5%, within the Fed’s hypothetical target range. The combination of these two indicators would signal a pause in interest rate hikes. This week, the Fed continued to compound its policy mistakes and raised interest rates another ¼ percent.
This week’s letter is about prices and two dynamic values, a use value and an exchange value. These two values can help us compare assets if not goods. I’ll review a short history of thinking on price and value. How does the passage of time affect different types of assets? Lastly, how sensitive are some assets to investor temperament?
The insights of prominent thinkers in the past can inform our perspective. Richard Cantillon (1680-1734) was a financier whose keen understanding of human exuberance enabled him to make a fortune in the stock market bubbles of the South Sea and Mississippi System. He argued that there was an intrinsic value to a commodity that was the sum of the inputs, land and labor (capital was included in land). The ratio of supply and demand as well as “humors and fancies” explained the variance between market price and intrinsic price. In a well-organized society, the market price and the intrinsic price tracked each other closely.
Writing a few decades later, Adam Smith would refine the classification of prices further. A market price included the rent of the land, the worker’s wages and a capitalists’ profit. A natural price was the average of market prices and a price that a customer expected to pay when going to market. Finally, there was an exchange price, a measure of purchasing power. Writers of that time distinguished between commodities, or subsistence goods, and goods of an artisanal nature, affordable only to those in the middle and upper classes.
In Book 1, Chapter 4, Smith distinguished the two meanings of the word value. The first was a value in use, the “utility of some particular object,” whose value is consumed. Utility depends on the person, their circumstances and preferences and cannot be measured. The second is a value in exchange, the “power of purchasing other goods.” Commodities like a pound of corn have both a use value and an exchange value but Smith made it clear that the use value of a commodity does not anchor its exchange value. He noted that many goods which have a high use value like water have a low exchange value, and those with a high exchange value like diamonds have little or no use value. Smith spent the following three chapters exploring the connection between exchange value and price.
As he compared standards of living in different ages and countries, from neighboring France to the American colonies, Smith was looking for a yardstick, a standard of measure. Economic institutions today compile extensive price and income indexes to compare prices across time and countries. Smith had limited manpower – himself. He chose a laborer’s toil as “the only standard by which we can compare the values of different commodities at all times, and at all places.” He was careful to note several caveats. It was “difficult to ascertain the proportion between two different quantities of labor” and the “real price of labor is very different upon different occasions” and in more advanced societies. Regardless of prices or the value of gold and silver in England and the American colonies, he could compare the purchasing power of laborers in each country doing similar work.
Smith’s grand thesis was that greater specialization of labor increased productivity and fostered economic progress. Within this framework, people would more frequently exchange their labor rather than consume the goods their labor produced. For Smith, labor was an “exchangeable value,” not some value inherent in a commodity. He used it to construct a measure of purchasing power. Almost a century later, Karl Marx would distort this yardstick of purchasing power into a qualitative claim that the labor input to a commodity was the intrinsic value of the commodity. Anything above that value was an exploitation of workers by capitalists, according to Marx.
Let’s extend this analysis to asset, which I will divide into two types: those that derive an exchange value based on ongoing operations and those that don’t. Ongoing operations can be likened to a use value because something is consumed in that operation, a depreciation. There is an explicit or imputed flow of income whose discounted value influences the market value of stocks and bonds. Time-sensitive financial instruments like stock options act like insurance and are very much anchored by ongoing activity and the expectations formed from those operations. The market value of real estate may rely on scarcity, like a collectible, but the scarcity aspect contributes to expectations of future income that the real estate can earn. Therefore, its market value is also anchored by operations.
Collectibles are an asset without any ongoing operation. They derive their market value from their scarcity or uniqueness. A painting may bring pleasure in the viewing but the enjoyment of that pleasure does not consume the painting. Time, yellowing and dust may introduce a depreciation expense but time usually increases the market value of the painting. Money can be a collectible but only if it is rare. Digital currencies behave very much like collectibles but there is nothing to hang on a museum wall. For traders, the chief attraction of crypto is the possibility of future trading gains. Unlike stocks, crypto does not represent ownership in operating profits. Unlike bonds, crypto is not a purchase of someone’s debt. Unlike real estate, crypto does not generate any cash flows from its use value.
Some assets with little ongoing use value have volatile valuations because their chief use value is the hope of future trading profits to the holders of the asset. Their use and trading values can collapse suddenly as though they were a time-sensitive financial asset. Being alert to that imminent collapse helped Richard Cantillon make a fortune. Investors in such assets must remain nimble.
This week’s letter is about the impact of the internet and digital technologies on our laws. New technologies introduce new connections between people and institutions. What was once separate when a law was written becomes joined under the new technology. Advocacy groups emerge to pressure policymakers to shape new laws and regulations that incorporate a recent technology.
Over twenty years ago, the increasing volume of internet sales not subject to sales tax challenged the meaning of tax nexus. This was a retailer’s physical presence within a state that required it to collect sales tax on purchases and remit those collections to the state. The Sales Tax Institute has a nice explainer on the various types of nexus and the history of court decisions on the topic. Several characteristics of cryptocurrency have challenged policymakers and legal interpretations.
On Thursday, the stock of the Coinbase exchange shot up 25% on the hope that it will prevail in its attempt to operate outside SEC regulations, those that govern conventional market exchanges and securities brokers. A month ago the agency had charged Coinbase for operating as an unregistered securities exchange. On Thursday, a district court judge in a case involving another blockchain ruled that crypto assets were not securities in many cases. The judge cited the Howey test, a 1946 Supreme Court decision that set forth characteristics that defined a security. This past March an article at Coin Telegraph explained the history of the Howey test. The key phrase in the Howey decision is that an investment contract is “an investment of money in a common enterprise with profits to come solely from the efforts of others.” Is crypto a “common enterprise” whose profits come “solely” from the efforts of others?
The judge’s ruling strengthens the argument by some that crypto has many characteristics of a collectible, which is not considered a security. Classifications rely on shared as well as distinguishing characteristics. Anomalies challenge classification the way that the platypus challenged biologists’ definition of a mammal. Advocates of crypto claim that it is a trustless system of exchange but let’s examine that claim.
A barter transaction between two people is the only exchange that does not involve a third party in some way. Trust is an implied intermediary in an exchange between two parties. Crypto or cash, there is some third party involved in a transaction. Our cash money may read “In God We Trust” but our trust is really in the Federal Reserve, an agency of the U.S. Government. Crypto exchanges have a fiduciary duty to the owners of the crypto coin the exchange holds. That fiduciary duty invites government regulation and it will be a battle of advocacy groups to shape the laws that create those regulations. This week Coinbase may be confident that it will prevail against SEC regulation but there will be an ongoing effort to impose some regulation to protect owners of crypto. As policy is shaped over the coming years, crypto owners can expect that crypto exchanges will experience similar abrupt reevaluations.
This week’s letter is about the digital currency proposed in a 2008 paper penned by someone or some group writing under the pseudonym Satoshi Nakamoto. The two problems that Nakamoto’s proposal targeted have been mitigated by other means. That may explain why there has not been wider adoption of digital currencies as a transaction medium.
Although the idea of a digital currency has gained a fervent loyalty in the public, that was not the intended audience. As electronic payments became a greater share of global payments, chargebacks plagued both merchants and banks. By 2010 electronic payments were 8% of global payments. In a decade, they were an estimated 16%, according to a report from McKinsey and Company, the world’s top consulting firm.
The proposal was a method to “make non-reversible payments for non-reversible services” as Nakamoto stated in the introductory paragraph. I won’t dive into the details of the chargeback-to-transaction ratio but it was a much bigger problem in 2008. By 2017, the chargeback-to-transaction ratio was still 3.76% but has fallen to 1.52% by 2021, according to Midigator. I will refer interested readers to two reports published by Midigator, a subsidiary of Equifax, the credit reporting company. The first is the 2022 report on chargebacks and a 2020 explainer of the Rapid Dispute Resolution system used by participating banks and merchants.
The second problem the paper addressed was the minimum amount set by banks and merchants for debit and credit card transactions. In 2008, a common minimum was $5. Today, many merchants will process a transaction of less than $1. Apple routinely charges customers $0.99 per month for additional cloud data storage.
Cash payments solve a verification problem in transactions between strangers. Cash is a non-reversible exchange of money for goods or services. Transactions without cash involve some form of I.O.U. – a check or debit card, or a credit card. The customer gets the good or service. How can the merchant trust the customer’s I.O.U.? This requires a verification process of the customer’s identity and a commitment by a third party like a bank to pay the I.O.U.
The marginal cost to send one more email or one more http request to a web server is nearly zero. The spread of email in the 1990s exposed millions of people to dishonest actors who could send thousands of emails at little cost. A small number of computers could send thousands of counterfeit http requests to a server to overwhelm its resources in a Denial Of Service (DOS) attack. A countermeasure was to require a proof of honest intent by having the computer show some proof of work. An honest agent would do the proof of work to gain access to the server. Such a scheme would frustrate a dishonest agent trying to make repeated attempts to overwhelm a server’s resources.
Nakamoto proposed a currency based on a proof-of-work system rather than trust in a central agency. In a peer-to-peer network, verification is done by consensus. If each voter were defined as an IP address, a malicious actor could simply accumulate a lot of IP addresses then legitimate a fraudulent transaction. To combat that problem, Nakamoto proposed that a voter be defined by CPU, not IP address. Any actor who could amass enough computing power to defeat the system would find it more profitable to use that power to honestly make new digital coin.
Digital currencies have evolved beyond their original purpose. Nakamoto’s currency was designed to combat flaws in electronic payment systems that presented problems for merchants and bank intermediaries. Since 2008 the industry has developed other methods to reduce or resolve chargebacks and fraud. Meanwhile, Nakamoto’s proposal has become a favored security for some investors. Its adoption as a collectible like investment has introduced a pricing volatility that subverts its original purpose as a non-reversible payment, a digital form of cash.
This week’s letter is about lying and some types of lies, the key part they play in our society and the steps we take to uncover and counter lying. This week’s picture by David Clode on Unsplash is that of a butterfly wing, not an owl.
Like many animals, humans survive by signaling. A growl, the curl of a tail or a frown on a face are forms of signaling. Lying is a signaling tool that we use to get something we want. We may want protection from some threat so we lie. We may want approval from others so we lie. Lies come in several colors. Lies told for some social or public purpose are called white lies. In the 1969 Frazier v. Cupp decision, the Supreme Court ruled that the police could make false statements to gain a confession. These are known as blue lies. There are red lies told to hurt someone or their reputation. We tell green lies to gain some financial advantage.
People are so good at lying that public agencies and private companies spend billions per year to prevent fraudulent claims. As Jennifer Pahlka (2023) noted in her book Recoding America, policymakers will spend far more than a $1 to prevent paying out $1 of a fraudulent claim even if it means that legitimate claimants have to wait longer to receive their benefits. Policymakers are rarely as responsive to people’s needs as they were during the pandemic. In a rush to serve millions of people laid off during the pandemic, many fraudulent claims flooded state unemployment offices. According to an indictment filed in May, recently elected Congressman George Santos (R-NY) was one of those who made a fraudulent claim with New York State and received $24,000 while already employed.
Politicians like Santos attract news coverage but it is not clear that they are any more dishonest than the population in general. It is true that those who are uncomfortable with lying are cautioned not to run for office. Lying is an accepted tactic to confound the opposition, gain a policy foothold or some electoral advantage. Within a democracy, the electoral process is a competition of lies and boasts that political scientists call branding. There is no “truth in advertising” standard that politicians must adhere to when they run for office. It is the voters who must beware when they “buy” whatever ideological concoction a politician is selling. The voters are supposed to act as a giant sieve, straining out the fabrications, the incompetent and crackpots. It is not a perfect system.
The Bitcoin algorithm was designed to “crowd-source” property claims, spreading the verification process to the many nodes in a historical transaction chain. Yet the 15 year history of Bitcoin and other digital currencies has been punctuated with episodes of large scale fraud. According to a Justice Dept. investigation, in 2011, customers of Bitcoin exchange Mt. Gox learned that most of the bitcoin stored on the exchange had disappeared. Over a three year period, Russian hackers with unauthorized access to the exchange’s server had stolen most of the Bitcoin stored there. Dishonesty 1, digital security 0.
Forms of digital communication like email allowed scammers to send a lie for a fraction of a penny, far below the cost of bulk mailing. The Federal Trade Commission reported that consumers lost $8.8 billion to fraudsters in 2022. Money transfers, both legitimate and criminal, happen with the flip of some ones and zeroes. Digital currencies can be stolen at far less personal risk than holding up a physical bank so it is surprising that more crypto is not stolen each year. CNN reported an estimate that $3.8 billion worth of digital currency, most of it DeFi, not Bitcoin, was stolen in 2022. That’s just 0.45% of the $840 billion in the market cap of cryptocurrencies at the end of 2022, as tracked by Coin Gecko. The technology that underlies digital currencies could be adapted to verify other forms of transactions.
A century from now, we may put digital currencies in the same historical bucket with the worthless stock certificates of hundreds of railroads and mines issued in the 19th and early 20th century. History is littered with broken dreams destroyed by deceit. Dig down to the ideological foundations of digital currency, however, and there is an enduring idea that will outlast whatever the current form of digital currency trading and transfer. That idea is as old as the Constitution – checks and balances. Money is information and information is power. Unless that power is checked, it accrues into an autocratic regime or an economic monopoly. Digital currency represents a yearning for a check on the accumulation of economic and political power. That idea will not go out of fashion.
Pahlka, J. (2023). Recoding America: Why government is failing in the Digital age and how we can do better. Metropolitan Books, Henry Holt and Company.