In chapter 3, we argued that variations in unemployment in the twentieth century are largely explainable in terms of changes in “the adjusted real wage,” or the compensation of workers in terms of purchasing power related to their productivity or output. Unemployment will tend to rise with increases in the adjusted real wage, which in turn can reflect rising money-wage rates, falling prices, or falling productivity of labor. Similarly, a fall in the adjusted real wage tends to reduce unemployment, ceteris paribus. Falling adjusted real wages can result from one or a combination of three factors: falling money wages, rising prices, or rising labor productivity. Several variants of the basic model were developed, and the one used here is taken from table 3.3, which suggests unemployment in this year depends on changes in the components of the real wage over the past six years, plus the adjusted real wage itself lagged seven years.
Table 4.2 indicates the actual unemployment rate by year for the 1900–1929 period, as well as the unemployment rate predicted by our adjusted real wage model. The model does reasonably well in forecasting major shifts in the unemployment rate. In no year did the actual unemployment rate deviate from the rate predicted by our model by more than three percentage points (e. g., 7.0 percent vs. 4.0 percent.) In 70 percent of the years, the deviation of the predicted unemployment rate from the actual rate was less than one percentage point. The model does a remarkably good job of predicting major increases in unemployment. The first major episode of moderately high unemployment comes after the panic of 1907. Unemployment rose from 2.8 to 8.0 percent, while the adjusted real wage model predicts a rise from about 0.5 to over 7.7 percent. Similarly, in 1914 and 1915, unemployment again rose to the 8 percent range for two years, up from the 4.3 percent level prevailing in 1913. Our model predicts a rise to a peak rate slightly over 8.8 percent in 1915, very near the actual rate.
TABLE 4.2 ACTUAL VS. PREDICTED UNEMPLOYMENT RATES FOR THE U.S. 1900–1929
The major downturn of the first three decades was the 1920–21 depression. Unemployment was abnormally low in 1919, 1.4 percent, rising to 5.2 percent in 1920 and 11.7 percent in 1921 before falling to 6.7 percent in 1922 and 2.4 percent in 1923. The wage model actually predicted negative unemployment in 1919. In the most important forecasting error for this era, the model predicts a rise in unemployment in 1920, but to only slightly over 2.3 percent. Yet the model accurately explains the huge rise in unemployment in 1921 (predicting a 12.1 percent rate), and a significant decline in unemployment in 1922 and 1923 (the model shows unemployment rates of 8.6 and 2.1 percent, respectively, for these years).
In the halcyon days from 1923 through 1929, unemployment varied modestly, ranging between 1.8 and 5.0 percent; our statistical explanation indicates values between 2.1 and 4.3 percent, not a very large discrepancy. In short, variations in unemployment during the Gilded Age are well explained by movements in money wages, prices, and labor productivity.
Interpreting the Unemployment Experience, 1900–1919
Before 1913, price data are very crude, with the consumer price index reported only to the nearest whole number. Thus, our interpretation of the proximate determinants of the first major episode of rising unemployment in 1908 must be somewhat cautious owing to data limitations. The evidence is, however, that a combination of price deflation and a fairly sharp productivity decline led to a rise in the adjusted real wage, more than offsetting a modest (1.1 percent) decline in money wages. Real wages rose in a period of falling labor productivity, pushing the adjusted real wage up fairly sharply. This, in turn, led to a near tripling in the unemployment rate.
The unemployment rate returned fairly quickly to a range near equilibrium in 1909. The real wage-enhancing deflation ended. More importantly, labor productivity rose over 7 percent, more than the rise in money wages, leading to a decline in the adjusted real wage.
The 1907–8 surge in unemployment triggered by rising adjusted real wages was unquestionably in large part a result of the shock to prices. Our regression estimates suggest that they had the direct effect of raising the unemployment rate by about two percentage points, approximately 40 percent of the observed increase. The evidence on wholesale prices suggests that the true decline in prices was in fact probably greater than our inadequate consumer price data indicate, so the role of price shocks may be even greater. The more than 8 percent drop in the stock of money from May 1907 to February 1908 resulted at first from gold outflows but after October 1907 reflected the banking panic that developed in New York, with the resultant rise in depositor fear and a shift from deposits to currency. Between May 1907 and February 1908, bank deposits declined nearly 10 percent while currency holdings rose almost 6 percent.9
The story of rising unemployment in 1908 is highly consistent with the standard neoclassical, the Austrian, and also the monetarist interpretations of the period. Austrians emphasize the discoordinating effects of price shocks, the monetarists the link between money changes and prices, and the neoclassical economists the importance of relative prices. While there are differences in perspectives on some issues among these groups, all provide useful insights into the developing disequilibrium in the labor market and its solution.
While there may have been economists who believed that the unemployment of the era would be solved in time by price and wage adjustments, they were certainly not outspoken in expressing themselves. By contrast, a number of prominent Americans with an activist, Keynesian-style economic philosophy spoke up forcefully. In a speech at Cooper Union, William Jennings Bryan implied that the government should serve as the “employer of last resort,” guaranteeing jobs to those needing one.10 Even Theodore Roosevelt, in speeches made during the 1908 presidential campaign, seemed to lend his support to the principle of “maintaining the prosperity scale of wages in hard times.”11
Still, the prevailing atmosphere was not one of clamor for government intervention. For example, the Nation, even then a liberal periodical, showed some skepticism about expanding public-works employment: “there lies the ever-present danger in ‘making work.’ The work is badly and expensively done, and what is really ‘made’ is an addition to the ranks of the unemployed.”12
The 1914–15 downturn usually gets less attention among economic historians, but from an unemployment perspective it was a more significant episode than that arising from the panic of 1907. In terms of the components of the adjusted real wage, the 1914–15 downturn resulted almost entirely from a sharp productivity shock. Money wages moved very little in either year; the same is true of prices, which rose very slightly; thus, real wages changed very little. A massive productivity shock, however, led to a very severe decline in output per worker in 1914. Output per man hour fell by 6.5 percent in 1914, the largest productivity decline observed in the twentieth century. The cause of the decline is something of a mystery, although conventional wisdom probably would attribute it at least in part to the outbreak of war. It is not clear, however, how the war specifically impacted on productivity, particularly since it only started in August. The productivity decline almost certainly does not reflect changing capital stock per worker (“Smithian” productivity change), as the evidence does not show a decline in the capital-labor ratio, although good annual data on this score are not available for this period.
Was the decline the result of “Keynesian” productivity movements? Specifically, did a decline in autonomous spending lead to falling output, while