Along with the news that real GDP growth improved from -0.7 percent in the second quarter to 3.5 percent in the third quarter, the Bureau of Economic Analysis (BEA) released detailed National Income and Product Account tables yesterday, which received little comment in the press today. These tables make it very clear that the $787 billion stimulus package had virtually nothing to do with the improvement. Of the 4.2 percent improvement, more than half (2.36 percentage points) was due to firms cutting inventories at a less rapid pace, which has nothing to do with the stimulus. (For the details look at BEA’s Table 2 which shows that the contribution of inventory investment increased from -1.42 to .94 which equals 2.36.)
What about the other components of GDP? In particular what about government spending, which was supposed to be a big part of this stimulus? Government spending was a negative factor, subtracting 0.9 percentage points from the change in GDP growth.
Automobiles and parts contributed 1.15 percent for the quarterly improvement, but as today’s release of monthly data shows that was an unsustainable temporary blip: up in August and down in September due to cash for clunckers. Here is how BEA put it today: “Purchases of motor vehicles and parts accounted for most of the decrease [in real consumption] in September and for most of the increase in August, reflecting the impact of the federal CARS program (popularly called “cash for clunkers”). The program, which provided a credit for customers who purchased a qualifying new, more fuel efficient auto or light truck, ended on August 24, 2009.” And the latest consumption and income data in today's release reveal no noticeable impact of the temporary tax rebates and one time payments on consumption as John Cogan, Volker Wieland and I had earlier shown.
Friday, October 30, 2009
Tuesday, October 27, 2009
Ending Government Bailouts As We Know Them
Fears of potential damage from the failure of a large financial institution has created a bailout mentality in which the U.S. government has committed many billions of dollars, intervened in the operations of scores of private firms, and caused excessive risk-taking. A new policy is needed. Two proposals were considered in testimony at the House Judiciary Committee a few days ago. Michael Barr of the U.S. Treasury and David Moss of Harvard supported a proposal to create an FDIC-like resolution regime for any financial firm viewed as too big or complex to fail. Testimony by David Skeel of Penn and me criticized that approach as institutionalizing the bailout process seen during the crisis and supported alternatives in which the failing financial firm would go through a bankruptcy process designed to deal with financial firms. Look forward to more analysis of this important topic in the coming weeks.
Saturday, October 24, 2009
And the Answer Is…Productivity
I teach Economics 1 with an “audience response system” similar to the ones you see on TV game shows. Think of the “Lifeline” on “Who Wants to be a Millionaire?” Each student in the lecture has a little hand-held transmitter. They press the keys on the transmitter to give their opinions on issues or answers to questions. Their answers come directly into my laptop computer and are immediately projected in a bar chart on the screen, creating an opportunity for discussion.
The question on the right generated a good discussion this week. I asked students to respond A through E at the start of the lecture, which was about labor productivity and wages. Later in the lecture I then presented and explained the chart below which shows that the best answer is B. Productivity growth is highly correlated with compensation growth over time as predicted by basic economic theory and leaves relatively little for A, C, D, or E to explain. But before seeing the graph many guess another answer, and I suspect most people are surprised that there is so little to explain after you take productivity into account.
In the chart, labor productivity (output per hour of work) and compensation (wages plus fringe benefits per hour of work) pertain to the nonfarm business sector in the United States. Compensation is adjusted for inflation by dividing by the price of nonfarm business output which corresponds with the output measure. In the past few years the consumer price index (CPI) has grown faster than the price index for nonfarm business output. So if you adjust compensation by the CPI rather than the price index for nonfarm business as in the chart, compensation per hour deviates slightly below the productivity line in recent years, but the basic story over the long haul is the similar.
Friday, October 23, 2009
Despite claims, data continue to show small impact of stimulus
Debate about the impact of the $787 billion stimulus continued this week. “Thanks largely to the Recovery Act,” Larry Summers argued, “we have walked a substantial distance back from the economic abyss and are on the path toward economic recovery.” Yet the latest data from the Department of Commerce continue to show that only an insubstantial part of this distance was due to the stimulus. The table shows the latest Department of Commerce estimates of the contributions of consumption, investment, net exports, and government spending to the improvement in GDP growth from the first to second quarter. Growth improved by 5.7 percent (from -6.4 percent to -0.7 percent). Private investment was by far the major source. Government spending contributed 1.9 percentage points, but more than half of that was defense spending which was not part of the stimulus. The table is an update of information reported in my Wall Street Journal article of last month with John Cogan and Volker Wieland. This one-page brief provides more details and also shows that direct spending from the stimulus contributed only 0.3 percent of 5.7 percent. We will learn more when the Department of Commerce releases data from the third quarter next week, but so far their data are very clear that the stimulus is having a negligible impact.
Friday, October 16, 2009
Speaking of Monetary Policy Rules
This was another week with a lot of commentary on the Taylor Rule, and I am grateful to Jon Hilsenrath of the Wall Street Journal for suggesting an interview with me on the subject and posting it on Wednesday. He raises many good questions.
A few days earlier Paul Krugman wrote a piece using an estimated version of the Taylor rule rather than the actual normative rule I proposed back in 1992. He was trying to make the case that the Fed should keep the interest rate at zero for two more years. As explained in this Bloomberg News op-ed piece, I disagree with using estimated policy rules this way because it causes past mistakes to be repeated.
On Tuesday David Altig used another estimated policy rule. He was writing about the causes of the financial crisis. He used an estimated policy rule to argue that the very low interest rate set by the Fed in 2002-2005 was not an inappropriate deviation from a policy that worked well in the 1980s and 1990s, contrary to what I and others have argued. But his estimated version of the Taylor Rule uses the most recent federal funds rate (no matter what it is) to determine what the current federal funds rate should be at each Fed meeting. Such an approach is circular, assuming in essence that the Fed follows itself, which makes it difficult evaluate when policy is good and when it is not. Yesterday David Beckworth wrote an article explaining clearly the problem with such an approach.
A few days earlier Paul Krugman wrote a piece using an estimated version of the Taylor rule rather than the actual normative rule I proposed back in 1992. He was trying to make the case that the Fed should keep the interest rate at zero for two more years. As explained in this Bloomberg News op-ed piece, I disagree with using estimated policy rules this way because it causes past mistakes to be repeated.
On Tuesday David Altig used another estimated policy rule. He was writing about the causes of the financial crisis. He used an estimated policy rule to argue that the very low interest rate set by the Fed in 2002-2005 was not an inappropriate deviation from a policy that worked well in the 1980s and 1990s, contrary to what I and others have argued. But his estimated version of the Taylor Rule uses the most recent federal funds rate (no matter what it is) to determine what the current federal funds rate should be at each Fed meeting. Such an approach is circular, assuming in essence that the Fed follows itself, which makes it difficult evaluate when policy is good and when it is not. Yesterday David Beckworth wrote an article explaining clearly the problem with such an approach.
Thursday, October 15, 2009
Golden Balls and Duopoly: Shocking or Predictable?
Want to see an amazing illustration of how the game theory model of duopoly works? Watch this video from the British TV show Golden Balls. In the classic case of duopoly, two firms (call them Sarah and Steve) produce and sell a good in a single market. Each firm has the choice of charging a high price or a low price. A simple case is shown in the payoff table at the right. If Sarah and Steve cooperate and both charge a high price they split the higher profits. Each gets 50. But if Sarah undercuts Steve and charges a lower price, she will steal all the customers from Steve. She gets 100 and he gets 0 profits. Or Steve might undercut Sarah, and then he gets 100 and she gets 0. If they both charge a lower price, they both lose out. Of course consumers benefit by the low price which is why this is an important economic issue. The game of Golden Balls has the exact same structure. I showed this video in lectures today. It was a hit.
Monday, October 12, 2009
A Teachable Moment
The awarding of the Nobel Prize in economics is always a teachable moment. This year’s award is no exception. It recognizes research on “economic governance” and goes to Elinor Ostrom for her work on “the commons” and to Oliver Williamson for his work on “the boundaries of the firm.” Both Ostrom and Williamson focus on the interactions between people outside the usual market mechanisms, an important topic to teach in the first lecture or the first chapter of Principles of Economics. The Nobel Prize Committee web page provides an excellent summary of their contributions with many examples. Ronald Coase originated research on this subject and won the Nobel Price for it back in 1991. I think it is important to note how Ostrom and Williamson build on Coase’s work in different ways. What do they teach us?
Williamson’s research teaches us to recognize when transactions will take place within a firm and when they will take place in markets. He significantly extended Coase’s insights on reducing transactions costs by delineating the advantages of such within-firm interactions when mutual dependence between people is high. The predictions of his theory are testable and have been confirmed in many empirical studies.
Ostrom's research teaches us that “market failure” due to externalities or public goods of the kinds illustrated in Garrett Hardin’s famous "tragedy of the commons" example can be resolved by genuinely engaged individuals working together, and that government intervention may therefore not be needed to solve such market failures. Indeed she finds that individual arrangements frequently achieve better results than government intervention. In this way she too builds on the work of Coase.
Williamson’s research teaches us to recognize when transactions will take place within a firm and when they will take place in markets. He significantly extended Coase’s insights on reducing transactions costs by delineating the advantages of such within-firm interactions when mutual dependence between people is high. The predictions of his theory are testable and have been confirmed in many empirical studies.
Ostrom's research teaches us that “market failure” due to externalities or public goods of the kinds illustrated in Garrett Hardin’s famous "tragedy of the commons" example can be resolved by genuinely engaged individuals working together, and that government intervention may therefore not be needed to solve such market failures. Indeed she finds that individual arrangements frequently achieve better results than government intervention. In this way she too builds on the work of Coase.
Sunday, October 11, 2009
Fuori Strada
To Prevent Bubbles, Don’t Create Them
In their widely-cited Wall Street Journal column last week, Ian Bremmer and Nuriel Roubini argue that to prevent asset price bubbles in the future the Fed should focus on “properly calculating asset prices and the risk of asset bubbles according to the Taylor rule, an important guideline central banks use to set interest rates.” Central bankers such as Bill Dudley and Kevin Warsh of the Fed and Mark Carney of the Bank of Canada also propose that asset prices be factored in to interest rate decision criteria such as the Taylor rule. Adding asset prices to the Taylor rule would be a big change because the Taylor does not now incorporate asset prices, and much research, including Ben Bernanke's research ten years ago, shows it shouldn’t.
The rationale for the proposed change is that the sharp run up in housing prices, which lead to the financial crisis, was caused by interest rates being too low for too long. If central banks had taken account of housing price inflation they would have raised interest rates earlier—so the story goes. They would have stopped the bubble before it got so big, or burst it when the burst would not have caused so much damage.
I agree that the Fed held interest rates too low for too long, and I provided evidence of this at the summer 2007 Jackson Hole conference. But the problem was not that the Fed ignored the housing boom. The problem was that it caused it. Look at the nearby chart from The Economist. It shows the Taylor rule without any asset prices and the actual interest rate. Clearly interest rates were too low. By deviating from the rule and keeping interest rates too low, the Fed caused the acceleration in housing prices. If the Fed had simply conducted monetary policy as it had in the 1980s and 1990s, we would likely not have had the housing boom.
I agree that the Fed held interest rates too low for too long, and I provided evidence of this at the summer 2007 Jackson Hole conference. But the problem was not that the Fed ignored the housing boom. The problem was that it caused it. Look at the nearby chart from The Economist. It shows the Taylor rule without any asset prices and the actual interest rate. Clearly interest rates were too low. By deviating from the rule and keeping interest rates too low, the Fed caused the acceleration in housing prices. If the Fed had simply conducted monetary policy as it had in the 1980s and 1990s, we would likely not have had the housing boom.
Even putting aside the problems of identifing asset bubbles, pointed out by Donald Luskin, or the danger of creating collateral damage by doing so, adding asset prices to the equation would not address the real problem. Saying that adding asset prices to the central bank’s rule would prevent bubbles is like saying that requiring hikers in the forest to carry cell phones to call the fire department will prevent the damage from forest fires they start. By the time they call and the fire trucks arrive, the heat and flames will have caused tremendous damage. Far better to prevent hikers from starting fires in the first place. Far better for central bankers not to create bubbles in the first place.
Friday, October 9, 2009
Taking Stock at the Fed
Yesterday and today, three economists (Andrew Levin, Chris Erceg and Mike Kiley) who work on the staff of the Federal Reserve Board graciously hosted a big gathering of monetary economists from around the world. The get together was held on the top floor of Fed’s office building in Washington DC overlooking the Mall. Its purpose was to take stock of key research developments in monetary theory and policy over the past few years. For example, John Williams of the San Francisco Fed and I flew in from California and reviewed recent research on monetary policy rules. All the papers will eventually be published in a new Handbook of Monetary Economics.
The meeting demonstrated how completely wrong Paul Krugman is about recent developments in economics, at least as he portrayed the subject in the New York Times Magazine last month. This was not an all efficient markets meeting. The talk from start to finish was about the market imperfections, price rigidities, deadweight losses due to market power, and imperfect information, which all occur in monetary economics. If anything there was too much focus on market distortions. Overall I saw tremendous progress documented at the meeting. The presentation by my Stanford colleague Pete Klenow and his coauthor Ben Malin, for example, reviewed the impressive volume of empirical research on firm price setting decisions using new BLS data sets. Their discussant Marty Eichenbaum pointed to even more of this kind of research, which solidifies and bolsters the type of monetary theory that has been developed in recent years.
But if there has been so much progress in monetary economics, then why did we have the financial crisis? I argued that it was the policy, not the economics, which got off track. When the policy implications of the research were followed by policy makers, we had good economic performance, as in the period called the Great Moderation. When policy got off track, the Great Moderation ended in the financial crisis and Great Recession. I am hoping that policy will get on track again and we will have Great Moderation II.
The meeting demonstrated how completely wrong Paul Krugman is about recent developments in economics, at least as he portrayed the subject in the New York Times Magazine last month. This was not an all efficient markets meeting. The talk from start to finish was about the market imperfections, price rigidities, deadweight losses due to market power, and imperfect information, which all occur in monetary economics. If anything there was too much focus on market distortions. Overall I saw tremendous progress documented at the meeting. The presentation by my Stanford colleague Pete Klenow and his coauthor Ben Malin, for example, reviewed the impressive volume of empirical research on firm price setting decisions using new BLS data sets. Their discussant Marty Eichenbaum pointed to even more of this kind of research, which solidifies and bolsters the type of monetary theory that has been developed in recent years.
But if there has been so much progress in monetary economics, then why did we have the financial crisis? I argued that it was the policy, not the economics, which got off track. When the policy implications of the research were followed by policy makers, we had good economic performance, as in the period called the Great Moderation. When policy got off track, the Great Moderation ended in the financial crisis and Great Recession. I am hoping that policy will get on track again and we will have Great Moderation II.
Tuesday, October 6, 2009
The Price System in Action
Want to understand what really goes on behind the scenes of the supply and demand model? Read this wonderful clear essay by Russell Roberts. It explains how prices provide information, coordinate, and motivate decisions with many more details than in the summaries of Adam Smith and Milton Friedman in my October 3 post.
Saturday, October 3, 2009
Two Masters Speak on Power of Markets
The "free market" lectures--we had them last week at Stanford--are my favorites in Economics One. I wear this Adam Smith tie, give a short summary of Smith's writings, read his story of the woolen coat from the pages of his Wealth of the Nations, and then we all watch Milton Friedman's two-minute pencil lecture on YouTube, where you can see that he also wore an Adam Smith tie. We then dive into the technical explanation of how with competitive markets the price system leads to an efficient allocation of resourses and production. I only wish Smith were on YouTube. Next week we consider monopoly, which is not a story of efficiency.
Friday, October 2, 2009
A Beautiful Model, A Clear Prediction
The supply and demand model, which students learn in the first week of Economics One, is a beautiful, powerful tool for investigating real world issues like the minimun wage, the subject of tomorrow's Wall Street Journal editorial. The model's prediction is chrystal clear as this little diagram from my lectures shows: a minimum wage causes unemployment, especialy for young unskilled people, just as the Journal argues. Of course the size of the impact depends on the steepness or elasticity of labor demand. So empirical research by economists like David Neumark and Bill Wascher cited in the Journal is essential. Their research is described in this box from my Principles of Economcs book. The research shows the impact to be quite significant. But as is so often the case in economics, not all economists agree, so the box also describes some contrary findings by David Card and Alan Krueger. I side with Neumark and Wascher in this debate, but you can read the editorial, look at the diagram, read the box, and draw your own conclusions.
Subscribe to:
Posts (Atom)