Relevant and even prescient commentary on news, politics and the economy.

Obamacare hasn’t killed full-time jobs, either

When we last looked at Obamacare as an alleged “job-killer,” Matt Yglesias had just pointed out that 2014, the first full year of insurance on the exchanges, was also the best year for job creation since 1999. But recently a non-blogging friend reminded me of a related anti-Obamacare meme, the idea that employers have been cutting their workers below 32 hours per week so they would not have to provide them with health insurance. His argument was, logically enough, that this would mean a loss of full-time jobs.

As with so many other anecdotal Obamacare horror stories, this one does not stand up to even simple inspection. Just like total job creation, it turns out that full-time (BLS uses 35 hours/week, not 32, by the way) job creation has quickly increased since December 2013, just before exchange insurance went into effect. Not only that, part-time employment has fallen slightly. The Bureau of Labor Statistics’ monthly “Employment Situation” (Table A-9 in both cases) tells the tale.

Date            Full or Part Time     Not Seasonally Adjusted Jobs          Seasonally Adjusted Jobs

December 2013   Full-time          116,661,000                                      117,278,000

July 2015             Full-time           123,142,000                                     121,589,000

Change                                                 + 6,481,000                                      + 4,311,000

 

December 2013   Part-time           27,762,000                                        27,372,000

July 2015             Part-time            26,850,000                                        27,265,000

Change                                                     – 912,000                                          – 107,000

I included both seasonally adjusted and not seasonally adjusted data for completeness sake, but when we are comparing a summer month to a winter month, surely the seasonally adjusted figures are the correct ones to use. For those of you keeping score at home, then, full-time jobs have increased by 4.3 million since Obamacare exchange insurance went into effect, whereas part-time jobs have fallen by 107,000. Neither of these fits the anecdotes of workers being shunted from full-time to part-time work to avoid providing insurance. This increase in full-time work has been accomplished in the span of just 19 months, or an average of over 226,000 new full-time jobs per month.

Of course, it’s theoretically possible that using sophisticated statistical controls might uncover a hidden negative relationship; that we’d have even more full-time jobs than we do if the exchanges hadn’t gone into effect. Even if that were true, it’s obvious that everything else going on in the Obama economy is having a much bigger effect on full-time employment, so there’s no justification for using the epithet “job-killing” on the off chance that it’s true.

Cross-posted from Middle Class Political Economist.

Comments (6) | |

Is Effective Demand showing the limit of the Business Cycle… again?

Effective Demand is basically a demand limit upon the business cycle. Wouldn’t it be great if it could be determined? Then we would know where the limit of a business cycles is. Well maybe we can determine effective demand.
A simple equation for the Effective Demand Limit relates labor share to the utilization of labor and capital. Labor share represents the Effective Demand limit.

EDL = Non-farm business labor share * 0.762  -  (capacity utilization*(1 – unemployment rate))

EDL will want to stay above zero, such that, labor share*0.762 (underlined on left) will stay above the utilization of labor and capital (underlined on right).

Here is the graph of the data. (link)

update UT index

Recessions are in gray. The zero x-axis in the graph represents when the plot falls to near zero before each recession. EDL again has hit the same point that was hit twice before the 2008 recession.

I do not see a recession yet. Some are saying that we are close. Anyway, if a recession was to form again with the plot staying above zero, the equation would show an unusual consistency in predicting the limit of business cycles.

Comments (4) | |

The Confederate Ideology: "At this cost the system is maintained."

by  Sandwichman  

The Confederate Ideology: “At this cost the system is maintained.”

invisible hand

Cornell students leaving Willard Straight Hall
“We presume that the citizens of Virginia are much like the ‘rest of mankind,’ and under ordinary circumstances have as much nerve as falls to the lot of common humanity. But they have long lived under the shadow of a great terror. Each slaveholder keeps a grim skeleton in his social closet, which may start into life at any moment. The ‘demon of hate’ which his life of wrong and outrage has invoked, haunts him night and day. He listens for the roar of the slumbering fires of the volcano upon whose sides he sleeps, and every sound that hurtles through the air, every footfall behind him, makes him fancy that the avenger is on his truck.” — Frederick Douglass, “The Reign of Terror in the South”

The sub-sub-title to John Ellis Cairnes’s eloquent The Slave Power described the 1862 book as “an attempt to explain the real issues involved in the American contest.” This blog post is an attempt to explain the real issues involved in the (too) long-enduring contest over “political correctness.” It comes to the conclusion that it is pretty much the same real issue as Cairnes identified. The spectre of political correctness emanates from the “grim skeleton in [America's... capitalism's] social closet, which may start into life at any moment.”

Undoubtedly, the “political-correctness police” exact a tremendous toll on the psyches of White Americans and have been doing so for several decades. To put all that torment in perspective, one is advised to read Alexander Cockburn from 1992, “Bush & P.C. — A conspiracy so immense…” Lewis Lapham from 2004, “Tentacles of Rage: The Republican Propaganda Mill, a brief history,” and Martin Jay from 2010, “Dialectic of Counter-Enlightenment: The Frankfurt School as scapegoat of the lunatic fringe.”

Tags: Comments (19) | |

Global Volatility, Domestic Markets

by Joseph Joyce

Global Volatility, Domestic Markets

Unlike the global financial crisis of 2008-09, the current disruption in the financial markets of emerging market nations was anticipated. The “taper tantrum” of 2013 revealed the precarious position of many of these nations, particularly those dependent on commodity exports. The combination of a slowdown in Chinese growth, collapsing stock prices and a change in the Chinese central bank’s exchange rate policy indicated that the world’s second-largest economy has its own set of problems. But global volatility itself can roil financial markets, and good fundamentals may be of little help for a government trying to shelter its economy from the instability in world markets.

The importance of global (or “push”) factors for capital flows to emerging markets was studied by Eugenio Cerutti, Stijn Claessens and Damien Puy of the IMF. They looked at capital flows to 34 emerging markets during the period of 2001-2013, and found that global factors such as the VIX, a measure of anticipated volatility in the U.S. stock market, accounted for much of the variation in flows. Not all forms of capital were equally affected: bank-related and portfolio flows (bonds and equity) were strongly influenced by the global factors, but foreign direct investment was not.

Cerutti, Claessens and Puy also investigated whether the emerging markets could insulate themselves from the global environment with good domestic macro fundamentals. They reported that the sensitivity of emerging markets to the external factors depended in large part upon the identity of a country’s investors. The presence of global investors, such as international mutual funds in the case of portfolio flows and global banks in the case of bank finance, drove up the response to the global environment. The authors concluded: “…there is no robust evidence that “good” macroeconomic (e.g., public debt, growth) or institutional fundamentals (e.g., Investment Climate and Rule of Law) have a role in explaining EM different sensitivities to global push factors.”

Comments (0) | |

Marking Beliefs to Market, Stan Fischer edition

Brad DeLong Friday morning:

I cannot help but note strong divergence between the near-consensus views of Fed Chair Janet [Yellen]‘s and Fed Vice-Chair Stan [Fischer]‘s still-academic colleagues and students that tightening now is grossly premature, financial markets’ agreement with the hippies as evidenced by the ten-year breakeven, commercial-banker and wingnut demands for immediate tightening, the extraordinarily awful performance since 2007 of not all but the average regional Fed president as revealed in the transcripts, and the Federal Reserve’s strong predisposition to an interest-rate liftoff soon.

Prolix but accurate, and with the strong implication that Yellen and Fischer Know Better, but are constrained by their cohort.

Stan Fischer, Saturday morning (via Mark Thoma, whose presentation is more accurate and informative):

[B]ecause monetary policy influences real activity with a substantial lag, we should not wait until inflation is back to 2 percent to begin tightening.

As I said before—to the apparent dissatisfaction of those who want to be polite losers or believe that “well, they’re saying the right things now” is redemptive and not damning— who of the Sensible Technocrats is worth the trouble of paying attention to when they have a chance to do something in government and make a point of forgetting everything they have learned?

Comments (3) | |

How Many Equations Should There be in Macroeconomic Models ?

Recently a very old debate among macroeconomists has been reopened (this happens from time to time). Paul Romer decided to discuss a key conference held in 1978 (yes really). Some (including me) think that’s about when the profession took a wrong turn largely following Robert Lucas. But in the discussion until about yesterday, it was agreed that macroeconomics was in a bad way in 1978 and needed to change. Romer particularly criticized a paper presented by Ray Fair at the conference.

This has provoked Ray Fair* to start blogging. I think it is quite important to read his post (so does Mark Thoma). Fair is very unusual, because he works at a University (some small place called Yale) yet he stuck with the approach started by Jan Tinbergen and especially by Jacob Marschak and collegues at the Cowles Commission (then at U Chicago) which was criticized by Lucas. I will follow Fair by calling it the CC (for Cowles Commission) approach. Notably, the approach was never abandoned by working macroeconomists, including those at the Fed and those who sell forecasts to clients who care about forecast accuracy not microfoundations.

Insert: This post is long. The punchline is that I think that a promising approach would be to combine CC models with a pseudo prior that a good model is not too far from a standard DSGE model. This is the sort of thing done with high dimensional VARs using the so called Minnesota prior.
end insert.

There are (at least) two key differences between the CC approach and models developed later. First the old CC models did not assume rational expectations. This has been the focus of the discussion especially as viewed by outsiders. But another difference is that the old models including many more variables and, therefore, many more equations, than the newer ones. The model presented in 1978 had 97 equations. This post is about the second diference — I don’t believe it makes sense to assume rational expectations, but I won’t discuss that issue at all.

With his usual extreme courtesy, Simon Wren Lewis noted advantages of the old approach and, as always, argues both old and newer approaches are valuable and should be explored in parallel.

I have to admit that I don’t intend to ever work with a model with 97 separate equations (meaning 97 dependent variables). But I think that one fatal defect of current academic macroeconomics is that it has been decided to keep the number of equations down to roughly 7 (New Keynesian) or fewer (RBC).

I will start by discussing the costs of such parsimony.

1) One feature of pre 2008 DSGE models which, it is agreed just won’t do is that they assumed there was only one interest rate. In fact there are thousands. The difference between the return on Treasury bills and junk corporate bonds was one of the details which was ignored. The professions response to 2008 has been to focus on risk premia and how they change (without necessarily insisting on an explanation which has anything to do with firm level micro data). Here I think it is agreed that the pre 2008 approach was a very bad mistake.

2) As far as I know (and I don’t know as much as I should) a second omission has received much less attention. Standard DSGE models still contain no housing sector. So the profession is attempting to understand the great recession while ignoring housing completely. Here, in particular, the old view that monetary policy affects output principally through residential investment isn’t so much rejected as ignored (and quite possibly forgotten).

3) Similarly there are no inventories in models which aim to match patterns in quarterly data. I teach using “Advanced Macroeconomics” by Romer (David not Paul or Christine). He notes that a major component of the variance in detrended (or HP filtered) output is variance in detrended inventory investment, then writes no more on the topic. He is about as far from Lucas as an academic macro-economist (other than Fair) can be. Assuming no inventories when trying to model the business cycle is crazy.

4) In standard models, there is one sector. There is no discussion of the distinction between goods and services (except now financial service) or between capital goods and consumption goods. In particular it is assumed that there are no systematic wage differentials such that a given worker would be pleased to move from the fast food sector to the automobile manufacturing sector. Again the micro-econometric research is completely ignored.

5) A lot of standard academic DSGE models assume a closed economy.

6) No one thinks that the wage and price setting mechanisms assumed in either RBC or NK models are realistic. They are defended as convenient short cuts.

7) It is assumed that there are no hiring or firing costs (or unions which object to layoffs). Similarly the assumptions about costs of adjusting capital are not ones that anyone considered until it was necessary to make them to reconcile the data with the assumption that managers act only to maximize shareholder value.

8) Oh yes it is assumed that there are no principal agent problems in firms.

9) It is assumed that markets are complete even though they obviously aren’t and general equilibrium theorists know the assumption is absolutely key to standard results.

10) It is assumed that there is a representative agent even though there obviously isn’t and general equilibrium theorists know the assumption makes a huge gigantic difference.

This means that most of the topics which were the focus of old business cycle reasearch are ignored as are most post 1973 developments in microeconomics.

Before going on, I have to note that when each of these assumptions is criticized, special purpose models which relax the extreme assumptions are mentioned (sometimes they are developed after the criticism). But policy is discussed using the standard models. The assumptions are immune to evidence, because no one claims they are true yet their implications are taken very seriously.

What benefit could possibly be worth such choices ? That is what is wrong with a macroeconomic model with too many equations ? One problem is that complicated models are hard to understand and don’t clarify thought. This was once a strong argument, but it is not possible to intuitively grasp current DSGE models.

One reason to fear many equations is the experience of working with atheoretic vector autoregression (VAR) models which were developed in parallel with DSGE. in VARs the number of parameters to be estimated is proportional to the square of the number of equations. The number of observations of dependent variables is equal to the number of equations. More equations can imply more parameters than data points. Even short of that, large VAR models are over parametrized and fit excellently and forecast terribly. 7 equations are clearly too many. a 97 equation VAR just couldn’t be estimated. The CC approach relied on imposing many restrictions on the data based on common sense. A 97 equation DSGE model is, in pricipal, possible, but ideas about simplifying assumptions which should be made are, I think, based in large part on the assumptions which must be made to estimate a VAR.

If there are many dependent variables but each is explained by an ordinary number of independent variables each of which is instrumented by a credible instrument, then there shouldn’t be a problem with over-fitting. The fact that somewhere else in the model othere equations are estimated does not cause a spuriously good fit for an equation which doesn’t include too many parameters itself.

However, there is another cost of estimating a lot of parameters. The parameter estimation error makes forecasts worse at the same time it makes the in sample fit better. In the simplest cases, these two problems cause identical gaps between the in sample fit and the out of sample forecast. The second problem is absolutely not eliminated by making sure each equation is well identified.

But there is a standard approach to dealing with it. Instead of imposing a restriction that some parameter is zero, one can use a weighted average of the estimate parameter and zero. This is a Stein type pseudo Bayesian estimator.

I will give two examples. In the now standard approach, it is assumed that residential ivnestment is always exactly proportional to non residential investment. In the old approach residential and non residential investment were considered separately. In the pseudo Bayesian approach, one can estimate an equation for the growth of log total investment, estimate equation for the growth of log residential minus the growth of log total investment, then multiply the coefficients of the second equation by a constant less than one.

In another example one can assume that inventory investment is zero (as is standard DSGE models) or estimate net inventory investment as a function of other variables. Adding half the fitted net inventory investment to the standard DSGE model might give better forecasts than either the now fashionable or the old fashioned model.

This is the standard approach used with high dimensional VARs. I see no reason why it couldn’t be applied to CC models.

I see Wren Lewis has a new post which I must read before typing more (I have read it and type the same old same old so you probably don’t want to click “read more”).

Comments (10) | |

Real Household Net Worth: Look Out Below?

In my last post I pointed out that over the last half century, every time the year-over-year change in Real Household Net Worth went negative (real household wealth decreased), a recession had either started, or was about to.  (One bare exception: a tiny decline in Q4 2011, which looks rather like turbulence following The Big Whatever.) Throughout, click for source.

The problem: we don’t see this quarterly number until three+ months after the end of a quarter, when the Fed releases its Z.1 report for the the preceding quarter. The Q2 2015 report is due September 18.

But right now we might be able to roughly predict what we’re going to see four+ months from now, in the report on our current quarter, Q3, which ends September 30. We’re a bit over a month from the end the quarter, and we have some numbers to hand.

The U.S. equity markets are down roughly 7% year-over-year (click for source):

Screen shot 2015-08-26 at 11.42.32 AM

Total U.S. equities market cap one year ago was about $20 trillion:

Screen shot 2015-08-26 at 12.27.32 PM

So a 7% equity decline translates to a $1.4-trillion hit to total market cap, which goes straight to the lefthand (asset) side of household balance sheets, because households ultimately own all corporate equity — firms issue equity, and households own it (at one or more removes); people don’t issue equity in themselves, and firms don’t own people (at least not yet). It’s an asymmetrical, one-way ownership relationship. (Note: yes, the Fed accounts for household net worth on a mark-to-market basis.)

Total household net worth a year ago was $82 trillion. The $1.4 trillion equity decline translates to a 1.7% decline in household net worth.

Meanwhile household liabilities over the last four quarters have been growing at a fairly steady rate just above 0.2% per year. There’s no reason to expect a big difference in Q3.

This suggests a 1.9% decline in household net worth over the last year, based on the equity markets alone. (My gentle readers are encouraged to add numbers for real estate and fixed-income assets.) Add (subtract) 1.5% in inflation over that period, and you’re looking at something like 3.4% decline in real household net worth, year over year.

Unless the stock market rallies by 10% or 15% before the end of September ($2–3 trillion, or 2.5–3.5% of $80 trillion net worth), it’s likely we’ll see a negative print for year-over-year change in real household net worth when the Fed releases its Z.1 in early December of this year. And we know what that means — or at least we know what it’s meant over the last half century.

You heard it here first…

Cross-posted at Asymptosis.

Comments (9) | |

Predicting Recessions The Easy Way: Monetarists, MMT, And The Money Stock

I have a new post up that has implications for stock-market investment, so I decided to try posting it over at Seeking Alpha, where they’re paying me a few tens of dollars for the post (plus more based on page views — not much luck so far).

The post argues that year-over-year change in Real Household Net Worth has been a great predictor of NBER-designated recessions over the last half century. (It’s either 7 for 7, or 8 for 7, over 50+ years, depending on the threshold you use.) If you were following this measure, you would have gotten out of the market on March 6, 2008, avoiding a 50% drawdown over the next twelve months.

But the post goes farther, offering a somewhat monetarist economic explanation but using total household net worth as the measure of the “money stock.” Short story: if households have less (more) money, they spend less (more). Not exactly a radical behavioral economic assertion.

If you’re wondering how recent days’ market events have caused billions (trillions?) of dollars to “disappear,” and are pondering how to think about that, you might find it an interesting read.

Cross-posted at Asymptosis.

Comments (36) | |