Friday, July 25, 2014

Money is green

Image source


When I was a kid I collected aluminum cans from my neighbors, crushed them, and took them to a local scrapyard where I sold them for $0.40 per pound. I can still remember being all sticky after crushing all those soda and beer cans. Profits motivated me to recycle at an early age, but I had no idea that I was participating in such a massive, global industry:

In 2012, the seven thousand or so businesses that constitute the U.S. scrap recycling industry were responsible for transforming 135 million metric tons of recyclable waste into raw materials that could be made into new stuff. That's 135 million tons of iron ore, copper ore, nickel, paper, plastic, and glass that didn't have to be dug out of the ground or cut out of a forest. (44) . . . 
The global scrap industry . . . created a multibillion-dollar sustainable business model that stands as one of globalization's great, green successes. (85) . . . 
In 2012 Americans exported almost 22.3 million tons--or roughly 40.5 percent--of the used paper and cardboard they harvested. Of that, the majority went to China in shipping containers that otherwise would have crossed the Pacific Ocean carrying nothing more than air. It was joined by millions of tons of recycled metal and plastic, all of which went--like that paper--on what amounted to the unused portion of a round-trip ticket from China to the United States, paid for by American consumers eager for Chinese-made goods. One way or another, the boat is going back to China, and the fuel to send it there is going to be burned, whether or not the ticket is paid for. So anybody--or anything--hopping on that boat is getting what amounts to a carbon-neutral boat ride to China. . . . Of course, the same cannot be said of the weekend recycler who drives the recycling down to the local county dropoff, burning gas all the way. (87)

This is from Adam Minter's Junkyard Planet (2013), a really fun read (discussed more by Adam Ozimek here; Ozimek is how I heard about the book). It is a story about the global scrap industry, which is truly massive and serves as a reminder that markets are reasonably good at using resources efficiently, including natural resources. We all know the caveats about issues with the commons, which Minter discusses in some detail (and Ozimek mentions some of those in the context of this book).

One of the key insights from the book is that the things you probably think of when you hear the word "recycle"--the little blue bins and your selfless efforts to see that they make it to the recycling plant--are tiny compared with the massive profit-making enterprise that accounts for the bulk of recycling activity. Good intentions don't get us nearly as far as profit motives. When Minter asks a guy who runs a Christmas tree-recycling business why people get into this business, the reply is "People wanted to make money. That's all." The result, broadly speaking, is that "by the time a load of Chinese trash arrives at a landfill, very little that's reusable or recyclable is left in it" (27). And you can probably get rich if you can come up with a way to refine and sell the remaining stuff that is currently not considered to be reusable or recyclable.

The scrap industry isn't just about recycling; it also has mechanisms for reusing. Minter describes the industry (mostly Chinese) that buys broken or unused electric motors, fixes them, and resells them, and mentions this interesting idea: "The motors that used to drive U.S. [manufacturing] industry are being exported to China, refurbished, and used to drive Chinese industry" (111). There are similar markets for computer equipment, cars (of course), and lots of other stuff.

This book is a really good read. It has made me think more about entrepreneurship, how the movement of shipping containers is complicated by trade imbalances, development, and even my latest hobby, specificity. I'm not quite finished, so I may have more to say about it later.

Tuesday, July 22, 2014

"An important need to restructure"

It is clear that in an economy with an important need to restructure and hire workers in new sectors, unemployment is the result of the positive match surpluses that result from appropriability. . . . Thus, unemployment is an equilibrium response of the economic system, and it serves to restrain the bargaining position of workers in the presence of appropriability. This preserves the profitability of investment. . . . Periods of adjustment and intense gross hiring require high transitional unemployment to prevent surges in shadow wages. Unemployment keeps shadow wages at a level that makes job creation pay off.

This is from page 212 of Ricardo Caballero's Specificity and the Macroeconomics of Restructuring (2007), which I have found very useful. Here are charts of gross output and employment by sector in the vicinity of the Great Recession (click for larger images):




I'm just thinking; I don't have anything else to say about these right now.

Monday, July 14, 2014

Housing, finance, and the macroeconomy

That is the title of a new NBER working paper by Morris A. Davis and Stijn Van Nieuwerburgh, and I think it is going to be a chapter in something--the Macro Annual, or a handbook, or something. In my view, these two have done some of the best work on housing; see Davis' stuff here and Van Nieuwerburgh's stuff here.

The paper is really, really good.

It is a survey of the macro literature on housing. The sections of the paper are

  • stylized facts
  • housing and the business cycle
  • housing over the life cycle and in the portfolio
  • housing and asset pricing
  • the housing boom and bust and the Great Recession
  • housing policy

There's something for everyone (well, almost; I'll discuss below). Most of the sections describe a simple model that characterizes the literature on that topic, discussing the model's interesting implications and shortcomings. The paper covers a lot of ground, so it doesn't lend itself well to summarizing. Go read it if you want your mind blown.

Here's a slice:

Housing is not only an important asset in the portfolio, it also has several features that make it different from investments in financial assets. First, it is illiquid in the sense that changing the quantity of housing may take time and/or require incurring substantial transaction costs. Second, it is indivisible: A limited assortment of types and sizes are available for purchase at any time (including a minimum size). Third, home ownership and housing consumption are typically intimately linked. Most households own only one home and live in the house they own. Fourth, housing represents the main source of pledgeable capital against which households can borrow. Investment in housing is much more leveraged than investments in other financial assets and the value of owned housing limits the amount of leverage in households' portfolios. Fifth, housing is tied to a particular labor market: People usually live where they work. (24)

I wonder how many people realize just how weird housing is. In a previous post I wrote this:

For most people, an owned house is a massively concentrated, highly leveraged, totally undiversified bet on one asset class (real estate) in one geographical region. It's a long-term bet on the local labor market and natural environment. It may be a long-term bet on the owner's job match or occupation. The home purchase includes a bundle of local amenities--school district, voting district, neighbors, public administration, commute, etc.--and the new owner is making a bet about the outlook for that bundle as well. 

The issue of concentration and asset class is an obsession of mine. Say Davis and Van Nieuwerburgh:

Renters and owners choose substantially different portfolios of financial assets, highlighting that conclusions drawn about optimal portfolio allocations over the life-cycle from models that do not include a rental/own housing choice may be misleading. (36)

It drives me crazy that when I read books about personal investing they rarely (if ever) mention housing as part of the portfolio allocation problem. Another point I've made in previous posts is that buying a house is buying a lifetime stream of rental inventory. That is, rather than paying for housing services as they arrive, like renters do, owners buy a massive flow of services all at once. Tenure is a pretty complicated decision! From the paper's discussion of tenure models:

Although housing is risky, driving down demand, current housing is a hedge against future housing demand shocks since price changes of housing units in the same market are correlated. . . The hedging demand dominates its inherent risk. . . . When households expect to increase their holdings of housing in the future, they buy a bigger home today in response to an increase in house-price uncertainty. If, instead, households expect to down-size in the future, they reduce their holdings of housing today in response to an increase in house-price uncertainty. (35)

What's missing?

My dissertation focuses on a question that is not covered in this paper: housing as collateral for entrepreneurship. Early in grad school, I was looking through some firm dynamics data and noticed that young firms were hit particularly hard by the Great Recession. Then I noticed that both housing and young firm activity started collapsing in 2006, leading the NBER recession date by between 9 and 21 months. I hypothesized that the collapse in the value of housing collateral could lead to a decline in entrepreneurship via a collateral channel. There are now some empirical papers finding suggestive evidence that housing and young firm activity are related (and I have some related empirical work in progress). See a summary here, see also here. This topic has received a lot more attention recently.

In 2012 I did some informal interviews with a handful of bankers. They all told me that housing collateral is important for young firms, particularly brand new ones. The bankers use earnings history to make decisions about many small business loans, but new firms have no earnings history. They must have collateral, and for many entrepreneurs a house is the only collateral lying around. One banker told me that once house prices started falling, he shut down lending to new businesses entirely. Others said that they started significantly discounting the value of housing collateral and tightened loan-to-value ratios. In short, the anecdotal evidence suggested that housing collateral mattered a lot for lending to young businesses.

I built a DSGE model to investigate the topic. In the model, there is a corporate sector but households can engage in production if they want. People can own or rent houses, and owned houses can be used as collateral for any kind of borrowing (including capital rental for your business). In early versions of the model, I took the house price as exogenous. That route taught me the limitations of partial equilibrium reasoning: when house prices receive no feedback from housing investment decisions, things can get pretty wacky. If people think house prices are about to fall, everyone can sell their house (or eat it, if possible), rent, and wait for things to bottom out, relying on cash from the sale to secure ongoing borrowing.* More broadly, general equilibrium matters for thinking about entrepreneurship. The opportunity cost of starting a business is often earning a wage; so you can get more entrepreneurship (at the extensive margin, at least) by doing things that destroy the labor market (which is consistent with some evidence; see Robert Fairlie's stuff). So the supply-side financial frictions that affected large firms can have an ambiguous effect on entreprenership generally. Housing collateral, on the other hand, directly affects firms whose balance sheets are tied to households.

You might think that building a model where entrepreneurs need collateral and housing happens to be collateral is like assuming the result. But the model doesn't have to deliver the result that lower home values reduce entrepreneurship. People could respond to the lower house prices by holding more housing (which is what happens, e.g., if housing enters utility Cobb-Douglas style), or by holding more financial savings. But, quantitatively, these options aren't enough. In model experiments, a lower house price is associated with less entrepreneurial activity. This happens despite the fact that there is an unconstrained corporate sector that can make up for lost output from missing entrepreneurs, so wages are not decimated and aggregate demand need not fall dramatically. So I can isolate the effect of housing on entrepreneurship without confounding it with a bloodbath in the rest of the economy. The paper isn't totally finished, but I think the model is generating interesting results.

Who cares? It matters if recessions that are accompanied by (or preceded by) housing sector collapse are likely to also see a collapse in young firm activity, since young firm activity is large compared with net job flows. A labor market is likely to recover from shocks more quickly if firm dynamics are robust; we don't want to have to rely on old firms as a group to generate labor market recoveries, since the gross job creation of expanding old firms is typically matched by gross job destruction of shrinking ones.

So I think its role as business collateral is another reason to care about housing.


*My partial equilibrium version of the model did teach me things, though. An exogenous house price lends itself to the interpretation of price as a technology parameter. A low house price means you can convert a few consumption/capital goods into lots of housing. In this sense, falling house prices are somewhat similar to rising TFP! This generalizes to the endogenous house price case, more or less, particularly if aggregate housing supply is somewhat inelastic. A lot of people are inclined to see falling house prices as all doom and gloom, but it's actually really good for people buying houses, and like any technology shock it can have positive spillovers for other people as well (though probably not positive on net, for homeowners). The technology interpretation also helps the case that model results for entrepreneurship are robust, since the lower price is making people better off in other ways.

Tuesday, July 8, 2014

Productivity and reallocation

A few weeks back, Robin Harding wrote a nice FT piece on the "productivity puzzle." Yesterday Ryan Avent wrote a really nice note about the notion that productivity growth is very unpredictable. This stuff got me thinking.

Always remember that aggregate concepts are, well, aggregated. Aggregate productivity growth can be usefully divided into (a) productivity growth within businesses and (b) the failure and exit of low-productivity businesses and the creation of high-productivity businesses. Foster, Haltiwanger, and Syverson (2008) found that in manufacturing, about a third of productivity growth comes from establishment entry and exit (the fraction is likely higher in retail).* Another reason to focus on entry is that young firms invest proportionally more in R&D (Acemoglu, et al. 2013). Further, even among incumbent businesses, the effect of firm-level productivity improvements on aggregate productivity depends on the degree to which innovating firms gain resources and non-innovators lose them; the role of reallocation in aggregate productivity growth is therefore huge (Acemoglu, et al. say it's 80 percent).

It turns out that there is lots of productivity heterogeneity among businesses, with Chad Syverson (2011) finding that the 90th percentile (in terms of productivity) is twice as productive as the 10th percentile (in manufacturing, within 4-digit industries).** In some senses, this fact reflects unrealized potential productivity growth. In a frictionless world, unproductive businesses always fail and productive ones always grow. In the real world, the correlations we need are still there, but things may be changing.

In short, productivity growth doesn't only depend on new technology . It also depends on allocation and reallocation of resources--labor, capital etc. If the reallocation machine breaks down, we might not capture all the gains of innovation (depending on the reasons for the breakdown). On the other hand, we can survive a technology growth slowdown if we get better at putting resources where they can be used most productively.

As a side note, recall the Caliendo, et al. paper that found that eliminating shipping costs could boost aggregate productivity by as much as 50 percent. I think there are reasons to be optimistic about our ability to effectively reduce distance: the growth of services, 3D printing technology, technologies that can address the last-mile problem, big data, and so forth. But letting these things work means letting resources be allocated to the firms that are pushing them.


*Note that the data used here track establishments, not firms. I'm being loose with language in this post.
**This paragraph and the one preceding it borrow heavily from literature review contained in some joint work I have with Haltiwanger, Jarmin, and Miranda, forthcoming. And yeah, I have some justified Impostor Syndrome regarding that project.

Saturday, July 5, 2014

Excess water demand in California

Image source

WSJ reports:

About 60 California cities and agencies have imposed mandatory water-use cutbacks, some as high as 50%. In many cases, the rules are enforced by charging higher fees for excess usage. In others, inspectors are deployed to crack down on scofflaws. . . . 
Among the most aggressively monitored locales is the state capital, Sacramento. . . . 
This year, the city cut outdoor watering to two days a week from three. Because only about half its homes have water meters to measure use, Sacramento must rely on inspectors to help enforce the rules. 
A team of 40 inspectors working for the city's Department of Utilities investigate complaints. Sacramento, a city of 475,000, had received 7,604 water-use complaints as of June 18, said city spokeswoman Jessica Hess. 
The city and other water districts, meanwhile, are offering carrots along with sticks, paying residents to replace their turf lawns with drought-resistant vegetation.

The state of California does not have enough water to meet demand. One way they could eliminate excess demand is to raise water prices. If there are externality issues, stick a tax on it. This isn't so different from "charging higher fees for excess usage." But for the most part, municipalities have instead opted for the hodge podge of costly and overlapping remedies described above. Sometimes prices are actually raised, but the timing screws up the incentives. A lot of people seem pretty upset by the restrictions--one wonders if they'd be willing to pay more if they could have more water. The guy with the landscaping business is a pretty good example of Bastiat's "unseen" costs of policy.

A couple of water districts are actually letting prices do some work (from ABC):

Two water districts and a pair of landowners in the heart of the state's farmland are making millions of dollars by auctioning off their private caches. . . . 
In California, the sellers include those who hold claims on water that date back a century, private firms who are extracting groundwater and landowners who stored water when it was plentiful in underground caverns known as water banks.

This makes me wonder if the state is actually open to flexible prices, but cities aren't.* Or perhaps utilities are afraid of "price gouging" fights.** In any case, while cities are piling on unenforceable rules, asking neighbors to tattle, and hiring bureaucrats to drive around busting people, some landowners took steps to actually alleviate the drought by storing some water in hopes of selling it. It was a brave move. I wonder how many others throughout the state would have taken similar steps if they thought they could count on a price mechanism.

I don't mean to oversimplify. This is an epic public policy problem with a lot of complicated details. But it does seem likely that water is typically underpriced in the West.

I have several other water posts here.


*If I understand correctly, water prices are set by this organization. My assumption is that utilities request price changes and the Commission makes a decision. I would like to know more about this.
**I once asked about this in a comment section at California Water Blog. A commenter responded with the suggestion that for utilities to obtain approval of rate increases, they may have to open their books to regulators--so they are hesitant to do so. My prior is that their books are already fairly open to regulators, but this could be an interesting hang up.

Wednesday, July 2, 2014

I use Fortran; it may be right for you, too (or not)

I don't know who would care to read this post. Maybe econ grad students.

Aruoba and Fernández-Villaverde have a nice paper comparing programming languages for solving a standard DSGE model. They have provided a nice public good here, and the paper is worth a look for economists who code. The headline finding is that C++ and Fortran are still the fastest, and (somewhat surprisingly) C++ is slightly faster.

I originally used Matlab for my dissertation model. It was taking a long time to solve, and people in my department finally convinced me to switch to Fortran.* The most time-intensive parts of my solution algorithm take about one tenth of the time they took in Matlab. Other parts got even bigger speedups.

A lot of people give me a hard time about Fortran and tell me I should switch to Python or something similar. The reason I won't do that is clear enough from the paper. Python is, by all accounts, a very intuitive and versatile language. But my model can sometimes take 24 hours to solve, and even multiplying that by two or three times would be very costly. To calibrate (or estimate) a model, one must solve it many times. Also, other people in my department use Fortran (it's pretty popular in macro), so there are some nice agglomeration returns. Fortran is very common in scientific computing, so there is a large library of algorithms you can take off the shelf (see, e.g., Numerical Recipes). It's a really easy language to learn--in fact, it's fairly similar to Matlab.

A common critique of Fortran (voiced by the first commenter here) is that, these days, hardware is cheap and programmers are expensive--so easier, more versatile languages are best. That's probably true in much of industry, particularly things like web design. But for tasks that require serious number crunching, and in an academic world with limited resources, hardware is still a binding constraint (and grad student labor--i.e., mine--is cheap). I've been solving my model on 180 processors. A lot of people don't have access to that kind of hardware (until a few months ago, I couldn't use more than 18). Furthermore, there are diminishing returns to parallelization: above 180, I get basically no speedup from adding workers. So I'm not even sure that better hardware could offset Fortran's speed advantage in my case. (Right now, other people in my department are probably wishing I would quit using 180 processors...).

If you are doing representative agent models, the speed differences between languages are probably irrelevant. In that case, you probably care more about ease of use and applications other than the number crunching, like making charts. Fortran is pretty bad in this department--I dump all of my output into Matlab and make charts there, and I've been meaning to move those codes over to Python or R so I won't be so reliant on license stuff. But if you plan to only do those kinds of models, Fortran is probably not the right choice. Use Dynare, which is awesome.

If you are planning to solve models with some nontrivial heterogeneity, you need to choose your language carefully. In case you don't know: in a model in which agents differ over a state space, equilibrium prices don't just fall out of a first-order condition. You have to solve for them. The usual way is to guess prices, obtain policy functions, add up everyone's choices, check market clearing, and guess again. While a rep agent model only requires you to find policy functions once, a het agent model requires you to do it many times while you search for the right prices. (A nice side effect of solving models this way is that you get to see partial equilibrium results while it solves). Computing time grows exponentially with the number of heterogeneity dimensions you have, due to the Curse. Also, the more prices you have to find, the longer it will take (here's a tip: constant returns to scale technology makes factor prices move in lockstep, so knowing one implies the other). When I went from needing to find one price to needing to find two, it more than doubled my computation time.

This stuff matters because I think some of the most interesting work being done in macro right now is the empirical stuff based on micro data. To me, heterogeneity is what makes macro interesting. The theories that have to go with the rich micro data are often going to require hard computational work.


*I'll save commenters some time and simply note that I've already heard the one about how you used Fortran in college in the 1970s. It is somewhat funny that this language is still in wide use in scientific computing; but it's also not a huge surprise since doing floating point calculations over and over again doesn't require the latest bells and whistles. We're not trying to build Instagram here. Also modern Fortran is a pretty different language from Fortran 77 (it was last updated in 2008).

Monday, June 30, 2014

Most firms are small, but most people work at large firms

Click for larger image
Note: Firm size bins are not uniform in scope

The chart above shows that it can sometimes be misleading to say that "X% of US businesses have characteristic Y" (an example). Around 55 percent of US firms are in the smallest size category (less than 5 employees), but only 5 percent of US workers are employed at those firms. Over 70 percent of workers are employed by firms with at least 50 employees. Furthermore, the share of employment accounted for by large (and old) firms has been steadily growing.

I don't own this image; I found it on the Internet.