All you need is cash

Published by Anonymous (not verified) on Tue, 18/02/2020 - 8:00pm in

Andreas Joseph, Christiane Kneer, Neeltje van Horen and Jumana Saleheen

Financial crises affect firm growth not only in the short-run, but even more so in the long-run. Some firms permanently gain while others lose and cash is a crucial asset to have when the credit cycle turns. As we show in a new Staff Working Paper, having cash at hand allows firms to continue to invest during the crisis while industry rivals without cash have to divest. This gives cash-rich firms an important competitive edge that not only benefits them during the crisis but that gives them an advantage that lasts way beyond the crisis years.

But first, how are cash holdings – a firm’s deposit to asset ratio – actually distributed across sectors and firms? Figure 1 provides some insight. It looks at cash holdings just prior to the crisis, but these patterns have not changed much over time. Each dot represents an industry in the UK and shows how much cash firms in that industry have on average (horizontal axis) and how much these holdings vary within that industry (vertical axis). A striking fact stands out. Firms’ cash holdings not only differ greatly across but also within narrowly defined industries. This means that at any given moment in time some firms in an industry will have lots of cash at hand while others only very little.

Figure 1: Variations in cash holdings by industry (2006)

Notes: This figure plots the correlation between mean and standard deviation of the cash holdings of UK firms at the 4-digit industry level. Cash holdings are defined as deposits over total assets and measured in 2006.

Well, does this matter? Maybe not. When the economy is doing well firms’ cash holdings do not make much of a difference. This is roughly demonstrated in the top panel of Figure 2. Here we first rank firms in each industry according to the size of their cash holdings compared to other firms in that industry in the year 2000. Red means little cash compared to one’s industry rivals while green means lots of cash. Next, we track investment for these firms over time, i.e. the growth of their fixed assets, such as buildings, machines, office equipment, patents etc. Hardly any relationship between firms’ cash holdings and their investment between 2001 and 2007 exists: both cash-rich and cash-poor firms invested during this period. We now repeat this exercise, but measure cash in 2006 instead. That is, we rank firms according to the size of their cash holdings relative to their rivals just prior to the start of the global financial crisis. The picture changes dramatically as seen in the bottom panel of Figure 2. While firms with cash continued to invest throughout the crisis, cash-poor firms were shrinking their fixed assets. Perhaps more surprisingly, this divergence in investment behaviour became even more pronounced during the recovery period. Cash thus seems a crucial asset to have when the credit cycle turns.

Figure 2: Investment high vs low cash firms: pre-crisis and crisis period

Notes: These figures plot the average fixed asset growth for firms in each percentile of relative cash within the 90 percent interquartile range. In panel A average fixed asset growth is tracked over the period 2001-2007 and in panel B over the period 2007-2014. Fixed asset growth is defined as the log difference between 2001 and year 2001+j (pre-crisis period) and between 2007 and 2007+j (crisis period). Relative cash is calculated by subtracting from the firm’s cash holdings its industry mean and dividing the difference by the industry standard deviation and is measured in 2000 for the
pre-crisis period (panel A) and in 2006 for the crisis period (panel B). Industry mean and standard deviation are determined at 4-digit level.

Why does cash matter?

Keeping cash idle might be expensive during normal times, but when a financial crisis hits having cash at hand can positively affect firm investment for several reasons. First, cash provides a firm with an internal source of funds when earnings decline and it becomes more difficult to borrow from banks. Second, cash preserves its value when asset prices drop and can serve as high-quality collateral that a firm can pledge to raise external funds. Third, a firm with cash does not have to increase its cash holdings for precautionary reasons and can use its funds for investment instead.

Thus, firms with ample cash at hand can more easily continue to operate, replace fixed assets that have depreciated and even seize profitable investment opportunities when they come along despite the crisis. Their cash-starved rivals by contrast have to forgo investment opportunities, may be forced to shrink their fixed assets and may even struggle to survive. As a result, an investment gap between cash-rich and cash-poor firms opens up.

This brings about a shift in competition dynamics. As cash-rich firms grow their fixed assets their productive capacity expands. At the same time the productive capacity of cash-poor firms shrinks. During the recovery phase when demand returns and credit conditions improve, cash-rich firms have thus more capacity to meet this demand and can subsequently reinvest their earnings, increasing their capacity further. Cash-poor firms, on the other hand, have difficulties catching up with their cash-rich rivals and see their positions weaken further. As a result, the investment gap between cash-rich and cash-poor firms that opens up during a crisis period is amplified during the recovery period. This explains the growing divergence between “green” and “red” firms as observed in the bottom panel of Figure 2.

Who needs cash most?

While Figure 2 shows some very striking patterns, it is important to make sure that these differences are caused by differences in pre-crisis cash holdings and not by other factors. This is especially important as consumers reacted strongly to the crisis and were buying less products. In our analysis we therefore control for a large number of firm characteristics that might also explain how much a firm can invest when the credit cycle turns, as well as for economic conditions in the region where the firm is located and changes in demand and productivity affecting each industry. Even after accounting for all this, we find a strong cash effect.   

In numbers, cash-rich firms grew their fixed assets with 4 percentage points more between 2007 and 2009 – the depth of the crisis – compared to their cash-starved industry rivals. By 2014 this number had tripled to 12 percentage points – a big difference. This effect was present both for firms whose cash holdings fluctuated a lot over time and for firms whose cash holdings were very stable. The cash-effect was unique to the crisis and post-crisis recovery period and was not present in the tranquil period that preceded the global financial crisis. This suggests that the tightening of credit conditions played an important role in driving the cash- effect.

So who benefits most? Not surprisingly, it’s the young and small firms. During a financial crisis banks are more likely to cut lending to young and small firms. Having access to cash should thus be especially advantageous for these firms. Indeed, we find that a young firm (a business that is less than 10 years old) with cash invested 15 percentage points more over the period 2007-2014 than a young firm without cash and a small firm with cash 19 percentage points more than a small firm without cash.

As an illustrative example, let’s compare two hypothetical small coffee shops somewhere in the UK with different cash holdings before the start of the crisis. Let’s say that both have equipment, such as coffee machines, grinders, furniture, computers etc., worth £100,000. Using our estimates, the coffee shop with high levels of cash will, by 2014, have grown its equipment to the value of roughly £110,000, while the cash-poor shop’s equipment will only be worth just over £90,000. In other words, the cash-rich business owner could replace its coffee machines with the latest models and even buy and extra machine and expand the business. The other one instead had to scale down and keep old machines running for longer. Which one would you more likely go to for your morning latte? Very likely, the first one.

Indeed, we find that cash-rich firms, especially the young and small ones, were able to capture market share from their cash-poor rivals during the crisis and even more so during the recovery phase. And these firms were also able to generate greater profits over time. When you have more and better coffee machines, you are able to serve more customers and can poach them from your competitors.

To conclude

A financial crisis not only impacts firms in the short-run, but also in the long-run as some firms permanently gain while others permanently lose. Having cash at the onset of a crisis gives a firm an important competitive edge during the crisis that it can further exploit during the recovery phase. A liquid balance sheet when the credit cycle turns is thus an important determinant of firms’ long-term growth after a crisis – and a factor largely neglected by economists and policy makers.

Andreas Joseph works in the Bank’s Advanced Analytics Division, Christiane Kneer works in the Bank’s Financial Stability Strategy and Risk Division, Neeltje van Horen works in the Bank’s Research Hub and Jumana Saleheen works at the CRU Group.

If you want to get in touch, please email us at or leave a comment below.

Comments will only appear once approved by a moderator, and are only published where a full name is supplied. Bank Underground is a blog for Bank of England staff to share views that challenge – or support – prevailing policy orthodoxies. The views expressed here are those of the authors, and are not necessarily those of the Bank of England, or its policy committees.

The birds, the bees and the Bank? The birth-rate channel of monetary policy

Published by Anonymous (not verified) on Fri, 14/02/2020 - 8:00pm in

Fergus Cumming and Lisa Dettling

Children are expensive. Swings in families’ cash-flow can therefore move the dial on families’ decisions on whether and when to have a baby. For mortgaged families with an adjustable interest rate in 2008, the sharp fall in Bank Rate amounted to a windfall of around £1,000 per quarter in lower mortgage payments. In this post we show that people responded to this cash-flow boost by having more children. In total, we estimate that monetary policy increased the birth rate in the following three years by around 7.5%. That’s around 50,000 extra babies.

Who received the cash-flow shock?

In a new paper we employ administrative data covering the universe of births and mortgage originations in the UK to explore how the dramatic fall in interest rates in the Great Recession influenced households’ decisions to have a baby. In 2008, around half of UK families of child-bearing age were mortgaged home-owners. Of those, about half had mortgages that were directly tied to Bank Rate. The other half had mortgages on an initial fixed-rate period, which would reset to an adjustable rate sometime over the next few years.

Thus, when the Bank of England lowered its policy rate 4.5 percentage points during 2008 and 2009, about a quarter of families’ mortgage payments fell immediately, another quarter of families’ payments would fall at some point over the next couple of years, and the other half of families would never be affected (see Figure 1). For families with an adjustable rate, interest rate pass-through was sizeable and swift, lowering their mortgage payments by around 42 percent, or 7.5% of their take-home pay. Our paper uses these pre-determined differences in mortgage choices across local authorities and age groups as a “natural experiment” to examine how monetary policy affects families’ fertility decisions over the next three years.

Figure 1: Spatial variation in adjustable rate mortgagors

Sources: ONS, PSD and own calculations.

Our results indicate that a 1 percentage point reduction in Bank Rate – which decreased mortgage payments by 10 percent on average – leads to a 5 percent increase in the birth rate among families on an adjustable-rate mortgage. For the population as a whole, this is equivalent to a 2 percent increase in the UK birth rate. In aggregate, our estimates imply that the loosening of monetary policy in late 2008 and 2009 led to around 15,000 extra babies being born in 2009. In the paper we show that the effects were larger for people with lower incomes or more debt.

Mortgaged families across the pond in the United States were not so lucky. The prevalence of long-term fixed rate mortgages meant that most households saw few immediate benefits of looser monetary policy, at least in terms of their mortgage payments. Although aggregate birth rates rose in the UK over the period we study, in the US there was actually a Great Recession “baby bust”. Figure 2 shows birth rates in the US and UK, with grey recession bars, and the path of monetary policy shown by the dashed black line. In both countries, birth rates begin to fall almost as soon as the unemployment rate begins to rise, but in the UK that trend is reversed once Bank Rate begins to drop, and families keep more of their take-home pay for themselves.

Figure 2: Birth rates in the UK and US in the Great Recession

Note: Solid lines are seasonally adjusted quarterly birth rates by age-group dated to the quarter of conception and expressed per 1,000 women. The dashed line shows the path of Bank Rate (left column) and the Federal Funds Rate (right column). The grey bar indicates the period in which unemployment increased from its trough to its peak in each country during the Great Recession.

Sources: ONS and Bank of England (left column) and NCHS, Census, BLS and Federal Reserve Board (right column).

We show that in the absence of the cut in Bank Rate, declining employment and house prices would have otherwise led to a decline in birth rates in the UK. In other words, the fertility-stimulus effects of UK monetary policy were sufficiently large to outweigh the headwinds of the recession.

All else equal

Of course, we need to show that the drop in interest rates caused a change in UK fertility decisions, particularly since there was a lot going on in the Great Recession. We therefore control for time fixed effects, so that our estimates are net of any changes in economic conditions that affected everyone, regardless of their housing status. We also control for local house prices and unemployment rates to allow for differences in local economic conditions. Essentially, our strategy leverages the fact that all families in a particular age group and local authority are similarly exposed to changing economic conditions, but some families transitioned to an adjustable rate at some point after Bank Rate fell. Cash-flow shocks therefore exhibited variation across both space and time. But it is important that these repayment reductions were indeed a “shock”. Luckily, survey evidence from the summer of 2008 indicates that just 10 percent of households expected rates to fall at all in the coming year.

It is hard to know whether our estimates represent a shift in the timing of births, or in the total number of children. Given the recentness of the Great Recession, we won’t know for sure for another decade or so. But there is some suggestive evidence in favour of a permanent boost to the population. First, birth rates among older women beyond 2010 did not obviously dip down, which might be expected if women were bringing forward their child-bearing plans. Second, survey evidence on interest rate expectations, combined with the sustained low-interest-rate environment, suggests that the shock we study was perceived to be permanent. Standard models suggest this should lead to more babies per woman.

Policy implications

Our paper provides new evidence on one channel of monetary policy transmission to the real economy (though specifically influencing birth rates is not something the Bank aims to do as it falls far outside its remit!). Children are expensive, so a change in birth rates plausibly has spill-over effects on consumer spending. In addition to food, clothing, and other daily necessities, many consumer durables purchases (such as a larger vehicle) are prompted by the addition of a child. Indeed, estimates indicate that the average cost of raising a child during their first year is almost £11,000. If families previously saved this money, the additional 15,000 babies in 2009 could have led to up to £130 million in additional spending in 2009 alone. And although this is surely an upper bound, it does not factor in the costs of raising children beyond their first year (the total cost of raising a child is around £230,000). Moreover, our estimates do not capture some of the indirect effects of monetary policy on birth rates such as through employment, wages or house prices.

Our results have implications beyond the impact on aggregate demand. Fluctuations in cohort and school class sizes have meaningful effects on children’s educational attainment and future labour market outcomes. Meanwhile, parenthood affects labour supply decisions and could have knock-on effects for household income and productivity. Changes in birth rates also eventually feed into population dynamics and dependency ratios, which affect the transmission of monetary policy. And there is evidence that changes in dependency ratios can alter the natural rate of interest.

Monetary policy is often thought to operate with long and variable lags, though few economists have considered the link between interest rates and birth rates. During the Great Recession the birth-rate channel of monetary policy actually operated swiftly as people began having more children soon after rates were slashed (that is, about nine months later).

Fergus Cumming works in the Bank’s Monetary Policy Outlook Division and Lisa Dettling works at the Federal Reserve Board.

If you want to get in touch, please email us at or leave a comment below.

Comments will only appear once approved by a moderator, and are only published where a full name is supplied. Bank Underground is a blog for Bank of England staff to share views that challenge – or support – prevailing policy orthodoxies. The views expressed here are those of the authors, and are not necessarily those of the Bank of England, or its policy committees.

Degrowth Toward a Steady State Economy: Unifying Non-Growth Movements for Political Impact

Published by Anonymous (not verified) on Thu, 06/02/2020 - 2:02am in

By Brian Czech and Riccardo Mastini 

Limits to Growth and the Environmental Movement

No later than the 1960s, scholars wrote in rigorous terms of the limits to economic growth. Europeans such as E.F. Schumacher, Americans including Herman Daly, and European-born Americans (most notably Nicholas Georgescu-Roegen and Kenneth Boulding) set the stage for later studies in ecological economics and sustainability science. Their scholarship, supplemented by the population focus of Paul Ehrlich and the modeling approach of Donella Meadows and coauthors (for the Club of Rome), resonated with ecologists and opened the eyes of millions of concerned citizens worldwide.

The “limits to growth movement” was allied in effect with the environmental movement of the 1960s and early 1970s. As indicated by the events of the first Earth Day in 1970, the environmental movement had a global aspect and was a major political phenomenon in many countries. It too had its progenitors. In the USA Rachel Carson, Barry Commoner, and David Brower were in the vanguard, and limits to growth were in their academic DNA. They were essentially “economists of nature” who were steeped in the concept of carrying capacity.

The cumulative movement—limits to growth and environmental protection—was characterized by a rapidly mounting concern over destructive economic activity. The critique of growth was therefore accompanied by skepticism about the behavior of corporations. In Europe, especially, the sustainability of capitalism itself was called into question, with or without Marxist leanings.

Although the critique of growth was focused on and in capitalist countries, astute observers noted an obsession with economic growth in socialist and communist countries as well. At the time, the most profound example was the Soviet Union. The Cold War, after all, was waged in terms of GDP, as described in meticulous detail by Robert Collins in More: The Politics of Economic Growth in Postwar America.

While the cumulative movement had some tangible successes, these were primarily of a regulatory nature for specific environmental protections, including clean air policies and the establishment of national parks in the UK and France. Meanwhile in the USA, the Clean Air Act and the Clean Water Act were passed, and the Environmental Protection Agency was established to give the legislation teeth. The National Environmental Policy Act also helped to prevent the “sneaking” of environmentally devastating projects into the federal budget without copious public review and discussion.

The Cold War

The Cold War score was kept in GDP and was, therefore, highly unsustainable. (Image CC BY-SA 4.0, Credit: Carlos3653)

Little, on the other hand, was done to actually check the rates of economic growth in Europe or the USA. In fact, virtually nothing was done explicitly to that effect, and hardly anyone aside from Herman Daly even called for it in policy terms.

Perhaps the closest thing to macroeconomic reform was the Endangered Species Act of 1973. In the preamble, the 93rd American Congress found and declared that “species have been rendered extinct as a consequence of economic growth and development…” and went on to provide strict protections for threatened and endangered species. In essence, the Endangered Species Act was an implicit (and unintended, for most legislators) prescription for a steady state economy, albeit a steady state with a long list of species dangling from one last twig on the tree of life (see Czech and Krausman 2001).

The alternatives to growth were always obvious, starting with the opposite of growth; that is, recession, shrinkage, or “degrowth.” In between the two opposites was stability, equilibrium, or what Daly called the “steady state economy.”

Daly vs. Georgescu-Roegen: Less a Debate than a Different Frame of Time

When Daly started advancing the steady state economy as the sustainable alternative to growth, Georgescu-Roegen protested, as he had described in magnificent detail the unrelenting forces of entropy, which eventually brings down any economy on Earth as the sun runs out of hydrogen. But Daly acknowledged as much. Indeed Daly’s steady-state economics was born out of insights derived largely from Georgescu-Roegen, who was Daly’s Ph.D. advisor at Vanderbilt.

The contrast between Daly’s steady-state emphasis and Georgescu-Roegen’s entropy focus was hardly a political debate with policy implications. Instead it was theoretical and philosophical, applying primarily to the longest of long terms, not policy-relevant planning terms. Daly’s favorite metaphor of a long-term economy was a candle. The candle must first be lit, then will burn, and eventually must die out. The candle’s “production” can approximate a steady state for all but the lighting and the dying.

Unfortunately, however, the global economy was starting to look like a Roman candle with a suddenly vulnerable wick. Non-renewable resources—or “natural capital” stocks—were being liquidated, and the economy would have to recede to a level sustainable with renewable resources. This was a matter of common sense, yet the laws of thermodynamics were required to refute the notions of neoclassical economists who believed in perpetual substitutability of resources in an ever-growing economy.

There was a sort of middle ground: Within limits, additional mastery over the use of renewables could take up some of the slack as non-renewables were liquidated. Also during this adjustment phase, recycling of non-renewables would still be economic. An emphasis on efficiency has found renewed vigor with visions of a “circular economy.”

A Sustainability Slogan for the 21st Century: Clear, Accurate, and Policy-Relevant

Our focus for the current purposes, however, should be less on the technics of growth, degrowth, or the steady state economy, and more on the political common ground of degrowth and steady-state movements. The predominantly European “degrowthers” and the predominantly American (and Australian) “steady staters” would all have more cachet, influence, and success if they were united in their efforts to topple economic growth from the pedestal of politics and policy.

Our unified slogan ought not be simply “steady state economy” or “degrowth,” but rather “Degrowth Toward a Steady State Economy.” The slogan is perfectly clear, charts a path, and readily rolls off the tongue. It passes the test for effective slogans.

The vast majority of tips on communications, rhetoric, and marketing come from the context of business. While we can’t reduce social movements and statesmanship to salesmanship, the basics of effective slogans would seem to apply in all scenarios. Consider for example the “5 Tips for Writing an Effective Slogan” described by Dan Smith of Business Insider.

Smith’s tips 1 and 2 overlap substantially. Tip 1 is, “Highlight a key benefit. The point of a slogan is to differentiate your product or brand from that of your competitors, while also underscoring the company’s general mission.” Tip 2 is, “Explain the company’s commitment… differentiate the company from other competitors.”

How could we possibly explain our commitment more clearly with a handful of words? “Degrowth Toward a Steady State Economy.” This is our vision of sustainability, including environmental protection, economic sustainability, and peace among nations. As for differentiation, in calling for a clear alternative to growth, how could we be more differentiated from Wall Street, the World Bank, and most governments of the world, each of whom are competitors for the macroeconomic vision of the 21st century?

Tip 3. “Keep it short. Slogans should never be longer than a sentence and ideally should hit the sweet spot between six to eight words.”

“Degrowth Toward a Steady State Economy” weighs in at precisely six words comprising eleven syllables.

Tip 4. “Give them a rhythm, rhyme, and ring. A slogan longer than a single word should fulfill at least two of these three criteria.”

Well, there’s only so much you can do with a topic as heavy and demanding as limits to growth. We’re not selling paper towels here (the example provided at Smith’s article). Given the scope of the topic, it’s a relief that “Degrowth Toward a Steady State Economy” contains no problematic phonetics and causes no tongue-twisting. Also, in the context of discussions, articles, or media coverage, after the slogan has been introduced it can be referred to with the shorthand, “degrowth toward a steady state,” which rolls off the tongue more readily yet. For those so inclined, even rhyming is not out of the question. It isn’t difficult to imagine the late Kenneth Boulding quipping, “Degrowth toward a steady state—do it ‘fore it’s way too late.”

Tip 5. “Stay honest. When writing a slogan, it’s extremely easy to get carried away; however, it’s imperative that the slogan accurately reflects the business. In other words, hyperbole is extremely discouraged.” 

How could we be more honest about what “business” we’re in? We’re offering the sustainable alternative to growth, not some dishonest oxymoron such as “green growth” or “sustainable growth.” Nor are we exaggerating with, for example, “degrowth toward Heaven on Earth,” or “degrowth for infinite ecstasy.” We are advocating, quite clearly, for degrowth toward a steady state economy. Why not call it precisely that?

Disharmony Between North American and European Sustainability Advocates?

Herman Daly and Serge Latouche

Herman Daly (left) and Serge Latouche (right), champions of the steady-state and degrowth movements, respectively. (Left Image: credit by Herman Daly; Right image: Image CC BY-SA 3.0, Credit: Niccolò Caranti)

One wonders why “Degrowth Toward a Steady State Economy” hasn’t proliferated already among degrowthers and steady staters. Certainly the connection got off to a promising start in 2002. That’s when Herman Daly and Serge Latouche were honored side by side in Rimini, Italy, each with a Medal of the Italian Government for their groundbreaking work in steady-state and degrowth economics, respectively.

At CASSE, we use “degrowth toward a steady state economy” a lot, especially in speeches and social media, helping to empower the degrowth movement along with steady-state economics. The slogan works perfectly fine in academic articles as well (see for example O’Neill 2012, Sapinski 2015). In 2018 the nascent DegrowUS adopted the mission statement, “Our mission is a democratic and just transition to a smaller, steady state economy in harmony with nature, family, and community.” Yet the phrase “steady state economy” seems glaring in its absence from the European scene today, even in English-speaking venues. We can think of several potential reasons, and heretofore we hypothesize briefly about two.

Might it be, ironically, that Americans from broader sustainability circles are largely responsible? Many elder Americans, especially, still have Cold War sensitivities, whereby the phrase “steady state economy” evokes thoughts of the Soviet Gosplan, the central economic planning apparatus of the Soviet era. Such sensitivities may be largely subconscious, as several generations of Americans were essentially “programmed” into fear or loathing of the Soviet Union and, by association, central planning of economic activity. Self-aware scholars and sustainability leaders, while themselves long past the Cold War, may strongly suspect—perhaps correctly—that much of the American philanthropy community (which tends to be elderly by its nature) would not cotton the phrase.

Avoidance of the phrase “steady state economy” for fear of being politically marginalized (and losing out on grant money in academia and the non-profit sector) is understandable, but it hasn’t been helpful for advancing the steady state economy, much less degrowth, in politics and policy. If only American leaders in environmental protection, economic sustainability, and international diplomacy had spent some time sharpening their steady-state rhetoric over the past five decades, “steady state economy” would be far closer to vernacular. Only when explicit discussion of the steady state economy is in the vernacular can we expect American policy reforms conducive to degrowth toward a steady state economy.

The second hypothesis pertains to a small but vocal group of Marxists from several continents who have stubbed a collective toe on the work of Herman Daly. Daly has acknowledged the relative efficiency of markets for allocating a very specific and limited set of goods; namely “rival and excludable goods” (basically the small stuff such as boots and tin cans), and definitely not public goods and services (the big stuff such as environmental protection and national defense). Daly has also proposed solutions that entail tightly regulated market mechanisms, such as cap-and-trade systems conducive to sustainable scale, just distribution, and efficient allocation. Furthermore, Daly and generations of students, including textbook co-author Josh Farley, have recognized in detail the types and sources of market failure, even among the widget sectors (see for example Daly and Farley 2010).

Despite Daly’s careful, nuanced, and discerning assessment of markets and market-like mechanisms, the handful of vocal reactionaries seem to view him as an apologist for laissez-faire capitalism! This incredibly ironic misinterpretation of Daly’s life and work has furthermore led additional folks to overlook, ignore, or even object to steady-state economics itself, the highlight of which is, of course, the steady state economy as macroeconomic goal. Steady-state economics might be the biggest baby to ever be tossed with any bathwaters.

Degrowth conference

2014 Degrowth Conference, Leipzig University. Hundreds of degrowthers have signed the CASSE position on economic growth at degrowth conferences since the beginning of the movement. (Image CC BY-SA 3.0 DE, Credit: Eva Mahnke)

To the extent that sustainability advocates are misled into thinking of Daly—and even all of steady-state economics—as a capitalist enemy instead of a perfectly natural ally, it cripples the collective non-growth movement.

Coming Full Circle

Whenever a question arises about the macroeconomics of sustainability, it behooves us to consider the three basic alternatives: growth, degrowth, and the steady state economy. Neither growth nor degrowth are sustainable in the long run. This is most obvious in the case of degrowth. Meanwhile, the full body of work by Herman Daly, CASSE, and our many friends and colleagues in ecological economics (not always well-represented in Ecological Economics) makes it obvious enough regarding growth as well. This leaves the steady state economy as the sustainable alternative.

But what if—as indeed is clearly the case—the present economy has already grown too large for sustainability, much less optimality? (Think especially of American, European, and global economies.) Well, that brings us full circle:

No later than the 1960s, scholars wrote in rigorous terms of the limits to economic growth. Europeans such as E.F. Schumacher, Americans including Herman Daly, and European-born Americans (most notably Nicholas Georgescu-Roegen and Kenneth Boulding) set the stage…


Literature Cited

Collins, R.M. 2000. More: The politics of economic growth in postwar America. Oxford University Press, Oxford, U.K. 299pp.

Czech, B., and P.R. Krausman. 2001. The Endangered Species Act: History, conservation biology, and public policy. Johns Hopkins University Press. 212pp.

Daly, H.E., and J. Farley. 2010. Ecological economics: principles and applications. Second edition. Island Press, Washington, DC. 544pp.

O’Neill, D.W. 2012. Measuring progress in the degrowth transition to a steady state economy. Ecological Economics 84:221-231.

Sapinski, J.P. 2015. Climate capitalism and the global corporate elite network. Environmental Sociology 1(4):268-279. (


About the Authors

Brian Czech, Ph.D., is the founder and executive director of the Center for the Advancement of the Steady State Economy. He is the author of three books, Supply Shock, Shoveling Fuel for a Runaway Train, and The Endangered Species Act, as well as more than 50 academic journal articles. He served as a conservation biologist in the headquarters of the U.S. Fish and Wildlife Service from 1999-2017 and as a visiting professor of natural resource economics in Virginia Tech’s National Capitol Region.


Riccardo Mastini is a Ph.D. candidate at the Autonomous University of Barcelona, where he specializes in ecological economics and political ecology. He is a member of the academic collective Research&Degrowth and one of the editors of,. Previously with Friends of the Earth Europe, he is also CASSE’s Barcelona Chapter Director.

The post Degrowth Toward a Steady State Economy: Unifying Non-Growth Movements for Political Impact appeared first on Center for the Advancement of the Steady State Economy.

#RethinkMoney – The Greatest Lie Ever Told (Probably)…#TaxAndSpend

GIMMS is delighted to have permission from blogger Duncan Poundcake to reblog his article which was originally posted here


So what have we learned from the General Election of 2019?

Mainly the familiar cry of:

”How will you pay for it?”

”Labour ‘broke the bank”…

”Labour left a note saying  – We have spent all the money”…

Nothing very new in that. We have heard it on a loop for nearly 10 years from many Politicians. Policy Makers, Think Tanks, Economists, The Press and RW influencers, that:

  • For Her Majesty’s Government (HMG) to spend is a very bad thing to do.
  • HMG is at the largesse of the Tax Payer and is unable to spend for public purpose. HMG must either – a: Tax and/or b: Borrow before it spends.


  • There is an undefined and finite amount of Sterling that can ever be available in the economy.
  • Once this Sterling threshold has been reached, HMG must borrow back this Sterling from the private sector, to fund its spending.

Even Labour, with its £400bn spending bill, tells us; Tax, Borrow and Spend is the order of the day.

Unfortunately, yet again, Labour miss an opportunity and tell us the polar opposite of the reality…

1. The UK has ALWAYS been a Sovereign Fiat Currency Issuer

HMG has ALWAYS been able to create £s at will but there have been numerous times where, by circumstance, or design, it has been limited as to how many Fiat £s it can create.

Since 1971, the UK has been a Sovereign Fiat Currency Issuer, without restriction – In laymans language, HMG:

  • Has the legal monopoly on the creation (Issue) of its OWN currency.
  • Can create (Issue) Sterling at will, from thin air, with zero impedance.
  • Everyone else is a Currency User.

So why does everyone tell you otherwise?

Time to travel in the Monetary TARDIS…

2. A little bit of History repeating – The Gold Standard (Again):

Image by PublicDomainPictures from Pixabay


Over much of the 20th Century, the UK, US and other developed nations have been on and off variations of the ‘Gold Standard’.
In stark comparison to the economics of the last 40 years, when the Americans and British created ‘The Gold Exchange Standard in 1944’, their focus was:

  • To avoid trade deals which impoverished lesser trade partners.
  • An attempt to control flows of speculative financial capital.

The latter, in particular, had wrecked the global economy prior to the Great Depression, the outcome of which was seared into their collective memories:

  • A global depression,
  • Mass unemployment.
  • The rise of Fascism in Europe and Communism as a response.
  • Global War.
  • Millions Dead.

Post-War planners aimed to prevent the repetition of previous competitive currency devaluations but engineered not to force debtor nations to reduce their industrial bases to attract financial speculators and keep interest rates high.

British economic sage, John Maynard Keynes…

John Maynard Keynes portrait© National Portrait Gallery, London
Image cropped from John Maynard Keynes, 1st Baron Keynes of Tilton; Lydia Lopokova by Walter Benington, for Elliott & Fry bromide print, 1920s Given by Bassano & Vandyk Studios, 1974 Photographs Collection NPG x90117

again fearful of repeating the mistakes that led to Great Depression and carnage that followed, was the primary mover behind Britain’s proposal that Trade Surplus nations should be forced to use their trade surplus for good, or lose it for good:

  • Either import from debtor nations
  • Build factories in debtor nations
  • Donate to debtor nations.

The U.S. opposed Keynes’ plan and proposed creating the International Monetary Fund (IMF) with enough financial clout to counteract destabilising flows of speculative finance. However, in contrast to the modern IMF, the fund would counteract these speculative flows automatically, no political strings or agendas. An honest broker.

History demonstrates that on almost every point where the USA objected, Keynes was to be proved right.

3. Bretton Woods… 

The U.S. Secretary of the Treasury, Henry Morgenthau, Jr., addresses the delegates to the Bretton Woods Monetary Conference, July 8, 1944The U.S. Secretary of the Treasury, Henry Morgenthau, Jr., addresses the delegates to the Bretton Woods Monetary Conference, July 8, 1944 (Credit: U.S. Office of War Information in the National Archives).


In 1944, at Bretton Woods, the Allies met to plan a Post-War world and as a result of the collective conventional wisdom of the time, the Allied nations preferred to do this by regulating a system of fixed exchange rates, indirectly disciplined, by binding the USD to Gold at a fixed price per ounce.
This  system relied on a regulated market economy with:

  • Strict controls on the values of currencies.
  • Flows of speculative international finance would be stopped by channelling them through Central Banks. #Capital Controls
  • The intention being to direct international flows of investment.
  • The focus on using capital to building useful things that created jobs or benefited the public purpose, rather than financial speculation on the markets.

Interestingly, it was US planners who coined the phrase ‘Economic Security’, surmising that a liberal international economic system would enhance post-war peace and keep Communism at bay. This came from a belief, that causes of both World Wars, was ‘Economic Discrimination’ and trade wars. The main culprits being trade and exchange controls of Nazi Germany and the ‘Imperial Preference System’, where members, or former, of the British Empire were given special trade status, resulting in a German, French, and American protectionist policies.

*US Planners were shrewd enough to recognise that to keep Capitalism popular, taxpayers and workers, needed to see a benefit from it and to feel their lives being improved, rather than risk the alternative, Communism. To ensure this, regulated Capitalism was the solution and the irony is, we have the Cold War to thank for this Golden Age.*

In stark contrast to today, Bretton Woods participants agreed that a liberal international economic system ALSO required governmental intervention.

Following the economic turmoil of the 1930s, the management of economies had become the main activity of governments, taking on increasing responsibility for the economic well-being of its citizens. This had proved to be largely successful and popular. Employment, stability, and growth were the order of the day. In turn, the role of government in the national economy would continue. The Welfare State, which grew out of the Great Depression, had created a popular appetite for governmental intervention in the economy, and it was Keynes who made it clear that Government intervention was required to counter market failures.

Enter the era of State Capitalism…

Members of the Gold Standard agreed to closely regulate the production of their currencies to maintain fixed exchange rates, with a bit of wiggle room either side. The express aim being to make international trade easier. This was the foundation of the U.S. vision of a post-war world, Free Trade:

  • Lowering tariffs
  • Maintaining a balance of trade via fixed exchange rates that assists Capitalism.
  • Reduce trade and capital flows.
  • Revive the Gold Standard (Again) using USD as the world’s reserve currency.
  • Prevent Governments messing around with their currency supply, as they had between the wars.
  • Governments would be required to monitor the production of their currency and would refrain from manipulating its price.

4. Tax & Spend & Borrowing…

It is important at this point, to remind ourselves, HMG was still a Fiat Currency issuer but, up until 1971, had voluntarily limited its ability to created its own currency.

So following Bretton Woods, from 1944 until 1971, Gold was ‘Convertible On Demand’ into Sterling. This required HMG to have lots of Gold stashed away at the Bank of England (BoE) just in case anyone wanted to convert their pot of Gold into Sterling. Indeed, once upon a time, you could walk into the Bank of England with Gold and they were obliged to accept it and pay you cash.

Like all liabilities, it was worked out on risk. HMG surmised that only a small percentage of the public would ever demand their gold to be converted into Sterling, at any given time, so it only had to have a limited amount of Gold in reserve, just-in-case. Fractional Gold Reserve Central Banking, if you will.

However, because of the rules of the Gold Standard, HMG Currency Issuing (Spending) would be constrained by the amount of gold in the BoE vault.

The other issue HMG was acutely aware of, was spending Sterling for Public Purpose was in reality, spending the Gold it had in the BoE. The Gold never left the BoE but with a promise of convertibility into £s:

  • Limited how many £s could be spent at any one time
  • How many £s cash could be spent at any one time was…dictated by how much Gold it had in reserve.

So if HMG wanted spend more, it had to:

  • Find more Gold to allow it to create more Fiat £s to
  • Or, recoup Fiat £s from the private sector i.e: TAXPAYERS – BEFORE it could spend more. Welcome to…‘Tax and to Spend’.

Now to protect all that Gold in the BoE from a profligate Government, just creating Fiat £s to spend, they had a few tricks up their sleeve…

How could a Sovereign Currency Issuing Government, such as HMG with a self-imposed brake (The Gold Standard) on how many £s it can create and issue, spend more £s than it was allowed to create?

The Solution?

BORROWING BACK Fiat £s from the taxpayers’ savings – to spend again – rather than creating and issuing additional new Fiat £s, which might exceed the back-up supply of Gold. The plan being:

  • Why not get taxpayers to exchange their £s savings, for Sovereign Gilts, Treasury Bonds OR similar, that pay interest.
  • Taxpayers still get to benefit from the HMGs spending MORE £s each year than it intends to collect back in tax. Thus allowing taxpayers to continue to build their wealth of £s.


*One ingenious demonstration of this, was the infamous ‘ERNIE’, invented by a Bletchley Park codebreaker in 1956 and Premium Bonds, offering taxpayers another way to save outside of banks or building societies. Which of course, was not its main purpose. Premium Bonds were just another way to recoup £ from taxpayers, without actually Taxing. Recycled Money.*


And this is exactly how HMG ran Government spending up until the point Richard Nixon suspended US involvement in the Gold Standard in 1971 – due to the spiraling cost of the Vietnam War. US Government spending was outstripping its Gold Supply – and became a Sovereign Fiat Currency Issuer, without restriction.

*As Keynes had predicted in 1944, eventually the USA found itself in the inherent paradox of the Gold Standard:

1. It was required to be the Worlds Reserve Currency and as per the Bretton Woods agreement, keep USD flowing outwards to keep global trade moving.

2. However, this put a restraint on its ability to spend inwards, domestically.

A large percentage of its Gold Reserves had to be set aside to cover outward flows of USDs, restricting  USDs available to be created for domestic Public Purpose – which at the time of Nixon was Johnson’s: ‘The Great Society’ project.

Between 1944 – 1971, the US saw its stash of total world Gold Reserves shrink from 65% to 22%. The market speculated that the US has so many USD out in circulation, it was unable to convert USD to Gold, due to these dwindling Gold Reserves. The dollar depreciated. Inflation went up, employment followed suite and due to the spending spiraling requirements of the Vietnam War, Nixon saw the solution as suspending convertibility to Gold and to go 100% Fiat. No restrictions to USD creation.*   

The Gold Standard was effectively dead. The US now no longer converted USD into Gold and other nations bailed out in 1973. The Gold Standard was officially buried in 1976. The UK followed suit. However, the system for creating and issuing HMG money, DID NOT CHANGE and as I write, in 2019 nearly 50 years later, the Government still operates its finances as if it were on the Gold Standard:

  • So the HMG continues to sell Gilts, Treasury Bonds, Premium Bonds
  • So it can ‘borrow back’ £s from taxpayers
  • To spend MORE than it collected in taxation.

The one upside of this was/is a form of Corporate Welfare exchanging £s for Government IOUs, with the interest received, adding to private savings and wealth.

So we have ended up where the reality of Money Creation since 1971, is that HMG is not revenue constrained when it comes to spending for Public Purpose but continues to use a Monetary system that claims to be still on the Gold Standard.

To reiterate, for clarity, Her Majesty’s Government:

1. Is No Longer on the Gold Standard.

2. Is Not required to convert £s into any commodity to spend.

3. Is Not required to use taxpayers £s to spend.

4. Does not need to borrow or recoup Taxpayers £s savings to spend.

Yet, NO Government since 1971, has changed the from Gold Standard System to reflect the powers of a Fiat Currency Issuer. So HMG continues to tell us that it needs to:

  • Sell Gilts, Treasury Bonds. Premium Bonds, The Lottery etc.
  • Use the proceeds – Taxpayers’ Private Savings & Wealth to allow it to spend more than it collects in Taxes.

Now the rub with this is the interest, or payouts HMG needs to make to holders of these, all of which is added to the National Debt. So to pay for this, the Government needs to issue even MORE Gilts & Treasury Bonds etc. to cover the interest payments. Ad Infinitum…


A quick reality check via Quantitative Easing (QE) has shown us, if you are lucky enough to have owned £454bn of Gilts and Corporate Bonds, HMG bought from you, then you have become very rich indeed…unlike HMG which is falling ever deeper into debt.

Which is a complete MYTH and has created 50 years of confusion and a convenient smokescreen for those who see Government as a problem.

Even the BoE concurs: “Read my lips. No new taxes”…Does the Bank of England print money? – YouTube

5. Enter, Stage Right…AUSTERITY:

Now if you believe all this unwittingly, or otherwise, there is a logic in thinking that an ever-increasing National Debt is unsustainable and the ONLY solution is to REDUCE, substantially Government spending and to pay down the debt.

However, knowing that Gold Standard limitations no longer apply, the HMG has created a solution, to a problem that does not exist and ironically, created a further problem to the original one, which never existed in the first place. Think IMF Crisis, 1976.

The National Debt and the convoluted machinations of issuing ‘debt’ and accounting for it, as a brake to stop HMG from issuing more Fiat £s than it could guarantee with Gold, is a relic of history. Some would consider this insistence on clinging onto an economic fossil, to be stupidity, or perhaps a sign of something far more deliberate…

It is of course a legal requirement and not unreasonable, to expect HMG to keep a track of its spending. The much-vaunted DEFICIT:

1. The gap between £s out and £s in.

2. A balance sheet of the Fiat £s HMG has decided to spend into the economy but not redeemed in taxation.

When the HMG spends, this allows taxpayers to keep £s. When the Government reduces spending, this forces taxpayers to use their savings to spend and REDUCES private wealth. The less the Government pays for, the more you have to use your savings and income.


Questions to be answered…

1. If HMG fell out of the Gold Standard in 1971…

2. Which resulted in the £ no longer being required to be convertible to Gold…

3. Why do we still account for fiat government spending for public purpose as if we were on the Gold Standard?

4. Is it just welfare for taxpayers to exchange their fiat £s into Government Savings Instruments, that pay interest?

As QE has shown us, HMG Debt Instruments are not distributed equally across all taxpayers but are bought by a wealthy private and corporate elite.

Perhaps the most mind-blowing for taxpayers to get their head around is the ability for HMG to PAY OFF – at ANY TIME – the National Debt by purchasing all Debt Instruments in exchange for Fiat £s. Hello Japan…

So far from being ‘Fiscally Prudent’ by reducing the Deficit and running Government Finances like a household, only spending what is received in Taxation – the real-world outcome is to impoverish taxpayers and their well-being.

6. The solution?

A fundamental shift and an education of all taxpayers and the political establishment, to understand that:

As long as labour and sustainable resources are available, Government Spending is not only a good and positive but absolutely essential for the economy and the democratisation of wealth.

The ONLY limitations HMG has to spending for Public Purpose are:

1. The physical resources available.

2. The labour available.

3. Its own aspirations.

4. The taxpayers’ willingness to learn, deconstruct and the 1% and their cheerleaders across politics, the media and society who have used the confusion around Government Money Creation spending and taxing for the purpose of wealth extraction and power.

History demonstrates that the Tax and Spend myth, has resulted in dire and far-reaching consequences.


In 1976, when HMG went to the IMF claiming to have ‘run out of money’ and in return for $2bn, Healey was required to introduce Austerity measures – which were a precursor to the economics of Margaret Thatcher – latterly Neo-Liberalism.

There was an alternative proposed some 3 years before, yet thanks to Wilson, Healy & Callaghan’s refusal to listen to Tony Benn, history unfolded the way it did and Healy capitulated to Hayek and his Neo-Liberals, who have spent the following 42 years capturing the state, media and democracy in the UK – and beyond – for their own benefit.

Oh and by the way, Britain never did go ‘Bust’, no matter what Mr Roberts writes…





Join our mailing list

If you would like GIMMS to let you know about news and events, please click to sign up here










Viber icon

The post #RethinkMoney – The Greatest Lie Ever Told (Probably)…#TaxAndSpend appeared first on The Gower Initiative for Modern Money Studies.

Build-your-own fancharts in R

Published by Anonymous (not verified) on Tue, 19/11/2019 - 8:00pm in

Andrew Blake

Central banks the world over calculate and plot forecast fancharts as a way of illustrating uncertainty. Explaining the details of how this is done in a single blog post is a big ask, but leveraging free software tools means showing how to go about it isn’t. Each necessary step (getting data, building a model, forecasting with it, creating a fanchart) is shown as R code. In this post, a simple data-coherent model (a vector auto-regression or VAR) is used to forecast US GDP growth and inflation and the resulting fanchart plotted, all in a few self-contained chunks of code.

Why use R? Built for statisticians, it’s becoming increasingly popular with economists. R has a number of obvious advantages: it is free to use, it is well supported by a highly engaged user community, can produce stellar graphics, and allows you to disseminate your work through easy-to-build interactive apps which can be publicised through your own blog. However, R is famously hard to get into and for many – me included – making good use of the tools that comprise the tidyverse can make using it a lot easier. This set of highly-integrated packages is designed as a seamless way to manipulate and visualise data. Let’s build some fancharts.

The code

First load some libraries:

library(quantmod) # For data access



US data is easy to get. Below, the getSymbols command from the library quantmod is used to download quarterly values of year-on-year GDP growth and the CPI index (as of 5 November, 2019) from the US Fed’s data repository FRED.

series = c('A191RO1Q156NBEA', 'CPALTT01USQ661S') # FRED codes for US GDP growth and CPI
Growth = getSymbols(series[1], src='FRED', auto.assign=FALSE)
CPI = getSymbols(series[2], src='FRED', auto.assign=FALSE)

The next bit of code stores the data series in Data along with the date, calculates the annual inflation rate and then keeps only what’s necessary.

Data = inner_join(tibble(Date=time(Growth), Growth=coredata(Growth)),
tibble(Date=time(CPI), CPI=coredata(CPI)), by=c("Date")) %>%
mutate(Inflation=100*(CPI/lag(CPI,4)-1)) %>%
select(Date, Growth, Inflation) %>%
drop_na() # Drop missing obs to balance dataset

All good modellers inspect their data, so plot the series using ggplot2, a key part of the tidyverse.

centre_colour = c("seagreen","tomato") # Colours for time series/centre of fancharts
tail_colour = "gray95" # Colour for the tails, used later but defined here
pivot_longer(Data, cols=-Date, names_to="Variables", values_to="Values") %>%
ggplot() +
geom_line(aes(x=Date, y=Values, group=Variables, colour=Variables), size=1.1, show.legend=TRUE) +
scale_colour_manual(values=centre_colour) +
theme_minimal() +
theme(legend.title = element_blank()) +
labs(title="US GDP growth and CPI inflation", x="", y="",
caption=paste0("Source: FRED series ", paste(series, collapse=", ")))

Chart 1: US GDP growth and CPI inflation

These look fine. Let’s build a model.


Excellent packages exist to estimate VARs (such as vars) but the point is to do it from scratch. Algebraically a VAR with m lags is:

Y_t = \beta_0 +\sum_{i=1}^{m} \beta_i Y_{t-i} + \varepsilon_t

where Y_t  is a vector of growth and inflation in each period. Clever use of the tidyverse creates (and names) the required lags of each variable in a similar fashion to Answer 3 to this question on stackoverflow and a constant.

m     = 4  # maximum lag in VAR
Datal = Data %>%
pivot_longer(cols=-Date, names_to="Names", values_to="Values") %>%
mutate(lag_value=list(0:m)) %>%
unnest(cols=lag_value) %>%
group_by(Names, lag_value) %>%
mutate(Values=lag(Values, unique(lag_value))) %>%
ungroup() %>%
mutate(Names = if_else(lag_value==0, Names,              # No suffix at lag 0
paste0(Names, "_", str_pad(lag_value, 2, pad="0")))) %>% # All other lags
select(-lag_value) %>%      # Drop the redundant lag index
pivot_wider(names_from=Names, values_from=Values) %>%
slice(-c(1:m)) %>%          # Remove missing lagged initial values
mutate(constant = 1)         # Add column of ones at end

Now select the lagged values (those with a suffix) and constant as explanatory variables and the rest (except for the date) as dependent ones using a regular expression match. These are put in the matrices X and Y respectively.

s = paste(paste0(str_pad(1:m, 2, pad="0"), "$"), collapse="|")
X = data.matrix(select(Datal, matches(paste0(s,"|constant"))))
Y = data.matrix(select(Datal, -matches(paste0(s,"|constant|Date"))))

The VAR is easy to estimate by solving for the unknown \beta’s using:

(bhat = solve(crossprod(X), crossprod(X,Y)))

##      Growth    Inflation
## Growth_01 1.11860225 0.113957757
## Growth_02 -0.17202036 -0.094868237
## Growth_03 -0.07633554 0.006027886
## Growth_04 -0.06501444 0.052454614
## Inflation_01 -0.22304645 1.273950043
## Inflation_02 0.17785249 -0.327281242
## Inflation_03 -0.08725688 0.122644763
## Inflation_04 0.09025376 -0.110628806
## constant 0.74673184 -0.077816987

A nice feature of calculating bhat this way is that it automatically labels the output for ready interpretation. An econometrician would spend some time evaluating the statistical model, but let’s just press ahead.


Simulating the model to calculate the forecasts and the forecast error variances is done in a loop. A first-order representation of the VAR works best, with the small complication that the parameters need to be re-ordered.

nv    = ncol(Y) # Number of variables
nf    = 12      # Periods to forecast
nb    = 16      # Periods of back data to plot, used later

v     = crossprod(Y - X %*% bhat)/(nrow(Y)-m*nv-1)            # Calculate error variance
bhat2 = bhat[rep(seq(1,m*nv,m),m) + rep(seq(0,m-1), each=nv),] # Reorder for simulation
A     = rbind(t(bhat2), diag(1,nv*(m-1), nv*m))                # First order form - A
B     = diag(1,nv*m,nv)                                        # First order form - B
cnst  = c(t(tail(bhat,1)), rep(0,nv*(m-1)))                    # First order constants
# Simulation loop
Yf     = matrix(0,nv*m,nf+1)                # Stores forecasts
Yf[,1] = c(t(tail(Y,m)[m:1,]))              # Lagged data
Pf     = matrix(0,nv,nf+1)                  # Stores variances
P      = matrix(0,nv*m,nv*m)                # First period state covariance

for (k in 1:nf) {
  Yf[,k+1] = cnst + A %*% Yf[,k]
  P        = A %*% P %*% t(A) + B %*% v %*% t(B)
  Pf[,k+1] = diag(P)[1:nv]

At the end Yf contains the forecast levels of each variable and Pf the forecast standard errors.


There are packages to plot fancharts too. The fanplot package actually has Bank of England fancharts built in but not in the tidyverse, although for the tidy-minded there is ggfan. But it isn’t hard to do it from scratch.

Each forecast fanchart is built up of five shaded areas, with the darkest shade representing an area expected to contain the outcome 30% of the time. Two lighter adjacent areas are the next 15% probability bands below and above the central area, and then another 15% probability bands outside these are shaded lighter still. The edges of these bands are forecast quantiles, evaluated using the values below. Starting at the bottom, each selected forecast quantile is a lower edge of a polygon and next higher quantile the upper edge. The upper coordinates need to be reversed so the perimeter lines join to make the right side of the polygon. Creating this series for each polygon and each variable is done in the code segment below in the curly-bracketed bit {bind_rows(…)}. And as everything in a single data frame is convenient, a last step binds in the historical data.

qu     = c(.05,.2,.35,.65,.8,.95)  # Chosen quantiles ensures 30% of the distribution each colour
nq     = length(qu)
fdates = seq.Date(tail(Data$Date,1), by="quarter", length.out=nf+1) # Forecast dates

forecast_data = tibble(Date     = rep(fdates, 2),
                       Variable = rep(colnames(Data)[-1], each=(nf+1)),
                       Forecast = c(t(Yf[1:nv,])),
                       Variance = c(t(sqrt(Pf)))) %>%
  bind_cols(map(qu, qnorm, .$Forecast, .$Variance)) %>%         # Calculate quantiles
  select(-c("Forecast", "Variance")) %>%
  {bind_rows(select(., -(nq+2)),                                # Drop last quantile
             select(., -3) %>%                                  # Drop first quantile
               arrange(Variable, desc(Date)) %>%                # Reverse order
               rename_at(-(1:2), ~paste0("V",1:(nq-1))) )} %>%  # Shift names of reversed ones
  pivot_longer(cols=-c(Date, Variable), names_to="Area", values_to="Coordinates") %>%
  unite(VarArea, Variable, Area, remove=FALSE) %>%              # Create variable to index polygons
  bind_rows(pivot_longer(tail(Data,nb), cols = -Date, names_to="Variable", values_to="Backdata"), .)

That’s pretty much it. Shaded rectangles made using geom_rect indicate the forecast region, the filled polygons plotted using geom_polygon define the different bands and historical data is added using geom_line. A bit of formatting, apply facet_wrap() and we’re done.

# Band colours 'ramp' from the centre to the tail colour
band_colours = colorRampPalette(c(rbind(tail_colour, centre_colour), tail_colour),
                                space="Lab")(nv*nq+1)[-seq(1, nv*nq+1, nq)]

ggplot(forecast_data) +
  geom_rect(aes(xmin=Date[nv*nb], xmax=max(Date), ymin=-Inf, ymax=Inf), fill=tail_colour, alpha=.2) + 
  geom_polygon(aes(x=Date, y=Coordinates, group=VarArea, fill=VarArea)) +
  scale_fill_manual(values=band_colours) +
  geom_line(aes(x=Date, y=Backdata, group=Variable, colour=Variable)) +
  scale_colour_manual(values=centre_colour) +
  scale_x_date(expand=c(0,0)) +
  theme_minimal() +
  theme(legend.position="none") +
  facet_wrap(~ Variable, ncol=1) +
  labs(title="Forecasts of US GDP growth and CPI inflation",
       subtitle=paste("Quarterly data, annual rates of change, VAR with", m, "lags"),
       caption=paste("Source: FRED series", paste(series, collapse=", ")), x="", y="")

Chart 2: Forecasts of US GDP growth and CPI inflation


These look great but customisation is easy – why not see how BBC style graphics look? (Answer: they look better without the shaded rectangles.) Try experimenting with the model too – see what effect changing the maximum lag has or maybe add another variable. And use different data. The ONS makes UK data available for download in JSON format. A bit more code is needed than with FRED – this excellent blog post will get you up and running in no time. This blog post will do the same using the DBnomics data portal which gives access to a ton of data form all over the world.

Central bank forecasts are based on considerably more sophisticated models than the one used here, with much more data and above all incorporating expert judgment. But if this post has whetted your appetite, the code chunks should be enough to get you started. So get copying, pasting, running! (Or just download the code here.) Oh, and this post was completely written in R Markdown. Because you can.

Andrew Blake works in the Centre for Central Banking Studies.

If you want to get in touch, please email us at or leave a comment below.

Comments will only appear once approved by a moderator, and are only published where a full name is supplied. Bank Underground is a blog for Bank of England staff to share views that challenge – or support – prevailing policy orthodoxies. The views expressed here are those of the authors, and are not necessarily those of the Bank of England, or its policy committees.

Bitesize: How far do people move house?

Published by Anonymous (not verified) on Thu, 17/10/2019 - 7:00pm in

Fergus Cumming and Alastair Firrell

When moving house, people often don’t move too far away. Many will be commuting to the same job or don’t want their kids to move school. But many people move long-distance when they sell one house and buy another.

Using transaction data on residential mortgages we estimated the distribution of house move distances for 2018. Although England and Wales have good Land Registry data, it is often difficult to understand the links in the chains. We find the distribution of
move-distances is remarkably spread out compared to other estimates. The median mortgagor move is 95km: that’s York-Sheffield, or London-Oxford. Nearly a quarter move over 200km.

Figure 1: Number of mortgaged moves in each 20km distance bucket

What’s driving this? The chart below shows that the highest-income households are likely to move shorter distances, perhaps because they are less financially constrained in their employment or spending choices. A regression of important characteristics suggests that for every £1,000 increase in annual household income, people tend to move about 750m less, all else equal.

Figure 2: Mortgagors with highest incomes move shorter distances more often and longer distances less

Even the poorest mortgagors tend to be less financially constrained than the population at large. But it’s not all about income, there are regional differences too.

This approach has limitations. We only capture mortgage-to-mortgage transactions, so no moves to or from rental, or cash buyers. Therefore we see less about young people (e.g. finding early-career jobs) and older people (e.g. retirement home choices). Our matching algorithm cannot always uniquely identify movers; we take the minimum distance of possible matches. Even so, there are a remarkably high number of
long-distance movers.

Fergus Cumming works in the Bank’s Monetary Policy Outlook Division and Alastair Firrell works in the Bank’s Advanced Analytics Division.

If you want to get in touch, please email us at or leave a comment below.

Comments will only appear once approved by a moderator, and are only published where a full name is supplied. Bank Underground is a blog for Bank of England staff to share views that challenge – or support – prevailing policy orthodoxies. The views expressed here are those of the authors, and are not necessarily those of the Bank of England, or its policy committees.

What does population ageing mean for net foreign asset positions?

Published by Anonymous (not verified) on Fri, 11/10/2019 - 7:00pm in

Noëmie Lisack, Rana Sajedi and Gregory Thwaites

How sound is the argument that current account balances are driven by demographics? Our multi-country lifecycle model explains 20% of the variation in observed net foreign asset positions among advanced economies through differences in population age structure. These positions should expand further as countries continue to age at varying speeds.

Persistent current account surpluses and deficits have made the headlines over the past few years. While many have highlighted the role of policy actions that can potentially lead to imbalances, structural characteristics can also explain large external positions in line with countries’ fundamentals. One such characteristic is the population age structure and its evolution. For instance, Germany mentions its relatively faster ageing population as a potential explanation for its large surplus. In 2018, the IMF revamped, among other things, the way that demographics are taken into account in its External Balance Assessment framework.

Is the link between demographics and a country’s external balance theoretically sound? How quantitatively important is this mechanism for explaining observed current accounts, and the resulting accumulation of net foreign asset (NFA) positions, in advanced economies?

Asymmetric ageing across countries

In a previous blog post, we described the effects of ageing in advanced economies highlighting that, as populations get older, wealth and capital increase and hence interest rates decline. This demographic trend is common to all advanced economies, but different countries are ageing at different speeds. While the old-age dependency ratio – the ratio of over 65 year-olds to 20-64 year-olds – is projected to rise above 60% on aggregate, this number reaches 75% in Japan, against 55% in the US, by 2100 (Figure 1).

Figure 1: Old-age dependency ratios across advanced economies

Source: UN Population Statistics (projections based on median-fertility scenario).

Theoretical implications for capital flows

How do these differences affect the current accounts of these open economies? Assuming no frictions in capital flows, one global interest rate would prevail ensuring that household wealth equals capital in the global aggregate economy. However, that will not necessarily imply that country-level domestic wealth and capital are equal. Capital flows can take place if a country’s domestic wealth differs from its capital stock. Put differently, the global interest rate determines the financing cost for firms and hence the capital that they demand; whether all this capital is supplied domestically will depend on the ageing of domestic households relative to the aggregate.

Concretely, consider the US, which is ageing more slowly than the average. Since ageing is putting less upward pressure on domestic US savings, the global real interest rate is below the interest rate that would hypothetically arise were the US a closed economy, all else equal. Hence, at the global interest rate, US household wealth is below US firms’ desired capital level. This leads to capital inflows into the US, as foreign households supply capital to US firms, and hence to a negative NFA position for the US. The opposite is true for a country ageing faster than the average, such as Japan or Germany, which would experience capital outflows leading to a positive NFA position.

From ageing asymmetries to external positions

To quantify this effect, in our working paper, we develop a multi-country overlapping-generations model, solved separately for each country. All countries are considered symmetric, except for differences in their population age structure over time. Firms and households in each country take the global interest rate path as given to decide how much capital to use and how much to save, respectively. This setup gives us a model-implied NFA position over time for each advanced economy, which is driven only by demographic differences.

Comparing the 2015 model-implied NFA/GDP ratio to the data gives a broad idea of the relevance of demographics for external positions. The variation in model-implied NFA positions across advanced economies captures around 20% of the variation observed in the data (Figure 2). Demographic differences thus play a significant role in determining NFA positions, though naturally there are other important factors as well. The model clearly does not capture the large liabilities of Portugal, Ireland, Greece and Spain, which in 2015 were driven by cyclical and fiscal factors, nor the large assets of Norway, driven by their position as an oil exporter. The fitted line is also slightly shallower than the 45-degree line, showing that the model tends to imply larger NFA positions than in the data, reflecting the existence of capital flow frictions in the real world that are not captured in the model.

Figure 2: Model-implied vs. observed net foreign assets (%GDP)

Note: NFA is Net Foreign Assets, grey line is the 45-degree line. Source: IMF IFS.

Expanding future external positions

Using the UN projections for demographic trends, our model can also give us predictions for external positions in the future. For this exercise, we capture the degree of ageing using the high-wealth ratio (HWR) – the ratio of over 50 year-olds to 20-49 year-olds. This ratio is most relevant for the effect of ageing on household wealth, because it measures the ratio of households in the ‘high-wealth’ period of life relative to those in the ‘low-wealth’ period.

We consider how the model-implied NFA positions change as countries age (Figure 3). In line with the intuition laid out above, countries with more advanced ageing have higher NFA positions and countries with less advanced ageing have larger negative NFA positions. Going from 2015 to 2030, the HWR rises in all countries, as they all age. As this happens, the model predicts an increasing dispersion of NFA positions as countries continue to age at different speeds. We would therefore expect a large amount of capital flows between countries and widening external positions in the future.

Figure 3: Model-implied net foreign assets vs high-wealth ratio, 2015 and 2030

Note: High-Wealth Ratio: ratio of over 50 year-olds to 20-49 year-olds. The fitted lines use all 23 advanced economies in our sample.

Overall, large NFA positions do not necessarily mean large imbalances, after taking into account country-specific structural factors. Indeed, our model shows that demographics can be a significant factor driving persistently large external positions. Based on the model, the impact of population ageing is expected to persist and even expand over time, as population age structures are expected to further diverge.

This post was also published in French on the Banque de France’s blog Bloc-notes Éco.

Rana Sajedi works in the Bank’s research hub, Noëmie Lisack is a research economist at the Banque de France and Gregory Thwaites is Research Director at Founders Pledge.

If you want to get in touch, please email us at or leave a comment below.

Comments will only appear once approved by a moderator, and are only published where a full name is supplied. Bank Underground is a blog for Bank of England staff to share views that challenge – or support – prevailing policy orthodoxies. The views expressed here are those of the authors, and are not necessarily those of the Bank of England, or its policy committees.

Perceiving risk wrong: what happens when markets misprice risk?

Published by Anonymous (not verified) on Wed, 02/10/2019 - 6:00pm in

Kristina Bluwstein and Julieta Yung

Financial markets provide insightful information about the level of risk in the economy. However, sometimes market participants might be driven more by their perception rather than any fundamental changes in risk. In a recent Staff Working Paper we study the effect of changes in risk perceptions that can lead to a mispricing of risk. We find that when agents over-price risk, banks adjust their bank lending policies, which can lead to depressed investment and output. On the other hand, when agents under-price risk, excessive lending creates a ‘bad’ credit boom that can lead to a severe recession once sentiment is reversed.

Risk perceptions and the term premium

Risk perceptions are often highly subjective and notoriously hard to quantify, yet can have sizable effects on the economy. To measure movements in bond markets that are not driven by changes in fundamentals, we turn to something called the term premium. The term premium is the compensation that investors require in order to hold a bond that exposes them to duration risk instead of holding a consecutive series of short-term assets for the same amount of time. As such, movements in the price of a long-term bond can reflect (i) changes in the ‘expectations’ component, as we learn new information about the state of the economy, or (ii) changes in the compensation for duration risk, the term premium component.

Figure 1 shows the term premium corresponding to a 10-year U.S. Treasury yield, as estimated by the New York Fed. The figure indicates that the term premium varies over time, and has generally been positive over the past six decades, with an average of around 1.6%. However, the term premium has been trending downward since the mid-1980s, with a value close to zero in the most recent years prompting questions on whether duration risk is being mispriced by the markets.

Importantly, the term premium is highly correlated with other measures related to financial market risk, interest rate volatility, economic policy uncertainty and inflation uncertainty and can therefore serve as a useful indicator of financial market participants’ risk perception.

Figure 1: The U.S. Term Premium

What do the data say?

To understand whether risk perception shocks can affect not only bond markets but also economic activity, we collect monthly data on bank lending (loans to the private and public sector) and the economy (output, inflation and interest rates), and using a vector autoregressive model we explore the role of term premium shocks. We find that a 90 basis-point increase in the term premium is associated with a 2% decline in short-term commercial loans granted by banks (see Figure 2). This shock is even more persistent than the effect on output, with the median response only returning to the steady state after roughly five years. Interestingly, banks increase their long-term lending to the public sector by more than 4% in response to the same shock, suggesting a portfolio rebalancing strategy following heightened risk perceptions.

Figure 2: Effect of a Term Premium Shock on Growth (based on empirical model)

(a) Short-Term Loans to the Commercial Sector

(b) Long-Term Loans to the Public Sector

(c) Annual Output Growth

Although we establish a
relationship between the term premium and the economy, a caveat of this
analysis is that it remains agnostic on the mechanisms leading to the effects
we observe in the data.  In order to
investigate the channels through which risk perceptions impact bank lending and
the economy, we need a structural model to help quantify the different
mechanisms at play.

What does the model say?

There are currently only a few structural models that allow us to combine banking, asset pricing, and expectations. We therefore develop a new dynamic stochastic general equilibrium (DSGE) model to study the role of risk perceptions in shaping banks’ decisions to tighten bank lending, while maintaining econometric tractability. In this framework, households, entrepreneurs, banks, the central bank and the government make choices to achieve their objectives (maximise utility, stabilise the economy, increase profits). Figure 3 illustrates how all sectors in the economy interact with one another, providing us with a simplistic but intuitive representation of the channels through which financial markets influence the economy and vice versa.

Figure 3: Bank Lending in the DSGE Model

Our model has three important features. First, we introduce a more realistic financial sector, in which short-term rates are set by the central bank according to inflation and economic growth conditions, while long-term bonds issued by the government are priced by investors following general asset pricing rules. Banks take this into account when choosing how much and at what cost to lend to the private or public sector, influencing the availability of credit in the economy.

Second, we introduce a feedback loop between the financial sector and the economy via the term premium. That is, when agents suddenly perceive more/less risk in the economy, they misprice their compensation for risk exposure which leads to a change in the term premium. That change in the term premium influences the pricing of long-term bonds, thus affecting financial markets. We remain agnostic about what drives the unexpected shock in risk perception that allows the term premium to change. However, pessimistic expectations about the future or lower risk tolerance could lead to an over-pricing of risk in the form of a higher term premium. The opposite holds if agents under-price the actual underlying risk due to eg exuberant expectations, which would translate into a lower term premium.

Third, our model can explain both salient features of the economy and the financial sector, a notoriously difficult task given that the economy is slow moving and financial markets are more volatile. Our model therefore captures important dynamics that describe the way in which the economy operates at an aggregate level.

We find that when financial market participants ‘panic’, ie their perception of risk increases without any change in macroeconomic fundamentals, banks restrict their lending to the private sector and increase their holdings of long-term government bonds, as previously identified in the data (see Figure 4). However, we are now able to identify the mechanisms that link risk perceptions to bank lending, and ultimately the economy.

Figure 4: Effect of a Risk Perception Shock (based on theoretical model)

(a) Short-Term Loans to the Commercial Sector

(b) Long-Term Loans to the Public Sector

(c) Annual Output Growth

In this model, when perceived risk increases, investors seek insurance against bad outcomes and over-price risk. Therefore, their compensation for risk (the term premium) increases, pushing long-term interest rates higher. Elevated perceived risk increases the price at which banks are willing to provide loans to the commercial sector, as they themselves are indifferent between lending short-term to the private sector or long-term to the government. Unlike the government, the private sector needs to provide collateral, when they borrow money from the bank. Therefore, with higher costs of borrowing, fewer people can afford to take on a new loan and the total quantity of short-term loans to the private sector drops. As a consequence, investment and hence real economic activity drops (Figure 4.c).

Policy implications

Understanding bank lending and the factors that influence the availability of credit is of particular interest for financial stability and monetary policy transmission throughout the business cycle. Using our model, we can simulate different types of credit booms and find that a ‘bad’ credit boom, ie a boom driven by agents under-pricing risk in the economy, is less supportive of economic growth. A ‘good’ credit boom, ie one that is driven by strong economic fundamentals, however, allows for higher investment and consumption, and once over, remains supportive of economic growth.

Kristina Bluwstein works in the Bank’s Macroprudential Strategy and Support Division and Julieta Yung works at Bates College.

If you want to get in touch, please email us at or leave a comment below.

Comments will only appear once approved by a moderator, and are only published where a full name is supplied.Bank Underground is a blog for Bank of England staff to share views that challenge – or support – prevailing policy orthodoxies. The views expressed here are those of the authors, and are not necessarily those of the Bank of England, or its policy committees.

Great Expectations: the economic power of news about the future

Published by Anonymous (not verified) on Wed, 11/09/2019 - 6:00pm in

Silvia Miranda-Agrippino, Sinem Hacioglu Hoke and Kristina Bluwstein

Can shifts in beliefs about the future alter the macroeconomic present? This post summarizes our recent working paper where we have combined data on patent applications and survey forecasts to isolate news of potential future technological progress, and studied how macroeconomic aggregates respond to them. We have found news-induced changes in beliefs to be powerful enough to enable economic expansions even if different economic agents process these types of news in very different ways. A change in expectations about future improvements in technology can account for about 20% of the variation in current unemployment and aggregate consumption.

Beliefs-driven business cycles

Over the years, different generations of economists have entertained the idea that beliefs, and beliefs alone, could be powerful enough to set the economy on a path to expansions and recessions solely determined by expectations of rosy times being eventually fulfilled, or otherwise. This fascinating view, often referred to as the news-driven business cycles hypothesis, allows economic booms to arise just because economic agents collectively expect some of the fundamental drivers of growth, such as technology, to improve in the future, having received some signal – or news – about it. This wave of optimism sustains more consumption, more investment, and more employment, eventually improving economic conditions in advance of the anticipated technological progress. If, however, technology turns out ex post not to have improved at all, the disillusion, and consequential realization that the boom was in fact disconnected from fundamentals, plants the seed for a subsequent economic contraction. But is this mechanism at all plausible?

Measuring technology news

One of the reasons why the answer to this question has remained elusive, despite the many contributions in the field (see e.g. Beaudry and Portier, 2006, Barsky and Sims, 2011 for alternative views), is that these types of news are spectacularly hard to measure in practice. This has led many to rely heavily on assumptions, sometimes strong, motivated by economic theory, and mostly statistical in nature, in an attempt to isolate the fluctuations in macroeconomic aggregates that could be ascribed to such news. And, as is often the case, different assumptions have led to different conclusions. In our paper, we approach the question from a different and novel perspective. We are able to dispense with the assumptions used in earlier studies by relying instead on a direct measure of technological news that we construct starting from a new dataset of patent applications filed at the US Patents and Trademark Office since 1981. Proceeding in this way grants us two critical advantages. First, it enables us to remain open about the consequences of such news without imposing any patterns on their transmission. Second, it allows us to evaluate to what extent the restrictions imposed in earlier studies find any empirical support.

The reasoning behind our approach is quite intuitive. By their very nature, patent applications constitute a potential promise of future technological improvements. This has long been recognized in the literature, see e.g. Shea (1998). What is new is that we further acknowledge that today’s patents can themselves be the result of past news, or of other concurrent economic factors, and that they measure technological news only with some error. To account for all these aspects, we construct our direct measure of technological news – or an external instrument for the identification of news shocks in econometric parlance – by removing from patent applications the component that correlates (a) with expectations about the macroeconomic outlook that were formed prior to the application filings, as well as (b) with other contemporaneous monetary and tax policy changes that may influence current economic conditions and thus, in turn, individuals’ and firms’ decisions. The resulting instrumental variable is associated with large increases in indices that measure the expected economic importance of technological innovations, and in turn correlate strongly and positively with forward citation counts, a measure of their scientific value.

How does news affect productivity…

Armed with our instrument, we set out to study how the aggregate economy reacts to news about technological improvements that may occur sometime in the future. Technological news induces a slow but steady increase in aggregate productivity (Figure 1). The shape, timing and significance of the response are fully consistent with the well-documented fact that technology diffuses slowly through the economy, that it does so following an S-curve, and that while there seem to be some initial positive spillovers, news does not essentially alter productivity for the first few years. This result can be used to ‘test’ the plausibility of the common assumption made in earlier studies that current productivity does not react to news, which constrains the first point in the figure to equal zero.

Figure 1: Response of aggregate Total Factor Productivity to a technology news shock

Note: TFP is the Fernald (2014) series adjusted for inputs utilization. Impulse response function to a technology news shock identified using our external instrument. Maximum response normalized to 1 percentage point. Results are based on a Bayesian Vector Autoregression estimated on quarterly US data and with 4 lags over the sample 1971:1-2016:12.

…and the rest of the economy

Despite the long time that it takes for news to translate into actual meaningful technological improvements, the expectation that it may happen pushes up consumption, investment, and, consequentially, output as soon as after a few quarters (Figure 2). While it is clearly the case that the news that we are measuring turns out to be true on average (i.e. TFP eventually rises, see Figure 1), the large a-synchronicity in the timing of the peak responses in Figures 1 and 2 suggests that it is beliefs, rather than the actual future improvement in TFP, that play a crucial role in driving the business cycle expansion at short horizons. Our estimates show that they account for close to a fifth of the variation in consumption and employment over periods that go from 2 to 8 years, the normal definition of a business cycle. In this sense, our results support the view that changes in beliefs can generate business cycle type of fluctuations.

Figure 2: Response of output and components to a technology news shock

Note: The figure reports the impulse response functions of real GDP, real consumption and real investment to a technology news shock identified using our external instrument. Results are based on a Bayesian Vector Autoregression estimated on quarterly US data with 4 lags over the sample 1971:1-2016:12.

Not everyone in the economy, however, processes these types of news in the same way (Figures 3 and 4). News can be thought of as signals about the future, surrounded by noise. The extent to which economic agents update their forecasts about the future after having received such signals depends on their information processing capacity. The more they are equipped (or indeed willing) to filter out the noise, the more their behavior – an expression of their updated forecasts – will reflect the revealed signal rather than be determined by current economic conditions (see e.g. Coibion and Gorodnichenko, 2015).

The stock market is quick to price in the expected innovations, and welcomes the news with buoyant attitude. More generous expected valuations, bigger expected dividends, and higher expected profits are all presumably part of what matters in the response of the stock market.

The central bank, presumably instead primarily concerned with the gradual fall in aggregate prices, counteracts the expected deflation by shifting towards an accommodative stance that leads to a decline in nominal short-term interest rates. And, one could argue, despite the fact that future expected productivity gains are typically associated with a higher natural rate of interest. In the paper we show how this translates also into a contraction in risk (term) premia, which offers a potential amplification channel for the effect of news.

What really seems to matter for consumers is instead current labor market conditions. Firms shifting towards new technologies, or equivalently temporarily holding up and diverting investments, reduce total hours worked initially. In the paper, we show that wages also shrink over the same period. Both these contractions are tiny, and do not last more than a couple of quarters. Nonetheless, they bite hard on consumers, and are sufficient to worsen their projected outlook and to erode significantly their confidence in the short term. Consumers, as it happens, only seem to believe what they see.

Figure 3: Response of aggregate prices, the monetary authority, and the stock market to a technology news shock

Note: The figure reports the impulse response functions of the GDP deflator, the nominal short-term interest rate and the stock market index (Nasdaq) to a technology news shock identified using our external instrument. Results are based on a Bayesian Vector Autoregression estimated on quarterly US data and with 4 lags over the sample 1971:1-2016:12.

Figure 4: Response of labor market and consumer expectations about unemployment and index of consumer confidence to a technology news shock

Note: The figure reports the impulse response functions of hours worked, consumer expectations about unemployment one year hence, and the University of Michigan consumer confidence indicator to a technology news shock identified using our external instrument. Results are based on a Bayesian Vector Autoregression estimated on quarterly US data and with 4 lags over the sample 1971:1-2016:12.


Besides the direct application to technology news, our work offers valuable insights on the different speed at which different economic agents – firms, market participants, policy makers and consumers – incorporate signals about the future in their decision making. In turn, this has potentially more general implications for what concerns, for example, the study of aggregate behavior, or the design of policies whose implementation relies on the active management of aggregate expectations.

Silvia Miranda-Agrippino and Sinem Hacioglu Hoke work in the Bank’s Monetary and Financial Conditions Division and Kristina Bluwstein works in the Bank’s Macroprudential Strategy and Support Division.

If you want to get in touch, please email us at or leave a comment below.

Comments will only appear once approved by a moderator, and are only published where a full name is supplied. Bank Underground is a blog for Bank of England staff to share views that challenge – or support – prevailing policy orthodoxies. The views expressed here are those of the authors, and are not necessarily those of the Bank of England, or its policy committees.

Houses are assets not goods: taking the theory to the UK data

Published by Anonymous (not verified) on Fri, 06/09/2019 - 6:00pm in

John Lewis and Fergus Cumming

In yesterday’s post we argued that housing is an asset, whose value should be determined by the expected future value of rents, rather than a textbook demand and supply for physical dwellings. In this post we develop a simple asset-pricing model, and combine it with data for England and Wales. We find that the rise in real house prices since 2000 can be explained almost entirely by lower interest rates. Increasing scarcity of housing, evidenced by real rental prices and their expected growth, has played a negligible role at the national level.

To infinity and beyond…

A standard framework for pricing assets is the “Dividend Discount Model”. Just as the equilibrium value of a tulip bulb should be the (net present discounted) value of the future flowers it produces, for a houses the value is given by rents. More formally:

P_t=R_t+\sum\limits_{\tau=1}^{\infty}\frac{E_t (R_{t+\tau})}{\prod_{T=t+1}^{t+\tau}(1+\rho_T)}

We observe current rents (the first term on the right-hand side) and the path of expected rents (the numerator). For the denominator, the discount rate is the expected future interest rate (observed from yield curves) plus an estimated constant risk premium. Given all this, we can compute what the model says prices should be. Sure, this misses many other things (credit constraints, tax changes etc), but it’s just meant to be a simple model to illustrate the magnitudes of some of these channels rather than a definitive assessment of over/under valuation, or capturing all relevant factors. And it isn’t a forecasting tool.

Lower interest rates raise asset prices by increasing the present value of future cash flows. These effects can be powerful, especially when interest rates are already very low. To see this, suppose a contract pays you a pound coin every year forever. The first 20 pound coins are discounted by the prevailing expectations of future interest rates at the appropriate points on the yield curve, and then assume the discount rate is constant at some other value after that. How much would this contract be worth at different points in time?

The powerful role of discounting rates….

The purple lines on the chart below show the UK forward yield curve each month from January 1999 (darkest) to the present (lightest). Loosely, each curve is the expected annual rate of return on benchmark assets over each of the next 20 years. July 2019’s is red; August 2008’s is in green and May 1997, the beginning of the inflation-targeting era, is in blue.

The dots on the charts below decompose the valuations of the coins using these three yield curves, assuming the interest rate in year 20 persists for all future years, and no risk premium. The darker bars show the value contributed by the first 20 coins, the lighter ones the value of the rest.

As rates fall, the value of the coin stream increases from £14.20 in May 1997 to £21.40 in August 2008. The subsequent fall in interest rates to the July 2019 yield curve generates a further near-quadrupling in value to almost £80. The bulk of this rise occurs via dramatic increases in the value of the coins that arrive in more than 20 years’ time.

The chart below explores that sensitivity further. Each line shows the value of the coins assuming a given yield curve for the first 20 years, and then a range of values for the rate after that. The actual value of the 20-year rate at each point is shown by the dots (i.e. the rate used for the light bars above).

In May 1997, coins arriving in the far future are not worth very much because 20+ years of discounting at 7% erodes most of their value. So the blue line is fairly flat: shift your assumption about long rates and the value of the coin flow is virtually unchanged.

In August 2008, it’s a similar story. But fast forward to the July 2019 yield curve and a 1pp change in discount rates beyond 20 years can make an enormous difference to prices.

This is a problem because ultra-long run interest rate expectations are difficult to measure and not easily captured by financial market instruments. And over the decades very long yields can move around a lot. So in our model below, we switch off the ultra-long run interest rate channel completely by fixing the long-run discount rate to a constant rate of 3.8% (the 2000-2018 sample average) beyond 20 years. That means changes in expected future interest rates up to 20 years ahead (but no further out than that) can affect prices. It’s simple, but captures the belief that investors don’t make large revisions to their ultra-long run interest rate expectations.

What does the model say about house prices in England & Wales?

The black line is the model’s estimated value of average house prices assuming the annual discount rate reverts to its sample average after 20 years. For comparison, the gold line shows the case if the prevailing 20-year rate is extrapolated forward for the rest of time. The gap between the two is fairly small until 2014, when 10+ year rates really started to fall. The red line shows how actual prices evolved, re-based to the same units as the other lines. The red line is 70 at the start, indicating that actual prices were about 30% lower than the model’s benchmark in January 2000. Overall, the cumulative price growth between 2000 and 2018 matches the model quite closely. Though in individual years actual prices do sometimes diverge significantly from the model.

The drivers of change

The coloured bars below decompose the predicted nominal growth of the black line above into the different components (details in the appendix).

First up, the grey bars show the role of CPI inflation. If house prices rose at the same rate as goods in general, they’d have risen by 50% since 2000. So what explains the remaining 60pp of real house price growth?

Rising real rents (pink bars) only account for a very small amount. Yesterday’s post argued that scarcity of housing should show up in rising rents, so this suggests lack of supply has had very little role to play (similar to Ian Mulheirn’s recent paper). That doesn’t say anything about scarcity relative to other countries, but it does imply that housing hasn’t really got significantly scarcer over the past two decades.

The tiny maroon bars show that the role of expected future rental growth has been negligible. Admittedly, our model takes survey expectations rental growth 6 months ahead and then assumes it reverts to long-run averages after two years, so it’s hard for this channel to show up much. But we can also cross-check this against actual rents – if rising prices were driven by the belief that rental growth would be permanently stronger than in the past, those expectations weren’t borne out over the sample period.

By far the largest contributor is the lower discount rate (green bars), which accounts for almost all real house price rises since 2000. We completely shut down any role of interest rates beyond 20 years. That’s probably an overly harsh assumption( it’s probably unrealistic to think rates suddenly ping back to our 3.8% constant), but even with this crude way shutting down discounting effects at long horizons, you can still generate effects that match the observed 60% rise in real house prices.

What about geographical differences?

Even if the aggregate model suggests it’s mainly about lower interest rates, this cannot explain any geographical variation: risk-free rates are the same across the whole country. But all the other variables in our model are available at regional level. So we did the same exercise for the nine English regions and Wales. We group them into four geographical blocs based on similarity of results.

In “The North” (North East + North West + Yorkshire and the Humber), real rents have been declining, pulling down on house prices by about 20% over the sample period.

In “The Middle” (East Midlands + West Midlands + East of England + Wales), real rents have exerted less a smaller pull, tapering to near zero by the end.

By contrast, in “The South” (South East + South West), real rents have pushed up on prices, by around 8pp by 2018.

And in London has seen a similar but smaller contribution from real rents, though the overall magnitude of actual prices rises is higher.

So the role of rents in explaining house prices is relatively small in all regions, and the apparent greater scarcity in the “South” has in aggregate terms been offset by less scarcity elsewhere, with little effect on aggregate prices.


In levels terms, house prices are about in line with our model’s estimates, as is the overall rise seen since 2000. It attributes this primarily to CPI inflation and lower interest rates, even though our approach shuts this channel down after 20 years. The model says that relative scarcity of housing has played almost no role at the national level since 2000, though it has pushed in opposite directions in different regions.

John Lewis works in the Bank’s Research Hub and Fergus Cumming works in the Bank’s Monetary Policy Outlook Division.

If you want to get in touch, please email us at or leave a comment below.

Comments will only appear once approved by a moderator, and are only published where a full name is supplied. Bank Underground is a blog for Bank of England staff to share views that challenge – or support – prevailing policy orthodoxies. The views expressed here are those of the authors, and are not necessarily those of the Bank of England, or its policy committees.