Skip to content

Inequality Through the Ages

June 16, 2017


Economists are often interested in inequality as a modern phenomenon. They collect evidence on the distribution of wealth between the rich and the poor, both in the present and over the past two or three centuries (largely since the advent of industrial capitalism). This is important for evaluating and monitoring present-day levels of inequality, for learning about the historical causes and consequences of inequality, and for examining the effects of inequality on economic performance, for example in the form of gross domestic product (GDP) growth.

In a seminar at Pompeu Fabra University (UPF) on 24 May, Prof. Peter Turchin (University of Connecticut, Complexity Science Hub Vienna) invited his audience to consider a broader view.  He began by arguing that, since approximately 10 million years ago, human structural equality has followed a zig-zag pattern. In the first stage, the strong hierarchical nature of the groups formed by our ancestral primates is likely to have led to high degrees of structural inequality, which remained the case until more recognisable forms of human society emerged.


Source: Presentation by Prof. Peter Turchin, 24 May 2017

Approximately 200,000 to 100,000 years ago (depending on one’s definition of “human”), a large part of humanity was organised into foraging bands, and by 10,000 years ago, into small farming communities. These societies would have been more egalitarian than the social groups of their ancestral primates, due to their increased requirement for cooperation and relatively flat social structure. However, such egalitarian groups rarely grew beyond a typical size of several hundred or at most a few thousand individuals. One explanation for this is that humans can only maintain face-to-face cooperation with around 100 to 200 individuals, and therefore effective cooperation broke down once egalitarian groups grew too large.

To overcome this threshold, human societies required hierarchy. Specifically, adopting a hierarchical structure means that each individual needs to maintain face-to-face links with only his superior and his subordinates, creating a societal unit that can be scaled up indefinitely. Such hierarchical structures, combined with surplus resources generated by advances in agriculture and private property rights, allowed humans to form chiefdoms and archaic states numbering millions of individuals in the past 10,000 years. Due to their hierarchical nature, these societies were also characterised by higher levels of structural inequality, which is evident from a historical record of slavery, human sacrifice, unequal rights for commoners and elites, deification of rulers, and large wealth disparities.

When we look at modern societies, two important differences with these archaic states stand out. First, in many instances modern nation states are even larger than the societies described above, with tens or even hundreds of millions of members. Second, although present societies do exhibit varying levels of economic inequality, the severe forms of structural inequality described above have largely disappeared. Moreover, the explicit aim of many modern government structures is to benefit the public at large, for example by codifying human rights and democratic ideals. This raises an important question: how do such pro-social norms become dominant in human societies?


Source: Presentation by Prof. Peter Turchin, 24 May 2017

Prof. Turchin emphasises that the ultrasocial behaviour required to sustain societies of many millions comes at a significant evolutionary cost to the individual members of those societies. For example, volunteering for military service involves a large sacrifice of one individual’s chances of survival for the benefit of genetically unrelated individuals. In view of this, he proposes that the rise of ultrasocial norms can only be explained by an evolutionary mechanism operating between societies.

According to Prof. Turchin, the turning point came with the advent of the Axial age approximately 3,000 years ago. In part due to advances in technology — including the use of horses to travel longer distances, and the increased use of composite bows and iron — military competition between societies intensified. In this environment, the largest and most cohesive societies are likely to prevail, for example because mustering a large army is a collective action problem that requires a very high degree of intrasocietal cooperation.

This meant that evolutionary pressures favoured the selection of societies with prosocial cultures, including those with norms and institutions that constrained rulers in order to promote the public good. This period also saw the gradual disappearance of many structural forms of inequality as societies grew, including human sacrifice, the deification of human rulers, and eventually slavery. At the same time, new world religions, whose central messages often emphasised prosocial norms, started to spread.

Two opposing forces were therefore at play. On the one hand, a society expanding in size needs to increase the depth of its hierarchy to accommodate more individuals, which tends to increase structural inequality. On the other hand, competition between societies favours more cohesive and cooperative societies with lower levels of inequality. With the advent of the Axial age, military pressures meant that the latter force began to dominate the former, ultimately yielding the (relatively) prosocial societies much of the world lives in today.

This hypothesis generates predictions that can be tested against alternative theories. For example, opposing theories could hold that inequality only started to decline in the modern age instead of following a zig-zag pattern over millions of years, that mass religion generates inequality through oppression instead of being prosocial, or that military conflict destroys cooperation and decreases social scale instead of promoting ultrasocial norms. With a view to distinguishing between such rival hypotheses, Prof. Turchin is involved in building a global historical database of cultural evolution, Seshat, with the aim of collating data from diverse sources on the sociopolitical organisation of human societies from the earliest times up to the present.

Ultimately, research undertaken in this field is likely to provide important insights for the inequality debate in economics, as well as other economic issues. For example, if they are correct, the arguments summarised above have implications for development theory and the mechanics of how individual nation states become more successful, prosocial societies. They also have implications for the cooperation required between nation states to address global issues such as climate change.


Turchin, P.  (2015) Ultrasociety: How 10,000 Years of War Made Humans the Greatest Cooperators on Earth. Beresta Books.

Could post-Brexit uncertainty have been predicted?

May 26, 2017
tags: , , ,

By Cox Bogaards, Marceline Noumoe Feze, Swasti Gupta, Mia Kim Veloso

May Brexit

Almost a year since the UK voted to leave the EU, uncertainty still remains elevated with the UK’s Economic Policy Index at historical highs.  With Theresa May’s snap General Election in just under two weeks, the Labour party has narrowed the gap from Conservative lead to five percentage points, which combined with weak GDP data of only 0.2 per cent growth in Q1 2017 released yesterday, has driven the pound sterling to a three-week low against the dollar. Given potentially large repurcussions of market sentiment and financial market volatility on the economy as a whole, this series of events has further emphasised the the need for policymakers to implement effective forecasting models.

In this analysis, we contribute to ongoing research by assessing whether the uncertainty in the aftermath of the UK’s vote to leave the EU could have been predicted. Using the volatility of the Pound-Euro exchange rate as a measure of risk and uncertainty, we test the performance of one-step ahead forecast models including ARCH, GARCH and rolling variance in explaining the uncertainty that ensued in the aftermath of the Brexit vote.


The UK’s referendum on EU membership is a prime example of an event which perpetuated financial market volatility and wider uncertainty.  On 20th February 2016, UK Prime Minister David Cameron announced the official referendum date on whether Britain should remain in the EU, and it was largely seen as one of the biggest political decisions made by the British government in decades.

Assessment by HM Treasury (2016) on the immediate impacts suggested “a vote to leave would cause an immediate and profound economic shock creating instability and uncertainty”, and in a severe shock scenario could see sterling effective exchange rate index depreciate by as much as 15 percent.  This was echoed in responses to the Centre for Macroeconomics’ (CFM) survey (25th February 2016), where 93 percent of respondents agreed that the possibility of the UK leaving the EU would lead to increased volatility in financial markets and the broader economy, expressing uncertainty about the post-Brexit world.

Resonating these views, the UK’s vote to leave the EU on 23rd June 2016 indeed led to significant currency impacts including GBP devaluation and greater volatility. On 27th June 2016, the Pound Sterling fell to $1.315, reaching a 31-year low against the dollar since 1985 and below the value of the Pound’s “Black Wednesday” value in 1992 when the UK left the ERM.

In this analysis, we assess whether the uncertainty in the aftermath of the UK’s vote to leave the EU could have been predicted. Using the volatility of Pound-Euro exchange rate as a measure of risk and uncertainty, we test the performance of one-step ahead forecast models including ARCH, GARCH and rolling variance. We conduct an out-of-sample forecast based on models using daily data pre-announcement (from 1st January 2010 until 19th February 2016) and test performance against the actual data from 22nd February 2016 to 28th February 2017.

Descriptive Statistics and Dynamic Properties

As can be seen in Figure 1, the value of the Pound exhibits a general upward trend against the Euro over the majority of our sample. The series peaks at the start of 2016, and begins a sharp downtrend afterwards.  There are several noticeable movements in the exchange rate, which can be traced back to key events, and we can also comment on the volatility of exchange rate returns surrounding these events, as a proxy for the level of uncertainty, shown in Figure 2.

Figure 1: GBP/EUR Exchange Rate

Fig 1

Source: Sveriges Riksbank and authors’ calculations

Notably, over our sample, the pound reached its lowest level against the Euro at €1.10 in March 2010, amid pressure from the European Commission on the UK government to cut spending, along with a bearish housing market in England and Wales. The Pound was still recovering from the recent financial crisis in which it was severely affected during which it almost reached parity with the Euro at €1.02 in December 2008 – its lowest recorded value since the Euro’s inception (Kollewe 2008).

However, from the second half of 2011 the Pound began rising against the Euro, as the Eurozone debt crisis began to unfold. After some fears over a new recession due to consistently weak industrial output, by July 2015 the pound hit a seven and a half year high against the Euro at 1.44.   Volatility over this period remained relatively low, except in the run up to the UK General elections in early 2015.

However, Britain’s vote to leave the EU on 23rd June 2016 raised investors’ concerns about the economic prospects of the UK. In the next 24 hours, the Pound depreciated by 1.5 per cent on the immediate news of the exit vote and by a further 5.5 per cent over the weekend that followed, causing volatility to spike to new record levels as can be seen in Figure 2.

Figure 2: Volatility of GBP/EUR Exchange Rate

fig 2

Source: Sveriges Riksbank and authors’ calculations

As seen in Figure 1, the GBP-EUR exchange rate series is trending for majority of the sample, and this may reflect non-stationarity in which case standard asymptotic theory would be violated, resulting in infinitely persistent shocks. We conduct an Augmented Dickey Fuller test on the exchange rate and find evidence of non-stationarity, and proceed by creating daily log returns in order to de-trend the series. Table 1 summarises the first four moments of the log daily returns series, which is stationary.


Table 1: Summary Statistics

Table 1.PNG

Source: Sveriges Riksbank and authors’ calculations

The series has a mean close to zero, suggesting that on average the Pound neither appreciates or depreciates against the Euro on a daily basis. There is a slight negative skew and significant kurtosis – almost five times higher than that of the normal distribution of three – as depicted in the kernel density plot below. This suggests that the distribution of daily returns for the GBP-EUR, like many financial time series, exhibits fat tails, i.e. it exhibits a higher probability of extreme changes than the normal distribution, as would be expected.

To determine whether there is any dependence in our series, we assess the autocorrelation in the returns. Carrying out a Ljung-Box test using 22 lags, as this corresponds to a month of daily data, we cannot reject the null of no autocorrelation in the returns series, which is confirmed by an inspection of the autocorrelograms. While we find no evidence of dependence in the returns series, we find strong autocorrelations in the absolute and squared returns.

The non-significant ACF and PACF of returns, but significant ACFs of absolute and squared returns indicate that the series exhibits ARCH effects. This suggests that the variance of returns is changing over time, and there may be volatility clustering. To test this, we conduct an ARCH-LM test using four lag returns and find that the F-statistic is significant at the 0.05 level.


For the in-sample analysis we proceed using the Box-Jenkins methodology. Given the evidence of ARCH effects and volatility clustering using an ARCH-LM test but lack of any leverage effects in line with economic theory, we proceed to estimate models which can capture this: ARCH (1), ARCH (2), and the GARCH (1,1).  Estimation of ARCH (1) suggests low persistence as captured by α1 and relatively fast mean reversion. The ARCH(2) model generates greater persistence measured by sum of α1 and α2 and but still not as large as the GARCH(1,1) model, sum of  α1 and β as shown in table 2.

Table 2: Parameter Estimates

table 2

We proceed to forecast using the ARCH(1) as it has the lowest AIC and BIC in-sample, and GARCH (1,1) which has the most normally distributed residuals, no dependence in absolute levels, and the largest log-likelihood. We compare performance against a baseline 5 day rolling variance model.

Figure 3 plots the out of sample forecasts of the three models (from 22nd February 2016 to 28th February 2017). The ARCH model is able to capture the spike in volatility surrounding the referendum, however the shock does not persist. In contrast, the effect of this shock in the GARCH model fades more slowly suggesting that uncertainty persists for a longer time. However neither of the models fully capture the magnitude of the spike in volatility. This is in line with Dukich et al’s (2010) and Miletic’s (2014) findings that GARCH models are not able to adequately capture the sudden shifts in volatility associated with shocks.

Figure 3: Volatility forecasts and Squared Returns (5-day Rolling window)

Fig 3

We use two losses traditionally used in the volatility forecasting literature namely the quasi-likelihood (QL) loss and the mean-squared error (MSE) loss. QL depends only on the multiplicative forecast error, whereas the MSE depends only on the additive forecast error. Among the two losses, QL is often more recommended as MSE has a bias that is proportional to the square of the true variance, while the bias of QL is independent of the volatility level. As shown in table 3, GARCH(1,1) has the lowest QL, while the ARCH (1) and rolling variance perform better on the MSE measure.

Table 3: QL & MSE

Table 3 QL and MSE

Table 4: Diebold- Mariano Test (w/5-day Rolling window)

Table 4 DM test

Employing the Diebold-Mariano (DM) Test, we find that there is no significance in the DM statistics of both the QL and MSE. Neither the GARCH nor ARCH are found to perform significantly better than the 5-day Rolling Variance.



In this analysis, we tested various models to forecast the volatility of the Pound exchange rate against the Euro in light of the Brexit referendum. In line with Miletić (2014), we find that despite accounting for volatility clustering through ARCH effects, our models do not fully capture volatility during periods of extremely high uncertainty.

We find that the shock to the exchange rate resulted in a large but temporary swing in volatility but this did not persist as long as predicted by the GARCH model. In contrast, the ARCH model has a very low persistence, and while it captures the temporary spike in volatility well, it quickly reverts to the unconditional mean.  To the extent that we can consider exchange rate volatility as a measure of risk and uncertainty, we may have expected the outcome of Brexit to have a long term effect on uncertainty. However, we observe that the exchange rate volatility after Brexit does not seem significantly higher than before. This may suggest that either uncertainty does not persist (unlikely) or that the Pound-Euro exchange rate volatility does not capture fully the uncertainty surrounding the future of the UK outside the EU.



Abdalla S.Z.S (2012), “Modelling Exchange Rate Volatility using GARCH Models: Empirical Evidence from Arab Countries”, International Journal of Economics and Finance, 4(3), 216-229

Allen K.and Monaghan A. “Brexit Fallout – the Economic Impact in Six Key Charts.” Guardian News and Media Limited, 8 Jul. 2016. Web. Accessed: March 11, 2017

Brownlees C., Engle R., and Kelly B. (2011), “A Practical Guide to Volatility Forecasting Through Calm and Storm”, The Journal of Risk, 14(2), 3-22.

Centre for Macroeconomics (2016), “Brexit and Financial Market Volatility”. Accessed: March 9, 2017.

Cox, J. (2017) “Pound sterling falls after Labour slashes Tory lead in latest election poll”, Web. Accessed May 26, 2017

Diebold F. X. (2013), “Comparing Predictive Accuracy, Twenty Years Later: A Personal Perspective on the Use and Abuse of Diebold-Mariano Tests”. Dukich J., Kim K.Y., and Lin H.H. (2010), “Modeling Exchange Rates using the GARCH Model”

HM Treasury (2016), “HM Treasury analysis: the immediate economic impact of leaving the EU”, published 23rd May 2016.

Sveriges Riksbank, “Cross Rates” Web. Accessed 16 Feb 2017

Taylor, A. and Taylor, M. (2004), “The Purchasing Power Parity Debate”, Journal of Economic Perspectives, 18(4), 135-158.

Van Dijk, D., and Franses P.H. (2003), “Selecting a Nonlinear Time Series Model Using Weighted Tests of Equal Forecast Accuracy”, Oxford Bulletin of Economics and Statistics, 65, 727–44.

Tani, S. (2017), “Asian companies muddle through Brexit uncertainty” Web. Accessed: May 26, 2017

#ICYMI on the BGSE Data Science blog: Prediction as a Game

May 18, 2017

Prediction as a Game

by Davide Viviano ’17

In this article we provide a general understanding of sequential prediction, with a particular attention to adversarial models. The aim is to provide theoretical foundations to the problem and discuss real life applications…

#ICYMI on the BGSE Data Science blog: RandNLA for LS (Part 2)

May 4, 2017

Randomized Numerical Linear Algebra for Least Squares – Part 2

by Robert Lange ’17

In today’s article we are going to introduce the Fast Johnson Lindenstrauss Transform (FJLT). This result is going to be the fundament of two very important concepts which speed up the computation of an ε-approximation to the LS objective function and the target vector…

See also Part 1 of this post

10 Years, 5 Leading Economists: The BGSE Anniversary Roundtable (Part 2)

April 25, 2017

Students with Professors Alvin Roth and Christopher Sims

This is the second of two posts reporting on the roundtable discussion that took place on Friday, 31 March as part of the BGSE’s 10th Anniversary Celebrations. The first post focused on the contributions of the first three speakers,  Prof. Richard Blundell, Prof. Matthew Jackson and Prof. Anne Krueger. This post discusses the contributions of the last two speakers, Prof. Alvin Roth (Nobel Laureate 2012) and Prof. Christopher A. Sims (Nobel Laureate 2011) on  “The Practical Influence of Economic Research”.

Prof. Alvin Roth, Stanford University

Parothrof. Roth began by talking about the practical influence of his own research on matching markets. As Prof. Roth explained, these are markets where prices do not succeed in bringing the demand side and supply side of the market together. For example, while in a commodity market the price determines who will produce the good (those firms that can make profits at the given price) and to whom they will sell it (those with a willingness to pay greater than the price), the same cannot be said for the placement of students in public schools or the placement of new doctors in their first hospital.

Prof. Roth set out how economists’ research in matching markets played a crucial role in the design of more efficient school assignment systems. One problem in assigning learners to public schools, for example, is that parents may be incentivised not to list their first choice of school at the top of the list they submit to the relevant education authority, because they believe that their children’s chances of being assigned to the first choice is slim, and so they list their second or third choice at the top. However, in order to assign children to schools efficiently, the authority needs to know what parents’ true preferences are, and in this case economists can help to design the market such that parents have an incentive to reveal their true preferences.

Similarly, the college admissions process typically involves some congestion, since prospective students typically apply to many colleges,  but only accept a place in one college. This means colleges are faced with repeated rounds of admission decisions once they are informed that students who were accepted will not take up their places. A more efficient system would have prospective students submit their preferences over all colleges in advance, and are then placed in a single round using a central assignment mechanism.

Aside from designing more efficient matching markets, Prof. Roth also believes that economists can play an important role in gathering or producing evidence for the policy changes they advocate. As an example, Prof. Roth mentioned economic experiments undertaken in the United States (US), which convinced US policymakers to adopt a new mechanism for assigning new doctors to their first hospital, whereas those policymakers used to be reluctant to accept these theoretical arguments or experimental evidence from other countries.

Finally, Prof. Roth also voiced support for Esther Duflo’s view that economists have an important role to play as “plumbers”. This involves analysing and finding solutions for the ways in which markets do not perform well, due to some unforeseen practical obstacles not accounted for in the relevant economic model. In this respect, Prof. Roth talked about economists’ design of a matching market for kidney donors. Often friends or family members are unable to donate a kidney to a patient due to incompatibility of the donor and recipient, but this problem could be solved through a cross-matching mechanism whereby a donor-recipient pair is matched with another donor-recipient-pair in order to achieve compatibility.

However, in practice doctors often end up rejecting such matches for medical reasons in unforeseen ways. Economists designing this market are therefore faced with a practical problem to solve, namely how to elicit doctors’ preferences in such a way that once a kidney is matched to a patient, the doctor does not ultimately reject the organ for the operation.

Prof. Christopher A. Sims, Princeton University

Prof. Sims’ conProf.Chris Simstribution was concerned with developments in his own field of monetary policy, including the 2008/2009 financial crisis, and he began by pointing out that many central bank employees (including the heads of central banks) graduated with PhDs in economics, and therefore economics is certain to have a practical influence on monetary policy. In this respect, Prof. Sims believes that the effects of the financial crisis would have been far more serious if Ben Bernanke (and other policymakers) had not relied on the lessons of the Great Recession of the 1930s in implementing dramatically expansionary policies in response to the crisis.

In order to provide further context for his discussion of the financial crisis and the policy response in the United States, Prof. Sims noted that the policy prescriptions of monetarism had been a mistake, and that instead of aiming only at a constant growth rate of money, monetary policy should use interest rates as a tool to smooth fluctuations in economic activity. According to Prof. Sims, this shift in focus did indeed result in increased stability in the period leading up to the financial crisis.

Nevertheless, the overwhelming majority of economists, and in particular those in charge of monetary policy, did not predict the financial crisis, and the question has been asked why they were not able to do so. However, according to Prof. Sims, the role of economists in this instance can be compared to that of seismologists. In the same way that seismologists cannot predict earthquakes, but can only analyse them and provide very broad guidelines on when they might be likely to occur, recessions are inherently unpredictable, and economists should not be expected to predict them in advance.

A further question addressed by Prof. Sims is why the Fed chose to keep interests rates so low in the years leading up to the financial crisis. According to Prof. Sims, the Fed did have a rationale in adopting this policy, namely that it recognised the risks of falling into the kind of trap experienced by Japan since the 1990s, where the economy experiences low growth and the central bank is unable to provide monetary stimulus due to the zero lower bound on the interest rate. Nevertheless, it remains an open question whether the Fed should have better weighed these risks against the risk of creating a crisis through an extended period of low interest rates.

A further interesting possibility raised by Prof. Sims, although he did not address it conclusively,  is whether the post-monetarist stability leading up to the crisis paradoxically served to increase its severity. Here Prof. Sims again used an earthquake metaphor: if engineers design structures that are more resistant to earthquakes, this could allow a society both to construct more buildings, and to construct taller buildings that hold more people. In this way the original technology designed to protect the society against earthquakes could increase the expected damage of a very serious earthquake. In the same way, increased economic stability could have contributed to the deregulation and expansion of the financial sector which amplified the effects of the financial crisis.

Ultimately Prof. Sims expressed the view that economics as a field cannot carry the blame for an event such as the financial crisis. Which monetary policy is appropriate at a given moment is a hard problem, and economists should not carry the blame for attempting to tackle such problems, which are important to analyse, even if they are not completely successful in solving them.

Alum Charlie Thompson (ITFD ’14) uses data science to build a virtual Coachella experience

April 20, 2017

ITFD alum Charlie Thompson ’14 is an R enthusiast who enjoys “tapping into interesting data sets and creating interactive tools and visualizations.” His latest blog post explains how he used cluster analysis to build a Coachella playlist on Spotify:

“Coachella kicks off today, but since I’m not lucky enough to head off into the California desert this year, I did the next best thing: used R to scrape the lineup from the festival’s website and cluster the attending artists based on audio features of their top ten Spotify tracks!”

source: Charlie Thompson


source: Charlie Thompson

Read the full blog post on his website

Charlie shares a bit of his background on his website:

Currently an Analytics Specialist at a tech startup called VideoBlocks, I create models of online customer behavior and manage our A/B testing infrastructure. I previously worked as a Senior Data Analyst for Booz Allen Hamilton, where I developed immigration forecasts for the Department of Homeland Security. I also built RShiny applications for various clients to visualize trends in global disease detection, explore NFL play calling, and cluster MLB pitchers. After grad school I worked as a Research Assistant in the Macroeconomics Department of Banc Sabadell in Spain, measuring price bubbles in the Colombian housing market.

I have an MS in International Trade, Finance, and Development from the Barcelona Graduate School of Economics and a BS in Economics from Gonzaga University. For my Master’s thesis I drafted a policy proposal on primary education reform in Argentina, using cluster analysis to determine the optimal regions to implement the program. I also conducted research in behavioral economics and experimental design, using original surveys and statistical modelling to estimate framing effects and the maximization of employee effort.

Read more about Charlie on his website

10 Years, 5 Leading Economists: The BGSE Anniversary Roundtable (Part 1)

April 17, 2017


On Friday, 31 March, the BGSE played host to a number of Nobel laureates and other leading academics from around the world as part of its 10th Anniversary Celebrations. The first event of the weekend was a roundtable discussion with five eminent academic guests about “The Practical Influence of Economic Research”. This post highlights some of the main points to come out of the contributions of the first three speakers: Prof. Richard Blundell, Prof. Matthew Jackson and Prof. Anne Krueger.

Prof. Richard Blundell, University College London

blundell-photoWith the help of attendant BGSE staff, Prof. Blundell overcame a minor hiccup with his microphone to speak on the practical influence of his research in the microeconomics of public policy and tax reform, and argued that the evidence economists present can have an important impact on government policy. As an example, he referred to the Mirrlees Review, which was produced under the auspices of the UK’s Institute for Fiscal Studies (IFS), and published in 2011, with the aim to  “identify the characteristics of a good tax system for any open developed economy in the 21st century, assess the extent to which the UK tax system conforms to these ideals, and recommend how it might realistically be reformed in that direction.”

According to Prof. Blundell, the Mirrlees Review has been successful in identifying flaws in the UK tax system (and those of other countries), such as effective marginal tax rates that decrease over certain ranges of income levels, and that differ across different sources of income, such as earned income, self-employment income and dividend income.  Tax benefits for low-income members of the population also tend to be unnecessarily complex and difficult to understand. These aspects of developed economies’ tax systems carry particular weight in the context of increased inequality and decreasing incomes at the lower end of the income distribution.

Prof. Blundell also argued that the Mirrlees Review has had some success in addressing these flaws, referring to the fact that a number of UK lawmakers have accepted some of its core proposals, and that the Review has been widely translated and distributed around the world.

Prof. Matthew O. Jackson, Stanford University

mojackson.jpgProf. Jackson started his presentation with a question that would be referred to a number of times by other speakers in the contributions that followed: what is (and what should be) the role of economists in society? Prominent economists have offered different definitions of their role since the inception of the field, variously likening the profession to those of artists, ethicists, story-tellers, scientists, engineers and, most recently, plumbers. Prof. Jackson focused mainly on the contrasting characterisation of economics as story-telling (as propounded by Robert Lucas) or as a science.

According to the story-telling view, economists deliberately work in an “unrealistic”, simplified world in order to tell us useful things about the real world using the power of imagination and ideas. In contrast, seeing economists as scientists doing the same kind of work as, for example, physicists, would imply that economists are engaged in discovering universally applicable laws of how markets work, and how firms and consumers make decisions. Ultimately Prof. Jackson highlighted how many of the most exciting recent advances in economics appear to fit better with the characterisation of economists as engineers or plumbers, such as recent developments in market design and development policy.

Prof. Jackson concluded by pointing out the potential practical implications of his own research on economic and social networks, and how modern technological tools can help us to better understand such networks. As an example, he referred to a figure produced using Python, showing how the US Senate had become more partisan over time, by drawing connections between senators that voted for the same legislation across party lines, and illustrating how the number of connections between Democrats and Republicans had declined over time.

Prof. Anne Krueger, The Johns Hopkins University

Prof. Krueger hiakrueger.jpgghlighted two ways in which economists exercise practical influence, namely by providing evidence that influences policy, and by providing blueprints to follow when change happens too fast for appropriate evidence to be gathered.

Regarding the former avenue of influence, Prof. Krueger’s points were in line with those made by Prof. Blundell.  Her most important example was India’s use of a small scale industry (SSI) reservation policy for more than 60 years, through which the Indian government reserved the production of certain goods to firms that employed fewer than a specific threshold of employees. Economists ultimately produced convincing evidence that this policy was not allowing firms in the reservation industries to exploit economies of scale, thereby rendering them uncompetitive relative to producers of the same goods in other countries. According to Prof. Krueger, this economic evidence helped to convince the Indian government to scale back and ultimately do away with its SSI reservation policy, to the benefit of Indian businesses in the affected industries.

Prof. Krueger made a similar argument concerning the cost of protecting employment through the use of import constraints, and referred to an example in the US where the costs of higher prices to consumers of import protection were many multiples greater than the employment income saved through that protection.  Prof. Krueger argued that by attaching figures to these costs in dollar terms, economists could influence lawmakers to adopt better policies.

Finally, Prof. Krueger referred to the 2008 financial crisis as an instance where economists had formulated a blueprint for responding to a rapidly changing situation, partly based on research on the Great Depression. Prof Krueger argued that this blueprint, which among other things called for large monetary and fiscal stimulus, had probably prevented a more serious recession following the crisis. As a further example, she mentioned the Mexican sovereign debt crisis of 1982, and argued that the structural reforms proposed by economists as a blueprint following the crisis have helped Mexico to achieve a better economic position than it otherwise may have done.

Continue with Part 2, which covers presentations by Prof. Alan Roth and Prof. Christopher Sims