Media and behavioral response: the case of #BlackLivesMatter

Economics master project by Julie Balitrand, Joseph Buss, Ana Monteiro, Jens Oehlen, and Paul Richter ’19

source: Texas Public Radio

Editor’s note: This post is part of a series showcasing Barcelona GSE master projects. The project is a required component of all Master’s programs at the Barcelona GSE.


We study the effects of the #BlackLivesMatter movement on the law abiding behavior of African-Americans. First, we derive a conceptual framework to illustrate changes in risk perceptions across different races. Second, we use data from the Illinois Traffic Study Dataset to investigate race ratios in police stops. For identification, we apply a linear probability OLS regression on media coverage as well as an event study framework with specific cases. We find that the number of black people committing traffic law violations is significantly reduced after spikes in media coverage and notable police shootings. In the latter case, we further find that the effect holds for an approximate ten day period. We argue that these observed changes in driving behavior are a result of the updated risk beliefs.

Game Tree. Balitrand et al.


Beginning with our model, we show that media related changes in risk perceptions cause a change in the proportion of people committing crimes. Using this model, we further predict that this change would be different across different racial groups. More specifically, it predicts that Blacks became more cautious in order to decrease the chance of a negative interaction with the police. On the other hand, whites were predicted to not change their behavior, since the violence in media coverage is not relevant to their driving decisions.

In order to test our model, we develop a hypothesis testing strategy that allows us disentangle police actions from civilian decisions. By considering the proportion of stopped people who are black at nighttime, we completely remove any effect caused by changes in policing intensity and bias. Instead, we create a testable hypothesis that only focuses on the differences in behavior between racial groups.

To test this hypothesis, we use a linear probability model along with traffic data from Illinois. We test the hypothesis using both an event study approach, as well as using media intensity data from the GDELT Project. Both approaches verify our model’s predictions with high significance levels. Therefore, we have shown that Blacks became more cautious in response to these events compared to other racial groups. In addition, our robustness check on the total number of stops supports the claim that non-blacks do not have a significant response to media coverage of police brutality toward Blacks. This leads to the conclusion that the expected proportion of Blacks breaking traffic laws goes down in response to coverage of these events.

An implicit assumption in our model was that as media coverage goes to zero, Blacks would revert back to their original level of caution. To test this we looked at three days intervals following each media event. We showed that after approximately 10 days, the coefficients were not significant anymore, showing that the media only caused a short term change in behavior. Since this was a robustness check, and not a main focus of our model, we did not investigate this further. This is an interesting conclusion, and warrants future analysis.

On a final note, we want to address the type of media we use for our analysis. Our model section considers media in a general sense. This can include, but is not limited to, social media platforms such as Twitter and Facebook, as well as more traditional media platforms such as television and print newspapers. All of these sources cover police brutality cases at similar intensities. We use TV data for media intensity, since it affects the broadest demographic and therefore best represents the average driver’s exposure to the topic. Different media age medians might affect different demographics more or less. For example, social media may have a greater effect on younger drivers than older drivers. We believes this topic warrants further analysis, in a addition to the topic of the previous paragraph.

Authors: Julie Balitrand, Joseph Buss, Ana Monteiro, Jens Oehlen, and Paul Richter

About the Barcelona GSE Master’s Program in Economics

Brexit, digital money, and (Super) Mario, oh my!

Fall 2019 roundup of CaixaBank Research by Barcelona GSE alumni


It’s time once again to check in with Barcelona GSE Alumni who are now Economists and Senior Economists at CaixaBank Research in Barcelona. As part of their duties, they regularly publish working papers and reports on a range of topics. Below are some of their latest contributions.

(If you’re a Barcelona GSE alum and you’re also writing about Economics, Finance, or Data Science, let us know where we can find your stuff!)

The «sense and sensibility» of the ECB’s communication

Gabriel L. Ramos ’19 (Finance) and Adrià Morron ’12 (Economics)

Communication is one of the most powerful monetary policy tools. For this reason, CaixaBank Research has developed an index to measure the sentiment of the ECB’s statements.Our ECB sentiment index shows a strong correlation with euro area economic activity indicators and foresees changes in the reference interest rate. The index notes a significant deterioration in ECB sentiment between late 2017 and Q3 2019 and shows how geopolitical uncertainty has affected the ECB’s view of the economic outlook.

The United Kingdom’s potential for Spain after Brexit

Javier Ibañez de Aldecoa ’18 (Economics) with Claudia Canals and Josep Mestres Domènech

In this article, we analyse the extent to which it will be more difficult for Spanish companies to establish relations for international expansion with the United Kingdom following Brexit. We use the CaixaBank Index for Business Internationalisation (CIBI), which classifies foreign countries according to the potential for internationalisation they offer for Spanish companies, and we analyse the impact of the four Brexit scenarios put forward by the Bank of England.

The e-monetary policy of the new digital economy

Adrià Morron ’12 (Economics) and Ricard Murillo ’17 (International Trade, Finance and Development)

Digital technologies permeate the debate on the future of the economy. Monetary policy and its main vehicle, money, are no exception. More and more products are sold over the internet and cash is used less and less. This new digital economy creates new demands on the financial sector and digital money emerges as a new means of payment that appeals to consumers. How does all this affect monetary policy? What can central banks do (and what are they doing) about it?

The farewell of (Super) Mario Draghi

Adrià Morron ’12 (Economics)

Mario Draghi ends his eight-year mandate at the ECB on October 31, leaving the central bank at the cutting edge of monetary policy. Under Draghi’s leadership, the ECB has offered significant support to the recovery of the euro area. However, the latest measures have raised doubts over the margin for action and effectiveness of monetary policy. Christine Lagarde, with a less technical profile but a vision of continuity in monetary policy, will take over in a sombre economic environment in which signs of fragmentation between ECB members have appeared.

Source: CaixaBank Research

Why we need to discuss gender in a different way

Economics ’18 alumni Eva Schoenwald, deputy chair, and Iakov Frizis, editor-in-chief, of the Women in Economics Initiative

Originally posted by the authors on Women in Economics

Recent years have seen significant improvements in female representation in the workplace. Information campaigns, feminist associations, female employment quotas and a rising number of female role models all contribute to an improved gender balance in Western European and US workplaces.

Despite this progress, we remain far from achieving gender balance in the workplace. A significant contributor to the reform slowdown is the emergence of diversity fatigue and inclusion backlash among many companies trying to implement more gender inclusion in the workplace. It becomes increasingly clear that we need to find a way to redefine popular gender discourse if we wish to deliver more inclusion. 

According to the 2018 Global Gender Gap Report, current projections place the closing of the gender gap at 108 years from now. Yet success stories of female economists such as Esther Duflo, Christine Lagarde and Laurence Boone make it easy to cast data aside. They often let us forget about the existence of glass cliffs, implicit gender bias in recruitment and publication processes, pregnancy discrimination, sexual harassment, office favouritism, lack of role models, and restroom gossip, just to name a few. As compelling as success stories might be, they seem not to be bellwethers for reform. 

In the fight against gender discrimination, we face an elusive enemy. A recent International Labour Organisation survey found discrimination and unconscious gender bias to be among the five main challenges for women holding leadership positions. Unconscious bias stems from social norms, values, and experiences that contribute to decision-making. Such bias often manifests itself in an overall masculine corporate culture, along with preconceptions related to social roles and abilities of men and women, and the masculine nature of management positions.

Limited reflection on the effect of unconscious bias towards women in the workplace risks understating the urgency to push for more equality, allowing for a feeling of diversity fatigue to set in. Cundiff and Vescio (2016) show that individuals with strong gender stereotypes are less prone to attribute workplace gender disparities to discrimination. In 2017, James Damore, a Google engineer, unintentionally sided publicly with Cundiff and Vescio when he sued his employer on the grounds of intolerance against individuals holding unpopular political beliefs. The lawsuit came as a response to Google terminating the contract of Mr. Damore, following his drafting of an internal memo in which he argued that female underrepresentation in the tech industry is due to abilities, rather than flagrant discrimination. 

The Google case describes too well the feeling of exhaustion towards diversity and inclusion issues that motivates us to take action. The recent gender inclusion backlash points to a need to revisit how we discuss gender. We should both question the validity of the design of inclusion programmes and acknowledge that we still have a long way to go until we reach equality of opportunity between genders. 

We need to reinvent the way we discuss gender by taking the focus away from high-level gender policies and fairness approaches. Instead, we propose to address gender stereotypes and to develop a strong performance-oriented approach to discussing inclusion. Only by acknowledging that our profession has a gender issue will we be able to revisit this old problem through a new perspective – one that brings together practitioners across both genders, to work towards a more inclusive workplace. 

About the Women in Economics Initiative

Together with some friends, we have recently launched the Women in Economics Initiative (WiE). The Women in Economics Initiative was established to advance gender equality in the field of economics. Our goal is to encourage equal opportunity and a balanced representation of genders in the economics profession across the academic, business and public sectors. To achieve this, we offer a platform that highlights the work of women economists, a network to connect and exchange ideas and interactive data about the status of diversity in economics.

We are looking for new members, supporters as well as submissions of articles from women economists on their work.

Eva Schoenwald ’18 is a quantitative researcher at Nesta and deputy chair of WiE. She is an alum of the Barcelona GSE Master’s in Economics.

LinkedIn | Twitter

Iakov Frizis ’18 is a senior economist at PwC Luxembourg and editor-in-chief of WiE. He is an alum of the Barcelona GSE Master’s in Economics.

LinkedIn | Twitter

Measuring horizontal inequity in healthcare utilisation

Publication by Mohammad Habibullah Pulok ’12 (HEP)

My first paper from PhD is out in the European Journal of Health Economics: “Measuring horizontal inequity in healthcare utilisation: a review of methodological developments and debates”

Paper abstract

Equity in healthcare is an overarching goal of many healthcare systems around the world. Empirical studies of equity in healthcare utilisation primarily rely on the horizontal inequity (HI) approach which measures unequal utilisation of healthcare services by socioeconomic status (SES) for equal medical need. The HI method examines, quantifies, and explains inequity which is based on regression analysis, the concentration index, and the decomposition technique. However, this method is not beyond limitations and criticisms, and it has been subject to several methodological challenges in the past decade.

This review presents a summary of the recent developments and debates on various methodological issues and their implications on the assessment of HI in healthcare utilisation. We discuss the key disputes centred on measurement scale of healthcare variables as well as the evolution of the decomposition technique. We also highlight the issues about the choice of variables as the indicator of SES in measuring inequity. This follows a discussion on the application of the longitudinal method and use of administrative data to quantify inequity.

Future research could exploit the potential for health administrative data linked to social data to generate more comprehensive estimates of inequity across the healthcare continuum. This review would be helpful to guide future applied research to examine inequity in healthcare utilisation.

About the author


Mohammad Habibullah Pulok ’12 is a post-doc researcher at Dalhousie University in Canada. He is an alum of the Barcelona GSE Master’s in Health Economics and Policy (now EPP).

How Destructive is Innovation?

Publication in Econometrica by Daniel Garcia-Macia ’11 (Economics)

cover image

Daniel’s paper “How Destructive is Innovation?” (with Chang-Tai Hsieh & Peter Klenow) has been published in Econometrica (September 2019).

The paper has received media attention in NBER Digest, Chicago Booth Review, Financial Times, and Bloomberg.

Paper abstract

Entrants and incumbents can create new products and displace the products of competitors. Incumbents can also improve their existing products. How much of aggregate productivity growth occurs through each of these channels? Using data from the U.S. Longitudinal Business Database on all nonfarm private businesses from 1983 to 2013, we arrive at three main conclusions: First, most growth appears to come from incumbents. We infer this from the modest employment share of entering firms (defined as those less than 5 years old). Second, most growth seems to occur through improvements of existing varieties rather than creation of brand new varieties. Third, own‐product improvements by incumbents appear to be more important than creative destruction. We infer this because the distribution of job creation and destruction has thinner tails than implied by a model with a dominant role for creative destruction.

Daniel Garcia-Macia ’11 (PhD, Stanford University) is an Economist at the International Monetary Fund. He is an alum of the Barcelona GSE Master’s in Economics.

Small Numbers, Big Concerns: Practices and Organizational Arrangements in Rare Disease Drug Repurposing

Publication by Burcu Kucukkeles ’12 (Economics)

Burcu Kücükkeles (Economics ’12) has published a paper in the Academy of Management Discoveries. In this paper, “Small Numbers, Big Concerns: Practices and Organizational Arrangements in Rare Disease Drug Repurposing,” Burcu and her colleagues looked into the societal challenge of developing drugs for rare diseases (a rare disease is a condition that affects less than 200,000 people in the United States or 1 in 2,000 people in the European Union).

By studying the market and government failures in rare diseases and practices of two nonprofit organizations, Burcu and her colleagues contribute to the Agenda on the Sustainable Development Goals beyond the implications of their study to the management literature.

Burcu is currently a PhD candidate at the Chair of Strategic Management and Innovation, Department of Management, Technology, and Economics, ETH Zurich. Voice readers are welcome to email her for access to the full paper or with any questions about this research: burcuk [ at ] ethz [. ]ch

Paper Abstract

Due to their small market size, many rare diseases lack treatments. While government incentives exist for the development of drugs for rare diseases, these interventions have yielded insufficient progress. Drawing on an in-depth case study of rare diseases therapies, we explore how the practices of two nonprofit organizations allowed them to circumvent the endemic market and government failures involving positive externalities by using generic drug repurposing—i.e., seeking new therapeutic applications for existing generic drugs. Beyond elucidating the potential of generic drug repurposing for those suffering from rare diseases, our discoveries provide important insights into the mutual constitution of organizational arrangements for societal challenges and the practices they host. By showing how organizational arrangements can both reinforce and extend practices such that they enable practitioners to achieve a standard of excellence, our study advances practice theory and research on the comparative efficacy of alternative organizational arrangements for tackling societal challenges.


Burcu Kucukkeles ’12 is PhD Candidate at ETH Zurich and an alum of the Barcelona GSE Master’s in Economics.

Evaluating the performance of merger simulation using different demand systems

Competition and Market Regulation master project by Leandro Benítez and Ádám Torda ’19

Photo credit: Diego3336 on Flickr

Evaluating the performance of merger simulation using different demand systems: Evidence from the Argentinian beer market

Editor’s note: This post is part of a series showcasing Barcelona GSE master projects. The project is a required component of all Master’s programs at the Barcelona GSE.


This research arises in a context of strong debate on the effectiveness of merger control and how competition authorities assess the potential anticompetitive effects of mergers. In order to contribute to the discussion, we apply merger simulation –the most sophisticated and often used tool to assess unilateral effects– to predict the post-merger prices of the AB InBev / SAB-Miller merger in Argentina.

The basic idea of merger simulation is to simulate post-merger equilibrium from estimated structural parameters of the demand and supply equations. Assuming that firms compete a la Bertrand, we use different discrete choice demand systems –Logit, Nested Logit and Random Coefficients Logit models– in order to test how sensible the predictions are to changes in demand specification. Then, to get a measure of the precision of the method we compare these predictions with actual post-merger prices.

Finally, to conclude, we point out the importance of post-merger evaluation of merger simulation methods applied in complex cases, as well as the advantages and limitations of using these type of demand models.


Merger simulations yield mixed conclusions on the use of different demand models. The Logit model is ex-ante considered inappropriate because of its restrictive pattern of substitution, however it performed better than expected. Its predictions on average were close to the predictions of the Random Coefficients Logit model, which should yield the most realistic and precise estimates. Conversely, the Nested Logit model largely overestimated the post-merger prices. However, the poor performance is mainly motivated by the nests configuration: the swap of brands generates almost two close to monopoly positions in the standard and low-end segment for AB InBev and CCU, respectively. This issue, added to the high correlation of preferences for products in the same nest, generates enhanced price effects.


Regarding the substitution patterns, the Logit, Nested Logit and Random Coefficients Logit models yielded different results. The own-price elasticities are similar for the Logit and Nested Logit model, however for the Random Coefficients Logit model they are more almost tripled. This is likely driven by the estimated larger price coefficient as well as the standard deviations of the product characteristics. As expected, by construction the Random Coefficients Logit model yielded the most realistic cross-price elasticities.


Our question on how does the different discrete choice demand models affects merger simulation –and, by extension, their policy implications– is hard to be answered. For the AB InBev / SAB-Miller merger the Logit and Random Coefficients Logit model predict almost zero changes in prices. Conversely, according to the Nested Logit, both scenarios were equally harmful to consumers in terms of their unilateral effects. However, as mentioned above, given the particular post-merger nests configuration, evaluating this model solely by the precision of its predictions might be misleading. We cannot discard to have better predictions under different conditions.


As a concluding remark, we must acknowledge the virtues and limitations of merger simulation. Merger simulation is a useful tool for competition policy as it gives us the possibility to analyze different types of hypothetical scenarios –like approving the merger, or imposing conditions or directly blocking the operation–. However, we must take into account that it is still a static analysis framework. By focusing only on the current pre-merger market information, merger simulation does not consider dynamic factors such as product repositioning, entry and exit, or other external shocks.

Authors: Leandro Benítez and Ádám Torda

About the Barcelona GSE Master’s Program in Competition and Market Regulation

How do firms adjust to rises in the minimum wage? Survey evidence from Central and Eastern Europe

Publication by Nataša T. Jemec ’09 (Economics) and Ludmila Fadejeva ’11 (Macro)

Nataša Todorović Jemec ’09 (Economics) and Ludmila Fadejeva ’11 (Macroeconomic Policy and Financial Markets) have published a paper in the IZA Journal of Labour Policy, together with a few other colleagues from central banks of new EU member states. The paper, “How do firms adjust to rises in the minimum wage? Survey evidence from Central and Eastern Europe,” studies the transmission channels for rises in the minimum wage using a unique firm-level dataset from eight Central and Eastern European countries.

They wrote the publication within the ECB Wage Dynamics Network (WDN). At the time, Nataša and Ludmila were working at the Central Bank of Slovenia and the Central Bank of Latvia respectively, and they were their banks’ representatives in the WDN. Increase of the minimum wage was a common topic of many new EU members, and they decided to write a paper on that based on the data that they collected through a WDN survey in their countries.

Researchers can use this form to request access to the data of the WDN network which includes many EU countries.

Paper abstract

We study the transmission channels for rises in the minimum wage using a unique firm-level dataset from eight Central and Eastern European countries. Representative samples of firms in each country were asked to evaluate the relevance of a wide range of adjustment channels following specific instances of rises in the minimum wage during the recent post-crisis period. The paper adds to the rest of literature by presenting the reactions of firms as a combination of strategies and evaluates the relative importance of those strategies. Our findings suggest that the most popular adjustment channels are cuts in non-labour costs, rises in product prices, and improvements in productivity. Cuts in employment are less popular and occur mostly through reduced hiring rather than direct layoffs. Our study also provides evidence of potential spillover effects that rises in the minimum wage can have on firms without minimum wage workers.

About the authors

Nataša T. Jemec ’09 is a Senior Economist at IMAD. She is an alum of the Barcelona GSE Master’s in Economics.

Ludmila Fadejeva ’11 is a Senior Econometrician at the Bank of Latvia. She is an alum of the Barcelona GSE Master’s in Macroeconomic Policy and Financial Markets.

Forecasting Currency Crises

Macroeconomic master project by Ivana Ganeva and Rana Mohie ’19

Editor’s note: This post is part of a series showcasing Barcelona GSE master projects. The project is a required component of all Master’s programs at the Barcelona GSE.


The question of whether a currency crisis can be predicted beforehand has been discussed in the literature for decades. Economists and econometricians have been trying to develop different prediction models that can work as an Early Warning System (EWS) for a currency crisis. The significance of such systems is that they provide policy makers with a valuable tool to aid them in tackling economic issues or speculation pressure, and in taking decisions that would prevent that from turning into a crisis. This topic is especially relevant to Emerging Markets Economies due to the presence of a greater number of fluctuations in their exchange rate translating to a bigger currency crisis risk.

In this paper, we propose an Early Warning System for predicting currency crises that is based on an Artificial Neural Networks (ANN) algorithm. The performance of this EWS is then evaluated both in-sample and out-of-sample using a data set of 17 developed and developing countries over the period of 1980-2019. The performance of this Neural-Network-based EWS is then compared to two other models that are widely used in the literature. The first one is the Probit model dependent variable which is considered the standard model in predicting currency crises, and is based on Berg and Patillo, 1999. The second model under consideration is a regime switching prediction model based on that proposed by Abiad, 2006.

Artificial Neural Networks

Artificial Neural Networks (ANN-s) is a Machine Learning technique which drives its inspiration from biological nervous systems and the (human) brain structure. With recent advancement in the computing technologies, computer scientists were able to mimic the brain functionality using artificial algorithms. This has motivated researchers to use the same brain functionality to design algorithms that can solve complex and non-linear problems. As a result, ANN-s have become a source of inspiration for a large number of techniques across a vast variety of fields. The main financial areas where ANN-s are utilized include credit authorisation and screening, financial and economic forecasting, fixed income investments, and prediction of default and bankruptcy and credit card manipulations (Öztemel, 2003).

Main Contributions

1. Machine Learning Techniques:

(a) Using an Artificial Neural Network predictive model based on the multi-layered feed forward neural network (MLFN), also known as the “Back-propagation Network” which is one of the most widely used architectures in the financial series neural network literature (Aydin and Savdar 2015). To the best of our knowledge, this is the first study that used a purely neural network model in forecasting currency crises.

(b) Improving the forecast performance of the Neural Network model by allowing the model to be trained (learn) from the data of other countries in the cluster; i.e countries with similar traits and nominal exchange rate depreciation properties. The idea behind this model extension is mainly adopted from the “Transferred Learning” technique that is used in image recognition applications.

2. The Data Set: Comparing models across a large data set of 17 countries in 5 continents, and including both developing and developed economies.

3. Crisis Definition: Adding an extra step to the Early Warning System design by clustering the set of countries into 6 clusters based on their economy’s traits, and the behavior of their nominal exchange rate depreciation fluctuations. This allows for having a crisis definition that is uniquely based on each set of countries properties – we call it the ’middle-ground’ definition. Moreover, this allowed to test for the potential of improving the forecasting performance of the neural network by training the model on data sets of other countries within the same cluster. 4. Reproducible Research: Downloading and Cleaning Data has been automated, so that the results can be easily updated or extended.


We compare between models based on two main measures. The Good Signals measure captures the percentage of currency crises predicted out of the total crises that actually occurred in the data set. The second measure used for comparing across models is the False Alarms measure. That is, the percentage of false signals that the EWS gives out of the total number of crises it predicts. In other words, that is the percentage of times when the EWS predicts a crisis that never happens.

The tables presented below show our findings and how the models perform against each other on our data set of 17 countries. We also provide the relevant findings from literature as a benchmark for our research.

The results in Table 1 show that Berg & Patillo’s clustering of all countries together generally works worse than our way of clustering data. Therefore, we can confirm that the choice of a ’middle-ground’ crisis definition has indeed helped us preserve any potential important country- or cluster-specific traits. In brief, we get comparable results to the ones found in the literature when using conventional methods, as highlighted by the table to follow.

After introducing the ANN model and its extension, we observe their Out-of-Sample models performance and obtain some of the key results to our research.

Summary of the key results

  • The proposed Artificial Neural Network model crisis predictability is shown to be comparable to the standard currency crisis forecasting model across both measures of Good Signals and less False Alarms. However, the modified Neural Network model on the special clustering data set has shown superior performance to the standard forecasting model.
  • The performance of the Artificial Neural Network model observed a tangible improvement when introducing our method of clustering the data. That is, data from similar countries as part of the training set of the network could indeed serve as an advantage rather than a distortion. To the contrary, using the standard Probit model with the panels of clustered data resulted in lower performance as compared to the respective country-by-country measures.

Authors: Ivana Ganeva and Rana Mohie

About the Barcelona GSE Master’s Program in Macroeconomic Policy and Financial Markets

On the evolution of altruistic and cooperative behaviour due to schooling system in Spain

Economics master project by Shaily Bahuguna, Diego Loras Gimeno, Davina Heer, Manuel E. Lago, and Chiara Toietta ’19

Editor’s note: This post is part of a series showcasing Barcelona GSE master projects. The project is a required component of all Master’s programs at the Barcelona GSE.


This paper aims to find a pattern in the evolution of altruistic and cooperative behaviour whilst distinguishing across different types of schools in Spain. In specific, we design a controlled laboratory experiment by running the standard dictator game and a public goods game in a public and private (“concertada”) high school. Using a sample of 180 students, we compare 12 and 16 year old children to distinguish the evolutionary pattern and test if there is a significant change by the type of schooling system. Alongside, we control for variants such as parental wealth status, religious views and ethical opinions. Interestingly, evidence from our data highlights that altruism levels rise throughout public school education whilst it falls in private schools. On the contrary, cooperation levels are relatively stable in public schools but rise in private schools. The results from this paper can be exploited to understand how education may influence selfish and individualistic behaviour in our society. 

Key results

Diff-in-Diff (Altruism (L) & Cooperation (R))

Our results show that at the initial stage, i.e. for the first year students, the level of altruism is higher in public schools and this prevails throughout the students’ education in a public school. On the other hand, we observe an opposite trend for students attending private school, as over the four years of education, the average level of altruism declines. In regards to cooperation, we find some surprising results. Although students attending a public school initially show higher levels of cooperation than private schools, over the course of their education, this gap is not only reduced but it is also surpassed by the private school. Our results are in line with previous research which state that females are more likely to donate and cooperate than males but contradict the popular view in literature that income has a positive correlation with both dependent variables.

Authors: Shaily Bahuguna, Diego Loras Gimeno, Davina Heer, Manuel E. Lago, and Chiara Toietta

About the Barcelona GSE Master’s Program in Economics