The main goal of our Master Project is to predict intraday stock market movements using two different kinds of input features: financial indicators and sentiments from news and tweets. While the former are part of the common technical analysis of financial econometric models, the extracted sentiment of news articles and tweets from Twitter are also proven to correlate with stock markets movements. Our paper aims at contributing to the existing academic and professional knowledge in two main directions. First, we evaluate three different approaches to extract the sentiment from both social and mass media based on its forecasting power. Second, we deploy a battery of engineered features based on the sentiment, together with the financial indicators, in a machine learning model for a fine-grained minute-level forecasting exercise. In the end, two different classes of models are fitted to test the forecasting power of the combined input features. We estimated a classical ARIMA-model, and an XGBoost-model as machine learning algorithm. We collected data on the companies Apple, JPMorgan Chase, Exxon Mobil, and Boeing.
Research by Francesco Amodio ’10 (Economics) and co-authors at The World Bank and Barcelona GSE
Francesco Amodio ’10 (Economics) co-authored this article for VoxDev with Barcelona GSE Research Professor Giacomo De Giorgi along with World Bank economists Jieun Choi and Aminur Rahman. In the article, the team gives an overview of a field experiment they conducted and theoretical model they developed that describes the interaction between firms and inspectors.
“In collaboration with the World Bank Group and the State Tax Service of the Kyrgyz Republic, we designed an incentive scheme for tax inspectors that rewards them based on the anonymous evaluation submitted by inspected firms. In theory, this should increase the bargaining power of firms in their relationship with tax officials, and decrease the bribe size. However, if firms pay bribes instead of taxes, bribes can increase on the extensive margin, and tax revenues could decrease.”
They found that anonymous rating of inspectors can decrease bribes and increase tax revenues as long as it takes into account market structure considerations.
There exists a substantial body of literature concerned with the calibration of the Heston model for pricing financial derivatives under stochastic volatility, many of which rely on computationally expensive algorithms. Our paper evaluates a calibration method of the Heston model proposed by Alòs, De Santiago, and Vives (2015), which can be used to price derivatives with little computational effort. The calibration method is innovative in the sense that it considers only the three most critical regions of the implied volatility surface. The regions where the underlying option is, firstly, at-the-money, secondly, close to maturity and lastly, far away from maturity. Although their procedure is parsimonious and very easy to implement, they calibrate a model whose empirical applicability is contested.
The main contribution of our paper is the evaluation of their model in an extensive numerical exercise as well as an application to real data. Collecting empirical option data has been one of the main challenges with respect to this work, since historical data on financial derivatives is not accessible to the public. Faced with this issue we have written a script that allowed us to automatically scrape option data at a high frequency over just a couple of weeks. Thus, we build our own extensive data base. Also, we have made the data and code available on https://griipen.shinyapps.io/bgse/ and https://github.com/HitKnit/BGSE2018/tree/HitKnit-optionscraping, respectively.
In terms of our results, we find that whilst the calibration method has solid theoretical foundations and produces satisfactory estimation results within the theoretical Heston universe. However, it fails in practice. Specifically, for the numerical exercise we find that out of all simulations the maximum average error across the entire volatility surface is 0.999 percent while the mean error across simulations is only 0.481 percent. In sharp contrast to that, absolute percentage errors for our empirical data are on the order of 30-40 percent in many cases. In the following figure, we present our findings for intra-daily data from May 16, 2018. The left column shows empirical implied volatilities for a European call option on Facebook Inc. (FB) stocks. From top to bottom volatilities are shown for the opening, lunch and closing sessions. The central column shows the fitted volatility surfaces while the right column shows absolute percentage differences between empirical and estimated values. The finding that errors are particularly high for at-the-money options with short times to maturity is robust across the entire data sample.
Conclusions and key results:
In light of these results, we conclude that inherent limitations of the Heston Model disqualify the calibration for practical use. Nonetheless, we believe that similarly simple calibration methods as the one examined here should be used in combination with more sophisticated option pricing models.
Brian Albrecht ’14 (Economics of Public Policy) offers both a normative and a positive view
Brian Albrecht is a PhD candidate at the University of Minnesota and a graduate of the Barcelona GSE Master’s Program in Economics of Public Policy, as well as a past editor of the Barcelona GSE Voice. He is also a contributor to the Sound Money Project, a blog from the American Institute for Economic Research (AIER).
In two recent articles, he talks about money as a social contract, both from a normative and a positive perspective:
“Both monetary theory and social contract theory consider a hypothetical situation (a model) in which people in a society come together and collectively agree on some social institution. I have argued that both social contract theorists and monetary theorists use these hypotheticals to draw normative conclusions about what types of institutions are preferable. However, part of monetary theory is also concerned with the positive (i.e., not normative) question “Where does money come from?” In a similar way, part of social contract theory is concerned with the positive question “Where does the state come from?”
Read both of Brian’s articles over on the AIER website:
Our paper analyzes the impact of a cash transfer program targeting households in extreme poverty in Uruguay, called the Tarjeta Uruguay Social (henceforth referred to as TUS). In the past decades, cash transfers have become one of the main social assistance policies used to address poverty and inequality in developing countries. Their objective is to reduce vulnerability by increasing and smoothing household income, although additional objectives are usually defined depending on the program and country, such as increasing access to health and education, and reducing food insecurity (DFID 2011; Honorati et al. 2015).
The impact of these programs on different life outcomes has been widely studied. Overall, positive impacts on poverty, food insecurity, child school enrollment, labor outcomes, health and social cohesion have been found (DFID 2011; ODI 2016). Nevertheless, more research is still needed to understand the channels and particular aspects that determine their success, since countries differ widely in the details of program design. In our research, by taking advantage of considerable design modifications since the implementation of TUS, we evaluate the impact of the amount of the transfer and the benefit duration on relevant outcomes.
The Tarjeta Uruguay Social (TUS) is a conditional cash transfer program implemented in 2009 which aims at assisting those in situations of extreme poverty in Uruguay. It targets the 60,000 worst-off households by providing them with a monthly cash transfer on a prepaid magnetic card. This card can be used to purchase food items, cleaning supplies, and hygiene products, excluding cigarettes and alcohol. Eligibility for the program is based on the Critical Needs Index (CNI), a proxy means test that evaluates household poverty, using variables associated to education, dwelling, access to durable goods and household composition. The program has undergone many modifications since its inception, including increasing the number of participants, changing the eligibility criteria, and a doubling of the benefit for half of the recipients. Our analysis begins in 2013, in which the program had 60,000 participants, and the poorest 30,000 according to the CNI received a doubling of their benefit, creating two benefit categories: Simple TUS and Double TUS. In our research, we exploit the doubling of the benefit based on the CNI by using a Fuzzy Regression Discontinuity Design to evaluate the impact of the amount of the benefit on life outcomes.
The availability of an extensive set of administrative data allowed us to evaluate the impact of the doubling on an array of outcomes. There are many different channels through which this cash transfer program could have positive effects, since the resources freed up by the relaxation of the household budget constraint could be used differently according to household preferences. Therefore, by taking advantage of a rich set of administrative data, we analyzed 65 outcomes: housing and living conditions, food insecurity, formal labor market work, education enrollment of children and adolescents, prenatal and birth health conditions, and family composition. Additionally, we analyze how the duration of the benefit affects the impact of the program by comparing the effects for beneficiaries who receive the transfer for different time periods. We analyze short-term outcomes for those who receive the transfer for less than a year; medium-term outcomes for those who receive the transfer for two to three years; and long-term outcomes for those who receive the transfer consistently for three years.
Our results show than an increase in the amount of a cash transfer can in fact have important impacts on the life outcomes of recipients. Positive effects were found with regard to living conditions, with an increase in investment in durable goods and a betterment of housing conditions, such as purchasing water heaters or washing machines, adding a bathroom to the home, and upgrading from a trash roof to a concrete one. Additionally, results show positive impacts concerning individual outcomes, with improvements regarding prenatal care and months of formal work observed. Nevertheless, some negative results were found in the short-term, which could potentially be explained by an attempt of manipulation by the beneficiaries in order to ensure continued benefit provision under uncertainty. Results also show that the duration of the benefit has a considerable impact on how the transfer is spent. More positive significant household results are found in the medium-term, while individual results become stronger in the long-term. The increasing effects of more persistent benefits could potentially be explained due to uncertainty in the short-term regarding whether the benefit will continue to be provided, which decreases over time.
This study contributes to the literature of poverty alleviation policies by providing evidence which can be used to improve the design of cash transfer programs. The positive effects found in this paper from comparing different amounts of the transfer within the same program indicate that the monetary amount of the benefit is a relevant policy parameter with consequences for the effectiveness of the program. Additionally, the results for heterogeneous effects by benefit duration indicate that the persistence of the transfer is another relevant aspect of program design. The evidence provided in this paper indicates that a predefined duration upon entering the program together with a minimum duration of one year could constitute a good practice. This may mitigate negative effects regarding household manipulation attempts and potentiate positive effects by reducing income volatility and increasing housing investments. Our results suggest that further research on benefit size and timing is imperative for policy design of cash transfers, one of the main tools to reach universal social protection.
The goal of this paper is to assess quantitatively the impact that the emergence of China in the international markets during the 1990s had on the U.S. economy (i.e. the so-called China Shock). To do so, I build a model with two sectors producing two final goods, each of them using as the only input of production an intermediate good specific to each sector. Final goods are produced in a perfectly competitive environment. The intermediate goods are produced in a frictional environment with labor as the only input. First I calibrate the close economy model to match some salient stylized facts from the 1980s in the U.S. Then to assess the China Shock I introduce a new country (China) in the international scene. I proceed with two calibration strategies: (i) calibrate China such that it matches the variation in the price of imports relative to the price of exports for the U.S. between the average of the 1980s and the average of 2005-2007, (ii) Calibrate China such that variation in allocations are close to the ones observed in data, for the same window of time. I found that under calibration (i) the China Shock in the model explains 26.38% of the variation in the share of employment in the manufacturing sector, 16.28% of the variation in the share of manufacturing production and 27.40% of the variation in the share of wages of the manufacturing sector. Finally, under calibration (ii) I found that the change in relative price needed to match between 80 to 90 percent of the variation in allocations is around 3.47 times the one observed in data.
Conclusions and key results:
According to the model, the China Shock explains 26.35% of the variation in the share of manufacture employment, 16.28% of the variation in the share of manufacturing production and 27.44% of the variation in the share of wages of the manufacturing sector. The first of these results is consistent with findings in Autor et al. (2013). On the other hand, the variation in the unemployment rate of the economy is not matched, neither for the first nor the second calibration of the open economy. I also found that as a consequence of the China Shock, real wages increase when measuring them in terms of the price of the import good, and decrease when measured in terms of the price of the export good. This result is not in line with findings in Autor et al. (2013). The optimal unemployment insurance in the open economy is 6.13% of average wages higher than in the close economy because the unemployment rate of the open economy is higher than in the close economy (0.9% difference). Finally, the model generates a non-traditional source of comparative advantage, arising from differences in the relative bargaining power of workers.
We capitalise on the 2006 implementation of a minimum wage for the hospitality sector to make well-evidenced inferences about the impact of the upcoming National Minimum Wage (NMW) Legislation on low-wage workers. Our paper focuses on the two largest low-wage sectors currently without minimum wage regulation, which are manufacturing and construction. Two regression specifications and sensitivity analysis are used to provide insights into the implication for wages, hours worked, employment, formality and poverty rates. In light of our results and a comprehensive review of the literature, we conclude that the NMW will be largely beneficial for low-wage labourers. Our critical recommendation for policymakers is the need for complementary policies to ensure compliance and facilitate the transition of vulnerable groups (particularly black women) into the formal sector.
Conclusions and key results:
From our first specification, our analysis suggests that wages and hours worked will increase in manufacturing and construction sectors as a result of the minimum wage, mostly driven by increases for black and female workers. Although the policy is likely to increase the formality rate among male workers, we predict formality will fall among females as employers try to circumvent the legislation. Therefore it is crucial that adequate complementary policies are implemented to ensure the benefits are captured by all population groups. Our second specification exploits the variation in the median wage across provinces. In doing so, we find no significant effect on wages, which signals regional impacts of the minimum wage are fairly homogeneous. Therefore, compared to other countries adopting a similar policy, the implementation of safety-nets combating the adverse effects of the minimum wage will be relatively more straightforward. By conducting sensitivity analysis around compliance rates and poverty lines already stipulated in the literature, we predict between 100,000 and 300,000 manufacturing and construction workers will be lifted out of wage poverty as a result of the minimum wage. We combine our empirical partial equilibrium analysis with theoretical general equilibrium forces to provide statements on the anticipated lower bound of wage changes.
In honor of Nobel Laureate Richard Thaler’s famous book Nudge turning 10, Aishwarya Deshpande (Economics ’18) writes in Behavioral Scientist magazine about the emerging subfield in development economics, namely behavioral development economics. The subfield aims to incorporate insights informed by behavioral science to address issues of persistent inequality, poverty alleviation and welfare.
Aishwarya had the pleasure of reading various academic papers that addressed these issues with innovative approaches in preparation for the essay. She finds that ‘last mile’ between intention and action can be bridged by understanding the limitations of the human mind, which potentially has many policymaking implications.
Behavioral science has come a long way in the past 50 years. While many of the early, pioneering studies took place in sanitized “lab” environments, with subjects from Western countries, the past decade has seen an explosion of behavioral science research in the messier environment of the developing world. This work has given us greater insight into how and why the world’s poorest populations make the decisions they do. But perhaps more importantly, this work has allowed behavioral scientists to directly improve the well-being of the world’s poorest and most vulnerable populations.
In this paper we study the dynamics and drivers of 10 year’s sovereign bond yields using a panel of the original 11 Eurozone countries (excluding Luxembourg). The interest of this study relies on the fact that despite very different macroeconomic policy stances in the variables that we believe determine interest rates among these countries, 10 years Eurozone bond yields almost perfectly converged during the 2000’s, before they suffered a sudden disconnection in the aftermath of the Great Financial Crisis.
To this end, we apply two different methodologies. A Panel Data approach (that we end discarding) and a Time Varying Coefficients model using the Kalman Filter, which allows us for capturing changes in the pricing mechanism of bond yields over time. Initially, by using the latter methodology without controlling for the volatility of the interest rates (which dramatically increased after 2008), we obtain very noisy results that are barely explainable, since the coefficients seem to be capturing these changes in volatility. Once we introduce in the filter a GARCH process for the variance-covariance matrix of the interest rates that we use in the Time Varying Coefficients approach, we manage to obtain much more meaningful and explicative results.
One of our key contributions is the inclusion of new fiscal and macroeconomic variables as determinants of yields in the different Eurozone countries, which were discarded by other studies in the field. We also contribute by controlling by common determinants to all the Eurozone countries, which we obtained by applying a common component approach. Furthermore, our findings confirm that after the period of divergence in interest rates, started in the aftermath of the Great Financial Crisis, and caused by a refocus on fundamentals, Eurozone interest rates have converged again under the effect of a normalization of bond yield drivers, similarly to their pre-crisis levels, although not to the same extent. Another implication that we find is that in times of economic uncertainty and financial hysteresis, when default risk becomes an issue, the effects of government policy on interest rates can significantly lead to accentuated crowding out effects.
Conclusions and key results:
Our work indicates that there has been a significant break in the way sovereign debt was priced after the Great Financial Crisis of 2008, indicating a return to fundamentals as main drivers of sovereign yields. We find that several factors reflective of fiscal and macroeconomic stances became increasingly important during the crisis, after having been ignored in previous years. As such, Debt to GDP, Deficit to GDP, GDP growth and Current Account balances to GDP, among others, started to play important roles in the determination of long term interest rates for Eurozone government bonds. In line with previous research, our findings confirm the existence of 3 distinct phases in the euro bond market. A period of high integration, a period of disintegration, and a phase of partial reintegration (Adam and Lo Duca (2017)).
Our findings suggest that during periods of economic uncertainty characterized by high volatility in the financial markets, investors tend to focus on fundamentals, while in times of economic boom they do not discriminate too much among the different stance of these macroeconomic determinants. This finding has important policy implications since it suggests that during economic crises interest rates react much more to unsustainable fiscal policies and macroeconomic imbalances than during calmer times, causing a great private sector crowding out effect (Laubach (2011)).
Therefore, our results suggest that governments should pay closer attention to their fiscal stances during times of economic turbulence in order to avoid the detrimental effects of high interest rates on activity in a period of economic agent´s lack of confidence. As argued before by De Grauwe and Ji (2013), this former effect is exacerbated by the fact that Eurozone governments have no control over monetary policy, making impossible for them to reduce interest rates by no other means than sound fiscal policies. In line with this result, we notice that the ECB’s unconventional monetary policy (we obtain that the impact of short term interest rates -one of our common determinants obtained by principal components- on long yields has diminished over time) helped to bring down European bond yields after 2014. This fact contributed to put the fiscal stances of these countries, and other essential macroeconomic variables, back to sustainable levels, that along with the structural reforms carried out (which in addition to the former effect, have also contributed to bring back economic confidence and dynamism) have had by its own another loosening impact in the interest rates that these countries have been facing in every debt issuance.
Regarding the methodologies used to address our research question, we were able to obtain robust results and determine which method was the most appropriate to investigate the drivers of 10 year’s sovereign bond yields. We found that panel data approaches, which are widely used in the literature, lead to unstable and unsatisfactory results, causing us to attach limited credibility to the outcomes of such analysis. However, the Time Varying Coefficients approach seems more reliable and yields more robust and plausible results after we model the changes in volatility appropriately. We believe that having a larger sample (we use the forecasts released twice a year by the IMF in its World Economic Outlook and by the OECD in their Economic Outlook in order to control by the effect of the market´s forward looking in current levels of interest rates, as well as by reverse causality) would have allowed us to obtain more reliable results on this approach as well.
A suggestion for further research would be to apply Bayesian techniques to estimate our model. Indeed, given the limited amount of data available and the complexity of our models, these methods seem to suit better in this kind of estimation, where the great amount of parameters, as well as the possible presence of non-linearities, can make the optimization process very costly. Consequently, this methodology would have allowed us to also model the variance of the Time Varying Parameters, and not only the ones of the interest rates (our observables) with another GARCH or stochastic volatility process, since we expect that these variances could also follow a conditional process, which might have an impact on our estimation results.
Under the context of digital platforms who act as an intermediary between consumers and sellers, Price Parity Clauses (PPCs) is a contractual restriction for the seller not to sell at a lower price through any other channel (the so-called wide PPCs), or only in its own channel (narrow PPCs). These clauses present a trade-off between efficiencies and anticompetitive effects. On one side, PPCs act as a committing device of the seller to solve the show-rooming effect suffered by platforms (a particular form of free- riding), at the same time that it ensures platforms viability and enhances its incentives to invest and innovate. On the other side, PPCs allow platforms to charge higher fees, and lead to foreclosure of the market. Currently, neither the EC nor NCAs have set a clear guidance on how to assess these clauses. The main contribution of this paper is to set a legal standard for both wide and narrow PPCs using the cost-error analysis. The conclusions we arrived to are that wide PPCs should be per se illegal; and narrow PPCs should be presumed legal unless proven otherwise, except if narrow PPCs are eliminating the competitive restraints of the platform, in which case the standard should be that of rebuttable presumption of illegality.
Digital Economy will rise the use of Digital Platforms. Network externalities inherent to two-sided markets lead to high market power that make platforms an indispensable ally.
Digital Platforms use PPCs and this is capturing the interest of Competition Authorities. But there is no consensus with respect to the legal standard.
PPCs present a trade-off: On the one hand efficiencies results in reduction of search costs, prevents showrooming, incentives on investment and innovation. On the other hand, anti-competitive effects arise, creating high fees, foreclosure, collusion.
The results of our cost-error analysis are that Wide PPCs Min Type II error, therefore should be Per se illegal. Narrow PPCs Min Type I error: Rebuttable Presumption of Legality, except if (i) One-Stop Shop/Network; (ii) Brand Positioning; (iii) Switching costs: Min. Type II: Rebuttable Presumption of illegality
Este sitio web utiliza cookies para que usted tenga la mejor experiencia de usuario. Si continúa navegando está dando su consentimiento para la aceptación de las mencionadas cookies y la aceptación de nuestra política de cookies, pinche el enlace para mayor información.plugin cookies