RePEc

Return to series list

NCER Working Paper Series

Search papers:

All  2013  2012  2011  2010  2009  2008  2007  2006  
No. 91   (Download full text)
Adam Clements and Yin Liao
The dynamics of co-jumps, volatility and correlation
Understanding the dynamics of volatility and correlation is a crucially important issue. The literature has developed rapidly in recent years with more sophisticated estimates of volatility, and its associated jump and diffusion components. Previous work has found that jumps at an index level are not related to future volatility. Here we examine the links between co-jumps within a group of large stocks, the volatility of, and correlation between their returns. It is found that the occurrence of common, or co-jumps between the stocks are unrelated to the level of volatility or correlation. On the other hand, both volatility and correlation are lower subsequent to a co-jump. This indicates that co-jumps are a transient event but in contrast to earlier research have a greater impact that jumps at an index level.
JEL-Codes: C22; G00.
Keywords: Realized volatility, correlation, jumps, co-jumps, point process
No. 90   (Download full text)
A. S. Hurn, K. A. Lindsay and A. J. Mcclelland
On the Efficacy of Fourier Series Approximations for Pricing European and Digital Options
This paper investigates several competing procedures for computing the price of European and digital options in which the underlying model has a characteristic function that is known in at least semi-closed form. The algorithms for pricing the options investigated here are the half-range Fourier cosine series, the half-range Fourier sine series and the full-range Fourier series. The performance of the algorithms is assessed in simulation experiments which price options in a Black-Scholes world where an analytical solution is available and for a simple affine model of stochastic volatility in which there is no closed-form solution. The results suggest that the half-range sine series approximation is the least effective of the three proposed algorithms. It is rather more difficult to distinguish between the performance of the half-range cosine series and the full-range Fourier series. There are however two clear differences. First, when the interval over which the density is approximated is relatively large, the full-range Fourier series is at least as good as the half-range Fourier cosine series, and outperforms the latter in pricing out-of-the-money call options, in particular with maturities of three months or less. Second, the computational time required by the half-range Fourier cosine series is uniformly longer than that required by the full-range Fourier series for an interval of fixed length. Taken together, these two conclusions make a strong case for the merit of pricing options using a full-range range Fourier series as opposed to a half-range Fourier cosine series.
Keywords: Fourier transform, Fourier series, characteristic function, option price
No. 89   (Download full text)
Hiranya K Nath and Jayanta Sarkar
City Relative Price Dynamics in Australia: Are Structural Breaks Important?
This paper examines the dynamic behaviour of relative prices across seven Australian cities by applying panel unit root test procedures with structural breaks to quarterly CPI data for 1972Q1-2011Q4. We find overwhelming evidence of convergence in city relative prices. Three common structural breaks are endogenously determined at 1985, 1995, and 2007. Further, correcting for two potential biases, namely Nickell bias and time aggregation bias, we obtain half-life estimates of 2.3-3.8 quarters that are much shorter than those reported by previous research. Thus, we conclude that both structural breaks and bias corrections are important to obtain shorter half-life estimates.
JEL-Codes: C33; E31; R19
Keywords: Relative price convergence; Structural break; Panel unit root test; Half-life; Time
No. 88   (Download full text)
Adam Clements and Joanne Fuller
Forecasting increases in the VIX: A time-varying long volatility hedge for equities
Since the introduction of volatility derivatives, there has been growing interest in option implied volatility (IV). Many studies have examined informational content, and or forecast accuracy of IV, however there is relatively less work on directly modeling and forecasting IV. This paper uses a semi-parametric forecasting approaching to implement a time varying long volatility hedge to combine with a long equity position. It is found that such a equity-volatility combination improves the risk-return characteristics of a simple long equity position which is particularly successful during periods of market turmoil.
JEL-Codes: C22; G00
Keywords: Implied volatility, VIX, hedging, semi-parametric, forecasting
No. 87   (Download full text)
Stan Hurn, Ken Lindsay and Andrew McClelland
Estimating the Parameters of Stochastic Volatility Models using Option Price Data
This paper describes a maximum likelihood method for estimating the parameters of Heston's model of stochastic volatility using data on an underlying market index and the prices of options written on that index. Parameters of the physical measure (associated with the index) and the parameters of the risk-neutral measure (associated with the options) are identified including the equity and volatility risk premia. The estimation is implemented using a particle filter. The computational load of this estimation method, which previously has been prohibitive, is managed by the effective use of parallel computing using Graphical Processing Units. A byproduct of this focus on easing the computational burden is the development of a simplification of the closed-form approximation used to price European options in Heston's model. The efficacy of the filter is demonstrated under simulation and an empirical investigation of the fit of the model to the S&P 500 Index is undertaken. All the parameters of the model are reliably estimated and, in contrast to previous work, the volatility premium is well estimated and found to be significant.
JEL-Codes: C22;C52
Keywords: stochastic volatility, parameter estimation, maximum likelihood, particle filter
No. 86   (Download full text)
Stephen Hogg, Stan Hurn, Stuart McDonald and Alicia Rambaldi
A Spatial Econometric Analysis of the Effect of Vertical Restraints and Branding on Retail Gasoline Pricing
This paper builds an econometric model of retail gas competition to explain the pricing decisions of retail outlets in terms of vertical management structures, input costs and the characteristics of the local market they operate within. The model is estimated using price data from retail outlets from the South-Eastern Queensland region in Australia, but the generic nature of the model means that the results will be of general interest. The results indicate that when the cost of crude oil and demographic variations across different localities are accounted for, branding (i.e. whether the retail outlet is affiliated with one of the major brand distributers - Shell, Caltex, Mobil or BP) has a statistically significant positive effect on prices at nearby retail outlets. Conversely, the presence of an independent (non-branded) retailer within a locality has the effect of lowering retail prices. Furthermore, the results of this research show that service stations participating in discount coupon schemes with the two major retail supermarket chains have the effect of largely off-setting the price increase derived from branding affiliation. While, branding effects are not fully cancelled out, the overall effect is that prices are still higher than if branding did not occur.
JEL-Codes: C21; L13
Keywords: Retail Gasoline Pricing, Vertical Restraints, Shop-a-Docket Discount Scheme, Spatial Econometrics, Australia
No. 85   (Download full text)
Adam E Clements, Mark Doolan, Stan Hurn and Ralf Becker
Selecting forecasting models for portfolio allocation
Techniques for evaluating and selecting multivariate volatility forecasts are not yet as well understood as their univariate counterparts. This paper considers the ability of different loss functions to discriminate between a competing set of forecasting models which are subsequently applied in a portfolio allocation context. It is found that a likelihood based loss function outperforms it competitors including those based on the given portfolio application. This result indicates that the particular application of forecasts is not necessarily the most effective approach under which to select models.
JEL-Codes: C22; G00
Keywords: Multivariate volatility, portfolio allocation, forecast evaluation, model selection, model confidence set
No. 84   (Download full text)
Jayanta Sarkar and Dipanwita Sarkar
Why does child labour persist with declining poverty?
Uneven success of poverty-based approaches calls for a re-think of the causes behind persistent child labour in many developing societies. We develop a theoretical model to highlight the role of income inequality as a channel of persistence. The interplay between income inequality and investments in human capital gives rise to a non-convergent dynamic path of income distribution characterised by clustering of steady state relative incomes around local poles. The child labour trap thus generated is shown to preserve itself despite rising per capita income. In this context, we demonstrate that redistributive policies, such as public provision of education can alleviate the trap, while a ceteris paribus ban on child labour is likely to aggravate it.
JEL-Codes: I1;J2;O1,O2
Keywords: Child labour, Health, Human capital, Income inequality, Multiple equilibria
No. 83   (Download full text)
Lionel Page, David Savage and Benno Torgler
Variation in Risk Seeking Behavior in a Natural Experiment on Large Losses Induced by a Natural Disaster
This study explores people's risk attitudes after having suffered large real-world losses following a natural disaster. Using the margins of the 2011 Australian floods (Brisbane) as a natural experimental setting, we find that homeowners who were victims of the floods and face large losses in property values are 50% more likely to opt for a risky gamble - a scratch card giving a small chance of a large gain ($500,000) - than for a sure amount of comparable value ($10). This finding is consistent with prospect theory predictions of the adoption of a risk-seeking attitude after a loss.
JEL-Codes: D03;D81;C93
Keywords: Decision under risk, large losses, natural experiment
No. 82   (Download full text)
Adam E Clements, Joanne Fuller and Stan Hurn
Semi-parametric forecasting of Spikes in Electricity Prices
The occurrence of extreme movements in the spot price of electricity represent a significant source of risk to retailers. Electricity markets are often structured so as to allow retailers to purchase at an unregulated spot price but then sell to consumers at a heavily regulated price. As such, the ability to forecast price spikes is an important aspect of effective risk management. A range of approaches have been considered with respect to modelling electricity prices, including predicting the trajectory of spot prices, as well as more recently, focusing of the prediction of spikes specifically. These models however, have relied on time series approaches which typically use restrictive decay schemes placing greater weight on more recent observations. This paper develops an alternative, semi-parametric method for forecasting that does not rely on this convention. In this approach, a forecast is a weighted average of historical price data, with the greatest weight given to periods that exhibit similar market conditions to the time at which the forecast is being formed. Weighting is determined by comparing short-term trends in electricity price spike occurrences across time, including other relevant factors such as load, by means of a multivariate kernel scheme. It is found that the semi-parametric method produces forecasts that are more accurate than the previously identified best approach for a short forecast horizon.
JEL-Codes: C14; C53.
Keywords: Electricity Prices, Prices Spikes, Semi-parametric, Multivariate Kernel
No. 81   (Download full text)
Uwe Dulleck, David Johnston, Rudolf Kerschbamer and Matthias Sutter
The Good, the Bad and the Naive: Do fair prices signal good types or do they induce good behaviour?
Evidence on behavior of experts in credence goods markets raises an important causality issue: Do "fair prices" induce "good behavior", or do "good experts" post "fair prices"? To answer this question we propose and test a model with three seller types: "the good" choose fair prices and behave consumer-friendly; "the bad" mimic the good types' price-setting, but cheat on quality; and "the naive" fall victim to a projection bias that all sellers behave like the bad types. OLS, sample selection and fixed effects regressions support the model's predictions and show that causality goes from good experts to fair prices.
JEL-Codes: C91, L15, D82, D40
Keywords: Credence Goods, Experts, Pricing,
No. 80   (Download full text)
Adam E Clements, Ayesha Scott and Annastiina Silvennoinen
Forecasting multivariate volatility in larger dimensions: some practical issues
The importance of covariance modelling has long been recognised in the field of portfolio management and large dimensional multivariate problems are increasingly becoming the focus of research. This paper provides a straightforward and commonsense approach toward investigating whether simpler moving average based correlation forecasting methods have equal predictive accuracy as their more complex multivariate GARCH counterparts for large dimensional problems. We find simpler forecasting techniques do provide equal (and often superior) predictive accuracy in a minimum variance sense. A portfolio allocation problem is used to compare forecasting methods. The global minimum variance portfolio and Model Confidence Set (Hansen, Lunde, and Nason (2003)) are used to compare methods, whilst portfolio weight stability and computational time are also considered.
JEL-Codes: C22;G11;G17
Keywords: Volatility, multivariate GARCH, portfolio allocation
No. 79   (Download full text)
Uwe Dulleck and Berthold U Wigger
Expert Politicians, Electoral Control, and Fiscal Restraints
Fiscal restraints have been argued to force today's governments to internalize the externalities that result from extensive borrowing on future electorates and governments as well as on other countries by causing fiscal instability. In this article we provide an alternative argument for fiscal restraints which is based on an agency perspective on government. A budget maximizing politician is better informed than the electorate about the necessary spending to ensure the states ability to provide services for the economy. In this respect, the politician is an expert in the meaning of the credence good literature. The electorate, being able to observe the budget but not the necessary level of spending, will reelect a government if its budget does not exceed a critical level. A fiscal restraint limits the maximum spending a government will choose if the reelection level is not sufficient to ensure the state's ability to provide services to the economy. We determine when such a fiscal restraint improves voter welfare and discuss the role of the opposition in situations where very high levels of spending are required.
JEL-Codes: D82;H50;H61
Keywords: Electoral control, Fiscal restraints, Credence goods
No. 78   (Download full text)
Uwe Dulleck and Andreas Loffler
μ-σ Games
Risk aversion in game theory is usually modelled using expected utility, which has been critized early on leading to an extensive literature on generalized expected utility. In this paper we are first to apply μ-σ theory to the analysis of (static) games.
μ-σ theory is widely accepted in the finance literature, using it allows us to study the effect on uncertainty endogenous to the game, i.e. mixed equilibria. In particular, we look at the case of linear μ-σ utility functions and determine the best response strategy. In the case of 2x2- and NxM-games we are able to characterize all mixed equilibria.
No. 77   (Download full text)
Philipp Engler
Monetary Policy and Unemployment in Open Economies
After an expansionary monetary policy shock employment increases and unemployment falls. In standard New Keynesian models the fall in aggregate unemployment does not affect employed workers at all. However, Luchinger, Meier and Stutzer (2010) found that the risk of unemployment negatively affects utility of employed workers: An increases in aggregate unemployment decreases workers' subjective well-being, which can be explained by an increased risk of becoming unemployed. I take account of this effect in an otherwise standard New Keynesian open economy model with unemployment as in Gali (2010) and find two important results with respect to expansionary monetary policy shocks: First, the usual wealth effect in New Keynesian models of a declining labor force, which is at odds with the data as high-lighted by Christiano, Trabandt and Walentin (2010), is shut down. Second, the welfare effects of such shocks improve considerably, modifying the standard results of the open economy literature that set off with Obstfeld and Rogoff's (1995) redux model.
JEL-Codes: E24;E52;F32;F41
Keywords: Open economy macroeconomics, monetary policy, unemployment
No. 76   (Download full text)
Adam E Clements and Annastiina Silvennoinen
Volatility timing and portfolio selection: How best to forecast volatility
Within the context of volatility timing and portfolio selection this paper considers how best to estimate a volatility model. Two issues are dealt with, namely the frequency of data used to construct volatility estimates, and the loss function used to estimate the parameters of a volatility model. We find support for the use of intraday data for estimating volatility which is consistent with earlier research. We also find that the choice of loss function is important and show that a simple mean squared error loss, overall provides the best forecasts of volatility upon which to form optimal portfolios.
JEL-Codes: C22;G11; G17
Keywords: Volatility, volatility timing, utility, portfolio allocation, realized volatility
No. 75   (Download full text)
Adrian Pagan and Don Harding
Econometric Analysis and Prediction of Recurrent Events
Economic events such as expansions and recessions in economic activity, bull and bear markets in stock prices and financial crises have long attracted substantial interest. In recent times there has been a focus upon predicting the events and constructing Early Warning Systems of them. Econometric analysis of such recurrent events is however in its infancy. One can represent the events as a set of binary indicators. However they are different to the binary random variables studied in micro-econometrics, being constructed from some (possibly) continuous data. The lecture discusses what difference this makes to their econometric analysis. It sets out a framework which deals with how the binary variables are constructed, what an appropriate estimation procedure would be, and the implications for the prediction of them. An example based on Turkish business cycles is used throughout the lecture.
JEL-Codes: C22; E32; E37
Keywords: Business and Financial Cycles; Binary Time Series; BBQ Algorithm
No. 74   (Download full text)
Uwe Dulleck, Jacob Fell and Jonas Fooken
Within-subject Intra- and Inter-method consistency of two experimental risk attitude elicitation
We compare the consistency of choices in two methods to used elicit risk preferences on an aggregate as well as on an individual level. We asked subjects to choose twice from a list of nine decision between two lotteries, as introduced by Holt and Laury (2002, 2005) alternating with nine decisions using the budget approach introduced by Andreoni and Harbaugh (2009). We find that while on an aggregate (subject pool) level the results are (roughly) consistent, on an individual (within-subject) level, behavior is far from consistent. Within each method as well as across methods we observe low correlations. This again questions the reliability of experimental risk elicitation measures and the ability to use results from such methods to control for the risk aversion of subjects when explaining effects in other experimental games.
JEL-Codes: C91; D81
Keywords: risk preferences, laboratory experiment, elicitation methods, subject heterogeneity
No. 73   (Download full text)
Uwe Dulleck and Jianpei Li
Contracting for Infrastructure Projects as Credence Goods
Large infrastructure projects are a major responsibility of government, who usually lacks expertise to fully specify the demanded projects. Contractors, typically experts on such projects, advise of the needed design in their bids. Producing the right design is nevertheless costly.
We model the contracting for such infrastructure projects taking into account this credence goods feature and examine the performance of commonly used contracting methods. We show that when building costs are public information, multistage competitive bidding involving shortlisting of two contractors and contingent compensation of both contractors on design efforts outperforms sequential search and the traditional Design-and-Build approach. While the latter leads to minimum design effort, sequential search suffers from a commitment problem. If building costs are the private information of the contractors and are revealed to them after design cost is sunk, competitive bidding may involve sampling more than two contractors. The commitment problem under sequential search may be overcome by the procurer's incentive to search for low building cost if the design cost is sufficiently low. If this is the case, sequential search may outperform competitive bidding.
JEL-Codes: L14, D82, D44, R50
Keywords: Credence Goods, Design-Build, Competitive Bidding, Sequential Search, Infrastructure Projects
No. 72   (Download full text)
Adam E Clements, Christopher A Coleman-Fenn and Daniel R Smith
Forecasting Equicorrelation
We study the out-of-sample forecasting performance of several time-series models of equicorrelation, which is the average pairwise correlation between a number of assets. Building on the existing Dynamic Conditional Correlation and Linear Dynamic Equicorrelation models, we propose a model that uses proxies for equicorrelation based on high-frequency intraday data, and the level of equicorrelation implied by options prices. Using state-of-the-art statistical evaluation technology, we find that the use of both realized and implied equicorrelations outperform models that use daily data alone. However, the out-of-sample forecasting benefits of implied equicorrelation disappear when used in conjunction with the realized measures.
JEL-Codes: C32; C53; G17
Keywords: Equicorrelation, Implied Correlation, Multivariate GARCH, DCC
No. 71   (Download full text)
Gunnar Bardsen, Stan Hurn and Zoe McHugh
Asymmetric unemployment rate dynamics in Australia
The unemployment rate in Australia is modelled as an asymmetric and nonlinear function of aggregate demand, productivity, real interest rates, the replacement ratio and the real exchange rate. If changes in unemployment are big, the management of of demand, real interest rates and the replacement ratio will be good instruments to start bringing it down. The model is developed by exploiting recent developments in automated model-selection procedures.
JEL-Codes: C12; C52; C87; E24; E32
Keywords: unemployement, non-linearity, dynamic modelling, aggregate demand, real wages
No. 70   (Download full text)
Tim Christensen, Stan Hurn and Ken Lindsay
Forecasting Spikes in Electricity Prices
In many electricity markets, retailers purchase electricity at an unregulated spot price and sell to consumers at a heavily regulated price. Consequently the occurrence of extreme movements in the spot price represents a major source of risk to retailers and the accurate forecasting of these extreme events or price spikes is an important aspect of effective risk management. Traditional approaches to modeling electricity prices are aimed primarily at predicting the trajectory of spot prices. By contrast, this paper focuses exclusively on the prediction of spikes in electricity prices. The time series of price spikes is treated as a realization of a discrete-time point process and a nonlinear variant of the autoregressive conditional hazard (ACH) model is used to model this process. The model is estimated using half-hourly data from the Australian electricity market for the sample period 1 March 2001 to 30 June 2007. The estimated model is then used to provide one-step-ahead forecasts of the probability of an extreme event for every half hour for the forecast period, 1 July 2007 to 30 September 2007, chosen to correspond to the duration of a typical forward contract. The forecasting performance of the model is then evaluated against a benchmark that is consistent with the assumptions of commonly-used electricity pricing models.
JEL-Codes: C14; C52
Keywords: Electricity Prices, Price Spikes, Autoregressive Conditional Duration, Autoregressive
No. 69   (Download full text)
Don Harding and Adrian Pagan
Can We Predict Recessions?
The fact that the Global Financial Crisis, and the Great Recession it ushered in, was largely unforeseen, has led to the common opinion that macroeconomic models and analysis is deficient in some way. Of course it has probably always been true that businessmen, journalists and politicians have agreed on the proposition that economists can't forecast recessions. Yet we see an enormous published literature that presents results which suggest it is possible to do so, either with some new model or some new estimation method e.g. Kaufman (2010), Galvao (2006), Dueker (2005), Wright (2006) and Moneta (2005). Moreover, there seem to be no shortage of papers still emerging that make claims along these lines. So a question that naturally arises is how one is to reconcile the existence of an expanding literature on predicting recessions with the scepticism noted above?
Keywords: Global Financial Crisis, Great Recession,
No. 68   (Download full text) (forthcoming)
Amir Rubin and Daniel Smith
Comparing Different Explanations of the Volatility Trend
We analyze the puzzling behavior of the volatility of individual stock returns over the past few decades. The literature has provided many different explanations to the trend in volatility and this paper tests the viability of the different explanations. Virtually all current theoretical arguments that are provided for the trend in the average level of volatility over time lend themselves to explanations about the difference in volatility levels between firms in the cross-section. We therefore focus separately on the crosssectional and time-series explanatory power of the different proxies. We fail to find a proxy that is able to explain both dimensions well. In particular, we find that Cao et al. (2008) market-to-book ratio tracks average volatility levels well, but has no crosssectional explanatory power. On the other hand, the low-price proxy suggested by Brandt et al. (2010) has much cross-sectional explanatory power, but has virtually no time-series explanatory power. We also find that the different proxies do not explain the trend in volatility in the period prior to 1995 (R-squared of virtually zero), but explain rather well the trend in volatility at the turn of the Millennium (1995-2005).
JEL-Codes: G32;G35
Keywords: Volatility; Trend; Turnover
No. 67   (Download full text) (forthcoming)
Wagner Piazza Gaglianone, Luiz Renato Lima, Oliver Linton and Daniel Smith
Evaluating Value-at-Risk Models via Quantile Regression
This paper is concerned with evaluating Value-at-Risk estimates. It is well known that using only binary variables, such as whether or not there was an exception, sacrifices too much information. However, most of the specification tests (also called backtests) available in the literature, such as Christofferson (1998) and Engle and Mangenelli (2004) are based on such variables. In this paper we propose a new backtest that does not rely solely on binary variables. It is shown that the new backtest provides a sufficient condtion to assess the finite sample performance of a quantile model whereas the existing ones do not. The proposed methodolgy allows us to identify periods of an increased risk exposure based on a quantile regression model (Koenker and Xiao, 2002). Our theoretical findings are corroborated through a Monte Carlo simulation and an empirical exercise with daily S&P500 time series.
JEL-Codes: C12;C14;C52,G11
Keywords: Value-at-Risk, Backtesting, Quantile Regression
No. 66   (Download full text)
Ralf Becker, Adam Clements and Robert O'Neill
A Kernel Technique for Forecasting the Variance-Covariance Matrix
The forecasting of variance-covariance matrices is an important issue. In recent years an increasing body of literature has focused on multivariate models to forecast this quantity. This paper develops a nonparametric technique for generating multivariate volatility forecasts from a weighted average of historical volatility and a broader set of macroeconomic variables. As opposed to traditional techniques where the weights solely decay as a function of time, this approach employs a kernel weighting scheme where historical periods exhibiting the most similar conditions to the time at which the forecast if formed attract the greatest weight. It is found that the proposed method leads to superior forecasts, with macroeconomic information playing an important role.
JEL-Codes: C14, C32, C53, C58
Keywords: Nonparametric, variance-covariance matrix, volatility forecasting, multivariate
No. 65   (Download full text)
Stan Hurn, Andrew McClelland and Kenneth Lindsay
A quasi-maximum likelihood method for estimating the parameters of multivariate diffusions
This paper develops a quasi-maximum likelihood (QML) procedure for estimating the parameters of multi-dimensional stochastic differential equations. The transitional density is taken to be a time-varying multivariate Gaussian where the first two moments of the distribution are approximately the true moments of the unknown transitional density. For affine drift and diffusion functions, the moments are shown to be exactly those of the true transitional density and for nonlinear drift and diffusion functions the approximation is extremely good. The estimation procedure is easily generalizable to models with latent factors, such as the stochastic volatility class of model. The QML method is as effective as alternative methods when proxy variables are used for unobserved states. A conditioning estimation procedure is also developed that allows parameter estimation in the absence of proxies.
JEL-Codes: C22;C52
Keywords: stochastic differential equations, parameter estimation, quasi-maximum likelihood, moments
No. 64   (Download full text)
Ralf Becker and Adam Clements
Volatility and the role of order book structure
There is much literature that deals with modeling and forecasting asset return volatility. However, much of this research does not attempt to explain variations in the level of volatility. Movements in volatility are often linked to trading volume or frequency, as a reflection of underlying information flow. This paper considers whether the state of an open limit order book influences volatility. It is found that market depth and order imbalance do influence volatility, even in the presence of the traditional volume related variables.
JEL-Codes: G10;G12
Keywords: Realized volatility, bi-power variation, limit order book, market microstructure, order imbalance
No. 63   (Download full text)
Adrian Pagan
Can Turkish Recessions be Predicted?
In response to the widespread criticism that macro-economists failed to predict the global recession coming from the GFC, we look at whether recessions in Turkey can be predicted. Because the growth in Turkish GDP is quite persistent one might expect this is possible. But it is the sign of GDP growth that needs to be forecast if we are to predict a recession, and this is made more difficult by the persistence in GDP growth. We build a small SVAR model of the Turkish economy that is motivated by New Keynesian models of the open economy, and find that using the variables entering it increases predictive success, although it is still the case that the predictive record is not good. Non-linear models for Turkish growth are then found to add little to predictive ability. Fundamentally, recession prediction requires one to forecast future shocks to the economy, and thus one needs some indicators of these. The paper explores a range of indicators for the Turkish economy, but none are particularly advantageous. Developing a bigger range of these indicators should be a priority for future Turkish macro-economic research.
Keywords: Business cycles, binary models, predicting recessions
No. 62   (Download full text)
Lionel Page and Katie Page
Evidence of referees' national favouritism in rugby
The present article reports evidence of national favouritism from professional referees in two major sports: Rugby League and Rugby Union. National favouritism can appear when a referee is in charge of a match where one team (and only one) is from his country. For fear of the risk of such favouritism, such situations are avoided in most major sports. In this study we study two specific competitions who depart from this national neutrality" rule: the European Super League in Rugby League (and its second tier competition) and the Super 14 in Rugby Union. In both cases we find strong evidence that referees favour teams from their own nationality, in a way which has a large influence on match results.
For these two major competitions, the Super League and the Super 14, we compare how a team performs in situations where the referee both shares their nationality and in situations where the referee comes from a different nationality. We also analyse referees' decisions within matches (such as penalty and try decisions) in a Rugby League competition, the Championship (second tier below the Super League). In both Rugby League and Rugby Union we find strong evidence of national favouritism.
Keywords: Rugby league; Rugby Union; favouritism
No. 61   (Download full text)
Nicholas King, P Dorian Owen and Rick Audas
Playoff Uncertainty, Match Uncertainty and Attendance at Australian National Rugby League Matches
This paper develops a new simulation-based measure of playoff uncertainty and investigates its contribution to modelling match attendance compared to other variants of playoff uncertainty in the existing literature. A model of match attendance that incorporates match uncertainty, playoff uncertainty, past home-team performance and other relevant control variables is fitted to Australian National Rugby League data for seasons 2004-2008 using fixed effects estimation. The results suggest that playoff uncertainty and home-team success are more important determinants of match attendance than match uncertainty. Alternative measures of playoff uncertainty based on points behind the leader, although more ad hoc, also appear able to capture the effects of playoff uncertainty.
JEL-Codes: C23; L83
Keywords: playoff uncertainty, match uncertainty, sports league attendance, Australian National Rugby League, fixed effects estimation
No. 60   (Download full text)
Ralf Becker, Adam Clements and Robert O'Neill
A Cholesky-MIDAS model for predicting stock portfolio volatility
This paper presents a simple forecasting technique for variance covariance matrices. It relies significantly on the contribution of Chiriac and Voev (2010) who propose to forecast elements of the Cholesky decomposition which recombine to form a positive definite forecast for the variance covariance matrix. The method proposed here combines this methodology with advances made in the MIDAS literature to produce a forecasting methodology that is flexible, scales easily with the size of the portfolio and produces superior forecasts in simulation experiments and an empirical application.
JEL-Codes: c22;G00
Keywords: Cholesky, Midas, volatility forecasts
No. 59   (Download full text)
P Dorian Owen
Measuring Parity in Sports Leagues with Draws: Further Comments
This paper re-examines the calculation of the relative standard deviation (RSD) measure of competitive balance in leagues in which draws are possible outcomes. Some key conclusions emerging from the exchange between Cain and Haddock (2006) and Fort (2007) are reversed. There is no difference, for any given points assignment scheme, between the RSD for absolute points compared to percentages of points. However, variations in the points assignment that change the ratio of points for a win compared to a draw do result in different RSD values, although the numerical differences are minor.
JEL-Codes: D63;L83
Keywords: sports economics, competitive balance, relative standard deviation,idealized standard deviation, draws/ties
No. 58   (Download full text)
Don Harding
Applying shape and phase restrictions in generalized dynamic categorical models of the business cycle
To match the NBER business cycle features it is necessary to employ Generalised dynamic categorical (GDC) models that impose certain phase restrictions and permit multiple indexes. Theory suggests additional shape restrictions in the form of monotonicity and boundedness of certain transition probabilities. Maximum likelihood and constraint weighted bootstrap estimators are developed to impose these restrictions. In the application these estimators generate improved estimates of how the probability of recession varies with the yield spread.
JEL-Codes: C22;C53;E32;E37
Keywords: Generalized dynamic categorical model, Business cycle; binary variable, Markov process, probit model, yield curve
No. 57   (Download full text)
Renee Fry and Adrian Pagan
Sign Restrictions in Structural Vector Autoregressions: A Critical Review
The paper provides a review of the estimation of structural VARs with sign restrictions. It is shown how sign restrictions solve the parametric identification problem present in structural systems but leave the model identification problem unresolved. A market and a macro model are used to illustrate these points. Suggestions have been made on how to find a unique model. These are reviewed, along with some of the difficulties that can arise in how one is to use the impulse responses found with sign restrictions.
JEL-Codes: E32;C51;C32
Keywords: Structural Vector Autoregressions, New Keynesian Model, Sign Restrictions
No. 56   (Download full text)
Mardi Dungey and Lyudmyla Hvozdyk
Cojumping: Evidence from the US Treasury Bond and Futures Markets
The basis between spot and future prices will be affected by jump behavior in each asset price, challenging intraday hedging strategies. Using a formal cojumping test this paper considers the cojumping behavior of spot and futures prices in high frequency US Treasury data. Cojumping occurs most frequently at shorter maturities and higher sampling frequencies. We find that the presence of an anticipated macroeconomic news announcement, and particularly non-farm payrolls, increases the probability of observing cojumps. However, a negative surprise in non-farm payrolls, also increases the probability of the cojumping tests being unable to determine whether jumps in spots and futures occur contemporaneously, or alternatively that one market follows the other. On these occasions the market does not clearly signal its short term pricing behavior.
JEL-Codes: C1; C32; G14
Keywords: US Treasury markets, high frequency data, cojump test
No. 55   (Download full text)
Martin G. Kocher, Marc V. Lenz and Matthias Sutter
Psychological pressure in competitive environments: Evidence from a randomized natural experiment: Comment
Apesteguia and Palacios-Huerta (forthcoming) report for a sample of 129 shootouts from various seasons in ten different competitions that teams kicking first in soccer penalty shootouts win significantly more often than teams kicking second. Collecting data for the entire history of six major soccer competitions we cannot replicate their result. Teams kicking first win only 53.4% of 262 shootouts in our data, which is not significantly different from random. Our findings have two implications: (1) Apesteguia and Palacios-Huerta's results are not generally robust. (2) Using specific subsamples without a coherent criterion for data selection might lead to non-representative findings.
JEL-Codes: C93
Keywords: Tournament, first-mover advantage, psychological pressure, field experiment, soccer, penalty shootouts
No. 54   (Download full text)
Adam Clements and Annastiina Silvennoinen
Portfolio allocation: Getting the most out of realised volatility
Recent advances in the measurement of volatility have utilized high frequency intraday data to produce what are generally known as realised volatility estimates. It has been shown that forecasts generated from such estimates are of positive economic value in the context of portfolio allocation. This paper considers the link between the value of such forecasts and the loss function under which models of realised volatility are estimated. It is found that employing a utility based estimation criteria is preferred over likelihood estimation, however a simple mean squared error criteria performs in a similar manner. These findings have obvious implications for the manner in which volatility models based on realised volatility are estimated when one wishes to inform the portfolio allocation decision.
JEL-Codes: C22; G11;G17
Keywords: Volatility, utility, portfolio allocation, realized volatility, MIDAS
No. 53   (Download full text)
Luis CatÃŖo and Adrian Pagan
The Credit Channel and Monetary Transmission in Brazil and Chile: A Structured VAR Approach
We use an expectation-augmented SVAR representation of an open economy New Keynesian model to study monetary transmission in Brazil and Chile. The underlying structural model incorporates key structural features of Emerging Market economies, notably the role of a bank-credit channel. We find that interest rate changes have swifter effects on output and inflation in both countries compared to advanced economies and that exchange rate dynamics plays an important role in monetary transmission, as currency movements are highly responsive to changes in in policy-controlled interest rates. We also find the typical size of credit shocks to have large effects on output and inflation in the two economies, being stronger in Chile where bank penetration is higher.
JEL-Codes: C51; E31; E52
Keywords: Monetary Policy, Bank Credit, VAR, Brazil, Chile
No. 52   (Download full text)
Vlad Pavlov and Stan Hurn
Testing the Profitability of Technical Analysis as a Portfolio Selection Strategy
One of the main diffculties in evaluating the profits obtained using technical analysis is that trading rules are often specifed rather vaguely by practitioners and depend upon the judicious choice of rule parameters. In this paper, popular moving-average (or cross-over) rules are applied to a cross-section of Australian stocks and the signals from the rules are used to form portfolios. The performance of the trading rules across the full range of possible parameter values is evaluated by means of an aggregate test that does not depend on the parameters of the rules. The results indicate that for a wide range of parameters moving-average rules generate contrarian profits (profits from the moving-average rules are negative). In bootstrap simulations the returns statistics are significant indicating that the moving-average rules pick up some form of systematic variation in returns that does not correlate with the standard risk factors.
JEL-Codes: C22; C53;Q49
Keywords: Stock returns, Technical analysis, Momentum trading rules, Bootstrapping.
No. 51   (Download full text)
Sue Bridgewater, Lawrence M. Kahn and Amanda H. Goodall
Substitution Between Managers and Subordinates: Evidence from British Football
We use data on British football managers and teams over the 1994-2007 period to study substitution and complementarity between leaders and subordinates. We find for the Premier League (the highest level of competition) that, other things being equal, managers who themselves played at a higher level raise the productivity of less-skilled teams by more than that of highly skilled teams. This is consistent with the hypothesis that one function of a top manager is to communicate to subordinates the skills needed to succeed, since less skilled players have more to learn. We also find that managers with more accumulated professional managing experience raise the productivity of talented players by more than that of less-talented players. This is consistent with the hypothesis that a further function of successful managers in high-performance workplaces is to manage the egos of elite workers. Such a function is likely more important the more accomplished the workers are -- as indicated, in our data, by teams with greater payrolls.
JEL-Codes: J24; M51
Keywords: Productivity, leadership
No. 50   (Download full text)
Martin Fukac and Adrian Pagan
Structural Macro-Econometric Modelling in a Policy Environment
The paper looks at the development of macroeconometric models over the past sixty years. In particular those that have been used for analysing policy options. We argue that there have been four generations of these. Each generation has evolved new features that have been partly drawn from the developing academic literature and partly from the perceived weaknesses in the previous generation. Overall the evolution has been governed by a desire to answer a set of basic questions and sometimes by what can be achieved using new computational methods. Our account of each generation considers their design, the way in which parameters were quantified and how they were evaluated.
JEL-Codes: E12;E13;C51;C52
Keywords: DSGE models;Phillips Curve;Macroeconometric Models;Bayesian Estimation
No. 49   (Download full text)
Tim M Christensen, Stan Hurn and Adrian Pagan
Detecting Common Dynamics in Transitory Components
This paper considers VAR/VECM models for variables exhibiting cointegration and common features in the transitory components. While the presence of cointegration reduces the rank of the long-run multiplier matrix, other types of common features lead to rank reduction in the short-run dynamics. These common transitory components arise when linear combination of the first differenced variables in a cointegrated VAR are white noise. This paper offers a reinterpretation of the traditional approach to testing for common feature dynamics, namely checking for a singular covariance matrix for the transitory components. Instead, the matrix of short-run coefficients becomes the focus of the testing procedure thus allowing a wide range of tests for reduced rank in parameter matrices to be potentially relevant tests of common transitory components. The performance of the different methods is illustrated in a Monte Carlo analysis which is then used to reexamine an existing empirical study. Finally, this approach is applied to analyze whether one would observe common dynamics in standard DSGE models.
JEL-Codes: C14; C52.
Keywords: Transitory components, common features, reduced rank, cointegration.
No. 48   (Download full text)
Egon Franck, Erwin Verbeek and Stephan Nuesch
Inter-market Arbitrage in Sports Betting
Unlike the existing literature on sports betting, which concentrates on arbitrage within a single market, this paper examines inter-market arbitrage by searching for arbitrage opportunities through combining bets at the bookmaker and the exchange market. Using the posted odds of eight different bookmakers and the corresponding odds traded at a well-known bet exchange for 5,478 football matches played in the top-five European leagues during three seasons, we find (only) ten intra-market arbitrage opportunities. However, we find 1,450 cases in which a combined bet at the bookmaker as well as at the exchange yields a guaranteed positive return. Further analyses reveal that inter-market arbitrage emerges from different levels of informational efficiency between the two markets.
Keywords: sports betting, inter-market arbitrage
No. 47   (Download full text)
Raul Caruso
Relational Good at Work! Crime and Sport Participation in Italy. Evidence from Panel Data Regional Analysis over the Period 1997-2003.
What is the broad impact of sport participation and sport activities in a society? The first aim of this paper is tackling this crucial point by studying whether or not there is a relationship between sport participation and crime. A panel dataset have been constructed for the twenty Italian regions over the period 1997-2003. The impact of spot participation on different type of crimes has been studied. Results show that: (i) there is a robust negative association between sport participation and property crime; (ii) There is a robust negative association between sport participation and juvenile crime; (iii) There is a positive association between sport participation and violent crime, but it is only weakly significant.
JEL-Codes: L83; D62
Keywords: Sport participation, relational goods, crime, Kenneth Boulding
No. 46   (Download full text) (Accepted)
Peter Dawson and Stephen Dobson
The Influence of Social Pressure and Nationality on Individual Decisions: Evidence from the Behaviour of Referees
This study considers the influences on agents’ decisions in an international context. Using data from five seasons of European cup football matches it is found that referees favour home teams when awarding yellow and red cards. Previous research on referee decisions in national leagues has identified social pressure as a key reason for favouritism. While social pressure is also found to be an important influence in this study, the international setting shows that nationality is another important influence on the decision-making of referees. In considering principal-agent relationships account needs to be taken not only of how agents (referees) decide under social pressure but also of how national identity shapes agents’ decision making.
JEL-Codes: D81; L83
Keywords: social pressure, nationality, decision-making, referee home bias, football
No. 45   (Download full text)
Ralf Becker, Adam Clements and Christopher Coleman-Fenn
Forecast performance of implied volatility and the impact of the volatility risk premium
Forecasting volatility has received a great deal of research attention, with the relative performance of econometric models based on time-series data and option implied volatility forecasts often being considered. While many studies find that implied volatility is the preferred approach, a number of issues remain unresolved. Implied volatilities are risk-neutral forecasts of spot volatility, whereas time-series models are estimated on risk-adjusted or real world data of the underlying. Recently, an intuitive method has been proposed to adjust these risk-neutral forecasts into their risk-adjusted equivalents, possibly improving on their forecast accuracy. By utilising recent econometric advances, this paper considers whether these risk-adjusted forecasts are statistically superior to the unadjusted forecasts, as well as a wide range of model based forecasts. It is found that an unadjusted risk-neutral implied volatility is an inferior forecast. However, after adjusting for the risk premia it is of equal predictive accuracy relative to a number of model based forecasts.
JEL-Codes: C12;C22;G00
Keywords: Implied volatility, volatility forecasts, volatility models, volatility risk premium, model confidence sets
No. 44   (Download full text)
Adam Clements and Annastiina Silvennoinen
On the economic benefit of utility based estimation of a volatility model
Forecasts of asset return volatility are necessary for many financial applications, including portfolio allocation. Traditionally, the parameters of econometric models used to generate volatility forecasts are estimated in a statistical setting and subsequently used in an economic setting such as portfolio allocation. Differences in the criteria under which the model is estimated and applied may inhibit reduce the overall economic benefit of a model in the context of portfolio allocation. This paper investigates the economic benefit of direct utility based estimation of the parameters of a volatility model and allows for practical issues such as transactions costs to be incorporated within the estimation scheme. In doing so, we compare the benefits stemming from various estimators of historical volatility in the context of portfolio allocation. It is found that maximal utility based estimation, taking into account transactions costs, of a simple volatility model is preferred on the basis of greater realized utility. Estimation of models using historical daily returns is preferred over historical realized volatility.
JEL-Codes: C10;C22;G11;G17
Keywords: Volatility, utility, portfolio allocation, realized volatility, MIDAS
No. 43   (Download full text)
Adam Clements and Ralf Becker
A nonparametric approach to forecasting realized volatility
A well developed literature exists in relation to modeling and forecasting asset return volatility. Much of this relate to the development of time series models of volatility. This paper proposes an alternative method for forecasting volatility that does not involve such a model. Under this approach a forecast is a weighted average of historical volatility. The greatest weight is given to periods that exhibit the most similar market conditions to the time at which the forecast is being formed. Weighting occurs by comparing short-term trends in volatility across time (as a measure of market conditions) by the application of a multivariate kernel scheme. It is found that at a 1 day forecast horizon, the proposed method produces forecasts that are significantly more accurate than competing approaches.
JEL-Codes: C22; G00
Keywords: Volatility, forecasts, forecast evaluation, model confidence set, nonparametric
No. 42   (Download full text)
Uwe Dulleck, Rudolf Kerschbamer and Matthias Sutter
The Economics of Credence Goods: On the Role of Liability, Verifiability, Reputation and Competition
Credence goods markets are characterized by asymmetric information between sellers and consumers that may give rise to inefficiencies, such as under- and overtreatment or market break-down. We study in a large experiment with 936 participants the determinants for efficiency in credence goods markets. While theory predicts that either liability or verifiability yields efficiency, we find that liability has a crucial, but verifiability only a minor effect. Allowing sellers to build up reputation has little influence, as predicted. Seller competition drives down prices and yields maximal trade, but does not lead to higher efficiency as long as liability is violated.
No. 41   (Download full text)
Adam Clements, Mark Doolan, Stan Hurn and Ralf Becker
Evaluating multivariate volatility forecasts
The performance of techniques for evaluating univariate volatility forecasts are well understood. In the multivariate setting however, the efficacy of the evaluation techniques is not developed. Multivariate forecasts are often evaluated within an economic application such as portfolio optimisation context. This paper aims to evaluate the efficacy of such techniques, along with traditional statistical based methods. It is found that utility based methods perform poorly in terms of identifying optimal forecasts whereas statistical methods are more effective.
JEL-Codes: C22; G00
Keywords: Multivariate volatility, forecasts, forecast evaluation, Model confidence set
No. 40   (Download full text) (forthcoming)
Lawrence M. Kahn
The Economics of Discrimination: Evidence from Basketball
This Chapter reviews evidence on discrimination in basketball, primarily examining studies on race but with some discussion of gender as well. I focus on discrimination in pay, hiring, and retention against black NBA players and coaches and pay disparities by gender among college coaches. There was much evidence for each of these forms of discrimination against black NBA players in the 1980s. However, there appears to be less evidence of racial compensation, hiring and retention discrimination against black players in the 1990s and early 2000s than the 1980s. This apparent decline is consistent with research on customer discrimination in the NBA: in the 1980s, there was abundant evidence of fan preference for white players; however, since the 1980s, these preferences seem much weaker. There appears to be little evidence of pay, hiring or retention discrimination against black NBA coaches, and while male college basketball coaches outearn females, this gap is accounted for by differences in revenues and coaches' work histories. There is some dispute over whether these revenue differences are themselves the result of employer discrimination.
JEL-Codes: J71; L83.
Keywords: discrimination, race, gender, basketball
No. 39   (Download full text) (Accepted)
Don Harding and Adrian Pagan
An Econometric Analysis of Some Models for Constructed Binary Time Series
Macroeconometric and financial researchers often use secondary or constructed binary random variables that differ in terms of their statistical properties from the primary random variables used in micro-econometric studies. One important difference between primary and secondary binary variables is that, while the former are, in many instances, independently distributed (i.d.), the latter are rarely i.d. We show how popular rules for constructing the binary states interact with the stochastic processes for of the variables they are constructed from, so that the binary states need to be treated as Markov processes. Consequently, one needs to recognize this when performing analyses with the binary variables, and it is not valid to adopt a model like static Probit which fails to recognize such dependence. Moreover, these binary variables are often censored, in that they are constructed in such a way as to result in sequences of them possessing the same sign. Such censoring imposes restrictions upon the DGP of the binary states and it creates difficulties if one tries to utilize a dynamic Probit model with them. Given this we describe methods for modeling with these variables that both respects their Markov process nature and which explicitly deals with any censoring constraints. An application is provided that investigates the relation between the business cycle and the yield spread.
JEL-Codes: C22; C53; E32; E37
Keywords: Business cycle; binary variable, Markov process, Probit model, yield curve
No. 38   (Download full text)
Richard Dennis
Timeless Perspective Policymaking: When is Discretion Superior?
In this paper I show that discretionary policymaking can be superior to timeless perspective policymaking and identify model features that make this outcome more likely. Developing a measure of conditional loss that treats the auxiliary state variables that characterize the timeless perspective equilibrium appropriately, I use a New Keynesian DSGE model to show that discretion can dominate timeless perspective policymaking when the Phillips curve is relatively flat, due, perhaps, to firm-specific capital (or labor) and/or Kimball (1995) aggregation in combination with nominal price rigidity. These results suggest that studies applying the timeless perspective might also usefully compare its performance to discretion, paying careful attention to how policy performance is evaluated.
JEL-Codes: C61; E52; E58.
Keywords: Discretion, timeless perspective, policy evaluation.
No. 37   (Download full text)
Paul Frijters, Amy Y.C. Liu and Xin Meng
Are optimistic expectations keeping the Chinese happy?
In this paper we study the effect of optimistic income expectations on life satisfaction amongst the Chinese population. Using a large scale household survey conducted in 2002 we find that the level of optimism about the future is particularly strong in the countryside and amongst rural-to-urban migrants. The importance of these expectations for life satisfaction is particularly pronounced in the urban areas, though also highly significant for the rural area. If expectations were to reverse from positive to negative, we calculate that this would have doubled the proportion of unhappy people and reduced proportion of very happy people by 48%. We perform several robustness checks to see if the results are driven by variations in precautionary savings or reverse causality.
JEL-Codes: C35; D63; D91; P2
Keywords: Expectations; Happiness; Consumption and Savings; China; Political Economy
No. 36   (Download full text)
Benno Torgler, Markus Schaffner, Bruno S. Frey, Sascha L. Schmidt and Uwe Dulleck
Inequality Aversion and Performance in and on the Field
The experimental literature and studies using survey data have established that people care a great deal about their relative economic position and not solely, as standard economic theory assumes, about their absolute economic position. Individuals are concerned about social comparisons. However, behavioral evidence in the field is rare. This paper provides an empirical analysis, testing the model of inequality aversion using two unique panel data sets for basketball and soccer players. We find support that the concept of inequality aversion helps to understand how the relative income situation affects performance in a real competitive environment with real tasks and real incentives.
JEL-Codes: D000; D600; 8222; 9210; L830
Keywords: Inequality aversion, relative income, positional concerns, envy, social comparison, performance, interdependent preferences
No. 35   (Download full text)
T M Christensen, A. S. Hurn and K A Lindsay
Discrete time-series models when counts are unobservable
Count data in economics have traditionally been modeled by means of integer-valued autoregressive models. Consequently, the estimation of the parameters of these models and their asymptotic properties have been well documented in the literature. The models comprise a description of the survival of counts generally in terms of a binomial thinning process and an independent arrivals process usually specified in terms of a Poisson distribution. This paper extends the existing class of models to encompass situations in which counts are latent and all that is observed is the presence or absence of counts. This is a potentially important modification as many interesting economic phenomena may have a natural interpretation as a series of 'events' that are driven by an underlying count process which is unobserved. Arrivals of the latent counts are modeled either in terms of the Poisson distribution, where multiple counts may arrive in the sampling interval, or in terms of the Bernoulli distribution, where only one new arrival is allowed in the same sampling interval. The models with latent counts are then applied in two practical illustrations, namely, modeling volatility in financial markets as a function of unobservable 'news' and abnormal price spikes in electricity markets being driven by latent 'stress'.
JEL-Codes: C13; C25; C32.
Keywords: Integer-valued autoregression, Poisson distribution, Bernoulli distribution, latent factors, maximum likelihood estimation
No. 34   (Download full text)
Adam Clements, A S Hurn and K A Lindsay
Developing analytical distributions for temperature indices for the purposes of pricing temperature-based weather derivatives
Temperature-based weather derivatives are written on an index which is normally defined to be a nonlinear function of average daily temperatures. Recent empirical work has demonstrated the usefulness of simple time-series models of temperature for estimating the payoffs to these instruments. This paper develops analytical distributions of temperature indices on which temperature derivatives are written. If deviations of daily temperature from its expected value is modelled as an Ornstein-Uhlenbeck process with time-varying variance, then the distributions of the temperature index on which the derivative is written is the sum of truncated, correlated Gaussian deviates. The key result of this paper is to provide an analytical approximation to the distribution of this sum, thus allowing the accurate computation of payoffs without the need for any simulation. A data set comprising average daily temperature spanning over a hundred years for four Australian cities is used to demonstrate the efficacy of this approach for estimating the payoffs to temperature derivatives. It is demonstrated that expected payoffs computed directly from historical records is a particulary poor approach to the problem when there are trends in underlying average daily temperature. It is shown that the proposed analytical approach is superior to historical pricing.
JEL-Codes: C14, C52.
Keywords: Weather Derivatives, Temperature Models, Cooling Degree Days, Maximum Likelihood Estimation, Distribution for Correlated Variables
No. 33   (Download full text)
Adam Clements, A S Hurn and K A Lindsay
Estimating the Payoffs of Temperature-based Weather Derivatives
Temperature-based weather derivatives are written on an index which is normally defined to be a nonlinear function of average daily temperatures. Recent empirical work has demonstrated the usefulness of simple time-series models of temperature for estimating the payoffs to these instruments. This paper argues that a more direct and parsimonious approach is to model the time-series behaviour of the index itself, provided a sufficiently rich supply of historical data is available. A data set comprising average daily temperature spanning over a hundred years for four Australian cities is assembled. The data is then used to compare the actual payoffs of temperature-based European call options with the expected payoffs computed from historical temperature records and two time-series approaches. It is concluded that expected payoffs computed directly from historical records perform poorly by comparison with the expected payoffs generated by means of competing time-series models. It is also found that modeling the relevant temperature index directly is superior to modeling average daily temperatures.
JEL-Codes: C14;C52.
Keywords: Temperature, Weather Derivatives, Cooling Degree Days, Time-series Models.
No. 32   (Download full text)
T M Christensen, A S Hurn and K A Lindsay
The Devil is in the Detail: Hints for Practical Optimisation
Finding the minimum of an objective function, such as a least squares or negative log-likelihood function, with respect to the unknown model parameters is a problem often encountered in econometrics. Consequently, students of econometrics and applied econometricians are usually well-grounded in the broad differences between the numerical procedures employed to solve these problems. Often, however, relatively little time is given to understanding the practical subtleties of implementing these schemes when faced with illbehaved problems. This paper addresses some of the details involved in practical optimisation, such as dealing with constraints on the parameters, specifying starting values, termination criteria and analytical gradients, and illustrates some of the general ideas with several instructive examples.
JEL-Codes: C13, C63
Keywords: gradient algorithms, unconstrained optimisation, generalised method of moments.
No. 31   (Download full text)
Uwe Dulleck, Franz Hackl, Bernhard Weiss and Rudolf Winter-Ebmer
Buying Online: Sequential Decision Making by Shopbot Visitors
In this article we propose a two stage procedure to model demand decisions by customers who are balancing several dimensions of a product. We then test our procedure by analyzing the behavior of buyers from an Austrian price comparison site. Although in such a market a consumer will typically search for the cheapest price for a given product, reliability and service of the supplier are other important characteristics of a retailer. In our data, consumers follow such a two stage procedure: they select a shortlist of suppliers by using the price variable only; finally, they trade off reliability and price among these shortlisted suppliers.
JEL-Codes: L81, D83.
Keywords: e-commerce, price comparison, decision theory, heuristics, seller reputation
No. 30   (Download full text)
Richard Dennis
Model Uncertainty and Monetary Policy
Model uncertainty has the potential to change importantly how monetary policy should be conducted, making it an issue that central banks cannot ignore. In this paper, I use a standard new Keynesian business cycle model to analyze the behavior of a central bank that conducts policy with discretion while fearing that its model is misspecified. My main results are as follows. First, policy performance can be improved if the discretionary central bank implements a robust policy. This important result is obtained because the central bank's desire for robustness directs it to assertively stabilize inflation, thereby mitigating the stabilization bias associated with discretionary policymaking. In effect, a fear of model uncertainty can act similarly to a commitment mechanism. Second, exploiting the connection between robust control and uncertainty aversion, I show that the central bank's fear of model misspecification leads it to forecast future outcomes under the belief that inflation (in particular) will be persistent and have large unconditional variance, raising the probability of extreme outcomes. Private agents, however, anticipating the policy response, make decisions under the belief that inflation will be more closely stabilized, that is, more tightly distributed, than under rational expectations. Third, as a technical contribution, I show how to solve an important class of linear-quadratic robust Markov-perfect Stackelberg problems.
JEL-Codes: E52; E62; C61.
Keywords: Model uncertainty, robustness, uncertainty aversion, time-consistency.
No. 29   (Download full text)
Richard Dennis
The Frequency of Price Adjustment and New Keynesian Business Cycle Dynamics
The Calvo pricing model that lies at the heart of many New Keynesian business cycle models has been roundly criticized for being inconsistent both with time series data on inflation and with micro-data on the frequency of price changes. In this paper I develop a new pricing model whose structure can be interpreted in terms of menu costs and information gathering/processing costs, that usefully addresses both criticisms. The resulting Phillips curve encompasses the partial-indexation model, the full-indexation model, and the Calvo model, and can speak to micro-data in ways that these models cannot. Taking the Phillips curve to the data, I find that the share of firms that change prices each quarter is about 60 percent and, reflecting the importance of information gathering/processing costs, that most firms that change prices use indexation. Exploiting an isomorphism result, I show that these values are consistent with estimates implied by the partial-indexation model.
JEL-Codes: C11, C52, E31, E52.
Keywords: Price adjustment, inflation indexation, Bayesian estimation.
No. 28   (Download full text)
Paul Frijters and Aydogan Ulker
Robustness in Health Research: Do differences in health measures, techniques, and time frame matter?
Survey-based health research is in a boom phase following an increased amount of health spending in OECD countries and the interest in ageing. A general characteristic of survey-based health research is its diversity. Different studies are based on different health questions in different datasets; they use different statistical techniques; they differ in whether they approach health from an ordinal or cardinal perspective; and they differ in whether they measure short-term or long-term effects. The question in this paper is simple: do these differences matter for the findings? We investigate the effects of life-style choices (drinking, smoking, exercise) and income on six measures of health in the US Health and Retirement Study (HRS) between 1992 and 2002: (1) self-assessed general health status, (2) problems with undertaking daily tasks and chores, (3) mental health indicators, (4) BMI, (5) the presence of serious long-term health conditions, and (6) mortality. We compare ordinal models with cardinal models; we compare models with fixed effects to models without fixed-effects; and we compare short-term effects to long-term effects. We find considerable variation in the impact of different determinants on our chosen health outcome measures; we find that it matters whether ordinality or cardinality is assumed; we find substantial differences between estimates that account for fixed effects versus those that do not; and we find that short-run and long-run effects differ greatly. All this implies that health is an even more complicated notion than hitherto thought, defying generalizations from one measure to the others or one methodology to another.
JEL-Codes: Z1; C23; C25; I31
Keywords: Morbidity, Mortality, Lifestyle, Alcohol, Smoking, Exercise, Income
No. 27   (Download full text)
Paul Frijters, David W. Johnston, Manisha Shah and Michael A. Shields
Early Child Development and Maternal Labor Force Participation: Using Handedness as an Instrument
We estimate the effect of early child development on maternal labor force participation using data from teacher assessments. Mothers might react to having a poorly developing child by dropping out of the formal labor force in order to spend more time with their child, or they could potentially increase their labor supply to be able to provide the funds for better education and health resources. Which action dominates is therefore the empirical question we seek to answer in this paper. Importantly, we control for the potential endogeneity of child development by using an instrumental variables approach, uniquely exploiting exogenous variation in child development associated with child handedness. We find that having a poorly developing young child reduces the probability that a mother will participate in the labor market by about 25 percentage points.
JEL-Codes: J22; J13; C31
Keywords: Child Development, Maternal Labor Force Participation, Handedness
No. 26   (Download full text)
Paul Frijters and Tony Beatton
The mystery of the U-shaped relationship between happiness and age.
In this paper we address the puzzle of the relation between age and happiness. Whilst the majority of psychologists have concluded there is not much of a relationship at all, the economic literature has unearthed a possible U-shape relationship. In this paper we replicate the U-shape for the German SocioEconomic Panel (GSOEP), and we investigate several possible explanations for it.
JEL-Codes: C23; C25; I31.
Keywords: Happiness methodology, unobservables, latent variable
No. 25   (Download full text)
T M Christensen, A S Hurn and K A Lindsay
It never rains but it pours: Modelling the persistence of spikes in electricity prices
During periods of market stress, electricity prices can rise dramatically. This paper treats these abnormal episodes or price spikes as count events and attempts to build a model of the spiking process. In contrast to the existing literature, which either ignores temporal dependence in the spiking process or attempts to model the dependence solely in terms of deterministic variables (like seasonal and day of the week effects), this paper argues that persistence in the spiking process is an important factor in building an effective model. A Poisson autoregressive framework is proposed in which price spikes occur as a result of the latent arrival and survival of system stresses. This formulation captures the salient features of the process adequately, and yields forecasts of price spikes that are superior to those obtained from näıve models which do not account for persistence in the spiking process.
JEL-Codes: C14, C52
Keywords: Electricity Prices, Extreme Events, Poisson Regressions, Poisson Autoregressive Model
No. 24   (Download full text)
Ralf Becker, Adam Clements and Andrew McClelland
The Jump component of S&P 500 volatility and the VIX index
Much research has investigated the differences between option implied volatilities and econometric model-based forecasts in terms of forecast accuracy and relative informational content. Implied volatility is a market determined forecast, in contrast to model-based forecasts that employ some degree of smoothing to generate forecasts. Therefore, implied volatility has the potential to reflect information that a model-based forecast could not. Specifically, this paper considers two issues relating to the informational content of the S&P 500 VIX implied volatility index. First, whether it subsumes information on how historical jump activity contributed to the price volatility, followed by whether the VIX reflects any incremental information relative to model based forecasts pertaining to future jumps. It is found that the VIX index both subsumes information relating to past jump contributions to volatility and reflects incremental information pertaining to future jump activity, relative to modelbased forecasts. This is an issue that has not been examined previously in the literature and expands our understanding of how option markets form their volatility forecasts.
JEL-Codes: C12, C22, G00, G14
Keywords: Implied volatility, VIX, volatility forecasts, informational efficiency, jumps
No. 23   (Download full text)
A. S. Hurn and V.Pavlov
Momentum in Australian Stock Returns: An Update
It has been documented that a momentum investment strategy based on buying past well performing stocks while selling past losing stocks, is a profitable one in the Australian context particularly in the 1990s. The aim of this short paper is to investigate whether or not this feature of Australian stock returns is still evident. The paper confirms the presence of a medium-term momentum effect, but also provides some interesting new evidence on the importance of the size effect on momentum.
JEL-Codes: G11, G12
Keywords: Stock returns, Momentum portfolios, Size effect
No. 22   (Download full text)
Mardi Dungey, George Milunovich and Susan Thorp
Unobservable Shocks as Carriers of Contagion: A Dynamic Analysis Using Identified Structural GARCH
Markets in financial crisis may experience heightened sensitivity to news from abroad and they may also spread turbulence into foreign markets, creating contagion. We use a structural GARCH model to separate and measure these two parts of crisis transmission. Unobservable structural shocks are named and linked to source markets using variance decompositions, allowing clearer interpretation of impulse response functions. Applying this method to data from the Asian crisis, we find signifcant contagion from Hong Kong to nearby markets but little heightened sensitivity. Impulse response functions for an equally-weighted equity portfolio show the increasing dominance of Korean and Hong Kong shocks during the crisis, whereas Indonesia's infuence shrinks.
JEL-Codes: F37, C51
Keywords: Contagion, Structural GARCH
No. 21   (Download full text) (forthcoming)
Mardi Dungey and Adrian Pagan
Extending an SVAR Model of the Australian Economy
Dungey and Pagan (2000) present an SVAR model of the Australian economy which models macro-economic outcomes as transitory deviations from a deterministic trend. In this paper we extend that model in two directions. Firstly, we relate it to an emerging literature on DSGE modelling of small open economies. Secondly, we allow for both transitory and permanent components in the series and show how this modification has an impact upon the design of macroeconomic models.
No. 20   (Download full text)
Benno Torgler, Nemanja Antic and Uwe Dulleck
Mirror, Mirror on the Wall, who is the Happiest of Them All?
This paper turns Snow-White's magic mirror onto recent economics Nobel Prize winners, top economists and happiness researchers, and through the eyes of the "man in the street" seeks to determine who the happiest academic is. The study not only provides a clear answer to this question but also unveils who is the ladies' man and who is the sweetheart of the aged. It also explores the extent to which information matters and whether individuals' self-reported happiness affects their perceptions about the happiness of these superstars in economics.
JEL-Codes: A110, D100, I310
Keywords: happiness, subjective well-being, perceptions, superstars, economists
No. 19   (Download full text)
Justina AV Fischer and Benno Torgler
Social Capital And Relative Income Concerns: Evidence From 26 Countries
Research evidence on the impact of relative income position on individuals' attitudes and behaviour is sorely lacking. Therefore, using the International Social Survey Programme 1998 data from 26 countries this paper investigates the impact of relative income on 14 measurements of social capital. We find support for a considerable deleterious positional concern effect of persons below the reference income. This effect is more sizeable by far than the beneficial impact of a relative income advantage. Most of the results indicate that such an effect is non-linear. Lastly, changing the reference group (regional versus national) produces no significant differences in the results.
JEL-Codes: Z130; I300; D310
Keywords: Relative income, positional concerns, social capital, social norms, happiness.
No. 18   (Download full text)
Ralf Becker and Adam Clements
Forecasting stock market volatility conditional on macroeconomic conditions.
This paper presents a GARCH type volatility model with a time-varying unconditional volatility which is a function of macroeconomic information. It is an extension of the SPLINE GARCH model proposed by Engle and Rangel (2005). The advantage of the model proposed in this paper is that the macroeconomic information available (and/or forecasts)is used in the parameter estimation process. Based on an application of this model to S&P500 share index returns, it is demonstrated that forecasts of macroeconomic variables can be easily incorporated into volatility forecasts for share index returns. It transpires that the model proposed here can lead to significantly improved volatility forecasts compared to traditional GARCH type volatility models.
JEL-Codes: C12; C22; G00
Keywords: Volatility, macroeconomic data, forecast, spline, GARCH.
No. 17   (Download full text)
Ralf Becker and Adam Clements
Are combination forecasts of S&P 500 volatility statistically superior?
Forecasting volatility has received a great deal of research attention. Many articles have considered the relative performance of econometric model based and option implied volatility forecasts. While many studies have found that implied volatility is the preferred approach, a number of issues remain unresolved. One issue being the relative merit of combination forecasts. By utilising recent econometric advances, this paper considers whether combination forecasts of S&P 500 volatility are statistically superior to a wide range of model based forecasts and implied volatility. It is found that combination forecasts are the dominant approach, indicating that the VIX cannot simply be viewed as a combination of various model based forecasts.
JEL-Codes: C12; C22; G00
Keywords: Implied volatility, volatility forecasts, volatility models, realized volatility, combination forecasts.
No. 16   (Download full text)
Uwe Dulleck and Neil Foster
Imported Equipment, Human Capital and Economic Growth in Developing Countries
De Long and Summers (1991) began a literature examining the impact of equipment investment on growth. In this paper we examine such a relationship for developing countries by considering imports of equipment from advanced countries as our measure of equipment investment for a sample of 55 developing countries. We examine whether the level of human capital in a country affects its ability to benefit from such investment. We find a complex interrelationship between imported equipment and human capital. Generally, the relationship between imported equipment and growth is lowest, and often negative, for countries with low levels of human capital, highest for countries within an intermediate range and somewhat in between for countries with the highest level of human capital.
JEL-Codes: F43; O15; O40
Keywords: Capital Goods Imports, Human Capital, Developing Countries, Technology Diffusion
No. 15   (Download full text)
Ralf Becker, Adam Clements and James Curchin
Does implied volatility reflect a wider information set than econometric forecasts?
Much research has addressed the relative performance of option implied volatilities and econometric model based forecasts in terms of forecasting asset return volatility. The general theme to come from this body of work is that implied volatility is a superior forecast. Some authors attribute this to the fact that option markets use a wider information set when forming their forecasts of volatility. This article considers this issue and determines whether S&P 500 implied volatility reflects a set of economic information beyond its impact on the prevailing level of volatility. It is found, that while the implied volatility subsumes this information, as do model based forecasts, this is only due to its impact on the current or prevailing level of volatility. Therefore, it appears as though implied volatility does not reflect a wider information set than model based forecasts, implying that implied volatility forecasts simply reflect volatility persistence in much the same way of as do econometric models.
JEL-Codes: C12; C22; G00; G14
Keywords: Implied volatility, VIX, volatility forecasts, informational efficiency
No. 14   (Download full text)
Renee Fry and Adrian Pagan
Some Issues in Using Sign Restrictions for Identifying Structural VARs
The paper looks at estimation of structural VARs with sign restrictions. Since sign restrictions do not generate a unique model it is necessary to find some way of summarizing the information they yield. Existing methods present impulse responses from different models and it is argued that they should come from a common model. If this is not done the implied shocks implicit in the impulse responses will not be orthogonal. A method is described that tries to resolve this difficulty. It works with a common model whose impulse responses are as close as possible to the median values of the impulse responses (taken over the range of models satisfying the sign restrictions). Using a simple demand and supply model it is shown that there is no reason to think that sign restrictions will generate better quantitative estimates of the effects of shocks than existing methods such as assuming a system is recursive.
No. 13   (Download full text)
Adrian Pagan
Weak Instruments: A Guide to the Literature
Weak instruments have become an issue in many contexts in which econometric methods have been used. Some progress has been made into how one diagnoses the problem and how one makes an allowance for it. The present paper gives a partial survey of this literature, focussing upon some of the major contributions and trying to provide a relatively simple exposition of the proposed solutions.
No. 12   (Download full text)
Ronald G. Cummings, Jorge Martinez-Vazquez, Michael McKee and Benno Torgler
Effects of Tax Morale on Tax Compliance: Experimental and Survey Evidence
There is considerable evidence that enforcement efforts can increase tax compliance. However, there must be other forces at work because observed compliance levels cannot be fully explained by the level of enforcement actions typical of most tax authorities. Further, there are observed differences, not related to enforcement effort, in the levels of compliance across countries and cultures. To fully understand differences in compliance behavior across cultures one needs to understand differences in tax administration and citizen attitudes toward governments. The working hypothesis is that cross-cultural differences in behavior have foundations in these institutions. Tax compliance is a complex behavioral issue and its investigation requires the use of a variety of methods and data sources. Results from laboratory experiments conducted in different countries demonstrate that observed differences in tax compliance levels can be explained by differences in the fairness of tax administration, in the perceived fiscal exchange, and in the overall attitude towards the respective governments. These experimental results are shown to be robust by replicating them for the same countries using survey response measures of tax compliance.
JEL-Codes: H20; C90
No. 11   (Download full text)
Benno Torgler, Sascha L. Schmidt and Bruno S. Frey
The Power of Positional Concerns: A Panel Analysis
Many studies have established that people care a great deal about their relative economic position and not solely, as standard economic theory assumes, about their absolute economic position. However, behavioral evidence is rare. This paper provides an empirical analysis on how individuals' relative income position affects their performance. Using a unique data set for 1040 soccer players over a period of eight seasons, our analysis suggests that if a player's salary is below the average and this difference increases, his performance worsens and the productivity decreasing effects of positional concerns are stronger. Moreover, the larger the income differences within a team, the stronger positional concern effects are observable. We also find that the more the players are integrated in a particular social environment (their team), the more evident a relative income effect is. Finally, we find that positional effects are stronger among high performing teams.
Keywords: Relative income, positional concerns, envy, performance, social integration
No. 10   (Download full text)
Ralf Becker, Stan Hurn and Vlad Pavlov
Modelling Spikes in Electricity Prices
During periods of market stress, electricity prices can rise dramatically. Electricity retailers cannot pass these extreme prices on to customers because of retail price regulation. Improved prediction of these price spikes, therefore, is important for risk management. This paper builds a time-varying-probability Markov-switching model of Queensland electricity prices, aimed particularly at forecasting price spikes. Variables capturing demand and weather patterns are used to drive the transition probabilities. Unlike traditional Markov-switching models, that assume normality of the prices in each state, the model presented here uses a generalized beta distribution to allow for the skewness in the distribution of electricity prices during high-price episodes.
JEL-Codes: C22; C53; Q49
Keywords: electricity prices, regime switching, time-varying probabilities, beta
No. 9   (Download full text)
A. Hurn, J. Jeisman and K. Lindsay
Teaching an Old Dog New Tricks: Improved Estimation of the Parameters of Stochastic Differential Equations by Numerical Solution of the Fokker-Planck Equation
Many stochastic differential equations (SDEs) do not have readily available closed-form expressions for their transitional probability density functions (PDFs). As a result, a large number of competing estimation approaches have been proposed in order to obtain maximum-likelihood estimates of their parameters. Arguably the most straightforward of these is one in which the required estimates of the transitional PDF are obtained by numerical solution of the Fokker-Planck(or forward-Kolmogorov) partial differential equation. Despite the fact that this method produces accurate estimates and is completely generic, it has not proved popular in the applied literature. Perhaps this is attributable to the fact that this approach requires repeated solution of a parabolic partial differential equation to obtain the transitional PDF and is therefore computationally quite expensive. In this paper, three avenues for improving the reliability and speed of this estimation method are introduced and explored in the context of estimating the parameters of the popular Cox-Ingersoll-Ross and Ornstein-Uhlenbeck models. The recommended algorithm that emerges from this investigation is seen to offer substantial gains in reliability and computational time.
JEL-Codes: C22; C52
Keywords: stochastic diŽerential equations, maximum likelihood, ¯nite diŽerence, ¯nite element, cumulative
No. 8   (Download full text)
Stan Hurn and Ralf Becker
Testing for nonlinearity in mean in the presence of heteroskedasticity. Working paper #8
This paper considers an important practical problem in testing time-series data for nonlinearity in mean. Most popular tests reject the null hypothesis of linearity too frequently if the the data are heteroskedastic. Two approaches to redressing this size distortion are considered, both of which have been proposed previously in the literature although not in relation to this particular problem. These are the heteroskedasticity-robust-auxiliary-regression approach and the wild bootstrap. Simulation results indicate that both approaches are effective in reducing the size distortion and that the wild bootstrap others better performance in smaller samples. Two practical examples are then used to illustrate the procedures and demonstrate the potential pitfalls encountered when using non-robust tests.
Keywords: nonlinearity in mean, heteroskedasticity, wild bootstrap, empirical size and power
No. 7   (Download full text) (published)
Adrian Pagan and Hashem Pesaran
Econometric Analysis of Structural Systems with Permanent and Transitory Shocks. Working paper #7
This paper considers the implications of the permanent/transitory decomposition of shocks for identification of structural models in the general case where the model might contain more than one permanent structural shock. It provides a simple and intuitive generalization of the influential work of Blanchard and Quah (1989), and shows that structural equations for which there are known permanent shocks must have no error correction terms present in them, thereby freeing up the latter to be used as instruments in estimating their parameters. The proposed approach is illustrated by a re-examination of the identification scheme used in a monetary model by Wickens and Motta (2001), and in a well known paper by Gali (1992) which deals with the construction of an IS-LM model with supply-side effects. We show that the latter imposes more short-run restrictions than are needed because of a failure to fully utilize the cointegration information.
Keywords: Permanent shocks, structural identi?cation, error correction models, IS-LM models
No. 6   (Download full text) (published)
Martin Fukac and Adrian Pagan
Limited Information Estimation and Evaluation of DSGE Models. Working paper #6
We advance the proposal that DSGE models should not just be estimated and evaluated with reference to full information methods. These make strong assumptions and therefore there is uncertainty about their impact upon results. Some limited information analysis which can be used in a complementary way seems important. Because it is sometimes difficult to implement limited information methods when there are unobservable non-stationary variables in the system we present a simple method of overcoming this that involves normalizing the non-stationary variables with their permanent components and then estimating the estimating the resulting Euler equations. We illustrate the interaction between full and limited information methods in the context of a well-known open economy model of Lubik and Schorfheide. The transformation was effective in revealing possible mis-specifications in the equations of LS's system and the limited information analysis highlighted the role of priors in having a major influence upon the estimates.
No. 5   (Download full text)
Andrew E. Clark, Paul Frijters and Michael A. Shields
Income and Happiness: Evidence, Explanations and Economic Implications. Working paper #5
No. 4   (Download full text)
Louis J. Maccini and Adrian Pagan
Inventories, Fluctuations and Business Cycles. Working paper #4
The paper looks at the role of inventories in U.S. business cycles and fluctuations. It concentrates upon the goods producing sector and constructs a model that features both input and output inventories. A range of shocks are present in the model, including sales, technology and inventory cost shocks. It is found that the presence of inventories does not change the average business cycle characteristics in the U.S. very much. The model is also used to examine whether new techniques for inventory control might have been an important contributing factor to the decline in the volatility of US GDP growth. It is found that these would have had little impact upon the level of volatility.
No. 3   (Download full text)
Adam Clements, Stan Hurn and Scott White
Estimating Stochastic Volatility Models Using a Discrete Non-linear Filter. Working paper #3
Many approaches have been proposed for estimating stochastic volatility (SV) models, a number of which are filtering methods. While non-linear filtering methods are superior to linear approaches, non-linear filtering methods have not gained a wide acceptance in the econometrics literature due to their computational cost. This paper proposes a discretised non-linear filtering (DNF) algorithm for the estimation of latent variable models. It is shown that the DNF approach leads to significant computational gains relative to other procedures in the context of SV estimation without any associated loss in accuracy. It is also shown how a number of extensions to standard SV models can be accommodated within the DNF algorithm.
Keywords: non-linear filtering, stochastic volatility, state-space models, asymmetries, latent factors, two factor volatility models
No. 2   (Download full text)
Stan Hurn, J.Jeisman and K.A. Lindsay
Seeing the Wood for the Trees: A Critical Evaluation of Methods to Estimate the Parameters of Stochastic Differential Equations. Working paper #2
Maximum-likelihood estimates of the parameters of stochastic differential equations are consistent and asymptotically efficient, but unfortunately difficult to obtain if a closed form expression for the transitional probability density function of the process is not available. As a result, a large number of competing estimation procedures have been proposed. This paper provides a critical evaluation of the various estimation techniques. Special attention is given to the ease of implementation and comparative performance of the procedures when estimating the parameters of the Cox-Ingersoll-Ross and Ornstein-Uhlenbeck equations respectively.
Keywords: stochastic differential equations, parameter estimation, maximum likelihood, simulation, moments
No. 1   (Download full text)
Adrian pagan and Don Harding
The Econometric Analysis of Constructed Binary Time Series. Working paper #1
Macroeconometric and financial researchers often use secondary or constructed binary random variables that differ in terms of their statistical properties from the primary random variables used in microeconometric studies. One important difference between primary and secondary binary variables is that while the former are, in many instances, independently distributed (i.d.) the later are rarely i.d. We show how popular rules for constructing binary states determine the degree and nature of the dependence in those states. When using constructed binary variables as regressands a common mistake is to ignore the dependence by using a probit model. We present an alternative non-parametric method that allows for dependence and apply that method to the issue of using the yield spread to predict recessions.
Keywords: Business cycle; binary variable, Markov chain, probit model, yield curve

Contacts

QUT Business School

  • Level 1, B Block
    Gardens Point
    2 George St
    Brisbane
  • Postal address:
    QUT Business School
    GPO Box 2434
    Brisbane QLD 4001