887 resultados para nonparametric demand model
Resumo:
The paper considers meta-analysis of diagnostic studies that use a continuous score for classification of study participants into healthy or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might be confounded by a potentially unknown variation of the cut-off value. To cope with this phenomena it is suggested to use, instead, an overall estimate of the misclassification error previously suggested and used as Youden’s index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel–Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden’s index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.
Resumo:
The paper develops a measure of consumer welfare losses associated with withholding it formation about a possible link between BSE and vCJD. The Cost of Ignorance (COI) is measured by comparing the utility of the informed choice with the utility of the uninformed choice, under conditions of improved information. Unlike previous work that is largely based on a single equation demand model, the measure is obtained retrieving a cost,function from a dynamic Almost Ideal Demand System. The estimated perceived loss for Italian consumers due to delayed information ranges from 12 percent to 54 percent of total meat expenditure, depending on the month assumed to embody correct beliefs about the safety level of beef.
Resumo:
OBJECTIVES: This contribution provides a unifying concept for meta-analysis integrating the handling of unobserved heterogeneity, study covariates, publication bias and study quality. It is important to consider these issues simultaneously to avoid the occurrence of artifacts, and a method for doing so is suggested here. METHODS: The approach is based upon the meta-likelihood in combination with a general linear nonparametric mixed model, which lays the ground for all inferential conclusions suggested here. RESULTS: The concept is illustrated at hand of a meta-analysis investigating the relationship of hormone replacement therapy and breast cancer. The phenomenon of interest has been investigated in many studies for a considerable time and different results were reported. In 1992 a meta-analysis by Sillero-Arenas et al. concluded a small, but significant overall effect of 1.06 on the relative risk scale. Using the meta-likelihood approach it is demonstrated here that this meta-analysis is due to considerable unobserved heterogeneity. Furthermore, it is shown that new methods are available to model this heterogeneity successfully. It is argued further to include available study covariates to explain this heterogeneity in the meta-analysis at hand. CONCLUSIONS: The topic of HRT and breast cancer has again very recently become an issue of public debate, when results of a large trial investigating the health effects of hormone replacement therapy were published indicating an increased risk for breast cancer (risk ratio of 1.26). Using an adequate regression model in the previously published meta-analysis an adjusted estimate of effect of 1.14 can be given which is considerably higher than the one published in the meta-analysis of Sillero-Arenas et al. In summary, it is hoped that the method suggested here contributes further to a good meta-analytic practice in public health and clinical disciplines.
Resumo:
The paper considers meta-analysis of diagnostic studies that use a continuous Score for classification of study participants into healthy, or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between Studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might he confounded by a potentially unknown variation of the cut-off Value. To cope with this phenomena it is suggested to use, instead an overall estimate of the misclassification error previously suggested and used as Youden's index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel-Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden's index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.
Resumo:
Models of the City of London office market are extended by considering a longer time series of data, covering two cycles, and by explicit modeling of asymmetric rental response to supply and demand model. A long run structural model linking demand for office space, real rental levels and office-based employment is estimated and then rental adjustment processes are modeled using an error correction model framework. Adjustment processes are seen to be asymmetric, dependent both on the direction of the supply and demand shock and on the state of the rental market at the time of the shock. A complete system of equations is estimated: unit shocks produce oscillations but there is a return to a steady equilibrium state in the long run.
Resumo:
In a symmetric differentiated experimental oligopoly with multiproduct firms we test the predictive power of the corresponding Bertrand-Nash equilibria. Subjects are not informed on the specification of the underlying demand model. In the presence of intense multiproduct activity, and provided that a parallel pricing rule is imposed to multiproduct firms, strategies tend to confirm the non-cooperative multiproduct solution.
Resumo:
The majority of the UK population is either overweight or obese. Health economists, nutritionists and doctors are calling for the UK to follow the example of other European countries and introduce a tax on soft drinks as a result of the perception that high intakes contribute to diet-related disease. We use a demand model estimated with household-level data on beverage purchases in the UK to investigate the effects of a tax on soft drink consumption. The model is a Quadratic Almost Ideal Demand System, and censoring is handled by applying a double hurdle. Separate models are estimated for low, moderate and high consumers to allow for a differential impact on consumption between these groups. Applying different hypothetical tax rates, we conclude that understanding the nature of substitute/complement relationships is crucial in designing an effective policy as these relationships differ between consumers depending on their consumption level. The overall impact of a soft drink tax on calorie consumption is likely to be small.
Resumo:
The objective of this article is to study (understand and forecast) spot metal price levels and changes at monthly, quarterly, and annual horizons. The data to be used consists of metal-commodity prices in a monthly frequency from 1957 to 2012 from the International Financial Statistics of the IMF on individual metal series. We will also employ the (relatively large) list of co-variates used in Welch and Goyal (2008) and in Hong and Yogo (2009) , which are available for download. Regarding short- and long-run comovement, we will apply the techniques and the tests proposed in the common-feature literature to build parsimonious VARs, which possibly entail quasi-structural relationships between different commodity prices and/or between a given commodity price and its potential demand determinants. These parsimonious VARs will be later used as forecasting models to be combined to yield metal-commodity prices optimal forecasts. Regarding out-of-sample forecasts, we will use a variety of models (linear and non-linear, single equation and multivariate) and a variety of co-variates to forecast the returns and prices of metal commodities. With the forecasts of a large number of models (N large) and a large number of time periods (T large), we will apply the techniques put forth by the common-feature literature on forecast combinations. The main contribution of this paper is to understand the short-run dynamics of metal prices. We show theoretically that there must be a positive correlation between metal-price variation and industrial-production variation if metal supply is held fixed in the short run when demand is optimally chosen taking into account optimal production for the industrial sector. This is simply a consequence of the derived-demand model for cost-minimizing firms. Our empirical evidence fully supports this theoretical result, with overwhelming evidence that cycles in metal prices are synchronized with those in industrial production. This evidence is stronger regarding the global economy but holds as well for the U.S. economy to a lesser degree. Regarding forecasting, we show that models incorporating (short-run) commoncycle restrictions perform better than unrestricted models, with an important role for industrial production as a predictor for metal-price variation. Still, in most cases, forecast combination techniques outperform individual models.
Resumo:
The objective of this article is to study (understand and forecast) spot metal price levels and changes at monthly, quarterly, and annual frequencies. Data consists of metal-commodity prices at a monthly and quarterly frequencies from 1957 to 2012, extracted from the IFS, and annual data, provided from 1900-2010 by the U.S. Geological Survey (USGS). We also employ the (relatively large) list of co-variates used in Welch and Goyal (2008) and in Hong and Yogo (2009). We investigate short- and long-run comovement by applying the techniques and the tests proposed in the common-feature literature. One of the main contributions of this paper is to understand the short-run dynamics of metal prices. We show theoretically that there must be a positive correlation between metal-price variation and industrial-production variation if metal supply is held fixed in the short run when demand is optimally chosen taking into account optimal production for the industrial sector. This is simply a consequence of the derived-demand model for cost-minimizing firms. Our empirical evidence fully supports this theoretical result, with overwhelming evidence that cycles in metal prices are synchronized with those in industrial production. This evidence is stronger regarding the global economy but holds as well for the U.S. economy to a lesser degree. Regarding out-of-sample forecasts, our main contribution is to show the benefits of forecast-combination techniques, which outperform individual-model forecasts - including the random-walk model. We use a variety of models (linear and non-linear, single equation and multivariate) and a variety of co-variates and functional forms to forecast the returns and prices of metal commodities. Using a large number of models (N large) and a large number of time periods (T large), we apply the techniques put forth by the common-feature literature on forecast combinations. Empirically, we show that models incorporating (short-run) common-cycle restrictions perform better than unrestricted models, with an important role for industrial production as a predictor for metal-price variation.
Resumo:
Peer-to-peer markets are highly uncertain environments due to the constant presence of shocks. As a consequence, sellers have to constantly adjust to these shocks. Dynamic Pricing is hard, especially for non-professional sellers. We study it in an accommodation rental marketplace, Airbnb. With scraped data from its website, we: 1) describe pricing patterns consistent with learning; 2) estimate a demand model and use it to simulate a dynamic pricing model. We simulate it under three scenarios: a) with learning; b) without learning; c) with full information. We have found that information is an important feature concerning rental markets. Furthermore, we have found that learning is important for hosts to improve their profits.
Resumo:
This report provides an analysis and evaluation of the likely effects of climate change on the tourism sector in Montserrat. Clayton (2009) identifies three reasons why the Caribbean should be concerned about the potential effects of climate change on tourism: (a) the relatively high dependence on tourism as a source of foreign exchange and employment; (b) the intrinsic vulnerability of small islands and their infrastructure (e.g. hotels and resorts) to sea level rise and extreme climatic events (e.g. hurricanes and floods); and, (c) the high dependence of the regional tourist industry on carbon-based fuels (both to bring tourist to the region as well as to provide support services in the region). The effects of climate change are already being felt on the island. Between 1970 and 2009, there was a rise in the number of relatively hot days experienced on the island. Added to this, there was also a decline in mean precipitation over the period. Besides temperature, there is also the threat of wind speeds. Since the early 20th century, the number of hurricanes passing through the Caribbean has risen from about 5-6 per year to more than 25 in some years of the twenty-first century. In Montserrat, the estimated damage from four windstorms (including hurricanes) affecting the island was US$260 million or almost five times 2009 gross domestic product (GDP). Climate change is also likely to significantly affect coral reefs. Hoegh-Guldberg (2007) estimates that should current concentrations of carbon dioxide in the Earth’s atmosphere rise from 380ppm to 560ppm, decreases in coral calcification and growth by 40% are likely. The report attempted to quantify the likely effects of the changes in the climatic factors mentioned above. As it relates to temperature and other climatic variables, a tourism climatic index that captures the elements of climate that impact on a destination’s experience was constructed. The index was calculated using historical observations as well as those under two likely climate scenarios: A2 and B2. The results suggest that under both scenarios, the island’s key tourism climatic features will likely decline and therefore negatively impact on the destination experience of visitors. Including this tourism climatic index in a tourism demand model suggests that this would translate into losses of around 145% of GDP. As it relates to coral reefs, the value of the damage due to the loss of coral reefs was estimated at 7.6 times GDP, while the damage due to land loss for the tourism industry was 45% of GDP. The total cost of climate change for the tourism industry was therefore projected to be 9.6 times 2009 GDP over a 40-year horizon. Given the potential for significant damage to the industry, a large number of potential adaptation measures were considered. Out of these, a short-list of 9 potential options was selected using 10 evaluation criteria. These included: (a) Increasing recommended design wind speeds for new tourism-related structures; (b) Construction of water storage tanks; (c) Irrigation network that allows for the recycling of waste water; (d) Enhanced reef monitoring systems to provide early warning alerts of bleaching events; (e) Deployment of artificial reefs and fish-aggregating devices; (f) Developing national evacuation and rescue plans; (g) Introduction of alternative attractions; (h) Providing re-training for displaced tourism workers, and; (i) Revised policies related to financing national tourism offices to accommodate the new climatic realities Using cost-benefit analysis, three options were put forward as being financially viable and ready for immediate implementation: (a) Increase recommended design speeds for new tourism-related structures; (b) Enhance reef monitoring systems to provide early warning alerts of bleaching events, and; (c) Deploy artificial reefs or fish-aggregating devices. While these options had positive benefit cost ratios, other options were also recommended based on their non-tangible benefits: an irrigation network that allows for the recycling of waste water, development of national evacuation and rescue plans, providing retraining for displaced tourism workers and the revision of policies related to financing national tourism offices to accommodate the new climatic realities.
Resumo:
This work aimed to apply genetic algorithms (GA) and particle swarm optimization (PSO) in cash balance management using Miller-Orr model, which consists in a stochastic model that does not define a single ideal point for cash balance, but an oscillation range between a lower bound, an ideal balance and an upper bound. Thus, this paper proposes the application of GA and PSO to minimize the Total Cost of cash maintenance, obtaining the parameter of the lower bound of the Miller-Orr model, using for this the assumptions presented in literature. Computational experiments were applied in the development and validation of the models. The results indicated that both the GA and PSO are applicable in determining the cash level from the lower limit, with best results of PSO model, which had not yet been applied in this type of problem.
Resumo:
A literatura argumenta que o Brasil, embora ainda seja o maior exportador mundial de café verde, tem perdido poder neste mercado, pois a concorrência (rivalidade e probabilidade de entrada) imposta por países como a Colômbia e o Vietnã é forte o suficiente para tornar este mercado bastante competitivo. Assim, este artigo avalia o padrão recente de concorrência do mercado mundial de café verde utilizando uma metodologia econométrica mais usualmente empregada em análise antitruste. Para avaliar o comportamento dos consumidores, foram estimadas as elasticidades-preço da demanda mundial de café verde, por tipo de café, usando o modelo de demanda Logit Multinomial Antitruste. Para avaliar o comportamento de equilíbrio de mercado foram realizados testes de instabilidade de share de quantidade por meio de análise de cointegração em painel. Os resultados apontam para aumento da concorrência à variedade de café brasileiro por parte da demanda e manutenção de sharede quantidades como configuração de equilíbrio de mercado.
Resumo:
In this paper we first show that the gains achievable by integrating pricing and inventory control are usually small for classical demand functions. We then introduce reference price models and demonstrate that for this class of demand functions the benefits of integration with inventory control are substantially increased due to the price dynamics. We also provide some analytical results for this more complex model. We thus conclude that integrated pricing/inventory models could repeat the success of revenue management in practice if reference price effects are included in the demand model and the properties of this new model are better understood.
Resumo:
Fossil pollen data from stratigraphic cores are irregularly spaced in time due to non-linear age-depth relations. Moreover, their marginal distributions may vary over time. We address these features in a nonparametric regression model with errors that are monotone transformations of a latent continuous-time Gaussian process Z(T). Although Z(T) is unobserved, due to monotonicity, under suitable regularity conditions, it can be recovered facilitating further computations such as estimation of the long-memory parameter and the Hermite coefficients. The estimation of Z(T) itself involves estimation of the marginal distribution function of the regression errors. These issues are considered in proposing a plug-in algorithm for optimal bandwidth selection and construction of confidence bands for the trend function. Some high-resolution time series of pollen records from Lago di Origlio in Switzerland, which go back ca. 20,000 years are used to illustrate the methods.