915 resultados para Lifetime warranties, Warranty policies, Cost models
Resumo:
This paper presents a test of the predictive validity of various classes ofQALY models (i.e., linear, power and exponential models). We first estimatedTTO utilities for 43 EQ-5D chronic health states and next these states wereembedded in health profiles. The chronic TTO utilities were then used topredict the responses to TTO questions with health profiles. We find that thepower QALY model clearly outperforms linear and exponential QALY models.Optimal power coefficient is 0.65. Our results suggest that TTO-based QALYcalculations may be biased. This bias can be avoided using a power QALY model.
Resumo:
This paper theoretically and empirically documents a puzzle that arises when an RBC economy with a job matching function is used to model unemployment. The standard model can generate sufficiently large cyclical fluctuations in unemployment, or a sufficiently small response of unemployment to labor market policies, but it cannot do both. Variable search and separation, finite UI benefit duration, efficiency wages, and capital all fail to resolve this puzzle. However, either sticky wages or match-specific productivity shocks can improve the model's performance by making the firm's flow of surplus more procyclical, which makes hiring more procyclical too.
Resumo:
The principal aim of this paper is to estimate a stochastic frontier costfunction and an inefficiency effects model in the analysis of the primaryhealth care services purchased by the public authority and supplied by 180providers in 1996 in Catalonia. The evidence from our sample does not supportthe premise that contracting out has helped improve purchasing costefficiency in primary care. Inefficient purchasing cost was observed in thecomponent of this purchasing cost explicitly included in the contract betweenpurchaser and provider. There are no observable incentives for thecontracted-out primary health care teams to minimise prescription costs, whichare not explicitly included in the present contracting system.
Resumo:
Recent studies of American politics evidence that political polarization of both the electorate and the political elite have moved 'almost in tandem for the past half century' (McCarty et al., 2003, p.2), and that party polarization has steadily increased since the 1970s. On the other hand, the empirical literature on party platforms and implemented policies has consistently found an imperfect but nonnegligible correlation between electoral platforms and governmental policies: while platforms tend to be polarized, policies are moderate or centrist. However, existing theoretical models of political competition are not manifestly compatible with these observations.In this paper, we distinguish between electoral platforms and implemented policies by incorporating a non-trivial policy-setting process. It follows that voters may care not only about the implemented policy but also about the platform they support with their vote. We find that while parties tend to polarize their positions, the risk of alienating their constituency prevents them from radicalizing. The analysis evidences that the distribution of the electorate, and not only the (expected) location of a pivotal voter, matters in determining policies. Our results are consistent with the observation of polarized platforms and moderate policies, and the alienation and indifference components of abstention.
Resumo:
International industry data permits testing whether the industry-specific impact of cross-countrydifferences in institutions or policies is consistent with economic theory. Empirical implementationrequires specifying the industry characteristics that determine impact strength. Most of the literature has been using US proxies of the relevant industry characteristics. We show that usingindustry characteristics in a benchmark country as a proxy of the relevant industry characteristicscan result in an attenuation bias or an amplification bias. We also describe circumstances allowingfor an alternative approach that yields consistent estimates. As an application, we reexamine theinfluential conjecture that financial development facilitates the reallocation of capital from decliningto expanding industries.
Resumo:
Aim The imperfect detection of species may lead to erroneous conclusions about species-environment relationships. Accuracy in species detection usually requires temporal replication at sampling sites, a time-consuming and costly monitoring scheme. Here, we applied a lower-cost alternative based on a double-sampling approach to incorporate the reliability of species detection into regression-based species distribution modelling.Location Doñana National Park (south-western Spain).Methods Using species-specific monthly detection probabilities, we estimated the detection reliability as the probability of having detected the species given the species-specific survey time. Such reliability estimates were used to account explicitly for data uncertainty by weighting each absence. We illustrated how this novel framework can be used to evaluate four competing hypotheses as to what constitutes primary environmental control of amphibian distribution: breeding habitat, aestivating habitat, spatial distribution of surrounding habitats and/or major ecosystems zonation. The study was conducted on six pond-breeding amphibian species during a 4-year period.Results Non-detections should not be considered equivalent to real absences, as their reliability varied considerably. The occurrence of Hyla meridionalis and Triturus pygmaeus was related to a particular major ecosystem of the study area, where suitable habitat for these species seemed to be widely available. Characteristics of the breeding habitat (area and hydroperiod) were of high importance for the occurrence of Pelobates cultripes and Pleurodeles waltl. Terrestrial characteristics were the most important predictors of the occurrence of Discoglossus galganoi and Lissotriton boscai, along with spatial distribution of breeding habitats for the last species.Main conclusions We did not find a single best supported hypothesis valid for all species, which stresses the importance of multiscale and multifactor approaches. More importantly, this study shows that estimating the reliability of non-detection records, an exercise that had been previously seen as a naïve goal in species distribution modelling, is feasible and could be promoted in future studies, at least in comparable systems.
Resumo:
BACKGROUND: We assessed the prevalence of risk factors for cardiovascular disease (CVD) in a middle-income country in rapid epidemiological transition and estimated direct costs for treating all individuals at increased cardiovascular risk, i.e. following the so-called "high risk strategy". METHODS: Survey of risk factors using an age- and sex-stratified random sample of the population of Seychelles aged 25-64 in 2004. Assessment of CVD risk and treatment modalities were in line with international guidelines. Costs are expressed as USD per capita per year. RESULTS: 1255 persons took part in the survey (participation rate of 80.2%). Prevalence of main risk factors was: 39.6% for high blood pressure (> or =140/90 mmHg or treatment) of which 59% were under treatment; 24.2% for high cholesterol (> or =6.2 mmol/l); 20.8% for low HDL-cholesterol (<1.0 mmol/l); 9.3% for diabetes (fasting glucose > or =7.0 mmol/l); 17.5% for smoking; 25.1% for obesity (body mass index > or =30 kg/m2) and 22.1% for the metabolic syndrome. Overall, 43% had HBP, high cholesterol or diabetes and substantially increased CVD risk. The cost for medications needed to treat all high-risk individuals amounted to USD 45.6, i.e. 11.2 dollars for high blood pressure, 3.8 dollars for diabetes, and 30.6 dollars for dyslipidemia (using generic drugs except for hypercholesterolemia). Cost for minimal follow-up medical care and laboratory tests amounted to 22.6 dollars. CONCLUSION: High prevalence of major risk factors was found in a rapidly developing country and costs for treatment needed to reduce risk factors in all high-risk individuals exceeded resources generally available in low or middle income countries. Our findings emphasize the need for affordable cost-effective treatment strategies and the critical importance of population strategies aimed at reducing risk factors in the entire population.
Resumo:
Gas sensing systems based on low-cost chemical sensor arrays are gaining interest for the analysis of multicomponent gas mixtures. These sensors show different problems, e.g., nonlinearities and slow time-response, which can be partially solved by digital signal processing. Our approach is based on building a nonlinear inverse dynamic system. Results for different identification techniques, including artificial neural networks and Wiener series, are compared in terms of measurement accuracy.
Resumo:
This paper estimates a model of airline competition for the Spanish air transport market. I test the explanatory power of alternative oligopoly models with capacity constraints. In addition, I analyse the degree of density economies. Results show that Spanish airlines conduct follows a price-leadership scheme so that it is less competitive than the Cournot solution. I also find evidence that thin routes can be considered as natural monopolies
Resumo:
This paper analyses the behaviour of pharmaceutical companies that face the threat of having their drugs excluded from reimbursement and the markets characterised also by price caps. We conclude that price elasticity of demand and cost differentials cause the price discounts which drug firms offer to health care organisations. Additionally, we conclude that price cap regulations affect the time path of prices, resulting in higher prices for new products and lower prices for old products.
Resumo:
1. Identifying those areas suitable for recolonization by threatened species is essential to support efficient conservation policies. Habitat suitability models (HSM) predict species' potential distributions, but the quality of their predictions should be carefully assessed when the species-environment equilibrium assumption is violated.2. We studied the Eurasian otter Lutra lutra, whose numbers are recovering in southern Italy. To produce widely applicable results, we chose standard HSM procedures and looked for the models' capacities in predicting the suitability of a recolonization area. We used two fieldwork datasets: presence-only data, used in the Ecological Niche Factor Analyses (ENFA), and presence-absence data, used in a Generalized Linear Model (GLM). In addition to cross-validation, we independently evaluated the models with data from a recolonization event, providing presences on a previously unoccupied river.3. Three of the models successfully predicted the suitability of the recolonization area, but the GLM built with data before the recolonization disagreed with these predictions, missing the recolonized river's suitability and badly describing the otter's niche. Our results highlighted three points of relevance to modelling practices: (1) absences may prevent the models from correctly identifying areas suitable for a species spread; (2) the selection of variables may lead to randomness in the predictions; and (3) the Area Under Curve (AUC), a commonly used validation index, was not well suited to the evaluation of model quality, whereas the Boyce Index (CBI), based on presence data only, better highlighted the models' fit to the recolonization observations.4. For species with unstable spatial distributions, presence-only models may work better than presence-absence methods in making reliable predictions of suitable areas for expansion. An iterative modelling process, using new occurrences from each step of the species spread, may also help in progressively reducing errors.5. Synthesis and applications. Conservation plans depend on reliable models of the species' suitable habitats. In non-equilibrium situations, such as the case for threatened or invasive species, models could be affected negatively by the inclusion of absence data when predicting the areas of potential expansion. Presence-only methods will here provide a better basis for productive conservation management practices.
Resumo:
Modeling concentration-response function became extremely popular in ecotoxicology during the last decade. Indeed, modeling allows determining the total response pattern of a given substance. However, reliable modeling is consuming in term of data, which is in contradiction with the current trend in ecotoxicology, which aims to reduce, for cost and ethical reasons, the number of data produced during an experiment. It is therefore crucial to determine experimental design in a cost-effective manner. In this paper, we propose to use the theory of locally D-optimal designs to determine the set of concentrations to be tested so that the parameters of the concentration-response function can be estimated with high precision. We illustrated this approach by determining the locally D-optimal designs to estimate the toxicity of the herbicide dinoseb on daphnids and algae. The results show that the number of concentrations to be tested is often equal to the number of parameters and often related to the their meaning, i.e. they are located close to the parameters. Furthermore, the results show that the locally D-optimal design often has the minimal number of support points and is not much sensitive to small changes in nominal values of the parameters. In order to reduce the experimental cost and the use of test organisms, especially in case of long-term studies, reliable nominal values may therefore be fixed based on prior knowledge and literature research instead of on preliminary experiments
Resumo:
The objective of this paper is to identify the political conditions that are most likely to be conducive to the development of social investment policies. It starts from the view put forward by theorists of welfare retrenchment that in the current context of permanent austerity, policy is likely to be dominated by retrenchment and implemented in a way that allows governments to minimise the risk of electoral punishment (blame avoidance). It is argued that this view is inconsistent with developments observed in several European countries, were some welfare state expansion has taken place mostly in the fields of childcare and active labour market policy. An alternative model is put forward, that emphasises the notion of "affordable credit claiming". It is argued that even under strong budgetary pressures, governments maintain a preference for policies that allow them to claim credit for their actions. Since the traditional redistributive policies tend to be off the menu for cost reasons, governments have tended to favour investments in childcare and active labour market policy as credit claiming tools. Policies developed in this way while they have a social investment flavour, tend to be rather limited in the extent to which they genuinely improve prospects of disadvantaged people by investing in their human capital. A more ambitious strategy of social investment sees unlikely to develop on the basis of affordable credit claiming. The paper starts by presenting the theoretical argument, which is then illustrated with examples taken from European countries both in the pre-crisis and in the post-crisis years.
Resumo:
AIM: To perform a systematic review on the costs and cost-effectiveness of concomitant and adjuvant temozolomide with radiotherapy for the treatment of newly diagnosed glioblastoma compared with initial radiotherapy alone. METHODS: Electronic databases were searched for relevant publications on costs and cost-effectiveness until October 2008. RESULTS: We found four relevant clinical trials, one cost study and two economic models. The mean survival benefit in the radiotherapy plus temozolomide group varied between 0.21 and 0.25 life-years. Treatment costs were between 27,365 euros and 39,092 euros. The costs of temozolomide amounted to approximately 40% of the total treatment costs. The incremental cost-effectiveness ratios found in the literature were 37,361 euros per life-year gained and 42,912 euros per quality-adjusted life-year gained. However, the models are not comparable because different outcomes are used (i.e., life-years and quality-adjusted life-years). CONCLUSION: Although the models are not comparable according to outcome, the incremental cost-effectiveness ratios found are within acceptable ranges. We concluded that despite the high temozolomide acquisition costs, the costs per life-year gained and the costs per quality-adjusted life-year gained are comparable with other accepted first-line treatments with chemotherapy in patients with cancer.
Resumo:
This paper estimates a model of airline competition for the Spanish air transport market. I test the explanatory power of alternative oligopoly models with capacity constraints. In addition, I analyse the degree of density economies. Results show that Spanish airlines conduct follows a price-leadership scheme so that it is less competitive than the Cournot solution. I also find evidence that thin routes can be considered as natural monopolies