915 resultados para Lifetime warranties, Warranty policies, Cost models
Resumo:
Remotely sensed data have been used extensively for environmental monitoring and modeling at a number of spatial scales; however, a limited range of satellite imaging systems often. constrained the scales of these analyses. A wider variety of data sets is now available, allowing image data to be selected to match the scale of environmental structure(s) or process(es) being examined. A framework is presented for use by environmental scientists and managers, enabling their spatial data collection needs to be linked to a suitable form of remotely sensed data. A six-step approach is used, combining image spatial analysis and scaling tools, within the context of hierarchy theory. The main steps involved are: (1) identification of information requirements for the monitoring or management problem; (2) development of ideal image dimensions (scene model), (3) exploratory analysis of existing remotely sensed data using scaling techniques, (4) selection and evaluation of suitable remotely sensed data based on the scene model, (5) selection of suitable spatial analytic techniques to meet information requirements, and (6) cost-benefit analysis. Results from a case study show that the framework provided an objective mechanism to identify relevant aspects of the monitoring problem and environmental characteristics for selecting remotely sensed data and analysis techniques.
Resumo:
BACKGROUND: Sustained virological response (SVR) is the primary objective in the treatment of chronic hepatitis C (CHC). Results from a recent clinical trial of patients with previously untreated CHC demonstrate that the combination of peginterferon alpha-2a and ribavirin produces a greater SVR than interferon alpha-2b and ribavirin combination therapy. However, the cost-effectiveness of peginterferon alpha-2a plus ribavirin in the U.S. setting has not been investigated. METHODS: A Markov model was developed to investigate cost-effectiveness in patients with CHC using genotype to guide treatment duration. SVR and disease progression parameters were derived from the clinical trials and epidemiologic studies. The impact of treatment on life expectancy and costs were projected for a lifetime. Patients who had an SVR were assumed to remain virus-free for the rest of their lives. In genotype 1 patients, the SVRs were 46% for peginterferon alpha-2a plus ribavirin and 36% for interferon alpha-2b plus ribavirin. In genotype 2/3 patients, the SVRs were 76% for peginterferon alpha-2a plus ribavirin and 61% for interferon alpha-2b plus ribavirin. Quality of life and costs were based on estimates from the literature. All costs were based on published U.S. medical care costs and were adjusted to 2003 U.S. dollars. Costs and benefits beyond the first year were discounted at 3%. RESULTS: In genotype 1, peginterferon alpha-2a plus ribavirin increases quality-adjusted life expectancy (QALY) by 0.70 yr compared to interferon alpha-2b plus ribavirin, producing a cost-effectiveness ratio of $2,600 per QALY gained. In genotype 2/3 patients, peginterferon alpha-2a plus ribavirin increases QALY by 1.05 yr in comparison to interferon alpha-2b plus ribavirin. Peginterferon alpha-2a combination therapy in patients with HCV genotype 2 or 3 is dominant (more effective and cost saving) compared to interferon alpha-2b plus ribavirin. Results weighted by genotype prevalence (75% genotype 1; 25% genotype 2 or 3) also show that peginterferon alpha-2a plus ribavirin is dominant. Peginterferon alpha-2a and ribavirin remained cost-effective (below $16,500 per QALY gained) under sensitivity analyses on key clinical and cost parameters. CONCLUSION: Peginterferon alpha-2a in combination with ribavirin with duration of therapy based on genotype, is cost-effective compared with conventional interferon alpha-2b in combination with ribavirin when given to treatment-naive adults with CHC.
Resumo:
This paper considers the economics of conserving a species with mainly non-use value, the endangered mahogany glider. Three serial surveys of Brisbane residents provide data on the knowledge of respondents about the mahogany glider. The results supply information about the attitudes of respondents to the mahogany glider, to its conservation and relevant public policies, and about variations in these factors as the knowledge of participants of the mahogany glider alters. Similarly, data are provided and analysed about the willingness to pay of respondents to conserve the mahogany glider and how it changes. Population viability analysis is applied to estimate the required habitat area for a minimum viable population of the mahogany glider to ensure at least a 95% probability of its survival for 100 years. Places are identified in Queensland where the requisite minimum area of critical habitat can be conserved. Using the survey results as a basis, the likely willingness of groups of Australians to pay for the conservation of the mahogany glider is estimated and consequently their willingness to pay for the minimum required area of its habitat. Methods for estimating the cost of protecting this habitat are outlined. Australia-wide benefits are estimated to exceed the costs. Establishing a national park containing the minimum viable population of the mahogany glider is an appealing management option. This would also be beneficial in conserving other endangered wildlife species and ecosystems. Therefore, additional economic benefits to those estimated on account of the mahogany glider itself can be obtained. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
We develop foreign bank technical, cost and profit efficiency models for particular application with data envelopment analysis (DEA). Key motivations for the paper are (a) the often-observed practice of choosing inputs and outputs where the selection process is poorly explained and linkages to theory are unclear, and (b) foreign bank productivity analysis, which has been neglected in DEA banking literature. The main aim is to demonstrate a process grounded in finance and banking theories for developing bank efficiency models, which can bring comparability and direction to empirical productivity studies. We expect this paper to foster empirical bank productivity studies.
Resumo:
In this paper we propose a range of dynamic data envelopment analysis (DEA) models which allow information on costs of adjustment to be incorporated into the DEA framework. We first specify a basic dynamic DEA model predicated on a number or simplifying assumptions. We then outline a number of extensions to this model to accommodate asymmetric adjustment costs, non-static output quantities, non-static input prices, and non-static costs of adjustment, technological change, quasi-fixed inputs and investment budget constraints. The new dynamic DEA models provide valuable extra information relative to the standard static DEA models-they identify an optimal path of adjustment for the input quantities, and provide a measure of the potential cost savings that result from recognising the costs of adjusting input quantities towards the optimal point. The new models are illustrated using data relating to a chain of 35 retail department stores in Chile. The empirical results illustrate the wealth of information that can be derived from these models, and clearly show that static models overstate potential cost savings when adjustment costs are non-zero.
Resumo:
The measurement of lifetime prevalence of depression in cross-sectional surveys is biased by recall problems. We estimated it indirectly for two countries using modelling, and quantified the underestimation in the empirical estimate for one. A microsimulation model was used to generate population-based epidemiological measures of depression. We fitted the model to 1-and 12-month prevalence data from the Netherlands Mental Health Survey and Incidence Study (NEMESIS) and the Australian Adult Mental Health and Wellbeing Survey. The lowest proportion of cases ever having an episode in their life is 30% of men and 40% of women, for both countries. This corresponds to a lifetime prevalence of 20 and 30%, respectively, in a cross-sectional setting (aged 15-65). The NEMESIS data were 38% lower than these estimates. We conclude that modelling enabled us to estimate lifetime prevalence of depression indirectly. This method is useful in the absence of direct measurement, but also showed that direct estimates are underestimated by recall bias and by the cross-sectional setting.
Resumo:
Objective: To assess from a health sector perspective the incremental cost-effectiveness of eight drug treatment scenarios for established schizophrenia. Method: Using a standardized methodology, costs and outcomes are modelled over the lifetime of prevalent cases of schizophrenia in Australia in 2000. A two-stage approach to assessment of health benefit is used. The first stage involves a quantitative analysis based on disability-adjusted life years (DALYs) averted, using best available evidence. The robustness of results is tested using probabilistic uncertainty analysis. The second stage involves application of 'second filter' criteria (equity, strength of evidence, feasibility and acceptability) to allow broader concepts of benefit to be considered. Results: Replacing oral typicals with risperidone or olanzapine has an incremental cost-effectiveness ratio (ICER) of A$48 000 and A$92 000/DALY respectively. Switching from low-dose typicals to risperidone has an ICER of A$80 000. Giving risperidone to people experiencing side-effects on typicals is more cost-effective at A$20 000. Giving clozapine to people taking typicals, with the worst course of the disorder and either little or clear deterioration, is cost-effective at A$42 000 or A$23 000/DALY respectively. The least cost-effective intervention is to replace risperidone with olanzapine at A$160 000/DALY. Conclusions: Based on an A$50 000/DALY threshold, low-dose typical neuroleptics are indicated as the treatment of choice for established schizophrenia, with risperidone being reserved for those experiencing moderate to severe side-effects on typicals. The more expensive olanzapine should only be prescribed when risperidone is not clinically indicated. The high cost of risperidone and olanzapine relative to modest health gains underlie this conclusion. Earlier introduction of clozapine however, would be cost-effective. This work is limited by weaknesses in trials (lack of long-term efficacy data, quality of life and consumer satisfaction evidence) and the translation of effect size into a DALY change. Some stakeholders, including SANE Australia, argue the modest health gains reported in the literature do not adequately reflect perceptions by patients, clinicians and carers, of improved quality of life with these atypicals.
Resumo:
For leased equipment, the lessor carries out the maintenance of the equipment. Usually, the contract of lease specifies the penalty for equipment failures and for repairs not being carried out within specified time limits. This implies that optimal preventive maintenance policies must take these penalty costs into account and properly traded against the cost of preventive maintenance actions. The costs associated with failures are high as unplanned corrective maintenance actions are costly and the resulting penalties due to lease contract terms being violated. The paper develops a model to determine the optimal parameters of a preventive maintenance policy that takes into account all these costs to minimize the total expected cost to the lessor for new item lease. The parameters of the policy are (i) the number of preventive maintenance actions to be carried out over the lease period, (ii) the time instants for such actions, and (iii) the level of action. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Traditional sensitivity and elasticity analyses of matrix population models have been used to p inform management decisions, but they ignore the economic costs of manipulating vital rates. For exam le, the growth rate of a population is often most sensitive to changes in adult survival rate, but this does not mean that increasing that rate is the best option for managing the population because it may be much more expensive than other options. To explore how managers should optimize their manipulation of vital rates, we incorporated the cost of changing those rates into matrix population models. We derived analytic expressions for locations in parameter space where managers should shift between management of fecundity and survival, for the balance between fecundity and survival management at those boundaries, and for the allocation of management resources to sustain that optimal balance. For simple matrices, the optimal budget allocation can often be expressed as simple functions of vital rates and the relative costs of changing them. We applied our method to management of the Helmeted Honeyeater (Lichenostomus melanops cassidix; an endangered Australian bird) and the koala (Phascolarctos cinereus) as examples. Our method showed that cost-efficient management of the Helmeted Honeyeater should focus on increasing fecundity via nest protection, whereas optimal koala management should focus on manipulating both fecundity and survival simultaneously, These findings are contrary to the cost-negligent recommendations of elasticity analysis, which would suggest focusing on managing survival in both cases. A further investigation of Helmeted Honeyeater management options, based on an individual-based model incorporating density dependence, spatial structure, and environmental stochasticity, confirmed that fecundity management was the most cost-effective strategy. Our results demonstrate that decisions that ignore economic factors will reduce management efficiency.
Resumo:
Traditional vegetation mapping methods use high cost, labour-intensive aerial photography interpretation. This approach can be subjective and is limited by factors such as the extent of remnant vegetation, and the differing scale and quality of aerial photography over time. An alternative approach is proposed which integrates a data model, a statistical model and an ecological model using sophisticated Geographic Information Systems (GIS) techniques and rule-based systems to support fine-scale vegetation community modelling. This approach is based on a more realistic representation of vegetation patterns with transitional gradients from one vegetation community to another. Arbitrary, though often unrealistic, sharp boundaries can be imposed on the model by the application of statistical methods. This GIS-integrated multivariate approach is applied to the problem of vegetation mapping in the complex vegetation communities of the Innisfail Lowlands in the Wet Tropics bioregion of Northeastern Australia. The paper presents the full cycle of this vegetation modelling approach including sampling sites, variable selection, model selection, model implementation, internal model assessment, model prediction assessments, models integration of discrete vegetation community models to generate a composite pre-clearing vegetation map, independent data set model validation and model prediction's scale assessments. An accurate pre-clearing vegetation map of the Innisfail Lowlands was generated (0.83r(2)) through GIS integration of 28 separate statistical models. This modelling approach has good potential for wider application, including provision of. vital information for conservation planning and management; a scientific basis for rehabilitation of disturbed and cleared areas; a viable method for the production of adequate vegetation maps for conservation and forestry planning of poorly-studied areas. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
We present the design rationale and basic workings of a low-cost, easy-to-use power system simulator developed to support investigations into human interface design for a hydropower plant. The power system simulator is based on three important components: models of power system components, a data repository, and human interface elements. Dynamic Data Exchange (DDE) allows simulator components to communicate with each other within the simulator. To construct the modules of the simulator we have combined the advantages of commercial software such as Matlab/Simulink, ActiveX Control, Visual Basic and Excel and integrated them in the simulator. An important advantage of our approach is that further components of the simulator now can be developed independently. An initial assessment of the simulator indicates it is fit for intended purpose.
Resumo:
This thesis presents the formal definition of a novel Mobile Cloud Computing (MCC) extension of the Networked Autonomic Machine (NAM) framework, a general-purpose conceptual tool which describes large-scale distributed autonomic systems. The introduction of autonomic policies in the MCC paradigm has proved to be an effective technique to increase the robustness and flexibility of MCC systems. In particular, autonomic policies based on continuous resource and connectivity monitoring help automate context-aware decisions for computation offloading. We have also provided NAM with a formalization in terms of a transformational operational semantics in order to fill the gap between its existing Java implementation NAM4J and its conceptual definition. Moreover, we have extended NAM4J by adding several components with the purpose of managing large scale autonomic distributed environments. In particular, the middleware allows for the implementation of peer-to-peer (P2P) networks of NAM nodes. Moreover, NAM mobility actions have been implemented to enable the migration of code, execution state and data. Within NAM4J, we have designed and developed a component, denoted as context bus, which is particularly useful in collaborative applications in that, if replicated on each peer, it instantiates a virtual shared channel allowing nodes to notify and get notified about context events. Regarding the autonomic policies management, we have provided NAM4J with a rule engine, whose purpose is to allow a system to autonomously determine when offloading is convenient. We have also provided NAM4J with trust and reputation management mechanisms to make the middleware suitable for applications in which such aspects are of great interest. To this purpose, we have designed and implemented a distributed framework, denoted as DARTSense, where no central server is required, as reputation values are stored and updated by participants in a subjective fashion. We have also investigated the literature regarding MCC systems. The analysis pointed out that all MCC models focus on mobile devices, and consider the Cloud as a system with unlimited resources. To contribute in filling this gap, we defined a modeling and simulation framework for the design and analysis of MCC systems, encompassing both their sides. We have also implemented a modular and reusable simulator of the model. We have applied the NAM principles to two different application scenarios. First, we have defined a hybrid P2P/cloud approach where components and protocols are autonomically configured according to specific target goals, such as cost-effectiveness, reliability and availability. Merging P2P and cloud paradigms brings together the advantages of both: high availability, provided by the Cloud presence, and low cost, by exploiting inexpensive peers resources. As an example, we have shown how the proposed approach can be used to design NAM-based collaborative storage systems based on an autonomic policy to decide how to distribute data chunks among peers and Cloud, according to cost minimization and data availability goals. As a second application, we have defined an autonomic architecture for decentralized urban participatory sensing (UPS) which bridges sensor networks and mobile systems to improve effectiveness and efficiency. The developed application allows users to retrieve and publish different types of sensed information by using the features provided by NAM4J's context bus. Trust and reputation is managed through the application of DARTSense mechanisms. Also, the application includes an autonomic policy that detects areas characterized by few contributors, and tries to recruit new providers by migrating code necessary to sensing, through NAM mobility actions.
Resumo:
As várias teorias acerca da estrutura de capital despertam interesse motivando diversos estudos sobre o assunto sem, no entanto, ter um consenso. Outro tema aparentemente pouco explorado refere-se ao ciclo de vida das empresas e como ele pode influenciar a estrutura de capital. Este estudo teve como objetivo verificar quais determinantes possuem maior relevância no endividamento das empresas e se estes determinantes alteram-se dependendo do ciclo de vida da empresa apoiada pelas teorias Trade Off, Pecking Order e Teoria da Agência. Para alcançar o objetivo deste trabalho foi utilizado análise em painel de efeito fixo sendo a amostra composta por empresas brasileiras de capital aberto, com dados secundários disponíveis na Economática® no período de 2005 a 2013, utilizando-se os setores da BM&FBOVESPA. Como resultado principal destaca-se o mesmo comportamento entre a amostra geral, alto e baixo crescimento pelo endividamento contábil para o determinante Lucratividade apresentando uma relação negativa, e para os determinantes Oportunidade de Crescimento e Tamanho, estes com uma relação positiva. Para os grupos de alto e baixo crescimento alguns determinantes apresentaram resultados diferentes, como a singularidade que resultou significância nestes dois grupos, sendo positiva no baixo crescimento e negativa no alto crescimento, para o valor colateral dos ativos e benefício fiscal não dívida apresentaram significância apenas no grupo de baixo crescimento. Para o endividamento a valor de mercado foi observado significância para o Benefício fiscal não dívida e Singularidade. Este resultado reforça o argumento de que o ciclo de vida influência a estrutura de capital
Resumo:
This article investigates the relationship between simultaneity in decisions regarding business strategies and human resource management (HRM) policies and their impact on organizational performance. The research is based on a sample of 178 organizations operating in the Greek manufacturing sector. The results of this study support the hypothesis that when business strategies and HRM policies are developed simultaneously, they positively affect organizational performance. This is more valid for decisions taken simultaneously with respect to quality and employee development, innovation and employee rewards and relations, and cost and employee resourcing. © 2008 Wiley Periodicals, Inc.