906 resultados para log-based cost analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Finland has large forest fuel resources. However, the use of forest fuels for energy production has been low, except for small-scale use in heating. According to national action plans and programs related to wood energy promotion, the utilization of such resources will be multiplied over the next few years. The most significant part of this growth will be based on the utilization of forest fuels, produced from logging residues of regeneration fellings, in industrial and municipal power and heating plants. Availability of logging residues was analyzed by means of resource and demand approaches in order to identify the most suitable regions with focus on increasing the forest fuel usage. The analysis included availability and supply cost comparisons between power plant sites and resource allocation in a least cost manner, and between a predefined power plant structure under demand and supply constraints. Spatial analysis of worksite factors and regional geographies were carried out using the GIS-model environment via geoprocessing and cartographic modeling tools. According to the results of analyses, the cost competitiveness of forest fuel supply should be improved in order to achieve the designed objectives in the near future. Availability and supply costs of forest fuels varied spatially and were very sensitive to worksite factors and transport distances. According to the site-specific analysis the supply potential between differentlocations can be multifold. However, due to technical and economical reasons ofthe fuel supply and dense power plant infrastructure, the supply potential is limited at plant level. Therefore, the potential and supply cost calculations aredepending on site-specific matters, where regional characteristics of resourcesand infrastructure should be taken into consideration, for example by using a GIS-modeling approach constructed in this study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simulation model adopting a health system perspective showed population-based screening with DXA, followed by alendronate treatment of persons with osteoporosis, or with anamnestic fracture and osteopenia, to be cost-effective in Swiss postmenopausal women from age 70, but not in men. INTRODUCTION: We assessed the cost-effectiveness of a population-based screen-and-treat strategy for osteoporosis (DXA followed by alendronate treatment if osteoporotic, or osteopenic in the presence of fracture), compared to no intervention, from the perspective of the Swiss health care system. METHODS: A published Markov model assessed by first-order Monte Carlo simulation was refined to reflect the diagnostic process and treatment effects. Women and men entered the model at age 50. Main screening ages were 65, 75, and 85 years. Age at bone densitometry was flexible for persons fracturing before the main screening age. Realistic assumptions were made with respect to persistence with intended 5 years of alendronate treatment. The main outcome was cost per quality-adjusted life year (QALY) gained. RESULTS: In women, costs per QALY were Swiss francs (CHF) 71,000, CHF 35,000, and CHF 28,000 for the main screening ages of 65, 75, and 85 years. The threshold of CHF 50,000 per QALY was reached between main screening ages 65 and 75 years. Population-based screening was not cost-effective in men. CONCLUSION: Population-based DXA screening, followed by alendronate treatment in the presence of osteoporosis, or of fracture and osteopenia, is a cost-effective option in Swiss postmenopausal women after age 70.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the aim to compare the cost of treatment for rheumatoid arthritis therapy with desease-modifying antirheumatic drugs (DMARDS) for a 48-month period, were studied five different treatment stage based on clinical protocols recommended by the Brazilian Society of Rheumatology, and then five therapy cycles. The analytical model based on the Markov Analysis, considered chaces for the patient continue in some stages or change between them according with a positive effect on outcomes. Only direct costs were comprised in the analyzed data, like drugs, materials and tests used for monitoring these patients. The results of the model show that the stage in with metotrexato drug is used like monotherapy was cost-effective (R$ 113,900,00 for patient during 48 months), followed by refractory patient (R$ 1,554,483,43), those that use therapy triplicate followed by infleximable drug (R$ 1, 701, 286.76), the metotrexato intolearant patient (R$ 2,629,919,14), and final the result from that use metotrexato and infliximable in the beginning (R$ 9,292,879,31). The sensitivity analysis confirm this results, when alternate the efficacy of metotrexato and infliximabe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chronic graft-versus-host disease (cGvHD) is the leading cause of late nonrelapse mortality (transplant-related mortality) after hematopoietic stem cell transplant. Given that there are a wide range of treatment options for cGvHD, assessment of the associated costs and efficacy can help clinicians and health care providers allocate health care resources more efficiently. OBJECTIVE: The purpose of this study was to assess the cost-effectiveness of extracorporeal photopheresis (ECP) compared with rituximab (Rmb) and with imatinib (Imt) in patients with cGvHD at 5 years from the perspective of the Spanish National Health System. METHODS: The model assessed the incremental cost-effectiveness/utility ratio of ECP versus Rmb or Imt for 1000 hypothetical patients by using microsimulation cost-effectiveness techniques. Model probabilities were obtained from the literature. Treatment pathways and adverse events were evaluated taking clinical opinion and published reports into consideration. Local data on costs (2010 Euros) and health care resources utilization were validated by the clinical authors. Probabilistic sensitivity analyses were used to assess the robustness of the model. RESULTS: The greater efficacy of ECP resulted in a gain of 0.011 to 0.024 quality-adjusted life-year in the first year and 0.062 to 0.094 at year 5 compared with Rmb or Imt. The results showed that the higher acquisition cost of ECP versus Imt was compensated for at 9 months by greater efficacy; this higher cost was partially compensated for ( 517) by year 5 versus Rmb. After 9 months, ECP was dominant (cheaper and more effective) compared with Imt. The incremental cost-effectiveness ratio of ECP versus Rmb was 29,646 per life-year gained and 24,442 per quality-adjusted life-year gained at year 2.5. Probabilistic sensitivity analysis confirmed the results. The main study limitation was that to assess relative treatment effects, only small studies were available for indirect comparison. CONCLUSION: ECP as a third-line therapy for cGvHD is a more cost-effective strategy than Rmb or Imt.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chemical process accidents still occur and cost billions of dollars and, what is worse, many human lives. That means that traditional hazard analysis techniques are not enough mainly owing to the increase of complexity and size of chemical plants. In the last years, a new hazard analysis technique has been developed, changing the focus from reliability to system theory and showing promising results in other industries such as aeronautical and nuclear. In this paper, we present an approach for the application of STAMP and STPA analysis developed by Leveson in 2011 to the process industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Are the perceptions of professional economists on transaction costs consistent with make-or-buy decisions made within firms? The answer may have important implications for transaction cost research. Data on firms' outsourcing during the new product development process are taken from a largescale survey of UK, German and Irish manufacturing plants, and we test the consistency of these outsourcing decisions with the predictions derived from the transaction cost perceptions of a panel of economists. Little consistency is evident between actual outsourcing patterns and the predictions of the (Williamsonian) transactions cost model derived from the panel of economists. There is, however, evidence of a systematic pattern to the differences, suggesting that a competence or resource-based approach may be relevant to understanding firm outsourcing, and that firms are adopting a strategic approach to managing their external relationships. © Cambridge Political Economy Society 2005; all rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Energy price is related to more than half of the total life cycle cost of asphalt pavements. Furthermore, the fluctuation related to price of energy has been much higher than the general inflation and interest rate. This makes the energy price inflation an important variable that should be addressed when performing life cycle cost (LCC) studies re- garding asphalt pavements. The present value of future costs is highly sensitive to the selected discount rate. Therefore, the choice of the discount rate is the most critical element in LCC analysis during the life time of a project. The objective of the paper is to present a discount rate for asphalt pavement projects as a function of interest rate, general inflation and energy price inflation. The discount rate is defined based on the portion of the energy related costs during the life time of the pavement. Consequently, it can reflect the financial risks related to the energy price in asphalt pavement projects. It is suggested that a discount rate sensitivity analysis for asphalt pavements in Sweden should range between –20 and 30%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis attempts to find the least-cost strategy to reduce CO2 emission by replacing coal by other energy sources for electricity generation in the context of the proposed EPA’s regulation on CO2 emissions from existing coal-fired power plants. An ARIMA model is built to forecast coal consumption for electricity generation and its CO2 emissions in Michigan from 2016 to 2020. CO2 emission reduction costs are calculated under three emission reduction scenarios- reduction to 17%, 30% and 50% below the 2005 emission level. The impacts of Production Tax Credit (PTC) and the intermittency of renewable energy are also discussed. The results indicate that in most cases natural gas will be the best alternative to coal for electricity generation to realize CO2 reduction goals; if the PTC for wind power will continue after 2015, a natural gas and wind combination approach could be the best strategy based on the least-cost criterion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The damage Hurricane Sandy caused had far-reaching repercussions up and down the East Coast of the United States. Vast coastal flooding accompanied the storm, inundating homes, businesses, and utility and emergency facilities. Since the storm, projects to mitigate similar future floods have been scrutinized. Such projects not only need to keep out floodwaters but also be designed to withstand the effect that climate change might have on rising sea levels and increased flood risk. In this study, we develop an economic model to assess the costs and benefits of a berm (sea wall) to mitigate the effects of flooding from a large storm. We account for the lifecycle costs of the project, which include those for the upfront construction of the berm, ongoing maintenance, land acquisition, and wetland and recreation zone construction. Benefits of the project include avoided fatalities, avoided residential and commercial damages, avoided utility and municipal damages, recreational and health benefits, avoided debris removal expenses, and avoided loss of function of key transportation and commercial infrastructure located in the area. Our estimate of the beneficial effects of the berm includes ecosystem services from wetlands and health benefits to the surrounding community from a park and nature system constructed along the berm. To account for the effects of climate change and verify that the project will maintain its effectiveness over the long term, we allow the risk of flooding to increase over time. Over our 50-year time horizon, we double the risk of 100- and 500-year flood events to account for the effects of sea level rise on coastal flooding. Based on the economic analysis, the project is highly cost beneficial over its 50-year timeframe. This analysis demonstrates that climate change adaptation investments can be cost beneficial even though they mitigate the impacts of low-probability, high-consequence events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The anisotropic norm of a linear discrete-time-invariant system measures system output sensitivity to stationary Gaussian input disturbances of bounded mean anisotropy. Mean anisotropy characterizes the degree of predictability (or colouredness) and spatial non-roundness of the noise. The anisotropic norm falls between the H-2 and H-infinity norms and accommodates their loss of performance when the probability structure of input disturbances is not exactly known. This paper develops a method for numerical computation of the anisotropic norm which involves linked Riccati and Lyapunov equations and an associated special type equation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a novel approach to explore DNA nucleotide sequence data, aiming to produce high-level categorical and structural information about the underlying chromosomes, genomes and species. The article starts by analyzing chromosomal data through histograms using fixed length DNA sequences. After creating the DNA-related histograms, a correlation between pairs of histograms is computed, producing a global correlation matrix. These data are then used as input to several data processing methods for information extraction and tabular/graphical output generation. A set of 18 species is processed and the extensive results reveal that the proposed method is able to generate significant and diversified outputs, in good accordance with current scientific knowledge in domains such as genomics and phylogenetics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper aims to study the relationships between chromosomal DNA sequences of twenty species. We propose a methodology combining DNA-based word frequency histograms, correlation methods, and an MDS technique to visualize structural information underlying chromosomes (CRs) and species. Four statistical measures are tested (Minkowski, Cosine, Pearson product-moment, and Kendall τ rank correlations) to analyze the information content of 421 nuclear CRs from twenty species. The proposed methodology is built on mathematical tools and allows the analysis and visualization of very large amounts of stream data, like DNA sequences, with almost no assumptions other than the predefined DNA “word length.” This methodology is able to produce comprehensible three-dimensional visualizations of CR clustering and related spatial and structural patterns. The results of the four test correlation scenarios show that the high-level information clusterings produced by the MDS tool are qualitatively similar, with small variations due to each correlation method characteristics, and that the clusterings are a consequence of the input data and not method’s artifacts.