967 resultados para Historical cost data


Relevância:

90.00% 90.00%

Publicador:

Resumo:

ZUSAMMENFASSUNG Langzeitbeobachtungsstudien zur Landschaftsdynamik inSahelländern stehen generell einem defizitären Angebot anquantitativen Rauminformationen gegenüber. Der in Malivorgefundene lokal- bis regionalräumliche Datenmangelführte zu einer methodologischen Studie, die die Entwicklungvon Verfahren zur multi-temporalen Erfassung und Analyse vonLandschaftsveränderungsdaten beinhaltet. Für den RaumWestafrika existiert in großer Flächenüberdeckunghistorisches Fernerkundungsmaterial in Form hochauflösenderLuftbilder ab den 50er Jahren und erste erdbeobachtendeSatellitendaten von Landsat-MSS ab den 70er Jahren.Multitemporale Langzeitanalysen verlangen zur digitalenReproduzierbarkeit, zur Datenvergleich- undObjekterfaßbarkeit die a priori-Betrachtung derDatenbeschaffenheit und -qualität. Zwei, ohne verfügbare, noch rekonstruierbareBodenkontrolldaten entwickelte Methodenansätze zeigen nichtnur die Möglichkeiten, sondern auch die Grenzen eindeutigerradiometrischer und morphometrischerBildinformationsgewinnung. Innerhalb desÜberschwemmungsgunstraums des Nigerbinnendeltas im ZentrumMalis stellen sich zwei Teilstudien zur Extraktion vonquantitativen Sahelvegetationsdaten den radiometrischen undatmosphärischen Problemen:1. Präprozessierende Homogenisierung von multitemporalenMSS-Archivdaten mit Simulationen zur Wirksamkeitatmosphärischer und sensorbedingter Effekte2. Entwicklung einer Methode zur semi-automatischenErfassung und Quantifizierung der Dynamik derGehölzbedeckungsdichte auf panchromatischenArchiv-Luftbildern Die erste Teilstudie stellt historischeLandsat-MSS-Satellitenbilddaten für multi-temporale Analysender Landschaftsdynamik als unbrauchbar heraus. In derzweiten Teilstudie wird der eigens, mittelsmorphomathematischer Filteroperationen für die automatischeMusterkennung und Quantifizierung von Sahelgehölzobjektenentwickelte Methodenansatz präsentiert. Abschließend wird die Forderung nach kosten- undzeiteffizienten Methodenstandards hinsichtlich ihrerRepräsentativität für die Langzeitbeobachtung desRessourceninventars semi-arider Räume sowie deroperationellen Transferierbarkeit auf Datenmaterial modernerFernerkundungssensoren diskutiert.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: There is little evidence on differences across health care systems in choice and outcome of the treatment of chronic low back pain (CLBP) with spinal surgery and conservative treatment as the main options. At least six randomised controlled trials comparing these two options have been performed; they show conflicting results without clear-cut evidence for superior effectiveness of any of the evaluated interventions and could not address whether treatment effect varied across patient subgroups. Cost-utility analyses display inconsistent results when comparing surgical and conservative treatment of CLBP. Due to its higher feasibility, we chose to conduct a prospective observational cohort study. METHODS: This study aims to examine if1. Differences across health care systems result in different treatment outcomes of surgical and conservative treatment of CLBP2. Patient characteristics (work-related, psychological factors, etc.) and co-interventions (physiotherapy, cognitive behavioural therapy, return-to-work programs, etc.) modify the outcome of treatment for CLBP3. Cost-utility in terms of quality-adjusted life years differs between surgical and conservative treatment of CLBP.This study will recruit 1000 patients from orthopaedic spine units, rehabilitation centres, and pain clinics in Switzerland and New Zealand. Effectiveness will be measured by the Oswestry Disability Index (ODI) at baseline and after six months. The change in ODI will be the primary endpoint of this study.Multiple linear regression models will be used, with the change in ODI from baseline to six months as the dependent variable and the type of health care system, type of treatment, patient characteristics, and co-interventions as independent variables. Interactions will be incorporated between type of treatment and different co-interventions and patient characteristics. Cost-utility will be measured with an index based on EQol-5D in combination with cost data. CONCLUSION: This study will provide evidence if differences across health care systems in the outcome of treatment of CLBP exist. It will classify patients with CLBP into different clinical subgroups and help to identify specific target groups who might benefit from specific surgical or conservative interventions. Furthermore, cost-utility differences will be identified for different groups of patients with CLBP. Main results of this study should be replicated in future studies on CLBP.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Type 2 diabetes has grown to epidemic proportions in the U.S., and its prevalence has been steadily increasing in Texas. The physical activity levels in the population have remained low despite it being one of the primary preventive strategies for type 2 diabetes. The objectives of this study were to estimate the direct medical costs of type 2 diabetes attributable to not meeting physical activity Guidelines and to physical inactivity in the U.S. and Texas in 2007. This was a cross sectional study that used physical activity prevalence data from the 2007 Behavioral Risk Factor Surveillance System (BRFSS) to estimate the population attributable risk percentage (PAR%) for type 2 diabetes. These data were combined with the prevalence and cost data of type 2 diabetes to estimate the cost of type 2 diabetes attributable to not meeting Guidelines and to inactivity in the U.S. and Texas in 2007.^ The cost of type 2 diabetes in the U.S. in 2007, attributable to not meeting physical activity Guidelines was estimated to be $13.29 billion, and that attributable to physical inactivity (no leisure time physical activity) was estimated to be $3.32 billion. Depending on various assumptions, these estimates ranged from $7.61 billion to $41.48 billion for not meeting Guidelines, and $1.90 billion to $13.20 billion for physical inactivity in the U.S. in 2007. The cost of type 2 diabetes in Texas in 2007 attributable to not meeting physical activity Guidelines was estimated to be $1.15 billion, and that attributable to physical inactivity (no leisure time physical activity) was estimated to be $325 million. Depending on various assumptions, these estimates ranged from $800 million to $3.47 billion for not meeting Guidelines, and $186 million to $1.28 billion for physical inactivity in Texas in 2007. These results illustrate how much money could be saved annually just in terms of type 2 diabetes cost in the U.S. and Texas, if the entire adult population was active enough to meet physical activity Guidelines. Physical activity promotion, particularly at the environmental and policy level should be a priority in the population. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Back ground and Purpose. There is a growing consensus among health care researchers that Quality of Life (QoL) is an important outcome and, within the field of family caregiving, cost effectiveness research is needed to determine which programs have the greatest benefit for family members. This study uses a multidimensional approach to measure the cost effectiveness of a multicomponent intervention designed to improve the quality of life of spousal caregivers of stroke survivors. Methods. The CAReS study (Committed to Assisting with Recovery after Stroke) was a 5-year prospective, longitudinal intervention study for 159 stroke survivors and their spousal caregivers upon discharge of the stroke survivor from inpatient rehabilitation to their home. CAReS cost data were analyzed to determine the incremental cost of the intervention per caregiver. The mean values of the quality-of-life predictor variables of the intervention group of caregivers were compared to the mean values of usual care groups found in the literature. Significant differences were then divided into the cost of the intervention per caregiver to calculate the incremental cost effectiveness ratio for each predictor variable. Results. The cost of the intervention per caregiver was approximately $2,500. Statistically significant differences were found between the mean scores for the Perceived Stress and Satisfaction with Life scales. Statistically significant differences were not found between the mean scores for the Self Reported Health Status, Mutuality, and Preparedness scales. Conclusions. This study provides a prototype cost effectiveness analysis on which researchers can build. Using a multidimensional approach to measure QoL, as used in this analysis, incorporates both the subjective and objective components of QoL. Some of the QoL predictor variable scores were significantly different between the intervention and comparison groups, indicating a significant impact of the intervention. The estimated cost of the impact was also examined. In future studies, a scale that takes into account both the dimensions and the weighting each person places on the dimensions of QoL should be used to provide a single QoL score per participant. With participant level cost and outcome data, uncertainty around each cost-effectiveness ratio can be calculated using the bias-corrected percentile bootstrapping method and plotted to calculate the cost-effectiveness acceptability curves.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The development of the ecosystem approach and models for the management of ocean marine resources requires easy access to standard validated datasets of historical catch data for the main exploited species, together with the model estimates achieved from these data, allowing models inter-comparison and evaluation of model skills. North Atlantic albacore tuna is exploited all year round by longline and in summer and autumn by surface fisheries and fishery statistics compiled by the International Commission for the Conservation of Atlantic Tunas (ICCAT). Catch and effort with geographical coordinates at monthly spatial resolution of 1° or 5° squares were extracted for this species with a careful definition of fisheries and data screening. Length frequencies of catch were also extracted according to the definition of fisheries for the period 1956-2010. Using these data, an application of the spatial ecosystem and population dynamics model (SEAPODYM) was developed for the North Atlantic albacore population and fisheries and provided the first spatially explicit estimate of albacore density in the North Atlantic by life stage. These densities by life stage (larval recruits, young immature fish adult mature fish and total biomass) are provided in gridded file (Netcdf) at resolution of 2° x 2° x month.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The commercial data acquisition systems used for seismic exploration are usually expensive equipment. In this work, a low cost data acquisition system (Geophonino) has been developed for recording seismic signals from a vertical geophone. The signal goes first through an instrumentation amplifier, INA155, which is suitable for low amplitude signals like the seismic noise, and an anti-aliasing filter based on the MAX7404 switched-capacitor filter. After that, the amplified and filtered signal is digitized and processed by Arduino Due and registered in an SD memory card. Geophonino is configured for continuous registering, where the sampling frequency, the amplitude gain and the registering time are user-defined. The complete prototype is an open source and open hardware system. It has been tested by comparing the registered signals with the ones obtained through different commercial data recording systems and different kind of geophones. The obtained results show good correlation between the tested measurements, presenting Geophonino as a low-cost alternative system for seismic data recording.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Baltic Sea is a seasonally ice-covered, marginal sea in central northern Europe. It is an essential waterway connecting highly industrialised countries. Because ship traffic is intermittently hindered by sea ice, the local weather services have been monitoring sea ice conditions for decades. In the present study we revisit a historical monitoring data set, covering the winters 1960/1961 to 1978/1979. This data set, dubbed Data Bank for Baltic Sea Ice and Sea Surface Temperatures (BASIS) ice, is based on hand-drawn maps that were collected and then digitised in 1981 in a joint project of the Finnish Institute of Marine Research (today the Finnish Meteorological Institute (FMI)) and the Swedish Meteorological and Hydrological Institute (SMHI). BASIS ice was designed for storage on punch cards and all ice information is encoded by five digits. This makes the data hard to access. Here we present a post-processed product based on the original five-digit code. Specifically, we convert to standard ice quantities (including information on ice types), which we distribute in the current and free Network Common Data Format (NetCDF). Our post-processed data set will help to assess numerical ice models and provide easy-to-access unique historical reference material for sea ice in the Baltic Sea. In addition we provide statistics showcasing the data quality. The website http://www.baltic-ocean.org hosts the post-processed data and the conversion code.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The Lescol Intervention Prevention Study (LIPS) was a multinational randomized controlled trial that showed a 47% reduction in the relative risk of cardiac death and a 22% reduction in major adverse cardiac events (MACEs) from the routine use of fluvastatin, compared with controls, in patients undergoing percutaneous coronary intervention (PCI, defined as angioplasty with or without stents). In this study, MACEs included cardiac death, nonfatal myocardial infarction, and subsequent PCI and coronary artery bypass graft. Diabetes was the greatest risk factor for MACEs. Objective: This study estimated the cost-effectiveness of fluvastatin when used for secondary prevention of MACEs after PCI in people with diabetes. Methods: A post hoc subgroup analysis of patients with diabetes from the LIPS was used to estimate the effectiveness of fluvastatin in reducing myocardial infarction, revascularization, and cardiac death. A probabilistic Markov model was developed using United Kingdom resource and cost data to estimate the additional costs and quality-adjusted life-years (QALYs) gained over 10 years from the perspective of the British National Health Service. The model contained 6 health states, and the transition probabilities were derived from the LIPS data. Crossover from fluvastatin to other lipid-lowering drugs, withdrawal from fluvastatin, and the use of lipid-lowering drugs in the control group were included. Results: In the subgroup of 202 patients with diabetes in the LIPS trial, 18 (15.0%) of 120 fluvastatin patients and 21 (25.6%) of 82 control participants were insulin dependent (P = NS). Compared with the control group, patients treated with fluvastatin can expect to gain an additional mean (SD) of 0.196 (0.139) QALY per patient over 10 years (P < 0.001) and will cost the health service an additional mean (SD) of 10 (E448) (P = NS) (mean [SD] US $16 [$689]). The additional cost per QALY gained was;(51 (US $78). The key determinants of cost-effectiveness included the probabilities of repeat interventions, cardiac death, the cost of fluvastatin, and the time horizon used for the evaluation. Conclusion: Fluvastatin was an economically efficient treatment to prevent MACEs in these patients with diabetes undergoing PCI.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a new method for the optimisation of the mirror element spacing arrangement and operating temperature of linear Fresnel reflectors (LFR). The specific objective is to maximise available power output (i.e. exergy) and operational hours whilst minimising cost. The method is described in detail and compared to an existing design method prominent in the literature. Results are given in terms of the exergy per total mirror area (W/m2) and cost per exergy (US $/W). The new method is applied principally to the optimisation of an LFR in Gujarat, India, for which cost data have been gathered. It is recommended to use a spacing arrangement such that the onset of shadowing among mirror elements occurs at a transversal angle of 45°. This results in a cost per exergy of 2.3 $/W. Compared to the existing design approach, the exergy averaged over the year is increased by 9% to 50 W/m2 and an additional 122 h of operation per year are predicted. The ideal operating temperature at the surface of the absorber tubes is found to be 300 °C. It is concluded that the new method is an improvement over existing techniques and a significant tool for any future design work on LFR systems

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In a industrial environment, to know the process one is working with is crucial to ensure its good functioning. In the present work, developed at Prio Biocombustíveis S.A. facilities, using process data, collected during the present work, and historical process data, the methanol recovery process was characterized, having started with the characterization of key process streams. Based on the information retrieved from the stream characterization, Aspen Plus® process simulation software was used to replicate the process and perform a sensitivity analysis with the objective of accessing the relative importance of certain key process variables (reflux/feed ratio, reflux temperature, reboiler outlet temperature, methanol, glycerol and water feed compositions). The work proceeded with the application of a set of statistical tools, starting with the Principal Components Analysis (PCA) from which the interactions between process variables and their contribution to the process variability was studied. Next, the Design of Experiments (DoE) was used to acquire experimental data and, with it, create a model for the water amount in the distillate. However, the necessary conditions to perform this method were not met and so it was abandoned. The Multiple Linear Regression method (MLR) was then used with the available data, creating several empiric models for the water at distillate, the one with the highest fit having a R2 equal to 92.93% and AARD equal to 19.44%. Despite the AARD still being relatively high, the model is still adequate to make fast estimates of the distillate’s quality. As for fouling, its presence has been noticed many times during this work. Not being possible to directly measure the fouling, the reboiler inlet steam pressure was used as an indicator of the fouling growth and its growth variation with the amount of Used Cooking Oil incorporated in the whole process. Comparing the steam cost associated to the reboiler’s operation when fouling is low (1.5 bar of steam pressure) and when fouling is high (reboiler’s steam pressure of 3 bar), an increase of about 58% occurs when the fouling increases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis investigates how web search evaluation can be improved using historical interaction data. Modern search engines combine offline and online evaluation approaches in a sequence of steps that a tested change needs to pass through to be accepted as an improvement and subsequently deployed. We refer to such a sequence of steps as an evaluation pipeline. In this thesis, we consider the evaluation pipeline to contain three sequential steps: an offline evaluation step, an online evaluation scheduling step, and an online evaluation step. In this thesis we show that historical user interaction data can aid in improving the accuracy or efficiency of each of the steps of the web search evaluation pipeline. As a result of these improvements, the overall efficiency of the entire evaluation pipeline is increased. Firstly, we investigate how user interaction data can be used to build accurate offline evaluation methods for query auto-completion mechanisms. We propose a family of offline evaluation metrics for query auto-completion that represents the effort the user has to spend in order to submit their query. The parameters of our proposed metrics are trained against a set of user interactions recorded in the search engine’s query logs. From our experimental study, we observe that our proposed metrics are significantly more correlated with an online user satisfaction indicator than the metrics proposed in the existing literature. Hence, fewer changes will pass the offline evaluation step to be rejected after the online evaluation step. As a result, this would allow us to achieve a higher efficiency of the entire evaluation pipeline. Secondly, we state the problem of the optimised scheduling of online experiments. We tackle this problem by considering a greedy scheduler that prioritises the evaluation queue according to the predicted likelihood of success of a particular experiment. This predictor is trained on a set of online experiments, and uses a diverse set of features to represent an online experiment. Our study demonstrates that a higher number of successful experiments per unit of time can be achieved by deploying such a scheduler on the second step of the evaluation pipeline. Consequently, we argue that the efficiency of the evaluation pipeline can be increased. Next, to improve the efficiency of the online evaluation step, we propose the Generalised Team Draft interleaving framework. Generalised Team Draft considers both the interleaving policy (how often a particular combination of results is shown) and click scoring (how important each click is) as parameters in a data-driven optimisation of the interleaving sensitivity. Further, Generalised Team Draft is applicable beyond domains with a list-based representation of results, i.e. in domains with a grid-based representation, such as image search. Our study using datasets of interleaving experiments performed both in document and image search domains demonstrates that Generalised Team Draft achieves the highest sensitivity. A higher sensitivity indicates that the interleaving experiments can be deployed for a shorter period of time or use a smaller sample of users. Importantly, Generalised Team Draft optimises the interleaving parameters w.r.t. historical interaction data recorded in the interleaving experiments. Finally, we propose to apply the sequential testing methods to reduce the mean deployment time for the interleaving experiments. We adapt two sequential tests for the interleaving experimentation. We demonstrate that one can achieve a significant decrease in experiment duration by using such sequential testing methods. The highest efficiency is achieved by the sequential tests that adjust their stopping thresholds using historical interaction data recorded in diagnostic experiments. Our further experimental study demonstrates that cumulative gains in the online experimentation efficiency can be achieved by combining the interleaving sensitivity optimisation approaches, including Generalised Team Draft, and the sequential testing approaches. Overall, the central contributions of this thesis are the proposed approaches to improve the accuracy or efficiency of the steps of the evaluation pipeline: the offline evaluation frameworks for the query auto-completion, an approach for the optimised scheduling of online experiments, a general framework for the efficient online interleaving evaluation, and a sequential testing approach for the online search evaluation. The experiments in this thesis are based on massive real-life datasets obtained from Yandex, a leading commercial search engine. These experiments demonstrate the potential of the proposed approaches to improve the efficiency of the evaluation pipeline.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES: To systematically review cost of illness studies for schizophrenia (SC), epilepsy (EP) and type 2 diabetes mellitus (T2DM) and explore the transferability of direct medical cost across countries.

METHODS: A comprehensive literature search was performed to yield studies that estimated direct medical costs. A generalized linear model (GLM) with gamma distribution and log link was utilized to explore the variation in costs that accounted by the included factors. Both parametric (Random-effects model) and non-parametric (Boot-strapping) meta-analyses were performed to pool the converted raw cost data (expressed as percentage of GDP/capita of the country where the study was conducted).

RESULTS: In total, 93 articles were included (40 studies were for T2DM, 34 studies for EP and 19 studies for SC). Significant variances were detected inter- and intra-disease classes for the direct medical costs. Multivariate analysis identified that GDP/capita (p<0.05) was a significant factor contributing to the large variance in the cost results. Bootstrapping meta-analysis generated more conservative estimations with slightly wider 95% confidence intervals (CI) than the parametric meta-analysis, yielding a mean (95%CI) of 16.43% (11.32, 21.54) for T2DM, 36.17% (22.34, 50.00) for SC and 10.49% (7.86, 13.41) for EP.

CONCLUSIONS: Converting the raw cost data into percentage of GDP/capita of individual country was demonstrated to be a feasible approach to transfer the direct medical cost across countries. The approach from our study to obtain an estimated direct cost value along with the size of specific disease population from each jurisdiction could be used for a quick check on the economic burden of particular disease for countries without such data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most performance engineering approaches focus on understanding the use of runtime resources. However such approaches do not quantify the value being provided in return for the consumption of these resources. Without such a measure it is not possible to compare the e ciency of these components (that is whether the runtime cost is reasonable given the bene t being provided). We have created an empirical approach that measures the value being provided by a code path in terms of the visible data it generates for the rest of the application. Combining this with traditional performance cost data, creates an e ciency measure for every code path in the application. We have evaluated our approach using the DaCapo benchmark suite, demonstrating our analysis allows us to quantify the e ciency of the code in each benchmark and nd real optimisation opportunities, providing improvements of up to 36% in our case studies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Local climate is a critical element in the design of energy efficient buildings. In this paper, ten years of historical weather data in Australia's eight capital cities were profiled and analysed to characterize the variations of climatic variables in Australia. The method of descriptive statistics was employed. Either the pattern of cumulative distribution and/or the profile of percentage distribution are presented. It was found that although weather variables vary with different locations, there is often a good, nearly linear relation between a weather variable and its cumulative percentage for the majority of middle part of the cumulative curves. By comparing the slopes of these distribution profiles, it may be possible to determine the relative range of changes of the particular weather variables for a given city. The implications of these distribution profiles of key weather variables on energy efficient building design are also discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There has been a worldwide trend to increase axle loads and train speeds. This means that railway track degradation will be accelerated, and track maintenance costs will be increased significantly. There is a need to investigate the consequences of increasing traffic load. The aim of the research is to develop a model for the analysis of physical degradation of railway tracks in response to changes in traffic parameters, especially increased axle loads and train speeds. This research has developed an integrated track degradation model (ITDM) by integrating several models into a comprehensive framework. Mechanistic relationships for track degradation hav~ ?een used wherever possible in each of the models contained in ITDM. This overcc:mes the deficiency of the traditional statistical track models which rely heavily on historical degradation data, which is generally not available in many railway systems. In addition statistical models lack the flexibility of incorporating future changes in traffic patterns or maintenance practices. The research starts with reviewing railway track related studies both in Australia and overseas to develop a comprehensive understanding of track performance under various traffic conditions. Existing railway related models are then examined for their suitability for track degradation analysis for Australian situations. The ITDM model is subsequently developed by modifying suitable existing models, and developing new models where necessary. The ITDM model contains four interrelated submodels for rails, sleepers, ballast and subgrade, and track modulus. The rail submodel is for rail wear analysis and is developed from a theoretical concept. The sleeper submodel is for timber sleepers damage prediction. The submodel is developed by modifying and extending an existing model developed elsewhere. The submodel has also incorporated an analysis for the likelihood of concrete sleeper cracking. The ballast and subgrade submodel is evolved from a concept developed in the USA. Substantial modifications and improvements have been made. The track modulus submodel is developed from a conceptual method. Corrections for more global track conditions have been made. The integration of these submodels into one comprehensive package has enabled the interaction between individual track components to be taken into account. This is done by calculating wheel load distribution with time and updating track conditions periodically in the process of track degradation simulation. A Windows-based computer program ~ssociated with ITDM has also been developed. The program enables the user to carry out analysis of degradation of individual track components and to investigate the inter relationships between these track components and their deterioration. The successful implementation of this research has provided essential information for prediction of increased maintenance as a consequence of railway trackdegradation. The model, having been presented at various conferences and seminars, has attracted wide interest. It is anticipated that the model will be put into practical use among Australian railways, enabling track maintenance planning to be optimized and potentially saving Australian railway systems millions of dollars in operating costs.