830 resultados para Cost Of Illness
Resumo:
Predation risk can strongly constrain how individuals use time and space. Grouping is known to reduce an individual's time investment in costly antipredator behaviours. Whether grouping might similarly provide a spatial release from antipredator behaviour and allow individuals to use risky habitat more and, thus, improve their access to resources is poorly known. We used mosquito larvae, Aedes aegypti, to test the hypothesis that grouping facilitates the use of high-risk habitat. We provided two habitats, one darker, low-risk and one lighter, high-risk, and measured the relative time spent in the latter by solitary larvae versus larvae in small groups. We tested larvae reared under different resource levels, and thus presumed to vary in body condition, because condition is known to influence risk taking. We also varied the degree of contrast in habitat structure. We predicted that individuals in groups should use high-risk habitat more than solitary individuals allowing for influences of body condition and contrast in habitat structure. Grouping strongly influenced the time spent in the high-risk habitat, but, contrary to our expectation, individuals in groups spent less time in the high-risk habitat than solitary individuals. Furthermore, solitary individuals considerably increased the proportion of time spent in the high-risk habitat over time, whereas individuals in groups did not. Both solitary individuals and those in groups showed a small increase over time in their use of riskier locations within each habitat. The differences between solitary individuals and those in groups held across all resource and contrast conditions. Grouping may, thus, carry a poorly understood cost of constraining habitat use. This cost may arise because movement traits important for maintaining group cohesion (a result of strong selection on grouping) can act to exaggerate an individual preference for low-risk habitat. Further research is needed to examine the interplay between grouping, individual movement and habitat use traits in environments heterogeneous in risk and resources. (C) 2015 The Association for the Study of Animal Behaviour. Published by Elsevier Ltd. All rights reserved.
Resumo:
Seventy percent of the world's catch of fish and fishery products is consumed as food. Fish and shellfish products represent 15.6 percent of animal protein supply and 5.6 percent of total protein supply on a worldwide basis. Developing countries account for almost 50 percent of global fish exports. Seafood-borne disease or illness outbreaks affect consumers both physically and financially, and create regulatory problems for both importing and exporting countries. Seafood safety as a commodity cannot be purchased in the marketplace and government intervenes to regulate the safety and quality of seafood. Theoretical issues and data limitations create problems in estimating what consumers will pay for seafood safety and quality. The costs and benefits of seafood safety must be considered at all levels, including the fishers, fish farmers, input suppliers to fishing, processing and trade, seafood processors, seafood distributors, consumers and government. Hazard Analysis Critical Control Point (HACCP) programmes are being implemented on a worldwide basis for seafood. Studies have been completed to estimate the cost of HACCP in various shrimp, fish and shellfish plants in the United States, and are underway for some seafood plants in the United Kingdom, Canada and Africa. Major developments within the last two decades have created a set of complex trading situations for seafood. Current events indicate that seafood safety and quality can be used as non-tariff barriers to free trade. Research priorities necessary to estimate the economic value and impacts of achieving safer seafood are outlined at the consumer, seafood production and processing, trade and government levels. An extensive list of references on the economics of seafood safety and quality is presented. (PDF contains 56 pages; captured from html.)
Resumo:
The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.
Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.
Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.
Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.
In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.
Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.
The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.
Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.
Resumo:
While numerous studies find that deep-saline sandstone aquifers in the United States could store many decades worth of the nation's current annual CO 2 emissions, the likely cost of this storage (i.e. the cost of storage only and not capture and transport costs) has been harder to constrain. We use publicly available data of key reservoir properties to produce geo-referenced rasters of estimated storage capacity and cost for regions within 15 deep-saline sandstone aquifers in the United States. The rasters reveal the reservoir quality of these aquifers to be so variable that the cost estimates for storage span three orders of magnitude and average>$100/tonne CO 2. However, when the cost and corresponding capacity estimates in the rasters are assembled into a marginal abatement cost curve (MACC), we find that ~75% of the estimated storage capacity could be available for<$2/tonne. Furthermore, ~80% of the total estimated storage capacity in the rasters is concentrated within just two of the aquifers-the Frio Formation along the Texas Gulf Coast, and the Mt. Simon Formation in the Michigan Basin, which together make up only ~20% of the areas analyzed. While our assessment is not comprehensive, the results suggest there should be an abundance of low-cost storage for CO 2 in deep-saline aquifers, but a majority of this storage is likely to be concentrated within specific regions of a smaller number of these aquifers. © 2011 Elsevier B.V.
Resumo:
The cost of electricity, a major operating cost of municipal wastewater treatment plants, is related to influent flow rate, power price, and power load. With knowledge of inflow and price patterns, plant operators can manage processes to reduce electricity costs. Records of influent flow, power price, and load are evaluated for Blue Plains Advanced Wastewater Treatment Plant. Diurnal and seasonal trends are analyzed. Power usage is broken down among treatment processes. A simulation model of influent pumping, a large power user, is developed. It predicts pump discharge and power usage based on wet-well level. Individual pump characteristics are tested in the plant. The model accurately simulates plant inflow and power use for two pumping stations [R2 = 0.68, 0.93 (inflow), R2 =0.94, 0.91(power)]. Wet-well stage-storage relationship is estimated from data. Time-varying wet-well level is added to the model. A synthetic example demonstrates application in managing pumps to reduce electricity cost.
Resumo:
A generic, hierarchical, and multifidelity unit cost of acquisition estimating methodology for outside production machined parts is presented. The originality of the work lies with the method’s inherent capability of being able to generate multilevel and multifidelity cost relations for large volumes of parts utilizing process, supply chain costing data, and varying degrees of part design definition information. Estimates can be generated throughout the life cycle of a part using different grades of the combined information available. Considering design development for a given part, additional design definition may be used as it becomes available within the developed method to improve the quality of the resulting estimate. Via a process of analogous classification, parts are classified into groups of increasing similarity using design-based descriptors. A parametric estimating method is then applied to each subgroup of the machined part commodity in the direction of improved classification and using which, a relationship which links design variables to manufacturing cycle time may be generated. A rate cost reflective of the supply chain is then applied to the cycle time estimate for a given part to arrive at an estimate of make cost which is then totalled with the material and treatments cost components respectively to give an overall estimate of unit acquisition cost. Both the rate charge applied and the treatments cost calculated for a given procured part is derived via the use of ratio analysis.
Resumo:
OBJECTIVES: To evaluate the cost-effectiveness of an adapted U.S. model of pharmaceutical care to improve psychoactive prescribing for nursing home residents in Northern Ireland (Fleetwood NI Study).
DESIGN: Economic evaluation alongside a cluster randomized controlled trial.
SETTING: Nursing homes in NI randomized to intervention (receipt of the adapted model of care; n511) or control (usual care continued; n511).
PARTICIPANTS: Residents aged 65 and older who provided informed consent (N5253; 128 intervention, 125 control) and who had full resource use data at 12 months.
INTERVENTION: Trained pharmacists reviewed intervention home residents’ clinical and prescribing information for 12 months, applied an algorithm that guided them in assessing the appropriateness of psychoactive medication, and worked with prescribers (general practitioners) to make changes. The control homes received usual care in which there was no pharmacist intervention.
MEASUREMENTS: The proportion of residents prescribed one or more inappropriate psychoactive medications (according to standardized protocols), costs, and a cost-effectiveness acceptability curve. The latter two outcomes are the focus for this article.
RESULTS: The proportions of residents receiving inappropriate psychoactive medication at 12 months in the intervention and control group were 19.5% and 50.4%, respectively. The mean cost of healthcare resources used per resident per year was $4,923 (95% con?dence interval.
Resumo:
Abstract: The potential variance in feedstock costs can have signifi cant implications for the cost of a biofuel and the fi nancial viability of a biofuel facility. This paper employs the Grange Feed Costing Model to assess the cost of on-farm biomethane production using grass silages produced under a range of management scenarios. These costs were compared with the cost of wheat grain and sugarbeet roots for ethanol production at an industrial scale. Of the three feedstocks examined, grass silage represents the cheapest feedstock per GJ of biofuel produced. At a production cost of €27/tonne (t) feedstock (or €150/t volatile solids (VS)), the feedstock production cost of grass silage per gigajoule (GJ) of biofuel (€12.27) is lower than that of sugarbeet (€16.82) and wheat grain (€18.61). Grass biomethane is also the cheapest biofuel when grass silage is costed at the bottom quartile purchase price of silage of €19/t (€93/t VS). However, when considering the production costs (full-costing) of the three feedstocks, the total cost of grass biomethane (€32.37/GJ of biofuel; intensive 2-cut system) from a small on-farm facility ranks between that of sugarbeet (€29.62) and wheat grain ethanol (€34.31) produced in large industrial facilities. The feedstock costs for the above three biofuels represent 0.38, 0.57, and 0.54 of the total biofuel cost. The importance of feedstock cost on biofuel cost is further highlighted by the 0.43 increase in the cost of biomethane when grass silage is priced at the top quartile (€46/t or €232/t VS) compared to the bottom quartile purchase price.
Resumo:
Although child maltreatment due to abuse or neglect is pervasive within our society, less
is known about fabricated or induced illness by carers (FII), which is considered to be a
rare form of child abuse. FII occurs when a caregiver (in 93% of cases, the mother)
misrepresents the child as ill either by fabricating, or much more rarely, producing
symptoms and then presenting the child for medical care, disclaiming knowledge of the
cause of the problem. The growing body of literature on FII reflects the lack of clarity
amongst professionals as to what constitutes FII, the difficulties involved in diagnosis,
and the lack of research into psychotherapeutic intervention with perpetrators. This lack
of clarity further complicates the identification, management and treatment of children
suffering from FII and may result in many cases going undetected, with potentially lifethreatening
consequences for children. It has been suggested that there is a national
under-reporting of fabricated or induced illness. In practice these cases are encountered
more frequently due to the chronic nature of the presentations, the large number of
professionals who may be involved and the broad spectrum including milder cases that
may not all require a formal child protection response. Diagnosis of fabricated disease
can be especially difficult, because the reported signs and symptoms cannot be confirmed
(when they are being exaggerated or imagined) or may be inconsistent (when they are
induced or fabricated). This paper highlights and discusses the controversies and
complexities of this condition, the risks to the child and how it affects children; the
paucity of systematic research regarding what motivates mothers to harm their children
by means of illness falsification; how the condition should be managed and treated for
both mother and child; and implications for policy and practice.
Resumo:
OBJECTIVE: To conduct a cost-effectiveness analysis comparing two different tooth replacement strategies for partially dentate older patients, namely partial removable dental prostheses (RDP) and functionally orientated treatment based on the shortened dental arch concept (SDA).
METHODS: Ninety-two partially dentate older patients completed a randomized controlled clinical trial. Patients were randomly allocated to two treatment groups: the RDP group and the SDA group. Treatment effect was measured using impact on oral health-related quality of life (OHrQOL), and the costs involved in providing and maintaining care were recorded for all patients. Patients were followed for 12 months after treatment intervention. All treatment was provided by a single operator.
RESULTS: The total cost of achieving the minimally important clinical difference (MID) in OHrQOL for an average patient in the RDP group was €464.64. For the SDA group, the cost of achieving the MID for an average patient was €252.00. The cost-effectiveness ratio was therefore 1:1.84 in favour of SDA treatment.
CONCLUSION: With an increasingly ageing population, many patients will continue to benefit from removable prostheses to replace their missing natural teeth. From a purely economic standpoint, the results from this analysis suggest that the treatment of partially dentate older adults should be focused on functionally orientated treatment because it is simply more cost-effective.
Resumo:
BACKGROUND: As the world population ages, the requirement for cost-effective methods of treating chronic disease conditions increases. In terms of oral health, there is a rapidly increasing number of dentate elderly with a high burden of maintenance. Population surveys indicate that older individuals are keeping their teeth for longer and are a higher caries risk group. Atraumatic Restorative Treatment (ART) could be suitable for patients in nursing homes or house-bound elderly, but very little research has been done on its use in adults.
OBJECTIVES: To compare the cost-effectiveness of ART and a conventional technique (CT) for restoring carious lesions as part of a preventive and restorative programme for older adults.
METHODS: In this randomized clinical trial, 82 patients with carious lesions were randomly allocated to receive either ART or conventional restorations. Treatment costs were measured based on treatment time, materials and labour. For the ART group, the cost of care provided by a dentist was also compared to the cost of having a hygienist to provide treatment. Effectiveness was measured using percentage of restorations that survived after a year.
RESULTS: Eighty-two patients received 260 restorations, that is, 128 ART and 132 conventional restorations. 91.1% of the restorations were on one surface only. After a year, 252 restorations were assessed in 80 patients. The average cost for ART and conventional restorations was €16.86 and €28.71 respectively; the restoration survival percentages were 91.1% and 97.7%, respectively. This resulted in a cost-effectiveness ratio of 0.18 (ART) and 0.29 (CT). When the cost of a hygienist to provide ART was inserted in the analysis, the resulting ratio was 0.14.
CONCLUSIONS: Atraumatic restorative treatment was found to be a more cost-effective alternative to treat older adults after 1 year, compared to conventional restorations, especially in out of surgery facilities and using alternative workforce such as hygienists. Atraumatic restorative treatment can be a useful tool to provide dental care for frail and fearful individuals who might not access dental treatment routinely.
Resumo:
OBJECTIVE: To compare the cost-effectiveness of conventional treatment using partial dentures with functionally orientated treatment to replace missing teeth for partially dentate elders using a randomised controlled clinical trial.
BACKGROUND: In many countries, including the Republic of Ireland, the only publically funded treatment option offered to partially dentate older patients is a removable partial denture. However, evidence suggests that these removable prostheses are unpopular with patients and can potentially increase the risk of further dental disease and subsequent tooth loss.
MATERIALS AND METHODS: Fourty-four partially dentate patients aged 65 years and older were recruited. Patients were randomly assigned to the two treatment arms of the study. The conventional treatment group received removable partial dentures to replace all missing natural teeth. The functionally orientated group was restored to a Shortened Dental Arch (SDA) of 10 occluding contacts using resin-bonded bridgework (RBB). The costs associated with each treatment were recorded. Effectiveness was measured in terms of the impact on oral health-related quality of life (OHRQoL) using OHIP-14.
RESULTS: Both groups reported improvements in OHRQoL 1 month after completion of treatment. The conventional treatment group required 8.3 clinic visits as compared to 4.4 visits for the functionally orientated group. The mean total treatment time was 183 min 19 s for the conventional group vs. 124 min 8 s for the functionally orientated group. The average cost of treatment for the conventional group was 487.74 Euros compared to 356.20 Euros for the functional group.
CONCLUSIONS: Functionally orientated treatment was more cost-effective than conventional treatment in terms of treatment effect and opportunity costs to the patients' time.
Resumo:
Objectives: This study aimed to compare the cost effectiveness of conventional treatment using partial dentures with functionally-orientated treatment based on the shortened dental arch concept to replace missing teeth for partially dentate elders.
Methods: 44 partially dentate patients aged 65 years and older were recruited following routine dental assessment at a university dental hospital. Patients consented to and were randomly assigned to the two treatment arms. The conventional treatment group received a removable partial denture to replace all missing natural teeth. The functionally-orientated group were restored to a shortened dental arch of 10 occluding contacts using resin bonded bridgework. The costs associated with each treatment were recorded including laboratory charges, treatment time and opportunity costs. The impact on quality of life (OHRQoL) was measured using the 14-item Oral Health Impact Profile.
Results: Both groups reported improvements in OHRQoL after completion of treatment. For the conventional group, the mean OHIP-14 score decreased from 12.4 pre-operatively to 3.3 post-operatively (p<0.001). In the functionally-orientated group the OHIP-14 score decreased from 11.4 to 1.8 following treatment (p<0.001). On average the conventional treatment group required 8.3 clinic visits as compared to 4.4 visits for the functionally-orientated group. The mean total treatment time was 183 minutes 19 seconds for the conventional group versus 124 minutes 8 seconds for the functionally-orientated group. The conventional treatment group had an average of 6.33 teeth replaced at a laboratory cost of 337.31 Euros. The functionally-orientated group had an average of 2.64 teeth replaced at a laboratory cost of 244.05 Euros.
Conclusions: Restoration to a shortened dental arch using functionally-orientated treatment resulted in a similar improvement in OHRQoL with fewer clinic visits, less operative time and at a lower laboratory cost compared with conventional treatment.
Resumo:
Thesis (Ph.D.)--University of Washington, 2015
Resumo:
BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.