820 resultados para Common Assessment Framework (CAF)
Resumo:
Vaccines with limited ability to prevent HIV infection may positively impact the HIV/AIDS pandemic by preventing secondary transmission and disease in vaccine recipients who become infected. To evaluate the impact of vaccination on secondary transmission and disease, efficacy trials assess vaccine effects on HIV viral load and other surrogate endpoints measured after infection. A standard test that compares the distribution of viral load between the infected subgroups of vaccine and placebo recipients does not assess a causal effect of vaccine, because the comparison groups are selected after randomization. To address this problem, we formulate clinically relevant causal estimands using the principal stratification framework developed by Frangakis and Rubin (2002), and propose a class of logistic selection bias models whose members identify the estimands. Given a selection model in the class, procedures are developed for testing and estimation of the causal effect of vaccination on viral load in the principal stratum of subjects who would be infected regardless of randomization assignment. We show how the procedures can be used for a sensitivity analysis that quantifies how the causal effect of vaccination varies with the presumed magnitude of selection bias.
Resumo:
Introduction: The Health Technology Assessment report on effectiveness, cost-effectiveness and appropriateness of homeopathy was compiled on behalf of the Swiss Federal Office for Public Health (BAG) within the framework of the 'Program of Evaluation of Complementary Medicine (PEK)'. Materials and Methods: Databases accessible by Internet were systematically searched, complemented by manual search and contacts with experts, and evaluated according to internal and external validity criteria. Results: Many high-quality investigations of pre-clinical basic research proved homeopathic high-potencies inducing regulative and specific changes in cells or living organisms. 20 of 22 systematic reviews detected at least a trend in favor of homeopathy. In our estimation 5 studies yielded results indicating clear evidence for homeopathic therapy. The evaluation of 29 studies in the domain 'Upper Respiratory Tract Infections/Allergic Reactions' showed a positive overall result in favor of homeopathy. 6 out of 7 controlled studies were at least equivalent to conventional medical interventions. 8 out of 16 placebocontrolled studies were significant in favor of homeopathy. Swiss regulations grant a high degree of safety due to product and training requirements for homeopathic physicians. Applied properly, classical homeopathy has few side-effects and the use of high-potencies is free of toxic effects. A general health-economic statement about homeopathy cannot be made from the available data. Conclusion: Taking internal and external validity criteria into account, effectiveness of homeopathy can be supported by clinical evidence and professional and adequate application be regarded as safe. Reliable statements of cost-effectiveness are not available at the moment. External and model validity will have to be taken more strongly into consideration in future studies.
Resumo:
OBJECTIVE : To describe the methodology and to present the baseline findings of the Attention-deficit/hyperactivity Disorder Observational Research in Europe (ADORE) study, the primary objective of which is to describe the relationship between treatment regimen prescribed and quality of life of children with ADHD in actual practice. METHODS : In this 2-year prospective observational study, data on diagnosis, prescribed treatment and outcomes of ADHD were collected at seven time points by paediatricians and child psychiatrists on 1,573 children recruited in 10 European countries. The data presented here from the 1,478 patients included in the analyses describe the baseline condition, initial treatment regimen prescribed and quality of life of families with children with ADHD. RESULTS : Patients had a mean age of 9.0 years (SD 2.5) and 84% were male. Physicians diagnoses were made using DSM-IV (43 %), ICD-10 (32%) and both DSM-IV and ICD-10 (12 %). Mean age of awareness of a problem was 5.1 years, suggesting an average delay of approximately 4 years between awareness and diagnosis of ADHD. Baseline ADHD rating scale scores (physicianrated) indicated moderate to severe ADHD. Parent-rated SDQ scores were in agreement and suggested significant levels of co-existing problems. CGI-S, CGAS and CHIPCE scores also indicated significant impairment. Patients were offered the following treatments after the initial assessment: pharmacotherapy (25 %), psychotherapy (19 %), combination of pharmacotherapy and psychotherapy (25 %), other therapy (10 %) and no treatment (21 %). CONCLUSION : The ADORE study shows that ADHD is similarly recognised across 10 European countries and that the children are significantly impaired across a wide range of domains. In this respect, they resemble children described in previous ADHD samples.
Resumo:
BACKGROUND: The forced oscillation technique (FOT) requires minimal patient cooperation and is feasible in preschool children. Few data exist on respiratory function changes measured using FOT following inhaled bronchodilators (BD) in healthy young children, limiting the clinical applications of BD testing in this age group. A study was undertaken to determine the most appropriate method of quantifying BD responses using FOT in healthy young children and those with common respiratory conditions including cystic fibrosis, neonatal chronic lung disease and asthma and/or current wheeze. METHODS: A pseudorandom FOT signal (4-48 Hz) was used to examine respiratory resistance and reactance at 6, 8 and 10 Hz; 3-5 acceptable measurements were made before and 15 min after the administration of salbutamol. The post-BD response was expressed in absolute and relative (percentage of baseline) terms. RESULTS: Significant BD responses were seen in all groups. Absolute changes in BD responses were related to baseline lung function within each group. Relative changes in BD responses were less dependent on baseline lung function and were independent of height in healthy children. Those with neonatal chronic lung disease showed a strong baseline dependence in their responses. The BD response in children with cystic fibrosis, asthma or wheeze (based on both group mean data and number of responders) was not greater than in healthy children. CONCLUSIONS: The BD response assessed by the FOT in preschool children should be expressed as a relative change to account for the effect of baseline lung function. The limits for a positive BD response of -40% and 65% for respiratory resistance and reactance, respectively, are recommended.
Resumo:
There is poor agreement on definitions of different phenotypes of preschool wheezing disorders. The present Task Force proposes to use the terms episodic (viral) wheeze to describe children who wheeze intermittently and are well between episodes, and multiple-trigger wheeze for children who wheeze both during and outside discrete episodes. Investigations are only needed when in doubt about the diagnosis. Based on the limited evidence available, inhaled short-acting beta(2)-agonists by metered-dose inhaler/spacer combination are recommended for symptomatic relief. Educating parents regarding causative factors and treatment is useful. Exposure to tobacco smoke should be avoided; allergen avoidance may be considered when sensitisation has been established. Maintenance treatment with inhaled corticosteroids is recommended for multiple-trigger wheeze; benefits are often small. Montelukast is recommended for the treatment of episodic (viral) wheeze and can be started when symptoms of a viral cold develop. Given the large overlap in phenotypes, and the fact that patients can move from one phenotype to another, inhaled corticosteroids and montelukast may be considered on a trial basis in almost any preschool child with recurrent wheeze, but should be discontinued if there is no clear clinical benefit. Large well-designed randomised controlled trials with clear descriptions of patients are needed to improve the present recommendations on the treatment of these common syndromes.
Resumo:
BACKGROUND: Pain associated with routine procedures in NICUs is often inadequately managed. Barriers to more appropriate pain management are nurses' and physicians' knowledge and the challenges of collaborative decision-making. Few studies describe the differing perceptions of procedural pain intensity among nurses and physicians in NICUs which could complicate common decision-making. This study set out to explore the factors influencing pain intensity assessment and to gain insight into a possible pain intensity classification of routine procedures in the NICU. METHOD: A survey was conducted among 431 neonatal health care professionals from 4 tertiary level NICUs. Each routine procedure was assessed on a 10-point visual analogue scale (VAS) assuming absence of analgesia. RESULTS: Multiple ANCOVA models showed that nurses rated 19 of the 27 procedures as significantly more painful than did physicians (p<0.05). We found no differences in pain assessment based on professional experience, gender or age. Of the 27 procedures listed, 70% were rated as painful and 44% were judged very painful. Ranking and classification of the pain intensity of routine procedures were drawn up. The general ranking of the median across all procedures shows that "insertion of a thoracic drain" is assessed as the most painful procedure. CONCLUSIONS: The majority of routine procedures in an NICU are considered to be painful. Nurses generally rate procedures as more painful than do physicians. This difference in assessment deserves exploration in regard to its impact on collaborative decision-making in neonate pain management.
Resumo:
In 1998-2001 Finland suffered the most severe insect outbreak ever recorded, over 500,000 hectares. The outbreak was caused by the common pine sawfly (Diprion pini L.). The outbreak has continued in the study area, Palokangas, ever since. To find a good method to monitor this type of outbreaks, the purpose of this study was to examine the efficacy of multi-temporal ERS-2 and ENVISAT SAR imagery for estimating Scots pine (Pinus sylvestris L.) defoliation. Three methods were tested: unsupervised k-means clustering, supervised linear discriminant analysis (LDA) and logistic regression. In addition, I assessed if harvested areas could be differentiated from the defoliated forest using the same methods. Two different speckle filters were used to determine the effect of filtering on the SAR imagery and subsequent results. The logistic regression performed best, producing a classification accuracy of 81.6% (kappa 0.62) with two classes (no defoliation, >20% defoliation). LDA accuracy was with two classes at best 77.7% (kappa 0.54) and k-means 72.8 (0.46). In general, the largest speckle filter, 5 x 5 image window, performed best. When additional classes were added the accuracy was usually degraded on a step-by-step basis. The results were good, but because of the restrictions in the study they should be confirmed with independent data, before full conclusions can be made that results are reliable. The restrictions include the small size field data and, thus, the problems with accuracy assessment (no separate testing data) as well as the lack of meteorological data from the imaging dates.
Resumo:
The purpose of this study is to provide a procedure to include emissions to the atmosphere resulting from the combustion of diesel fuel during dredging operations into the decision-making process of dredging equipment selection. The proposed procedure is demonstrated for typical dredging methods and data from the Illinois Waterway as performed by the U.S. Army Corps of Engineers, Rock Island District. The equipment included in this study is a 16-inch cutterhead pipeline dredge and a mechanical bucket dredge used during the 2005 dredging season on the Illinois Waterway. Considerable effort has been put forth to identify and reduce environmental impacts from dredging operations. Though environmental impacts of dredging have been studied no efforts have been applied to the evaluation of air emissions from comparable types of dredging equipment, as in this study. By identifying the type of dredging equipment with the lowest air emissions, when cost, site conditions, and equipment availability are comparable, adverse environmental impacts can be minimized without compromising the dredging project. A total of 48 scenarios were developed by varying the dredged material quantity, transport distance, and production rates. This produced an “envelope” of results applicable to a broad range of site conditions. Total diesel fuel consumed was calculated using standard cost estimating practices as defined in the U.S. Army Corps of Engineers Construction Equipment Ownership and Operating Expense Schedule (USACE, 2005). The diesel fuel usage was estimated for all equipment used to mobilize and/or operate each dredging crew for every scenario. A Limited Life Cycle Assessment (LCA) was used to estimate the air emissions from two comparable dredging operations utilizing SimaPro LCA software. An Environmental Impact Single Score (EISS) was the SimaPro output selected for comparison with the cost per CY of dredging, potential production rates, and transport distances to identify possible decision points. The total dredging time was estimated for each dredging crew and scenario. An average hourly cost for both dredging crews was calculated based on Rock Island District 2005 dredging season records (Graham 2007/08). The results from this study confirm commonly used rules of thumb in the dredging industry by indicating that mechanical bucket dredges are better suited for long transport distances and have lower air emissions and cost per CY for smaller quantities of dredged material. In addition, the results show that a cutterhead pipeline dredge would be preferable for moderate and large volumes of dredged material when no additional booster pumps are required. Finally, the results indicate that production rates can be a significant factor when evaluating the air emissions from comparable dredging equipment.
Resumo:
Studies are suggesting that hurricane hazard patterns (e.g. intensity and frequency) may change as a consequence of the changing global climate. As hurricane patterns change, it can be expected that hurricane damage risks and costs may change as a result. This indicates the necessity to develop hurricane risk assessment models that are capable of accounting for changing hurricane hazard patterns, and develop hurricane mitigation and climatic adaptation strategies. This thesis proposes a comprehensive hurricane risk assessment and mitigation strategies that account for a changing global climate and that has the ability of being adapted to various types of infrastructure including residential buildings and power distribution poles. The framework includes hurricane wind field models, hurricane surge height models and hurricane vulnerability models to estimate damage risks due to hurricane wind speed, hurricane frequency, and hurricane-induced storm surge and accounts for the timedependant properties of these parameters as a result of climate change. The research then implements median insured house values, discount rates, housing inventory, etc. to estimate hurricane damage costs to residential construction. The framework was also adapted to timber distribution poles to assess the impacts climate change may have on timber distribution pole failure. This research finds that climate change may have a significant impact on the hurricane damage risks and damage costs of residential construction and timber distribution poles. In an effort to reduce damage costs, this research develops mitigation/adaptation strategies for residential construction and timber distribution poles. The costeffectiveness of these adaptation/mitigation strategies are evaluated through the use of a Life-Cycle Cost (LCC) analysis. In addition, a scenario-based analysis of mitigation strategies for timber distribution poles is included. For both residential construction and timber distribution poles, adaptation/mitigation measures were found to reduce damage costs. Finally, the research develops the Coastal Community Social Vulnerability Index (CCSVI) to include the social vulnerability of a region to hurricane hazards within this hurricane risk assessment. This index quantifies the social vulnerability of a region, by combining various social characteristics of a region with time-dependant parameters of hurricanes (i.e. hurricane wind and hurricane-induced storm surge). Climate change was found to have an impact on the CCSVI (i.e. climate change may have an impact on the social vulnerability of hurricane-prone regions).
Resumo:
In the Dominican Republic economic growth in the past twenty years has not yielded sufficient improvement in access to drinking water services, especially in rural areas where 1.5 million people do not have access to an improved water source (WHO, 2006). Worldwide, strategic development planning in the rural water sector has focused on participatory processes and the use of demand filters to ensure that service levels match community commitment to post-project operation and maintenance. However studies have concluded that an alarmingly high percentage of drinking water systems (20-50%) do not provide service at the design levels and/or fail altogether (up to 90%): BNWP (2009), Annis (2006), and Reents (2003). World Bank, USAID, NGOs, and private consultants have invested significant resources in an effort to determine what components make up an “enabling environment” for sustainable community management of rural water systems (RWS). Research has identified an array of critical factors, internal and external to the community, which affect long term sustainability of water services. Different frameworks have been proposed in order to better understand the linkages between individual factors and sustainability of service. This research proposes a Sustainability Analysis Tool to evaluate the sustainability of RWS, adapted from previous relevant work in the field to reflect the realities in the Dominican Republic. It can be used as a diagnostic tool for government entities and development organizations to characterize the needs of specific communities and identify weaknesses in existing training regimes or support mechanisms. The framework utilizes eight indicators in three categories (Organization/Management, Financial Administration, and Technical Service). Nineteen independent variables are measured resulting in a score of sustainability likely (SL), possible (SP), or unlikely (SU) for each of the eight indicators. Thresholds are based upon benchmarks from the DR and around the world, primary data collected during the research, and the author’s 32 months of field experience. A final sustainability score is calculated using weighting factors for each indicator, derived from Lockwood (2003). The framework was tested using a statistically representative geographically stratified random sample of 61 water systems built in the DR by initiatives of the National Institute of Potable Water (INAPA) and Peace Corps. The results concluded that 23% of sample systems are likely to be sustainable in the long term, 59% are possibly sustainable, and for 18% it is unlikely that the community will be able to overcome any significant challenge. Communities that were scored as unlikely sustainable perform poorly in participation, financial durability, and governance while the highest scores were for system function and repair service. The Sustainability Analysis Tool results are verified by INAPA and PC reports, evaluations, and database information, as well as, field observations and primary data collected during the surveys. Future research will analyze the nature and magnitude of relationships between key factors and the sustainability score defined by the tool. Factors include: gender participation, legal status of water committees, plumber/operator remuneration, demand responsiveness, post construction support methodologies, and project design criteria.
Resumo:
Highway infrastructure plays a significant role in society. The building and upkeep of America’s highways provide society the necessary means of transportation for goods and services needed to develop as a nation. However, as a result of economic and social development, vast amounts of greenhouse gas emissions (GHG) are emitted into the atmosphere contributing to global climate change. In recognizing this, future policies may mandate the monitoring of GHG emissions from public agencies and private industries in order to reduce the effects of global climate change. To effectively reduce these emissions, there must be methods that agencies can use to quantify the GHG emissions associated with constructing and maintaining the nation’s highway infrastructure. Current methods for assessing the impacts of highway infrastructure include methodologies that look at the economic impacts (costs) of constructing and maintaining highway infrastructure over its life cycle. This is known as Life Cycle Cost Analysis (LCCA). With the recognition of global climate change, transportation agencies and contractors are also investigating the environmental impacts that are associated with highway infrastructure construction and rehabilitation. A common tool in doing so is the use of Life Cycle Assessment (LCA). Traditionally, LCA is used to assess the environmental impacts of products or processes. LCA is an emerging concept in highway infrastructure assessment and is now being implemented and applied to transportation systems. This research focuses on life cycle GHG emissions associated with the construction and rehabilitation of highway infrastructure using a LCA approach. Life cycle phases of the highway section include; the material acquisition and extraction, construction and rehabilitation, and service phases. Departing from traditional approaches that tend to use LCA as a way to compare alternative pavement materials or designs based on estimated inventories, this research proposes a shift to a context sensitive process-based approach that uses actual observed construction and performance data to calculate greenhouse gas emissions associated with highway construction and rehabilitation. The goal is to support strategies that reduce long-term environmental impacts. Ultimately, this thesis outlines techniques that can be used to assess GHG emissions associated with construction and rehabilitation operations to support the overall pavement LCA.
Resumo:
Complex human diseases are a major challenge for biological research. The goal of my research is to develop effective methods for biostatistics in order to create more opportunities for the prevention and cure of human diseases. This dissertation proposes statistical technologies that have the ability of being adapted to sequencing data in family-based designs, and that account for joint effects as well as gene-gene and gene-environment interactions in the GWA studies. The framework includes statistical methods for rare and common variant association studies. Although next-generation DNA sequencing technologies have made rare variant association studies feasible, the development of powerful statistical methods for rare variant association studies is still underway. Chapter 2 demonstrates two adaptive weighting methods for rare variant association studies based on family data for quantitative traits. The results show that both proposed methods are robust to population stratification, robust to the direction and magnitude of the effects of causal variants, and more powerful than the methods using weights suggested by Madsen and Browning [2009]. In Chapter 3, I extended the previously proposed test for Testing the effect of an Optimally Weighted combination of variants (TOW) [Sha et al., 2012] for unrelated individuals to TOW &ndash F, TOW for Family &ndash based design. Simulation results show that TOW &ndash F can control for population stratification in wide range of population structures including spatially structured populations, is robust to the directions of effect of causal variants, and is relatively robust to percentage of neutral variants. In GWA studies, this dissertation consists of a two &ndash locus joint effect analysis and a two-stage approach accounting for gene &ndash gene and gene &ndash environment interaction. Chapter 4 proposes a novel two &ndash stage approach, which is promising to identify joint effects, especially for monotonic models. The proposed approach outperforms a single &ndash marker method and a regular two &ndash stage analysis based on the two &ndash locus genotypic test. In Chapter 5, I proposed a gene &ndash based two &ndash stage approach to identify gene &ndash gene and gene &ndash environment interactions in GWA studies which can include rare variants. The two &ndash stage approach is applied to the GAW 17 dataset to identify the interaction between KDR gene and smoking status.
Resumo:
During the project, managers encounter numerous contingencies and are faced with the challenging task of making decisions that will effectively keep the project on track. This task is very challenging because construction projects are non-prototypical and the processes are irreversible. Therefore, it is critical to apply a methodological approach to develop a few alternative management decision strategies during the planning phase, which can be deployed to manage alternative scenarios resulting from expected and unexpected disruptions in the as-planned schedule. Such a methodology should have the following features but are missing in the existing research: (1) looking at the effects of local decisions on the global project outcomes, (2) studying how a schedule responds to decisions and disruptive events because the risk in a schedule is a function of the decisions made, (3) establishing a method to assess and improve the management decision strategies, and (4) developing project specific decision strategies because each construction project is unique and the lessons from a particular project cannot be easily applied to projects that have different contexts. The objective of this dissertation is to develop a schedule-based simulation framework to design, assess, and improve sequences of decisions for the execution stage. The contribution of this research is the introduction of applying decision strategies to manage a project and the establishment of iterative methodology to continuously assess and improve decision strategies and schedules. The project managers or schedulers can implement the methodology to develop and identify schedules accompanied by suitable decision strategies to manage a project at the planning stage. The developed methodology also lays the foundation for an algorithm towards continuously automatically generating satisfactory schedule and strategies through the construction life of a project. Different from studying isolated daily decisions, the proposed framework introduces the notion of {em decision strategies} to manage construction process. A decision strategy is a sequence of interdependent decisions determined by resource allocation policies such as labor, material, equipment, and space policies. The schedule-based simulation framework consists of two parts, experiment design and result assessment. The core of the experiment design is the establishment of an iterative method to test and improve decision strategies and schedules, which is based on the introduction of decision strategies and the development of a schedule-based simulation testbed. The simulation testbed used is Interactive Construction Decision Making Aid (ICDMA). ICDMA has an emulator to duplicate the construction process that has been previously developed and a random event generator that allows the decision-maker to respond to disruptions in the emulation. It is used to study how the schedule responds to these disruptions and the corresponding decisions made over the duration of the project while accounting for cascading impacts and dependencies between activities. The dissertation is organized into two parts. The first part presents the existing research, identifies the departure points of this work, and develops a schedule-based simulation framework to design, assess, and improve decision strategies. In the second part, the proposed schedule-based simulation framework is applied to investigate specific research problems.
Resumo:
Water distribution systems are important for life saving facilities especially in the recovery after earthquakes. In this paper, a framework is discussed about seismic serviceability of water systems that includes the fragility evaluation of water sources of water distribution networks. Also, a case study is brought about the performance of a water system under different levels of seismic hazard. The seismic serviceability of a water supply system provided by EPANET is evaluated under various levels of seismic hazard. Basically, the assessment process is based on hydraulic analysis and Monte Carlo simulations, implemented with empirical fragility data provided by the American Lifeline Alliance (ALA, 2001) for both pipelines and water facilities. Represented by the Seismic Serviceability Index (Cornell University, 2008), the serviceability of the water distribution system is evaluated under each level of earthquakes with return periods of 72 years, 475 years, and 2475 years. The system serviceability under levels of earthquake hazard are compared with and without considering the seismic fragility of the water source. The results show that the seismic serviceability of the water system decreases with the growing of the return period of seismic hazard, and after considering the seismic fragility of the water source, the seismic serviceability decreases. The results reveal the importance of considering the seismic fragility of water sources, and the growing dependence of the system performance of water system on the seismic resilience of water source under severe earthquakes.
Resumo:
This paper describes the open source framework MARVIN for rapid application development in the field of biomedical and clinical research. MARVIN applications consist of modules that can be plugged together in order to provide the functionality required for a specific experimental scenario. Application modules work on a common patient database that is used to store and organize medical data as well as derived data. MARVIN provides a flexible input/output system with support for many file formats including DICOM, various 2D image formats and surface mesh data. Furthermore, it implements an advanced visualization system and interfaces to a wide range of 3D tracking hardware. Since it uses only highly portable libraries, MARVIN applications run on Unix/Linux, Mac OS X and Microsoft Windows.