913 resultados para Project analysis
Resumo:
OBJECTIVE: To determine the impact of a community based Helicobacter pylori screening and eradication programme on the incidence of dyspepsia, resource use, and quality of life, including a cost consequences analysis. DESIGN: H pylori screening programme followed by randomised placebo controlled trial of eradication. SETTING: Seven general practices in southwest England. PARTICIPANTS: 10,537 unselected people aged 20-59 years were screened for H pylori infection (13C urea breath test); 1558 of the 1636 participants who tested positive were randomised to H pylori eradication treatment or placebo, and 1539 (99%) were followed up for two years. INTERVENTION: Ranitidine bismuth citrate 400 mg and clarithromycin 500 mg twice daily for two weeks or placebo. MAIN OUTCOME MEASURES: Primary care consultation rates for dyspepsia (defined as epigastric pain) two years after randomisation, with secondary outcomes of dyspepsia symptoms, resource use, NHS costs, and quality of life. RESULTS: In the eradication group, 35% fewer participants consulted for dyspepsia over two years compared with the placebo group (55/787 v 78/771; odds ratio 0.65, 95% confidence interval 0.46 to 0.94; P = 0.021; number needed to treat 30) and 29% fewer participants had regular symptoms (odds ratio 0.71, 0.56 to 0.90; P = 0.05). NHS costs were 84.70 pounds sterling (74.90 pounds sterling to 93.91 pounds sterling) greater per participant in the eradication group over two years, of which 83.40 pounds sterling (146 dollars; 121 euro) was the cost of eradication treatment. No difference in quality of life existed between the two groups. CONCLUSIONS: Community screening and eradication of H pylori is feasible in the general population and led to significant reductions in the number of people who consulted for dyspepsia and had symptoms two years after treatment. These benefits have to be balanced against the costs of eradication treatment, so a targeted eradication strategy in dyspeptic patients may be preferable.
Resumo:
Motivation: Array CGH technologies enable the simultaneous measurement of DNA copy number for thousands of sites on a genome. We developed the circular binary segmentation (CBS) algorithm to divide the genome into regions of equal copy number (Olshen {\it et~al}, 2004). The algorithm tests for change-points using a maximal $t$-statistic with a permutation reference distribution to obtain the corresponding $p$-value. The number of computations required for the maximal test statistic is $O(N^2),$ where $N$ is the number of markers. This makes the full permutation approach computationally prohibitive for the newer arrays that contain tens of thousands markers and highlights the need for a faster. algorithm. Results: We present a hybrid approach to obtain the $p$-value of the test statistic in linear time. We also introduce a rule for stopping early when there is strong evidence for the presence of a change. We show through simulations that the hybrid approach provides a substantial gain in speed with only a negligible loss in accuracy and that the stopping rule further increases speed. We also present the analysis of array CGH data from a breast cancer cell line to show the impact of the new approaches on the analysis of real data. Availability: An R (R Development Core Team, 2006) version of the CBS algorithm has been implemented in the ``DNAcopy'' package of the Bioconductor project (Gentleman {\it et~al}, 2004). The proposed hybrid method for the $p$-value is available in version 1.2.1 or higher and the stopping rule for declaring a change early is available in version 1.5.1 or higher.
Resumo:
This article gives an overview over the methods used in the low--level analysis of gene expression data generated using DNA microarrays. This type of experiment allows to determine relative levels of nucleic acid abundance in a set of tissues or cell populations for thousands of transcripts or loci simultaneously. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. This includes the design of probes, the experimental design, the image analysis of microarray scanned images, the normalization of fluorescence intensities, the assessment of the quality of microarray data and incorporation of quality information in subsequent analyses, the combination of information across arrays and across sets of experiments, the discovery and recognition of patterns in expression at the single gene and multiple gene levels, and the assessment of significance of these findings, considering the fact that there is a lot of noise and thus random features in the data. For all of these components, access to a flexible and efficient statistical computing environment is an essential aspect.
Resumo:
A basic, yet challenging task in the analysis of microarray gene expression data is the identification of changes in gene expression that are associated with particular biological conditions. We discuss different approaches to this task and illustrate how they can be applied using software from the Bioconductor Project. A central problem is the high dimensionality of gene expression space, which prohibits a comprehensive statistical analysis without focusing on particular aspects of the joint distribution of the genes expression levels. Possible strategies are to do univariate gene-by-gene analysis, and to perform data-driven nonspecific filtering of genes before the actual statistical analysis. However, more focused strategies that make use of biologically relevant knowledge are more likely to increase our understanding of the data.
Resumo:
BACKGROUND AND OBJECTIVE: Most economic evaluations of chlamydia screening do not include costs incurred by patients. The objective of this study was to estimate both the health service and private costs of patients who participated in proactive chlamydia screening, using mailed home-collected specimens as part of the Chlamydia Screening Studies project. METHODS: Data were collected on the administrative costs of the screening study, laboratory time and motion studies and patient-cost questionnaire surveys were conducted. The cost for each screening invitation and for each accepted offer was estimated. One-way sensitivity analysis was conducted to explore the effects of variations in patient costs and the number of patients accepting the screening offer. RESULTS: The time and costs of processing urine specimens and vulvo-vaginal swabs from women using two nucleic acid amplification tests were similar. The total cost per screening invitation was 20.37 pounds (95% CI 18.94 pounds to 24.83). This included the National Health Service cost per individual screening invitation 13.55 pounds (95% CI 13.15 pounds to 14.33) and average patient costs of 6.82 pounds (95% CI 5.48 pounds to 10.22). Administrative costs accounted for 50% of the overall cost. CONCLUSIONS: The cost of proactive chlamydia screening is comparable to those of opportunistic screening. Results from this study, which is the first to collect private patient costs associated with a chlamydia screening programme, could be used to inform future policy recommendations and provide unique primary cost data for economic evaluations.
Resumo:
The challenges posed by global climate change are motivating the investigation of strategies that can reduce the life cycle greenhouse gas (GHG) emissions of products and processes. While new construction materials and technologies have received significant attention, there has been limited emphasis on understanding how construction processes can be best managed to reduce GHG emissions. Unexpected disruptive events tend to adversely impact construction costs and delay project completion. They also tend to increase project GHG emissions. The objective of this paper is to investigate ways in which project GHG emissions can be reduced by appropriate management of disruptive events. First, an empirical analysis of construction data from a specific highway construction project is used to illustrate the impact of unexpected schedule delays in increasing project GHG emissions. Next, a simulation based methodology is described to assess the effectiveness of alternative project management strategies in reducing GHG emissions. The contribution of this paper is that it explicitly considers projects emissions, in addition to cost and project duration, in developing project management strategies. Practical application of the method discussed in this paper will help construction firms reduce their project emissions through strategic project management, and without significant investment in new technology. In effect, this paper lays the foundation for best practices in construction management that will optimize project cost and duration, while minimizing GHG emissions.
Resumo:
In this dissertation, the National Survey of Student Engagement (NSSE) serves as a nodal point through which to examine the power relations shaping the direction and practices of higher education in the twenty-first century. Theoretically, my analysis is informed by Foucault’s concept of governmentality, briefly defined as a technology of power that influences or shapes behavior from a distance. This form of governance operates through apparatuses of security, which include higher education. Foucault identified three essential characteristics of an apparatus—the market, the milieu, and the processes of normalization—through which administrative mechanisms and practices operate and govern populations. In this project, my primary focus is on the governance of faculty and administrators, as a population, at residential colleges and universities. I argue that the existing milieu of accountability is one dominated by the neoliberal assumption that all activity—including higher education—works best when governed by market forces alone, reducing higher education to a market-mediated private good. Under these conditions, what many in the academy believe is an essential purpose of higher education—to educate students broadly, to contribute knowledge for the public good, and to serve as society’s critic and social conscience (Washburn 227)—is being eroded. Although NSSE emerged as a form of resistance to commercial college rankings, it did not challenge the forces that empowered the rankings in the first place. Indeed, NSSE data are now being used to make institutions even more responsive to market forces. Furthermore, NSSE’s use has a normalizing effect that tends to homogenize classroom practices and erode the autonomy of faculty in the educational process. It also positions students as part of the system of surveillance. In the end, if aspects of higher education that are essential to maintaining a civil society are left to be defined solely in market terms, the result may be a less vibrant and, ultimately, a less just society.
Resumo:
When a single brush-less dc motor is fed by an inverter with a sensor-less algorithm embedded in the switching controller, the system exhibits a linear and stable output in terms of the speed and torque. However, with two motors modulated by the same inverter, the system is unstable and rendered useless for a steady application, unless provided with some resistive damping on the supply lines. The project discusses and analysis the stability of such a system through simulations and hardware demonstrations and also will discuss a method to derive the values of these damping.
Resumo:
The objective of this research is to investigate the consequences of sharing or using information generated in one phase of the project to subsequent life cycle phases. Sometimes the assumptions supporting the information change, and at other times the context within which the information was created changes in a way that causes the information to become invalid. Often these inconsistencies are not discovered till the damage has occurred. This study builds on previous research that proposed a framework based on the metaphor of ‘ecosystems’ to model such inconsistencies in the 'supply chain' of life cycle information (Brokaw and Mukherjee, 2012). The outcome of such inconsistencies often results in litigation. Therefore, this paper studies a set of legal cases that resulted from inconsistencies in life cycle information, within the ecosystems framework. For each project, the errant information type, creator and user of the information and their relationship, time of creation and usage of the information in the life cycle of the project are investigated to assess the causes of failure of precise and accurate information flow as well as the impact of such failures in later stages of the project. The analysis shows that the misleading information is mostly due to lack of collaboration. Besides, in all the studied cases, lack of compliance checking, imprecise data and insufficient clarifications hinder accurate and smooth flow of information. The paper presents findings regarding the bottleneck of the information flow process during the design, construction and post construction phases. It also highlights the role of collaboration as well as information integration and management during the project life cycle and presents a baseline for improvement in information supply chain through the life cycle of the project.
Resumo:
In-cylinder pressure transducers have been used for decades to record combustion pressure inside a running engine. However, due to the extreme operating environment, transducer design and installation must be considered in order to minimize measurement error. One such error is caused by thermal shock, where the pressure transducer experiences a high heat flux that can distort the pressure transducer diaphragm and also change the crystal sensitivity. This research focused on investigating the effects of thermal shock on in-cylinder pressure transducer data quality using a 2.0L, four-cylinder, spark-ignited, direct-injected, turbo-charged GM engine. Cylinder four was modified with five ports to accommodate pressure transducers of different manufacturers. They included an AVL GH14D, an AVL GH15D, a Kistler 6125C, and a Kistler 6054AR. The GH14D, GH15D, and 6054AR were M5 size transducers. The 6125C was a larger, 6.2mm transducer. Note that both of the AVL pressure transducers utilized a PH03 flame arrestor. Sweeps of ignition timing (spark sweep), engine speed, and engine load were performed to study the effects of thermal shock on each pressure transducer. The project consisted of two distinct phases which included experimental engine testing as well as simulation using a commercially available software package. A comparison was performed to characterize the quality of the data between the actual cylinder pressure and the simulated results. This comparison was valuable because the simulation results did not include thermal shock effects. All three sets of tests showed the peak cylinder pressure was basically unaffected by thermal shock. Comparison of the experimental data with the simulated results showed very good correlation. The spark sweep was performed at 1300 RPM and 3.3 bar NMEP and showed that the differences between the simulated results (no thermal shock) and the experimental data for the indicated mean effective pressure (IMEP) and the pumping mean effective pressure (PMEP) were significantly less than the published accuracies. All transducers had an IMEP percent difference less than 0.038% and less than 0.32% for PMEP. Kistler and AVL publish that the accuracy of their pressure transducers are within plus or minus 1% for the IMEP (AVL 2011; Kistler 2011). In addition, the difference in average exhaust absolute pressure between the simulated results and experimental data was the greatest for the two Kistler pressure transducers. The location and lack of flame arrestor are believed to be the cause of the increased error. For the engine speed sweep, the torque output was held constant at 203 Nm (150 ft-lbf) from 1500 to 4000 RPM. The difference in IMEP was less than 0.01% and the PMEP was less than 1%, except for the AVL GH14D which was 5% and the AVL GH15DK which was 2.25%. A noticeable error in PMEP appeared as the load increased during the engine speed sweeps, as expected. The load sweep was conducted at 2000 RPM over a range of NMEP from 1.1 to 14 bar. The difference in IMEP values were less 0.08% while the PMEP values were below 1% except for the AVL GH14D which was 1.8% and the AVL GH15DK which was at 1.25%. In-cylinder pressure transducer data quality was effectively analyzed using a combination of experimental data and simulation results. Several criteria can be used to investigate the impact of thermal shock on data quality as well as determine the best location and thermal protection for various transducers.
Resumo:
In Panama, one of the Environmental Health (EH) Sector’s primary goals is to improve the health of rural Panamanians by helping them to adopt behaviors and practices that improve access to and use of sanitation systems. In complying with this goal, the EH sector has used participatory development models to improve hygiene and increase access to latrines through volunteer managed latrine construction projects. Unfortunately, there is little understanding of the long term sustainability of these interventions after the volunteers have completed their service. With the Peace Corps adapting their Monitoring, Reporting, and Evaluation procedures, it is appropriate to evaluate the sustainability of sanitation interventions offering recommendations for the adaptions of the EH training program, project management, and evaluation procedures. Recognizing the need for evaluation of past latrine projects, the author performed a post project assessment of 19 pit latrine projects using participatory analysis methodologies. First, the author reviewed volunteers’ perspectives of pit latrine projects in a survey. Then, for comparison, the author performed a survey of latrine projects using a benchmarking scoring system to rate solid waste management, drainage, latrine siting, latrine condition, and hygiene. It was observed that the Sanitation WASH matrix created by the author was an effective tool for evaluating the efficacy of sanitation interventions. Overall more than 75%, of latrines constructed were in use. However, there were some areas where improvements could be made for both latrine construction and health and hygiene. The latrines scored poorly on the indicators related to the privacy structure and seat covers. Interestingly those are the two items least likely to be included in project subsidies. Furthermore, scores for hygiene-related indicators were low; particularly those related to hand washing and cleanliness of the kitchen, indicating potential for improvement in hygiene education. Based on these outcomes, the EH sector should consider including subsidies and standardized designs for privacy structures and seat covers for latrines. In addition, the universal adoption of contracts and/or deposits for project beneficiaries is expected to improve the completion of latrines. In order to address the low scores in the health and hygiene indicators, the EH sector should adapt volunteer training, in addition to standardizing health and hygiene intervention procedures. In doing so, the sector should mimic the Community Health Club model that has shown success in improving health and hygiene indicators, as well as use a training session plan format similar to those in the Water Committee Seminar manual. Finally, the sector should have an experienced volunteer dedicated to program oversight and post-project monitoring and evaluation.
Resumo:
This research evaluated an Intelligent Compaction (IC) unit on the M-189 highway reconstruction project at Iron River, Michigan. The results from the IC unit were compared to several traditional compaction measurement devices including Nuclear Density Gauge (NDG), Geogauge, Light Weight Deflectometer (LWD), Dynamic Cone Penetrometer (DCP), and Modified Clegg Hammer (MCH). The research collected point measurements data on a test section in which 30 test locations on the final Class II sand base layer and the 22A gravel layer. These point measurements were compared with the IC measurements (ICMVs) on a point-to-point basis through a linear regression analysis. Poor correlations were obtained among different measurements points using simple regression analysis. When comparing the ICMV to the compaction measurements points. Factors attributing to the weak correlation include soil heterogeneity, variation in IC roller operation parameters, in-place moisture content, the narrow range of the compaction devices measurement ranges and support conditions of the support layers. After incorporating some of the affecting factors into a multiple regression analysis, the strength of correlation significantly improved, especially on the stiffer gravel layer. Measurements were also studied from an overall distribution perspective in terms of average, measurement range, standard deviation, and coefficient of variance. Based on data analysis, on-site project observation and literature review, conclusions were made on how IC performed in regards to compaction control on the M-189 reconstruction project.
Resumo:
This project consists of a proposed curriculum for a semester-long, community-based workshop for LGBTQIA+ (lesbian, gay, bisexual, trans*, queer or questioning, intersex, asexual or ally, "+" indicating other identifications that deviate from heterosexual) youth ages 16-18. The workshop focuses on an exploration of LGBTQIA+ identity and community through discussion and collaborative rhetorical analysis of visual and social media. Informed by queer theory and history, studies on youth work, and visual media studies and incorporating rhetorical criticism as well as liberatory pedagogy and community literacy practices, the participation-based design of the workshop seeks to involve participants in selection of media texts, active analytical viewership, and multimodal response. The workshop is designed to engage participants in reflection on questions of individual and collective responsibility and agency as members and allies of various communities. The goal of the workshop is to strengthen participants' abilities to analyze the complex ways in which television, film, and social media influence their own and others’ perceptions of issues surrounding queer identities. As part of the reflective process, participants are challenged to consider how they can in turn actively and collaboratively respond to and potentially help to shape these perceptions. My project report details the theoretical framework, pedagogical rationale, methods of text selection and critical analysis, and guidelines for conduct that inform and structure the workshop.
Resumo:
Social learning approaches have become a prominent focus in studies related to sustainable agriculture. In order to better understand the potential of social learning for more sustainable development, the present study assessed the processes, effects and facilitating elements of interaction related to social learning in the context of Swiss soil protection and the innovative ‘From Farmer - To Farmer’ project. The study reveals that social learning contributes to fundamental transformations of patterns of interactions. However, the study also demonstrates that a learning-oriented understanding of sustainable development implies including analysis of the institutional environments in which the organizations of the individual representatives of face-to-face-based social learning processes are operating. This has shown to be a decisive element when face-to-face-based learning processes of the organisations’ representatives are translated into organisational learning. Moreover, the study revealed that this was achieved not directly through formalisation of new lines of institutionalised cooperation but by establishing links in a ‘boundary space’ trying out new forms of collaboration, aiming at social learning and co-production of knowledge. It is argued that further research on social learning processes should give greater emphasis to this intermediary level of ‘boundary spaces’.
Resumo:
Software metrics offer us the promise of distilling useful information from vast amounts of software in order to track development progress, to gain insights into the nature of the software, and to identify potential problems. Unfortunately, however, many software metrics exhibit highly skewed, non-Gaussian distributions. As a consequence, usual ways of interpreting these metrics --- for example, in terms of "average" values --- can be highly misleading. Many metrics, it turns out, are distributed like wealth --- with high concentrations of values in selected locations. We propose to analyze software metrics using the Gini coefficient, a higher-order statistic widely used in economics to study the distribution of wealth. Our approach allows us not only to observe changes in software systems efficiently, but also to assess project risks and monitor the development process itself. We apply the Gini coefficient to numerous metrics over a range of software projects, and we show that many metrics not only display remarkably high Gini values, but that these values are remarkably consistent as a project evolves over time.