933 resultados para Project analysis
Resumo:
Urban agriculture is a phenomenon that can be observed world-wide, particularly in cities of devel- oping countries. It is contributing significantly to food security and food safety and has sustained livelihood of the urban and peri-urban low income dwe llers in developing countries for many years. Population increase due to rural-urban migration and natural - formal as well as informal - urbani- sation are competing with urban farming for available space and scarce water resources. A mul- titemporal and multisensoral urban change analysis over the period of 25 years (1982-2007) was performed in order to measure and visualise the urban expansion along the Kizinga and Mzinga valley in the south of Dar Es Salaam. Airphotos and VHR satellite data were analysed by using a combination of a composition of anisotropic textural measures and spectral information. The study revealed that unplanned built-up area is expanding continuously, and vegetation covers and agricultural lands decline at a fast rate. The validation showed that the overall classification accuracy varied depending on the database. The extracted built-up areas were used for visual in- terpretation mapping purposes and served as information source for another research project. The maps visualise an urban congestion and expansion of nearly 18% of the total analysed area that had taken place in the Kizinga valley between 1982 and 2007. The same development can be ob- served in the less developed and more remote Mzinga valley between 1981 and 2002. Both areas underwent fast changes where land prices still tend to go up and an influx of people both from rural and urban areas continuously increase the density with the consequence of increasing multiple land use interests.
Resumo:
Advances in food transformation have dramatically increased the diversity of products on the market and, consequently, exposed consumers to a complex spectrum of bioactive nutrients whose potential risks and benefits have mostly not been confidently demonstrated. Therefore, tools are needed to efficiently screen products for selected physiological properties before they enter the market. NutriChip is an interdisciplinary modular project funded by the Swiss programme Nano-Tera, which groups scientists from several areas of research with the aim of developing analytical strategies that will enable functional screening of foods. The project focuses on postprandial inflammatory stress, which potentially contributes to the development of chronic inflammatory diseases. The first module of the NutriChip project is composed of three in vitro biochemical steps that mimic the digestion process, intestinal absorption, and subsequent modulation of immune cells by the bioavailable nutrients. The second module is a miniaturised form of the first module (gut-on-a-chip) that integrates a microfluidic-based cell co-culture system and super-resolution imaging technologies to provide a physiologically relevant fluid flow environment and allows sensitive real-time analysis of the products screened in vitro. The third module aims at validating the in vitro screening model by assessing the nutritional properties of selected food products in humans. Because of the immunomodulatory properties of milk as well as its amenability to technological transformation, dairy products have been selected as model foods. The NutriChip project reflects the opening of food and nutrition sciences to state-of-the-art technologies, a key step in the translation of transdisciplinary knowledge into nutritional advice.
Resumo:
OBJECTIVE: Schizotypal features indicate proneness to psychosis in the general population. It is also possible that they increase transition to psychosis (TTP) among clinical high-risk patients (CHR). Our aim was to investigate whether schizotypal features predict TTP in CHR patients. METHODS: In the EPOS (European Prediction of Psychosis Study) project, 245 young help-seeking CHR patients were prospectively followed for 18 months and their TTP was identified. At baseline, subjects were assessed with the Schizotypal Personality Questionnaire (SPQ). Associations between SPQ items and its subscales with the TTP were analysed in Cox regression analysis. RESULTS: The SPQ subscales and items describing ideas of reference and lack of close interpersonal relationships were found to correlate significantly with TTP. The co-occurrence of these features doubled the risk of TTP. CONCLUSIONS: Presence of ideas of reference and lack of close interpersonal relations increase the risk of full-blown psychosis among CHR patients. This co-occurrence makes the risk of psychosis very high.
Resumo:
This examination of U.S. economic policy directed toward Chile centered on the political and economic changes that occurred within Chile between 1960 and 1988. During this time, U.S. economic policy directed toward Chile was crafted by members of the American government uneasy with Cold War concerns with the most important of which being the spread of Communism throughout the globe. By viewing U.S. policy toward Chile through this Cold War lens, this thesis explores the different ways in which economic policy was used to advance the political and economic goals within not only Chile, but also Latin America as a whole. The Cold Warriors that crafted and enacted these economic policies were motivated by a variety of factors, and influenced by events outside of their control. From President John F. Kennedy to Ronald Reagan, American policymakers utilized economic policy as a means to achieve regional goals. This project sheds light on an understudied section of U.S. foreign policy history by exploring the way that economic policy helped achieve Cold War objectives in the Southern Cone.
Resumo:
The purpose of this research project is to study an innovative method for the stability assessment of structural steel systems, namely the Modified Direct Analysis Method (MDM). This method is intended to simplify an existing design method, the Direct Analysis Method (DM), by assuming a sophisticated second-order elastic structural analysis will be employed that can account for member and system instability, and thereby allow the design process to be reduced to confirming the capacity of member cross-sections. This last check can be easily completed by substituting an effective length of KL = 0 into existing member design equations. This simplification will be particularly useful for structural systems in which it is not clear how to define the member slenderness L/r when the laterally unbraced length L is not apparent, such as arches and the compression chord of an unbraced truss. To study the feasibility and accuracy of this new method, a set of 12 benchmark steel structural systems previously designed and analyzed by former Bucknell graduate student Jose Martinez-Garcia and a single column were modeled and analyzed using the nonlinear structural analysis software MASTAN2. A series of Matlab-based programs were prepared by the author to provide the code checking requirements for investigating the MDM. By comparing MDM and DM results against the more advanced distributed plasticity analysis results, it is concluded that the stability of structural systems can be adequately assessed in most cases using MDM, and that MDM often appears to be a more accurate but less conservative method in assessing stability.
Resumo:
Female candidates have become more successful in the political arena, specifically in the United States Senate. Today, females make up twenty percent of the total Senate seats. Despite this increase, females are still underrepresented in Washington. As such, understanding the roadblocks to equality will help us achieve parity. In an attempt to understand various challenges that female senatorial candidates face, this project looks at a specific element of their campaign, TV advertisements. Assessing candidate advertisements will help us understand whether gender affects strategic campaign decisions. Specifically, this project investigates the relationship between candidate gender and casting and setting of TV advertisements. Does gender influence the makeup of political ad spots? In order to understand this relationship more completely, I employ both quantitative data and case study analysis for same-gender and mixed-gender primary and general election contests in 2004 and 2008. Ultimately, candidate gender has little to no effect on casting of senatorial advertisements across both election cycles. Despite this variation in casting, we observe consistent findings across three settings, the political setting, the home setting, and the neighborhood setting. In both 2004 and 2008, female candidates use smaller proportions of ad frames with the political setting in comparison to their male counterparts. Female candidates in both election cycles also employed greater proportions of ad frames with the home and neighborhood setting compared to male candidates. These discrepancies point to a distinction in advertisement strategy depending on gender of the candidate.
Resumo:
The project compared several indicators of educational attainment across various groups of transition economies, and between transition economies and others. The indicators of education reflected both the quantity of schooling (e.g. average years of schooling, percentage not attending school at all, and adult literacy rates) and schooling quality (e.g. public expenditure on education, pupil-teacher ratios, repetition rates, dropout rates, and international test scores). The basic test used in the project was a t-test on differences in sample means. Among the transition economies, the indicators examined were most favourable for central European, high-income and advanced transition countries, although the differences between these countries and the remaining transition countries were not usually statistically significant. When compared with other world economies, the transition countries typically showed significantly better indicators than developing countries, but differences between transition and developed countries were not statistically significant. The project also examined the behaviour of the correlation coefficient between indicators of education and income, which, as expected, were usually positive.
Resumo:
BACKGROUND: Despite recent algorithmic and conceptual progress, the stoichiometric network analysis of large metabolic models remains a computationally challenging problem. RESULTS: SNA is a interactive, high performance toolbox for analysing the possible steady state behaviour of metabolic networks by computing the generating and elementary vectors of their flux and conversions cones. It also supports analysing the steady states by linear programming. The toolbox is implemented mainly in Mathematica and returns numerically exact results. It is available under an open source license from: http://bioinformatics.org/project/?group_id=546. CONCLUSION: Thanks to its performance and modular design, SNA is demonstrably useful in analysing genome scale metabolic networks. Further, the integration into Mathematica provides a very flexible environment for the subsequent analysis and interpretation of the results.
Resumo:
OBJECTIVE: To determine the impact of a community based Helicobacter pylori screening and eradication programme on the incidence of dyspepsia, resource use, and quality of life, including a cost consequences analysis. DESIGN: H pylori screening programme followed by randomised placebo controlled trial of eradication. SETTING: Seven general practices in southwest England. PARTICIPANTS: 10,537 unselected people aged 20-59 years were screened for H pylori infection (13C urea breath test); 1558 of the 1636 participants who tested positive were randomised to H pylori eradication treatment or placebo, and 1539 (99%) were followed up for two years. INTERVENTION: Ranitidine bismuth citrate 400 mg and clarithromycin 500 mg twice daily for two weeks or placebo. MAIN OUTCOME MEASURES: Primary care consultation rates for dyspepsia (defined as epigastric pain) two years after randomisation, with secondary outcomes of dyspepsia symptoms, resource use, NHS costs, and quality of life. RESULTS: In the eradication group, 35% fewer participants consulted for dyspepsia over two years compared with the placebo group (55/787 v 78/771; odds ratio 0.65, 95% confidence interval 0.46 to 0.94; P = 0.021; number needed to treat 30) and 29% fewer participants had regular symptoms (odds ratio 0.71, 0.56 to 0.90; P = 0.05). NHS costs were 84.70 pounds sterling (74.90 pounds sterling to 93.91 pounds sterling) greater per participant in the eradication group over two years, of which 83.40 pounds sterling (146 dollars; 121 euro) was the cost of eradication treatment. No difference in quality of life existed between the two groups. CONCLUSIONS: Community screening and eradication of H pylori is feasible in the general population and led to significant reductions in the number of people who consulted for dyspepsia and had symptoms two years after treatment. These benefits have to be balanced against the costs of eradication treatment, so a targeted eradication strategy in dyspeptic patients may be preferable.
Resumo:
Motivation: Array CGH technologies enable the simultaneous measurement of DNA copy number for thousands of sites on a genome. We developed the circular binary segmentation (CBS) algorithm to divide the genome into regions of equal copy number (Olshen {\it et~al}, 2004). The algorithm tests for change-points using a maximal $t$-statistic with a permutation reference distribution to obtain the corresponding $p$-value. The number of computations required for the maximal test statistic is $O(N^2),$ where $N$ is the number of markers. This makes the full permutation approach computationally prohibitive for the newer arrays that contain tens of thousands markers and highlights the need for a faster. algorithm. Results: We present a hybrid approach to obtain the $p$-value of the test statistic in linear time. We also introduce a rule for stopping early when there is strong evidence for the presence of a change. We show through simulations that the hybrid approach provides a substantial gain in speed with only a negligible loss in accuracy and that the stopping rule further increases speed. We also present the analysis of array CGH data from a breast cancer cell line to show the impact of the new approaches on the analysis of real data. Availability: An R (R Development Core Team, 2006) version of the CBS algorithm has been implemented in the ``DNAcopy'' package of the Bioconductor project (Gentleman {\it et~al}, 2004). The proposed hybrid method for the $p$-value is available in version 1.2.1 or higher and the stopping rule for declaring a change early is available in version 1.5.1 or higher.
Resumo:
This article gives an overview over the methods used in the low--level analysis of gene expression data generated using DNA microarrays. This type of experiment allows to determine relative levels of nucleic acid abundance in a set of tissues or cell populations for thousands of transcripts or loci simultaneously. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. This includes the design of probes, the experimental design, the image analysis of microarray scanned images, the normalization of fluorescence intensities, the assessment of the quality of microarray data and incorporation of quality information in subsequent analyses, the combination of information across arrays and across sets of experiments, the discovery and recognition of patterns in expression at the single gene and multiple gene levels, and the assessment of significance of these findings, considering the fact that there is a lot of noise and thus random features in the data. For all of these components, access to a flexible and efficient statistical computing environment is an essential aspect.
Resumo:
A basic, yet challenging task in the analysis of microarray gene expression data is the identification of changes in gene expression that are associated with particular biological conditions. We discuss different approaches to this task and illustrate how they can be applied using software from the Bioconductor Project. A central problem is the high dimensionality of gene expression space, which prohibits a comprehensive statistical analysis without focusing on particular aspects of the joint distribution of the genes expression levels. Possible strategies are to do univariate gene-by-gene analysis, and to perform data-driven nonspecific filtering of genes before the actual statistical analysis. However, more focused strategies that make use of biologically relevant knowledge are more likely to increase our understanding of the data.
Resumo:
BACKGROUND AND OBJECTIVE: Most economic evaluations of chlamydia screening do not include costs incurred by patients. The objective of this study was to estimate both the health service and private costs of patients who participated in proactive chlamydia screening, using mailed home-collected specimens as part of the Chlamydia Screening Studies project. METHODS: Data were collected on the administrative costs of the screening study, laboratory time and motion studies and patient-cost questionnaire surveys were conducted. The cost for each screening invitation and for each accepted offer was estimated. One-way sensitivity analysis was conducted to explore the effects of variations in patient costs and the number of patients accepting the screening offer. RESULTS: The time and costs of processing urine specimens and vulvo-vaginal swabs from women using two nucleic acid amplification tests were similar. The total cost per screening invitation was 20.37 pounds (95% CI 18.94 pounds to 24.83). This included the National Health Service cost per individual screening invitation 13.55 pounds (95% CI 13.15 pounds to 14.33) and average patient costs of 6.82 pounds (95% CI 5.48 pounds to 10.22). Administrative costs accounted for 50% of the overall cost. CONCLUSIONS: The cost of proactive chlamydia screening is comparable to those of opportunistic screening. Results from this study, which is the first to collect private patient costs associated with a chlamydia screening programme, could be used to inform future policy recommendations and provide unique primary cost data for economic evaluations.
Resumo:
The challenges posed by global climate change are motivating the investigation of strategies that can reduce the life cycle greenhouse gas (GHG) emissions of products and processes. While new construction materials and technologies have received significant attention, there has been limited emphasis on understanding how construction processes can be best managed to reduce GHG emissions. Unexpected disruptive events tend to adversely impact construction costs and delay project completion. They also tend to increase project GHG emissions. The objective of this paper is to investigate ways in which project GHG emissions can be reduced by appropriate management of disruptive events. First, an empirical analysis of construction data from a specific highway construction project is used to illustrate the impact of unexpected schedule delays in increasing project GHG emissions. Next, a simulation based methodology is described to assess the effectiveness of alternative project management strategies in reducing GHG emissions. The contribution of this paper is that it explicitly considers projects emissions, in addition to cost and project duration, in developing project management strategies. Practical application of the method discussed in this paper will help construction firms reduce their project emissions through strategic project management, and without significant investment in new technology. In effect, this paper lays the foundation for best practices in construction management that will optimize project cost and duration, while minimizing GHG emissions.
Resumo:
In this dissertation, the National Survey of Student Engagement (NSSE) serves as a nodal point through which to examine the power relations shaping the direction and practices of higher education in the twenty-first century. Theoretically, my analysis is informed by Foucault’s concept of governmentality, briefly defined as a technology of power that influences or shapes behavior from a distance. This form of governance operates through apparatuses of security, which include higher education. Foucault identified three essential characteristics of an apparatus—the market, the milieu, and the processes of normalization—through which administrative mechanisms and practices operate and govern populations. In this project, my primary focus is on the governance of faculty and administrators, as a population, at residential colleges and universities. I argue that the existing milieu of accountability is one dominated by the neoliberal assumption that all activity—including higher education—works best when governed by market forces alone, reducing higher education to a market-mediated private good. Under these conditions, what many in the academy believe is an essential purpose of higher education—to educate students broadly, to contribute knowledge for the public good, and to serve as society’s critic and social conscience (Washburn 227)—is being eroded. Although NSSE emerged as a form of resistance to commercial college rankings, it did not challenge the forces that empowered the rankings in the first place. Indeed, NSSE data are now being used to make institutions even more responsive to market forces. Furthermore, NSSE’s use has a normalizing effect that tends to homogenize classroom practices and erode the autonomy of faculty in the educational process. It also positions students as part of the system of surveillance. In the end, if aspects of higher education that are essential to maintaining a civil society are left to be defined solely in market terms, the result may be a less vibrant and, ultimately, a less just society.