876 resultados para Large scale evaluation
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Abstract Background Spotted cDNA microarrays generally employ co-hybridization of fluorescently-labeled RNA targets to produce gene expression ratios for subsequent analysis. Direct comparison of two RNA samples in the same microarray provides the highest level of accuracy; however, due to the number of combinatorial pair-wise comparisons, the direct method is impractical for studies including large number of individual samples (e.g., tumor classification studies). For such studies, indirect comparisons using a common reference standard have been the preferred method. Here we evaluated the precision and accuracy of reconstructed ratios from three indirect methods relative to ratios obtained from direct hybridizations, herein considered as the gold-standard. Results We performed hybridizations using a fixed amount of Cy3-labeled reference oligonucleotide (RefOligo) against distinct Cy5-labeled targets from prostate, breast and kidney tumor samples. Reconstructed ratios between all tissue pairs were derived from ratios between each tissue sample and RefOligo. Reconstructed ratios were compared to (i) ratios obtained in parallel from direct pair-wise hybridizations of tissue samples, and to (ii) reconstructed ratios derived from hybridization of each tissue against a reference RNA pool (RefPool). To evaluate the effect of the external references, reconstructed ratios were also calculated directly from intensity values of single-channel (One-Color) measurements derived from tissue sample data collected in the RefOligo experiments. We show that the average coefficient of variation of ratios between intra- and inter-slide replicates derived from RefOligo, RefPool and One-Color were similar and 2 to 4-fold higher than ratios obtained in direct hybridizations. Correlation coefficients calculated for all three tissue comparisons were also similar. In addition, the performance of all indirect methods in terms of their robustness to identify genes deemed as differentially expressed based on direct hybridizations, as well as false-positive and false-negative rates, were found to be comparable. Conclusion RefOligo produces ratios as precise and accurate as ratios reconstructed from a RNA pool, thus representing a reliable alternative in reference-based hybridization experiments. In addition, One-Color measurements alone can reconstruct expression ratios without loss in precision or accuracy. We conclude that both methods are adequate options in large-scale projects where the amount of a common reference RNA pool is usually restrictive.
Resumo:
Global observations of the chemical composition of the atmosphere are essential for understanding and studying the present and future state of the earth's atmosphere. However, by analyzing field experiments the consideration of the atmospheric motion is indispensable, because transport enables different chemical species, with different local natural and anthropogenic sources, to interact chemically and so consequently influences the chemical composition of the atmosphere. The distance over which that transport occurs is highly dependent upon meteorological conditions (e.g., wind speed, precipitation) and the properties of chemical species itself (e.g., solubility, reactivity). This interaction between chemistry and dynamics makes the study of atmospheric chemistry both difficult and challenging, and also demonstrates the relevance of including the atmospheric motions in that context. In this doctoral thesis the large-scale transport of air over the eastern Mediterranean region during summer 2001, with a focus on August during the Mediterranean Intensive Oxidant Study (MINOS) measurement campaign, was investigated from a lagrangian perspective. Analysis of back trajectories demonstrated transport of polluted air masses from western and eastern Europe in the boundary layer, from the North Atlantic/North American area in the middle end upper troposphere and additionally from South Asia in the upper troposphere towards the eastern Mediterranean. Investigation of air mass transport near the tropopause indicated enhanced cross-tropopause transport relative to the surrounding area over the eastern Mediterranean region in summer. A large band of air mass transport across the dynamical tropopause develops in June, and is shifted toward higher latitudes in July and August. This shifting is associated with the development and the intensification of the Arabian and South Asian upper-level anticyclones and consequential with areas of maximum clear-air turbulence, hypothesizing quasi-permanent areas with turbulent mixing of tropospheric and stratospheric air during summer over the eastern Mediterranean as a result of large-scale synoptic circulation. In context with the latex knowledge about the transport of polluted air masses towards the Mediterranean and with increasing emissions, especially in developing countries like India, this likely gains in importance.
Resumo:
La Tesi analizza le relazioni tra i processi di sviluppo agricolo e l’uso delle risorse naturali, in particolare di quelle energetiche, a livello internazionale (paesi in via di sviluppo e sviluppati), nazionale (Italia), regionale (Emilia Romagna) e aziendale, con lo scopo di valutare l’eco-efficienza dei processi di sviluppo agricolo, la sua evoluzione nel tempo e le principali dinamiche in relazione anche ai problemi di dipendenza dalle risorse fossili, della sicurezza alimentare, della sostituzione tra superfici agricole dedicate all’alimentazione umana ed animale. Per i due casi studio a livello macroeconomico è stata adottata la metodologia denominata “SUMMA” SUstainability Multi-method, multi-scale Assessment (Ulgiati et al., 2006), che integra una serie di categorie d’impatto dell’analisi del ciclo di vita, LCA, valutazioni costi-benefici e la prospettiva di analisi globale della contabilità emergetica. L’analisi su larga scala è stata ulteriormente arricchita da un caso studio sulla scala locale, di una fattoria produttrice di latte e di energia elettrica rinnovabile (fotovoltaico e biogas). Lo studio condotto mediante LCA e valutazione contingente ha valutato gli effetti ambientali, economici e sociali di scenari di riduzione della dipendenza dalle fonti fossili. I casi studio a livello macroeconomico dimostrano che, nonostante le politiche di supporto all’aumento di efficienza e a forme di produzione “verdi”, l’agricoltura a livello globale continua ad evolvere con un aumento della sua dipendenza dalle fonti energetiche fossili. I primi effetti delle politiche agricole comunitarie verso una maggiore sostenibilità sembrano tuttavia intravedersi per i Paesi Europei. Nel complesso la energy footprint si mantiene alta poiché la meccanizzazione continua dei processi agricoli deve necessariamente attingere da fonti energetiche sostitutive al lavoro umano. Le terre agricole diminuiscono nei paesi europei analizzati e in Italia aumentando i rischi d’insicurezza alimentare giacché la popolazione nazionale sta invece aumentando.
Resumo:
Standard procedures for forecasting flood risk (Bulletin 17B) assume annual maximum flood (AMF) series are stationary, meaning the distribution of flood flows is not significantly affected by climatic trends/cycles, or anthropogenic activities within the watershed. Historical flood events are therefore considered representative of future flood occurrences, and the risk associated with a given flood magnitude is modeled as constant over time. However, in light of increasing evidence to the contrary, this assumption should be reconsidered, especially as the existence of nonstationarity in AMF series can have significant impacts on planning and management of water resources and relevant infrastructure. Research presented in this thesis quantifies the degree of nonstationarity evident in AMF series for unimpaired watersheds throughout the contiguous U.S., identifies meteorological, climatic, and anthropogenic causes of this nonstationarity, and proposes an extension of the Bulletin 17B methodology which yields forecasts of flood risk that reflect climatic influences on flood magnitude. To appropriately forecast flood risk, it is necessary to consider the driving causes of nonstationarity in AMF series. Herein, large-scale climate patterns—including El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), North Atlantic Oscillation (NAO), and Atlantic Multidecadal Oscillation (AMO)—are identified as influencing factors on flood magnitude at numerous stations across the U.S. Strong relationships between flood magnitude and associated precipitation series were also observed for the majority of sites analyzed in the Upper Midwest and Northeastern regions of the U.S. Although relationships between flood magnitude and associated temperature series are not apparent, results do indicate that temperature is highly correlated with the timing of flood peaks. Despite consideration of watersheds classified as unimpaired, analyses also suggest that identified change-points in AMF series are due to dam construction, and other types of regulation and diversion. Although not explored herein, trends in AMF series are also likely to be partially explained by changes in land use and land cover over time. Results obtained herein suggest that improved forecasts of flood risk may be obtained using a simple modification of the Bulletin 17B framework, wherein the mean and standard deviation of the log-transformed flows are modeled as functions of climate indices associated with oceanic-atmospheric patterns (e.g. AMO, ENSO, NAO, and PDO) with lead times between 3 and 9 months. Herein, one-year ahead forecasts of the mean and standard deviation, and subsequently flood risk, are obtained by applying site specific multivariate regression models, which reflect the phase and intensity of a given climate pattern, as well as possible impacts of coupling of the climate cycles. These forecasts of flood risk are compared with forecasts derived using the existing Bulletin 17B model; large differences in the one-year ahead forecasts are observed in some locations. The increased knowledge of the inherent structure of AMF series and an improved understanding of physical and/or climatic causes of nonstationarity gained from this research should serve as insight for the formulation of a physical-casual based statistical model, incorporating both climatic variations and human impacts, for flood risk over longer planning horizons (e.g., 10-, 50, 100-years) necessary for water resources design, planning, and management.
Resumo:
OBJECTIVES This study sought to study the efficacy and safety of newer-generation drug-eluting stents (DES) compared with bare-metal stents (BMS) in an appropriately powered population of patients with ST-segment elevation myocardial infarction (STEMI). BACKGROUND Among patients with STEMI, early generation DES improved efficacy but not safety compared with BMS. Newer-generation DES, everolimus-eluting stents, and biolimus A9-eluting stents, have been shown to improve clinical outcomes compared with early generation DES. METHODS Individual patient data for 2,665 STEMI patients enrolled in 2 large-scale randomized clinical trials comparing newer-generation DES with BMS were pooled: 1,326 patients received a newer-generation DES (everolimus-eluting stent or biolimus A9-eluting stent), whereas the remaining 1,329 patients received a BMS. Random-effects models were used to assess differences between the 2 groups for the device-oriented composite endpoint of cardiac death, target-vessel reinfarction, and target-lesion revascularization and the patient-oriented composite endpoint of all-cause death, any infarction, and any revascularization at 1 year. RESULTS Newer-generation DES substantially reduce the risk of the device-oriented composite endpoint compared with BMS at 1 year (relative risk [RR]: 0.58; 95% confidence interval [CI]: 0.43 to 0.79; p = 0.0004). Similarly, the risk of the patient-oriented composite endpoint was lower with newer-generation DES than BMS (RR: 0.78; 95% CI: 0.63 to 0.96; p = 0.02). Differences in favor of newer-generation DES were driven by both a lower risk of repeat revascularization of the target lesion (RR: 0.33; 95% CI: 0.20 to 0.52; p < 0.0001) and a lower risk of target-vessel infarction (RR: 0.36; 95% CI: 0.14 to 0.92; p = 0.03). Newer-generation DES also reduced the risk of definite stent thrombosis (RR: 0.35; 95% CI: 0.16 to 0.75; p = 0.006) compared with BMS. CONCLUSIONS Among patients with STEMI, newer-generation DES improve safety and efficacy compared with BMS throughout 1 year. It remains to be determined whether the differences in favor of newer-generation DES are sustained during long-term follow-up.
Resumo:
There is a shortage of empirical applications of the capability approach that employ closed survey instruments to assess self-reported capabilities. However, for those few instruments that have been designed and administered through surveys until now, no psychometric properties (reliability, validity, and factor structure) were reported. The purpose of this study is the assessment of the psychometric properties of three new language versions (German, French, and Italian) of an established (English) set of eight self-reported capability items. The set of items is taken from a previously published British study by Anand and van Hees (J Soc Econ 35(2):268–284, 2006). Our sample consists of 17,152 young male adults aged 18–25 years from the three major language regions in Switzerland. The results indicate good reliability of the three language versions. The results from the exploratory factor analyses suggest a one-dimensional factor structure for seven domain specific items. Furthermore, the results from multiple regression analyses suggest that a global summary item on overall capabilities represents a measurement alternative to the set of seven domain specific capability items. Finally, the results confirm the applicability of the closed capability instrument in a large scale survey questionnaire and represent the first attempt to measure self-reported capabilities in Switzerland.
Resumo:
Self-administered online surveys provide a higher level of privacy protection to respondents than surveys administered by an interviewer. Yet, studies indicate that asking sensitive questions is problematic also in self-administered surveys. Because respondents might not be willing to reveal the truth and provide answers that are subject to social desirability bias, the validity of prevalence estimates of sensitive behaviors from online surveys can be challenged. A well-known method to overcome these problems is the Randomized Response Technique (RRT). However, convincing evidence that the RRT provides more valid estimates than direct questioning in online surveys is still lacking. A new variant of the RRT called the Crosswise Model has recently been proposed to overcome some of the deficiencies of existing RRT designs. We therefore conducted an experimental study in which different implementations of the RRT, including two implementations of the crosswise model, were tested and compared to direct questioning. Our study is a large-scale online survey (N = 6,037) on sensitive behaviors by students such as cheating in exams and plagiarism. Results indicate that the crosswise-model RRT---unlike the other variants of RRT we evaluated---yields higher prevalence estimates of sensitive behaviors than direct questioning. Whether higher estimates are a sufficient condition for more valid results, however, remains questionable.
Resumo:
Simulating surface wind over complex terrain is a challenge in regional climate modelling. Therefore, this study aims at identifying a set-up of the Weather Research and Forecasting Model (WRF) model that minimises system- atic errors of surface winds in hindcast simulations. Major factors of the model configuration are tested to find a suitable set-up: the horizontal resolution, the planetary boundary layer (PBL) parameterisation scheme and the way the WRF is nested to the driving data set. Hence, a number of sensitivity simulations at a spatial resolution of 2 km are carried out and compared to observations. Given the importance of wind storms, the analysis is based on case studies of 24 historical wind storms that caused great economic damage in Switzerland. Each of these events is downscaled using eight different model set-ups, but sharing the same driving data set. The results show that the lack of representation of the unresolved topography leads to a general overestimation of wind speed in WRF. However, this bias can be substantially reduced by using a PBL scheme that explicitly considers the effects of non-resolved topography, which also improves the spatial structure of wind speed over Switzerland. The wind direction, although generally well reproduced, is not very sensitive to the PBL scheme. Further sensitivity tests include four types of nesting methods: nesting only at the boundaries of the outermost domain, analysis nudging, spectral nudging, and the so-called re-forecast method, where the simulation is frequently restarted. These simulations show that restricting the freedom of the model to develop large-scale disturbances slightly increases the temporal agreement with the observations, at the same time that it further reduces the overestimation of wind speed, especially for maximum wind peaks. The model performance is also evaluated in the outermost domains, where the resolution is coarser. The results demonstrate the important role of horizontal resolution, where the step from 6 to 2 km significantly improves model performance. In summary, the combination of a grid size of 2 km, the non-local PBL scheme modified to explicitly account for non-resolved orography, as well as analysis or spectral nudging, is a superior combination when dynamical downscaling is aimed at reproducing real wind fields.
Resumo:
The evaluation for European Union market approval of coronary stents falls under the Medical Device Directive that was adopted in 1993. Specific requirements for the assessment of coronary stents are laid out in supplementary advisory documents. In response to a call by the European Commission to make recommendations for a revision of the advisory document on the evaluation of coronary stents (Appendix 1 of MEDDEV 2.7.1), the European Society of Cardiology (ESC) and the European Association of Percutaneous Cardiovascular Interventions (EAPCI) established a Task Force to develop an expert advisory report. As basis for its report, the ESC-EAPCI Task Force reviewed existing processes, established a comprehensive list of all coronary drug-eluting stents that have received a CE mark to date, and undertook a systematic review of the literature of all published randomized clinical trials evaluating clinical and angiographic outcomes of coronary artery stents between 2002 and 2013. Based on these data, the TF provided recommendations to inform a new regulatory process for coronary stents. The main recommendations of the task force include implementation of a standardized non-clinical assessment of stents and a novel clinical evaluation pathway for market approval. The two-stage clinical evaluation plan includes recommendation for an initial pre-market trial with objective performance criteria (OPC) benchmarking using invasive imaging follow-up leading to conditional CE-mark approval and a subsequent mandatory, large-scale randomized trial with clinical endpoint evaluation leading to unconditional CE-mark. The data analysis from the systematic review of the Task Force may provide a basis for determination of OPC for use in future studies. This paper represents an executive summary of the Task Force's report.
Resumo:
Paper submitted to the IFIP International Conference on Very Large Scale Integration (VLSI-SOC), Darmstadt, Germany, 2003.
Resumo:
Manual curation has long been held to be the gold standard for functional annotation of DNA sequence. Our experience with the annotation of more than 20,000 full-length cDNA sequences revealed problems with this approach, including inaccurate and inconsistent assignment of gene names, as well as many good assignments that were difficult to reproduce using only computational methods. For the FANTOM2 annotation of more than 60,000 cDNA clones, we developed a number of methods and tools to circumvent some of these problems, including an automated annotation pipeline that provides high-quality preliminary annotation for each sequence by introducing an uninformative filter that eliminates uninformative annotations, controlled vocabularies to accurately reflect both the functional assignments and the evidence supporting them, and a highly refined, Web-based manual annotation tool that allows users to view a wide array of sequence analyses and to assign gene names and putative functions using a consistent nomenclature. The ultimate utility of our approach is reflected in the low rate of reassignment of automated assignments by manual curation. Based on these results, we propose a new standard for large-scale annotation, in which the initial automated annotations are manually investigated and then computational methods are iteratively modified and improved based on the results of manual curation.
Resumo:
In empirical studies of Evolutionary Algorithms, it is usually desirable to evaluate and compare algorithms using as many different parameter settings and test problems as possible, in border to have a clear and detailed picture of their performance. Unfortunately, the total number of experiments required may be very large, which often makes such research work computationally prohibitive. In this paper, the application of a statistical method called racing is proposed as a general-purpose tool to reduce the computational requirements of large-scale experimental studies in evolutionary algorithms. Experimental results are presented that show that racing typically requires only a small fraction of the cost of an exhaustive experimental study.
Resumo:
The purpose of this paper is to examine consumers' experience of a performing arts service to identify the predictors of audience behaviour especially as related to positive repurchase intention. Experiential service settings such as the performing arts have been cited in recent research as service contexts that may challenge current theory that repurchase intention is driven by service quality and customer satisfaction. It is posited that consumer emotions and the hedonic nature of the consumption experience may complicate the evaluation process to repurchase intention in a setting such as the performing arts. Qualitative semi-structured indepth interviews were undertaken of twenty-six performing arts consumers using a pool of questions and prompts developed from a review of the extant literature. Transcribed field notes were examined for key words and phrases and data was divided into the main emergent themes related to each of the questions and also coded for confirmation and is-confirmation of the extant literature constructs and relationships. The dimensions of service experience,price, service quality, target goal-directed emotions and non-target appraisal emotions were identified as driving repurchase intention in a performing arts setting. Customer satisfaction in this setting appears to result from emotional factors rather than expectancy dis-confirmation. This research supports the notion that an experiential consumption experience such as the performing arts will challenge the current theory of the drivers of repurchase intention and suggests that a more thorough large scale examination of these dimensions in this service setting is warranted.
Resumo:
This work is concerned with the development of techniques for the evaluation of large-scale highway schemes with particular reference to the assessment of their costs and benefits in the context of the current transport planning (T.P.P.) process. It has been carried out in close cooperation with West Midlands County Council, although its application and results are applicable elsewhere. The background to highway evaluation and its development in recent years has been described and the emergence of a number of deficiencies in current planning practise noted. One deficiency in particular stood out, that stemming from inadequate methods of scheme generation and the research has concentrated upon improving this stage of appraisal, to ensure that subsequent stages of design, assessment and implementation are based upon a consistent and responsive foundation. Deficiencies of scheme evaluation were found to stem from inadequate development of appraisal methodologies suffering from difficulties of valuation, measurement and aggregation of the disparate variables that characterise highway evaluation. A failure to respond to local policy priorities was also noted. A 'problem' rather than 'goals' based approach to scheme generation was taken, as it represented the current and foreseeable resource allocation context more realistically. A review of techniques with potential for highway problem based scheme generation, which would work within a series of practical and theoretical constraints were assessed and that of multivariate analysis, and classical factor analysis in particular, was selected, because it offerred considerable application to the difficulties of valuation, measurement and aggregation that existed. Computer programs were written to adapt classical factor analysis to the requirements of T.P.P. highway evaluation, using it to derive a limited number of factors which described the extensive quantity of highway problem data. From this, a series of composite problem scores for 1979 were derived for a case study area of south Birmingham, based upon the factorial solutions, and used to assess highway sites in terms of local policy issues. The methodology was assessed in the light of its ability to describe highway problems in both aggregate and disaggregate terms, to guide scheme design, coordinate with current scheme evaluation methods, and in general to improve upon current appraisal. Analysis of the results was both in subjective, 'common-sense' terms and using statistical methods to assess the changes in problem definition, distribution and priorities that emerged. Overall, the technique was found to improve upon current scheme generation methods in all respects and in particular in overcoming the problems of valuation, measurement and aggregation without recourse to unsubstantiated and questionable assumptions. A number of deficiencies which remained have been outlined and a series of research priorities described which need to be reviewed in the light of current and future evaluation needs.