876 resultados para Large scale evaluation
Resumo:
Globalization involves several facility location problems that need to be handled at large scale. Location Allocation (LA) is a combinatorial problem in which the distance among points in the data space matter. Precisely, taking advantage of the distance property of the domain we exploit the capability of clustering techniques to partition the data space in order to convert an initial large LA problem into several simpler LA problems. Particularly, our motivation problem involves a huge geographical area that can be partitioned under overall conditions. We present different types of clustering techniques and then we perform a cluster analysis over our dataset in order to partition it. After that, we solve the LA problem applying simulated annealing algorithm to the clustered and non-clustered data in order to work out how profitable is the clustering and which of the presented methods is the most suitable
Resumo:
A first assessment of debris flow susceptibility at a large scale was performed along the National Road N7, Argentina. Numerous catchments are prone to debris flows and likely to endanger the road-users. A 1:50,000 susceptibility map was created. The use of a DEM (grid 30 m) associated to three complementary criteria (slope, contributing area, curvature) allowed the identification of potential source areas. The debris flow spreading was estimated using a process- and GISbased model (Flow-R) based on basic probabilistic and energy calculations. The best-fit values for the coefficient of friction and the mass-to-drag ratio of the PCM model were found to be ? = 0.02 and M/D = 180 and the resulting propagation on one of the calibration site was validated using the Coulomb friction model. The results are realistic and will be useful to determine which areas need to be prioritized for detailed studies.
Resumo:
Nanoparticles <100 nanometres are being introduced into industrial processes, but they are suspected to cause similar negative health effects to ambient particles. Poor knowledge about the scale of introduction has not allowed global risk analysis until now. In 2006 a targeted telephone survey among Swiss companies (1) showed the usage of nanoparticles in a few selected companies but did not provide data to extrapolate to the full Swiss workforce. The purpose of the study presented here was to provide a quantitative estimate of the potential occupational exposure to nanoparticles in Swiss industry. Method: A layered representative questionnaire survey among 1626 Swiss companies of the production sector was conducted in 2007. The survey was a written questionnaire, collecting data about the used nanoparticles, the number of potentially exposed persons in the companies and their protection strategy. Results: The response rate of the study was 58.3%. The number of companies estimated to be using nanoparticles in Switzerland was 586 (95% Confidence Interval 145 to 1027). It is estimated that 1309 workers (95% CI 1073 to 1545) do their job in the same room as a nanoparticle application. Personal protection was shown to be the predominant protection means. Such information is valuable for risk evaluation. The low number of companies dealing with nanoparticles in Switzerland suggests that policy makers as well as health, safety and environmental officers within companies can focus their efforts on a relatively small number of companies or workers. The collected data about types of particles and applications may be used for research on prevention strategies and adapted protection means. However, to reflect the most recent trends, the information presented here has to be continuously updated, and a large-scale inventory of the usage should be considered.
Resumo:
BACKGROUND: An objective measurement of surgical procedures outcomes is inherent to professional practices quality control; this especially applies in orthopaedics to joint replacement outcomes. A self-administered questionnaire offers an attractive alternative to surgeon's judgement but is infrequently used in France for these purposes. The British questionnaire, the 12-item Oxford Hip Score (OHS) was selected for this study because of its ease of use. HYPOTHESIS: The objective of this study was to validate the French translation of the self-assessment 12-item Oxford Hip Score and compare its results with those of the reference functional scores: the Harris Hip Score (HHS) and the Postel-Merle d'Aubigné (PMA) score. MATERIALS AND METHODS: Based on a clinical series of 242 patients who were candidates for total hip arthroplasty, the French translation of this questionnaire was validated. Its coherence was also validated by comparing the preoperative data with the data obtained from the two other reference clinical scores. RESULTS: The translation was validated using the forward-backward translation procedure from French to English, with correction of all differences or mistranslations after systematized comparison with the original questionnaire in English. The mean overall OHS score was 43.8 points (range, 22-60 points) with similarly good distribution of the overall value of the three scores compared. The correlation was excellent between the OHS and the HHS, but an identical correlation between the OHS and the PMA was only obtained for the association of the pain and function parameters, after excluding the mobility criterion, relatively over-represented in the PMA score. DISCUSSION AND CONCLUSION: Subjective questionnaires that contribute a personal appreciation of the results of arthroplasty by the patient can easily be applied on a large scale. This study made a translated and validated version of an internationally recognized, reliable self-assessment score available to French orthopaedic surgeons. The results obtained encourage us to use this questionnaire as a complement to the classical evaluation scores and methods.
Resumo:
Many terrestrial and marine systems are experiencing accelerating decline due to the effects of global change. This situation has raised concern about the consequences of biodiversity losses for ecosystem function, ecosystem service provision, and human well-being. Coastal marine habitats are a main focus of attention because they harbour a high biological diversity, are among the most productive systems of the world and present high anthropogenic interaction levels. The accelerating degradation of many terrestrial and marine systems highlights the urgent need to evaluate the consequence of biodiversity loss. Because marine biodiversity is a dynamic entity and this study was interested global change impacts, this study focused on benthic biodiversity trends over large spatial and long temporal scales. The main aim of this project was to investigate the current extent of biodiversity of the high diverse benthic coralligenous community in the Mediterranean Sea, detect its changes, and predict its future changes over broad spatial and long temporal scales. These marine communities are characterized by structural species with low growth rates and long life spans; therefore they are considered particularly sensitive to disturbances. For this purpose, this project analyzed permanent photographic plots over time at four locations in the NW Mediterranean Sea. The spatial scale of this study provided information on the level of species similarity between these locations, thus offering a solid background on the amount of large scale variability in coralligenous communities; whereas the temporal scale was fundamental to determine the natural variability in order to discriminate between changes observed due to natural factors and those related to the impact of disturbances (e.g. mass mortality events related to positive thermal temperatures, extreme catastrophic events). This study directly addressed the challenging task of analyzing quantitative biodiversity data of these high diverse marine benthic communities. Overall, the scientific knowledge gained with this research project will improve our understanding in the function of marine ecosystems and their trajectories related to global change.
Resumo:
In the last few years, some of the visionary concepts behind the virtual physiological human began to be demonstrated on various clinical domains, showing great promise for improving healthcare management. In the current work, we provide an overview of image- and biomechanics-based techniques that, when put together, provide a patient-specific pipeline for the management of intracranial aneurysms. The derivation and subsequent integration of morphological, morphodynamic, haemodynamic and structural analyses allow us to extract patient-specific models and information from which diagnostic and prognostic descriptors can be obtained. Linking such new indices with relevant clinical events should bring new insights into the processes behind aneurysm genesis, growth and rupture. The development of techniques for modelling endovascular devices such as stents and coils allows the evaluation of alternative treatment scenarios before the intervention takes place and could also contribute to the understanding and improved design of more effective devices. A key element to facilitate the clinical take-up of all these developments is their comprehensive validation. Although a number of previously published results have shown the accuracy and robustness of individual components, further efforts should be directed to demonstrate the diagnostic and prognostic efficacy of these advanced tools through large-scale clinical trials.
Resumo:
The educational system in Spain is undergoing a reorganization. At present, high-school graduates who want to enroll at a public university must take a set of examinations Pruebas de Aptitud para el Acceso a la Universidad (PAAU). A "new formula" (components, weights, type of exam,...) for university admission is been discussed. The present paper summarizes part of the research done by the author in her PhD. The context for this thesis is the evaluation of large-scale and complex systems of assessment. The main objectives were: to achieve a deep knowledge of the entire university admissions process in Spain, to discover the main sources of uncertainty and topromote empirical research in a continual improvement of the entire process. Focusing in the suitable statistical models and strategies which allow to high-light the imperfections of the system and reduce them, the paper develops, among other approaches, some applications of multilevel modeling.
Resumo:
Polistine wasps are important in Neotropical ecosystems due to their ubiquity and diversity. Inventories have not adequately considered spatial attributes of collected specimens. Spatial data on biodiversity are important for study and mitigation of anthropogenic impacts over natural ecosystems and for protecting species. We described and analyzed local-scale spatial patterns of collecting records of wasp species, as well as spatial variation of diversity descriptors in a 2500-hectare area of an Amazon forest in Brazil. Rare species comprised the largest fraction of the fauna. Close range spatial effects were detected for most of the more common species, with clustering of presence-data at short distances. Larger spatial lag effects could also be identified in some species, constituting probably cases of exogenous autocorrelation and candidates for explanations based on environmental factors. In a few cases, significant or near significant correlations were found between five species (of Agelaia, Angiopolybia, and Mischocyttarus) and three studied environmental variables: distance to nearest stream, terrain altitude, and the type of forest canopy. However, association between these factors and biodiversity variables were generally low. When used as predictors of polistine richness in a linear multiple regression, only the coefficient for the forest canopy variable resulted significant. Some level of prediction of wasp diversity variables can be attained based on environmental variables, especially vegetation structure. Large-scale landscape and regional studies should be scheduled to address this issue.
Resumo:
Payments for Environmental Services (PES) are praised as innovative policy instruments and they influence the governance of forest restoration efforts in two major ways. The first is the establishment of multi-stakeholder agencies as intermediary bodies between funders and planters to manage the funds and to distribute incentives to planters. The second implication is that specific contracts assign objectives to land users in the form of conditions for payments that are believed to increase the chances for sustained impacts on the ground. These implications are important in the assessment of the potential of PES to operate as new and effective funding schemes for forest restoration. They are analyzed by looking at two prominent payments for watershed service programs in Indonesia-Cidanau (Banten province in Java) and West Lombok (Eastern Indonesia)-with combined economic and political science approaches. We derive lessons for the governance of funding efforts (e.g., multi-stakeholder agencies are not a guarantee of success; mixed results are obtained from a reliance on mandatory funding with ad hoc regulations, as opposed to voluntary contributions by the service beneficiary) and for the governance of financial expenditure (e.g., absolute need for evaluation procedures for the internal governance of farmer groups). Furthermore, we observe that these governance features provide no guarantee that restoration plots with the highest relevance for ecosystem services are targeted by the PES
Resumo:
A recurring task in the analysis of mass genome annotation data from high-throughput technologies is the identification of peaks or clusters in a noisy signal profile. Examples of such applications are the definition of promoters on the basis of transcription start site profiles, the mapping of transcription factor binding sites based on ChIP-chip data and the identification of quantitative trait loci (QTL) from whole genome SNP profiles. Input to such an analysis is a set of genome coordinates associated with counts or intensities. The output consists of a discrete number of peaks with respective volumes, extensions and center positions. We have developed for this purpose a flexible one-dimensional clustering tool, called MADAP, which we make available as a web server and as standalone program. A set of parameters enables the user to customize the procedure to a specific problem. The web server, which returns results in textual and graphical form, is useful for small to medium-scale applications, as well as for evaluation and parameter tuning in view of large-scale applications, requiring a local installation. The program written in C++ can be freely downloaded from ftp://ftp.epd.unil.ch/pub/software/unix/madap. The MADAP web server can be accessed at http://www.isrec.isb-sib.ch/madap/.
Resumo:
The induction of fungal metabolites by fungal co-cultures grown on solid media was explored using multi-well co-cultures in 2 cm diameter Petri dishes. Fungi were grown in 12-well plates to easily and rapidly obtain the large number of replicates necessary for employing metabolomic approaches. Fungal culture using such a format accelerated the production of metabolites by several weeks compared with using the large-format 9 cm Petri dishes. This strategy was applied to a co-culture of a Fusarium and an Aspergillus strain. The metabolite composition of the cultures was assessed using ultra-high pressure liquid chromatography coupled to electrospray ionisation and time-of-flight mass spectrometry, followed by automated data mining. The de novo production of metabolites was dramatically increased by nutriment reduction. A time-series study of the induction of the fungal metabolites of interest over nine days revealed that they exhibited various induction patterns. The concentrations of most of the de novo induced metabolites increased over time. However, interesting patterns were observed, such as with the presence of some compounds only at certain time points. This result indicates the complexity and dynamic nature of fungal metabolism. The large-scale production of the compounds of interest was verified by co-culture in 15 cm Petri dishes; most of the induced metabolites of interest (16/18) were found to be produced as effectively as on a small scale, although not in the same time frames. Large-scale production is a practical solution for the future production, identification and biological evaluation of these metabolites.
Resumo:
We present MBIS (Multivariate Bayesian Image Segmentation tool), a clustering tool based on the mixture of multivariate normal distributions model. MBIS supports multichannel bias field correction based on a B-spline model. A second methodological novelty is the inclusion of graph-cuts optimization for the stationary anisotropic hidden Markov random field model. Along with MBIS, we release an evaluation framework that contains three different experiments on multi-site data. We first validate the accuracy of segmentation and the estimated bias field for each channel. MBIS outperforms a widely used segmentation tool in a cross-comparison evaluation. The second experiment demonstrates the robustness of results on atlas-free segmentation of two image sets from scan-rescan protocols on 21 healthy subjects. Multivariate segmentation is more replicable than the monospectral counterpart on T1-weighted images. Finally, we provide a third experiment to illustrate how MBIS can be used in a large-scale study of tissue volume change with increasing age in 584 healthy subjects. This last result is meaningful as multivariate segmentation performs robustly without the need for prior knowledge.
Resumo:
BACKGROUND: Pseudogenes have long been considered as nonfunctional genomic sequences. However, recent evidence suggests that many of them might have some form of biological activity, and the possibility of functionality has increased interest in their accurate annotation and integration with functional genomics data. RESULTS: As part of the GENCODE annotation of the human genome, we present the first genome-wide pseudogene assignment for protein-coding genes, based on both large-scale manual annotation and in silico pipelines. A key aspect of this coupled approach is that it allows us to identify pseudogenes in an unbiased fashion as well as untangle complex events through manual evaluation. We integrate the pseudogene annotations with the extensive ENCODE functional genomics information. In particular, we determine the expression level, transcription-factor and RNA polymerase II binding, and chromatin marks associated with each pseudogene. Based on their distribution, we develop simple statistical models for each type of activity, which we validate with large-scale RT-PCR-Seq experiments. Finally, we compare our pseudogenes with conservation and variation data from primate alignments and the 1000 Genomes project, producing lists of pseudogenes potentially under selection. CONCLUSIONS: At one extreme, some pseudogenes possess conventional characteristics of functionality; these may represent genes that have recently died. On the other hand, we find interesting patterns of partial activity, which may suggest that dead genes are being resurrected as functioning non-coding RNAs. The activity data of each pseudogene are stored in an associated resource, psiDR, which will be useful for the initial identification of potentially functional pseudogenes.
Resumo:
PURPOSE: Pharmacovigilance methods have advanced greatly during the last decades, making post-market drug assessment an essential drug evaluation component. These methods mainly rely on the use of spontaneous reporting systems and health information databases to collect expertise from huge amounts of real-world reports. The EU-ADR Web Platform was built to further facilitate accessing, monitoring and exploring these data, enabling an in-depth analysis of adverse drug reactions risks.METHODS: The EU-ADR Web Platform exploits the wealth of data collected within a large-scale European initiative, the EU-ADR project. Millions of electronic health records, provided by national health agencies, are mined for specific drug events, which are correlated with literature, protein and pathway data, resulting in a rich drug-event dataset. Next, advanced distributed computing methods are tailored to coordinate the execution of data-mining and statistical analysis tasks. This permits obtaining a ranked drug-event list, removing spurious entries and highlighting relationships with high risk potential.RESULTS: The EU-ADR Web Platform is an open workspace for the integrated analysis of pharmacovigilance datasets. Using this software, researchers can access a variety of tools provided by distinct partners in a single centralized environment. Besides performing standalone drug-event assessments, they can also control the pipeline for an improved batch analysis of custom datasets. Drug-event pairs can be substantiated and statistically analysed within the platform's innovative working environment.CONCLUSIONS: A pioneering workspace that helps in explaining the biological path of adverse drug reactions was developed within the EU-ADR project consortium. This tool, targeted at the pharmacovigilance community, is available online at https://bioinformatics.ua.pt/euadr/. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Blowing and drifting of snow is a major concern for transportation efficiency and road safety in regions where their development is common. One common way to mitigate snow drift on roadways is to install plastic snow fences. Correct design of snow fences is critical for road safety and maintaining the roads open during winter in the US Midwest and other states affected by large snow events during the winter season and to maintain costs related to accumulation of snow on the roads and repair of roads to minimum levels. Of critical importance for road safety is the protection against snow drifting in regions with narrow rights of way, where standard fences cannot be deployed at the recommended distance from the road. Designing snow fences requires sound engineering judgment and a thorough evaluation of the potential for snow blowing and drifting at the construction site. The evaluation includes site-specific design parameters typically obtained with semi-empirical relations characterizing the local transport conditions. Among the critical parameters involved in fence design and assessment of their post-construction efficiency is the quantification of the snow accumulation at fence sites. The present study proposes a joint experimental and numerical approach to monitor snow deposits around snow fences, quantitatively estimate snow deposits in the field, asses the efficiency and improve the design of snow fences. Snow deposit profiles were mapped using GPS based real-time kinematic surveys (RTK) conducted at the monitored field site during and after snow storms. The monitored site allowed testing different snow fence designs under close to identical conditions over four winter seasons. The study also discusses the detailed monitoring system and analysis of weather forecast and meteorological conditions at the monitored sites. A main goal of the present study was to assess the performance of lightweight plastic snow fences with a lower porosity than the typical 50% porosity used in standard designs of such fences. The field data collected during the first winter was used to identify the best design for snow fences with a porosity of 50%. Flow fields obtained from numerical simulations showed that the fence design that worked the best during the first winter induced the formation of an elongated area of small velocity magnitude close to the ground. This information was used to identify other candidates for optimum design of fences with a lower porosity. Two of the designs with a fence porosity of 30% that were found to perform well based on results of numerical simulations were tested in the field during the second winter along with the best performing design for fences with a porosity of 50%. Field data showed that the length of the snow deposit away from the fence was reduced by about 30% for the two proposed lower-porosity (30%) fence designs compared to the best design identified for fences with a porosity of 50%. Moreover, one of the lower-porosity designs tested in the field showed no significant snow deposition within the bottom gap region beneath the fence. Thus, a major outcome of this study is to recommend using plastic snow fences with a porosity of 30%. It is expected that this lower-porosity design will continue to work well for even more severe snow events or for successive snow events occurring during the same winter. The approach advocated in the present study allowed making general recommendations for optimizing the design of lower-porosity plastic snow fences. This approach can be extended to improve the design of other types of snow fences. Some preliminary work for living snow fences is also discussed. Another major contribution of this study is to propose, develop protocols and test a novel technique based on close range photogrammetry (CRP) to quantify the snow deposits trapped snow fences. As image data can be acquired continuously, the time evolution of the volume of snow retained by a snow fence during a storm or during a whole winter season can, in principle, be obtained. Moreover, CRP is a non-intrusive method that eliminates the need to perform man-made measurements during the storms, which are difficult and sometimes dangerous to perform. Presently, there is lots of empiricism in the design of snow fences due to lack of data on fence storage capacity on how snow deposits change with the fence design and snow storm characteristics and in the estimation of the main parameters used by the state DOTs to design snow fences at a given site. The availability of such information from CRP measurements should provide critical data for the evaluation of the performance of a certain snow fence design that is tested by the IDOT. As part of the present study, the novel CRP method is tested at several sites. The present study also discusses some attempts and preliminary work to determine the snow relocation coefficient which is one of the main variables that has to be estimated by IDOT engineers when using the standard snow fence design software (Snow Drift Profiler, Tabler, 2006). Our analysis showed that standard empirical formulas did not produce reasonable values when applied at the Iowa test sites monitored as part of the present study and that simple methods to estimate this variable are not reliable. The present study makes recommendations for the development of a new methodology based on Large Scale Particle Image Velocimetry that can directly measure the snow drift fluxes and the amount of snow relocated by the fence.