886 resultados para Query Refinement


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Often, road construction causes the need to create a work zone. In these scenarios, portable concrete barriers (PCBs) are typically installed to shield workers and equipment from errant vehicles as well as prevent motorists from striking other roadside hazards. For an existing W-beam guardrail system installed adjacent to the roadway and near the work zone, guardrail sections are removed in order to place the portable concrete barrier system. The focus of this research study was to develop a proper stiffness transition between W-beam guardrail and portable concrete barrier systems. This research effort was accomplished through development and refinement of design concepts using computer simulation with LS-DYNA. Several design concepts were simulated, and design metrics were used to evaluate and refine each concept. These concepts were then analyzed and ranked based on feasibility, likelihood of success, and ease of installation. The rankings were presented to the Technical Advisory Committee (TAC) for selection of a preferred design alternative. Next, a Critical Impact Point (CIP) study was conducted, while additional analyses were performed to determine the critical attachment location and a reduced installation length for the portable concrete barriers. Finally, an additional simulation effort was conducted in order to evaluate the safety performance of the transition system under reverse-direction impact scenarios as well as to select the CIP. Recommendations were also provided for conducting a Phase II study and evaluating the nested Midwest Guardrail System (MGS) configuration using three Test Level 3 (TL-3) full-scale crash tests according to the criteria provided in the Manual for Assessing Safety Hardware, as published by the American Association of Safety Highway and Transportation Officials (AASHTO).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Normal and abnormal brains can be segmented by registering the target image with an atlas. Here, an atlas is defined as the combination of an intensity image (template) and its segmented image (the atlas labels). After registering the atlas template and the target image, the atlas labels are propagated to the target image. We define this process as atlas-based segmentation. In recent years, researchers have investigated registration algorithms to match atlases to query subjects and also strategies for atlas construction. In this paper we present a review of the automated approaches for atlas-based segmentation of magnetic resonance brain images. We aim to point out the strengths and weaknesses of atlas-based methods and suggest new research directions. We use two different criteria to present the methods. First, we refer to the algorithms according to their atlas-based strategy: label propagation, multi-atlas methods, and probabilistic techniques. Subsequently, we classify the methods according to their medical target: the brain and its internal structures, tissue segmentation in healthy subjects, tissue segmentation in fetus, neonates and elderly subjects, and segmentation of damaged brains. A quantitative comparison of the results reported in the literature is also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La pregunta inicial d’aquesta investigació rau en si el patrimoni proper és un recurs que facilita l’aprenentatge de les ciències socials als nens i nenes. Per tal de donar resposta a aquesta qüestió, s’ha estudiat la relació entre el patrimoni local i l’interès que genera. Per realitzar-ho, es van concretar uns ítems que han estat analitzats a partir d’un estudi de casos. Els participants d’aquesta investigació han estat un grup d’alumnes d’Educació Primària, els quals van realitzar una proposta didàctica on s’utilitzava el patrimoni proper com a eix vertebrador. També, han participat en aquesta investigació mestres d’escoles de Sant Celoni, els quals han expressat les vivències viscudes amb els seus alumnes en activitats que impliquen el patrimoni local. Finalment, s’ha analitzat la informació aportada per tots els participants i s’ha arribat a unes conclusions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of CleanEx is to provide access to public gene expression data via unique gene names. A second objective is to represent heterogeneous expression data produced by different technologies in a way that facilitates joint analysis and cross-data set comparisons. A consistent and up-to-date gene nomenclature is achieved by associating each single experiment with a permanent target identifier consisting of a physical description of the targeted RNA population or the hybridization reagent used. These targets are then mapped at regular intervals to the growing and evolving catalogues of human genes and genes from model organisms. The completely automatic mapping procedure relies partly on external genome information resources such as UniGene and RefSeq. The central part of CleanEx is a weekly built gene index containing cross-references to all public expression data already incorporated into the system. In addition, the expression target database of CleanEx provides gene mapping and quality control information for various types of experimental resource, such as cDNA clones or Affymetrix probe sets. The web-based query interfaces offer access to individual entries via text string searches or quantitative expression criteria. CleanEx is accessible at: http://www.cleanex.isb-sib.ch/.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper highlights the role of non-functional information when reusing from a component library. We describe a method for selecting appropriate implementations of Ada packages taking non-functional constraints into account; these constraints model the context of reuse. Constraints take the form of queries using an interface description language called NoFun, which is also used to state non-functional information in Ada packages; query results are trees of implementations, following the import relationships between components. We define two different situations when reusing components, depending whether we take the library being searched as closed or extendible. The resulting tree of implementations can be manipulated by the user to solve ambiguities, to state default behaviours, and by the like. As part of the proposal, we face the problem of computing from code the non-functional information that determines the selection process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The current study investigated cognitive resource allocation in discourse processing by means of pupil dilation and behavioral measures. Short question-answer dialogs were presented to listeners. Either the context question queried a new information focus in the successive answer, or else the context query was corrected in the answer sentence (correction information). The information foci contained in the answer sentences were either adequately highlighted by prosodic means or not. Participants had to judge the adequacy of the focus prosody with respect to the preceding context question. Prosodic judgment accuracy was higher in the conditions bearing adequate focus prosody than in the conditions with inadequate focus prosody. Latency to peak pupil dilation was longer when new information foci were perceived compared to correction foci. Moreover, for the peak dilation, an interaction of focus type and prosody was found. Post hoc statistical tests revealed that prosodically adequate correction focus positions were processed with smaller peak dilation in comparison to all other dialog conditions. Thus, pupil dilation and results of a principal component analysis suggest an interaction of focus type and focus prosody in discourse processing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L'objectiu del projecte és el desenvolupament d'una aplicació d'escriptori que permeti a l'estudiant la consulta offline de les darreres novetats disponibles a les aules del campus UOC. Es tracta d'automatitzar el procés de consulta i descàrrega local de dades per tal de poder consultar-les posteriorment fora de línia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Searching for matches between large collections of short (14-30 nucleotides) words and sequence databases comprising full genomes or transcriptomes is a common task in biological sequence analysis. We investigated the performance of simple indexing strategies for handling such tasks and developed two programs, fetchGWI and tagger, that index either the database or the query set. Either strategy outperforms megablast for searches with more than 10,000 probes. FetchGWI is shown to be a versatile tool for rapidly searching multiple genomes, whose performance is limited in most cases by the speed of access to the filesystem. We have made publicly available a Web interface for searching the human, mouse, and several other genomes and transcriptomes with oligonucleotide queries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recognition that nutrients have the ability to interact and modulate molecular mechanisms underlying an organism's physiological functions has prompted a revolution in the field of nutrition. Performing population-scaled epidemiological studies in the absence of genetic knowledge may result in erroneous scientific conclusions and misinformed nutritional recommendations. To circumvent such issues and more comprehensively probe the relationship between genes and diet, the field of nutrition has begun to capitalize on both the technologies and supporting analytical software brought forth in the post-genomic era. The creation of nutrigenomics and nutrigenetics, two fields with distinct approaches to elucidate the interaction between diet and genes but with a common ultimate goal to optimize health through the personalization of diet, provide powerful approaches to unravel the complex relationship between nutritional molecules, genetic polymorphisms, and the biological system as a whole. Reluctance to embrace these new fields exists primarily due to the fear that producing overwhelming quantities of biological data within the confines of a single study will submerge the original query; however, the current review aims to position nutrigenomics and nutrigenetics as the emerging faces of nutrition that, when considered with more classical approaches, will provide the necessary stepping stones to achieve the ambitious goal of optimizing an individual's health via nutritional intervention.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Screening people without symptoms of disease is an attractive idea. Screening allows early detection of disease or elevated risk of disease, and has the potential for improved treatment and reduction of mortality. The list of future screening opportunities is set to grow because of the refinement of screening techniques, the increasing frequency of degenerative and chronic diseases, and the steadily growing body of evidence on genetic predispositions for various diseases. But how should we decide on the diseases for which screening should be done and on recommendations for how it should be implemented? We use the examples of prostate cancer and genetic screening to show the importance of considering screening as an ongoing population-based intervention with beneficial and harmful effects, and not simply the use of a test. Assessing whether screening should be recommended and implemented for any named disease is therefore a multi-dimensional task in health technology assessment. There are several countries that already use established processes and criteria to assess the appropriateness of screening. We argue that the Swiss healthcare system needs a nationwide screening commission mandated to conduct appropriate evidence-based evaluation of the impact of proposed screening interventions, to issue evidence-based recommendations, and to monitor the performance of screening programmes introduced. Without explicit processes there is a danger that beneficial screening programmes could be neglected and that ineffective, and potentially harmful, screening procedures could be introduced.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Three pavement design software packages were compared with regards to how they were different in determining design input parameters and their influences on the pavement thickness. StreetPave designs the concrete pavement thickness based on the PCA method and the equivalent asphalt pavement thickness. The WinPAS software performs both concrete and asphalt pavements following the AASHTO 1993 design method. The APAI software designs asphalt pavements based on pre-mechanistic/empirical AASHTO methodology. First, the following four critical design input parameters were identified: traffic, subgrade strength, reliability, and design life. The sensitivity analysis of these four design input parameters were performed using three pavement design software packages to identify which input parameters require the most attention during pavement design. Based on the current pavement design procedures and sensitivity analysis results, a prototype pavement design and sensitivity analysis (PD&SA) software package was developed to retrieve the pavement thickness design value for a given condition and allow a user to perform a pavement design sensitivity analysis. The prototype PD&SA software is a computer program that stores pavement design results in database that is designed for the user to input design data from the variety of design programs and query design results for given conditions. The prototype Pavement Design and Sensitivity Analysis (PA&SA) software package was developed to demonstrate the concept of retrieving the pavement design results from the database for a design sensitivity analysis. This final report does not include the prototype software which will be validated and tested during the next phase.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report proposes, that for certain types of highway construction projects undertaken by the Iowa Department of Transportation, a scheduling technique commonly referred to as linear scheduling may be more effective than the Critical Path Method scheduling technique that is currently being used. The types of projects that appear to be good candidates for the technique are those projects that have a strong linear orientation. Like a bar chart, this technique shows when an activity is scheduled to occur and like a CPM schedule it shows the sequence in which activities are expected to occur. During the 1992 construction season, the authors worked with an inlay project on Interstate 29 to demonstrate the linear scheduling technique to the Construction Office. The as-planned schedule was developed from the CPM schedule that the contractor had developed for the project. Therefore, this schedule represents what a linear representation of a CPM schedule would look like, and not necessarily what a true linear schedule would look like if it had been the only scheduling technique applied to the project. There is a need to expand the current repertoire of scheduling techniques to address those projects for which the bar chart and CPM may not be appropriate either because of the lack of control information or due to overly complex process for the actual project characteristics. The scheduling approaches used today on transportation projects have many shortcomings for properly modeling the real world constraints and conditions which are encountered. Linear project's predilection for activities with variable production rates, a concept very difficult to handle with the CPM, is easily handled and visualized with the linear technique. It is recommended that work proceed with the refinement of the method of linear scheduling described above and the development of a microcomputer based system for use by the Iowa Department of Transportation and contractors for its implementation. The system will be designed to provide the information needed to adjust schedules in a rational understandable method for monitoring progress on the projects and alerting Iowa Department of Transportation personnel when the contractor is deviating from the plan.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ground-penetrating radar (GPR) geophysical method has the potential to provide valuable information on the hydraulic properties of the vadose zone because of its strong sensitivity to soil water content. In particular, recent evidence has suggested that the stochastic inversion of crosshole GPR traveltime data can allow for a significant reduction in uncertainty regarding subsurface van Genuchten-Mualem (VGM) parameters. Much of the previous work on the stochastic estimation of VGM parameters from crosshole GPR data has considered the case of steady-state infiltration conditions, which represent only a small fraction of practically relevant scenarios. We explored in detail the dynamic infiltration case, specifically examining to what extent time-lapse crosshole GPR traveltimes, measured during a forced infiltration experiment at the Arreneas field site in Denmark, could help to quantify VGM parameters and their uncertainties in a layered medium, as well as the corresponding soil hydraulic properties. We used a Bayesian Markov-chain-Monte-Carlo inversion approach. We first explored the advantages and limitations of this approach with regard to a realistic synthetic example before applying it to field measurements. In our analysis, we also considered different degrees of prior information. Our findings indicate that the stochastic inversion of the time-lapse GPR data does indeed allow for a substantial refinement in the inferred posterior VGM parameter distributions compared with the corresponding priors, which in turn significantly improves knowledge of soil hydraulic properties. Overall, the results obtained clearly demonstrate the value of the information contained in time-lapse GPR data for characterizing vadose zone dynamics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

STUDY DESIGN:: Retrospective database- query to identify all anterior spinal approaches. OBJECTIVES:: To assess all patients with pharyngo-cutaneous fistulas after anterior cervical spine surgery. SUMMARY OF BACKGROUND DATA:: Patients treated in University of Heidelberg Spine Medical Center, Spinal Cord Injury Unit and Department of Otolaryngology (Germany), between 2005 and 2011 with the diagnosis of pharyngo-cutaneous fistulas. METHODS:: We conducted a retrospective study on 5 patients between 2005 and 2011 with PCF after ACSS, their therapy management and outcome according to radiologic data and patient charts. RESULTS:: Upon presentation 4 patients were paraplegic. 2 had PCF arising from one piriform sinus, two patients from the posterior pharyngeal wall and piriform sinus combined and one patient only from the posterior pharyngeal wall. 2 had previous unsuccessful surgical repair elsewhere and 1 had prior radiation therapy. In 3 patients speech and swallowing could be completely restored, 2 patients died. Both were paraplegic. The patients needed an average of 2-3 procedures for complete functional recovery consisting of primary closure with various vascularised regional flaps and refining laser procedures supplemented with negative pressure wound therapy where needed. CONCLUSION:: Based on our experience we are able to provide a treatment algorithm that indicates that chronic as opposed to acute fistulas require a primary surgical closure combined with a vascularised flap that should be accompanied by the immediate application of a negative pressure wound therapy. We also conclude that particularly in paraplegic patients suffering this complication the risk for a fatal outcome is substantial.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The major objective of this project is to evaluate image analysis for characterizing air voids in Portland cement contract (PCC) and asphalt concrete (AC) and aggregate gradation in asphalt concrete. Phase 1 of this project has concentrated on evaluation and refinement of sample preparation techniques, evaluation of methods and instruments for conducting image analysis, and finally, analysis and comparison of a select portion of samples. Preliminary results suggest a strong correlation between the results obtained from the linear traverse method and image analysis methods for determining percent air voids in concrete. Preliminary work with asphalt samples has shown that damage caused by a high vacuum of the conventional scanning electron microscope (SEM) may too disruptive. Alternative solutions have been explored, including confocal microscopy and low vacuum electron microscopy. Additionally, a conventional high vacuum SEM operating at a marginal operating vacuum may suffice.