983 resultados para Accuracy Assessment
Resumo:
US Geological Survey (USGS) based elevation data are the most commonly used data source for highway hydraulic analysis; however, due to the vertical accuracy of USGS-based elevation data, USGS data may be too “coarse” to adequately describe surface profiles of watershed areas or drainage patterns. Additionally hydraulic design requires delineation of much smaller drainage areas (watersheds) than other hydrologic applications, such as environmental, ecological, and water resource management. This research study investigated whether higher resolution LIDAR based surface models would provide better delineation of watersheds and drainage patterns as compared to surface models created from standard USGS-based elevation data. Differences in runoff values were the metric used to compare the data sets. The two data sets were compared for a pilot study area along the Iowa 1 corridor between Iowa City and Mount Vernon. Given the limited breadth of the analysis corridor, areas of particular emphasis were the location of drainage area boundaries and flow patterns parallel to and intersecting the road cross section. Traditional highway hydrology does not appear to be significantly impacted, or benefited, by the increased terrain detail that LIDAR provided for the study area. In fact, hydrologic outputs, such as streams and watersheds, may be too sensitive to the increased horizontal resolution and/or errors in the data set. However, a true comparison of LIDAR and USGS-based data sets of equal size and encompassing entire drainage areas could not be performed in this study. Differences may also result in areas with much steeper slopes or significant changes in terrain. LIDAR may provide possibly valuable detail in areas of modified terrain, such as roads. Better representations of channel and terrain detail in the vicinity of the roadway may be useful in modeling problem drainage areas and evaluating structural surety during and after significant storm events. Furthermore, LIDAR may be used to verify the intended/expected drainage patterns at newly constructed highways. LIDAR will likely provide the greatest benefit for highway projects in flood plains and areas with relatively flat terrain where slight changes in terrain may have a significant impact on drainage patterns.
Resumo:
OBJECTIVE: To compare the predictive accuracy of the original and recalibrated Framingham risk function on current morbidity from coronary heart disease (CHD) and mortality data from the Swiss population. METHODS: Data from the CoLaus study, a cross-sectional, population-based study conducted between 2003 and 2006 on 5,773 participants aged 35-74 without CHD were used to recalibrate the Framingham risk function. The predicted number of events from each risk function were compared with those issued from local MONICA incidence rates and official mortality data from Switzerland. RESULTS: With the original risk function, 57.3%, 21.2%, 16.4% and 5.1% of men and 94.9%, 3.8%, 1.2% and 0.1% of women were at very low (<6%), low (6-10%), intermediate (10-20%) and high (>20%) risk, respectively. With the recalibrated risk function, the corresponding values were 84.7%, 10.3%, 4.3% and 0.6% in men and 99.5%, 0.4%, 0.0% and 0.1% in women, respectively. The number of CHD events over 10 years predicted by the original Framingham risk function was 2-3 fold higher than predicted by mortality+case fatality or by MONICA incidence rates (men: 191 vs. 92 and 51 events, respectively). The recalibrated risk function provided more reasonable estimates, albeit slightly overestimated (92 events, 5-95th percentile: 26-223 events); sensitivity analyses showed that the magnitude of the overestimation was between 0.4 and 2.2 in men, and 0.7 and 3.3 in women. CONCLUSION: The recalibrated Framingham risk function provides a reasonable alternative to assess CHD risk in men, but not in women.
Resumo:
Fatigue life assessment of weldedstructures is commonly based on the nominal stress method, but more flexible and accurate methods have been introduced. In general, the assessment accuracy is improved as more localized information about the weld is incorporated. The structural hot spot stress method includes the influence of macro geometric effects and structural discontinuities on the design stress but excludes the local features of the weld. In this thesis, the limitations of the structural hot spot stress method are discussed and a modified structural stress method with improved accuracy is developed and verified for selected welded details. The fatigue life of structures in the as-welded state consists mainly of crack growth from pre-existing cracks or defects. Crack growth rate depends on crack geometry and the stress state on the crack face plane. This means that the stress level and shape of the stress distribution in the assumed crack path governs thetotal fatigue life. In many structural details the stress distribution is similar and adequate fatigue life estimates can be obtained just by adjusting the stress level based on a single stress value, i.e., the structural hot spot stress. There are, however, cases for which the structural stress approach is less appropriate because the stress distribution differs significantly from the more common cases. Plate edge attachments and plates on elastic foundations are some examples of structures with this type of stress distribution. The importance of fillet weld size and weld load variation on the stress distribution is another central topic in this thesis. Structural hot spot stress determination is generally based on a procedure that involves extrapolation of plate surface stresses. Other possibilities for determining the structural hot spot stress is to extrapolate stresses through the thickness at the weld toe or to use Dong's method which includes through-thickness extrapolation at some distance from the weld toe. Both of these latter methods are less sensitive to the FE mesh used. Structural stress based on surface extrapolation is sensitive to the extrapolation points selected and to the FE mesh used near these points. Rules for proper meshing, however, are well defined and not difficult to apply. To improve the accuracy of the traditional structural hot spot stress, a multi-linear stress distribution is introduced. The magnitude of the weld toe stress after linearization is dependent on the weld size, weld load and plate thickness. Simple equations have been derived by comparing assessment results based on the local linear stress distribution and LEFM based calculations. The proposed method is called the modified structural stress method (MSHS) since the structural hot spot stress (SHS) value is corrected using information on weld size andweld load. The correction procedure is verified using fatigue test results found in the literature. Also, a test case was conducted comparing the proposed method with other local fatigue assessment methods.
Resumo:
It is axiomatic that our planet is extensively inhabited by diverse micro-organisms such as bacteria, yet the absolute diversity of different bacterial species is widely held to be unknown. Different bacteria can be found from the depths of the oceans to the top of the mountains; even the air is more or less colonized by bacteria. Most bacteria are either harmless or even advantageous to human beings but there are also bacteria, which can cause severe infectious diseases or spoil the supplies intended for human consumption. Therefore, it is vitally important not only to be able to detect and enumerate bacteria but also to assess their viability and possible harmfulness. Whilst the growth of bacteria is remarkably fast under optimum conditions and easy to detect by cultural methods, most bacteria are believed to lie in stationary phase of growth in which the actual growth is ceased and thus bacteria may simply be undetectable by cultural techniques. Additionally, several injurious factors such as low and high temperature or deficiency of nutrients can turn bacteria into a viable but non-culturable state (VBNC) that cannot be detected by cultural methods. Thereby, various noncultural techniques developed for the assessment of bacterial viability and killing have widely been exploited in modern microbiology. However, only a few methods are suitable for kinetic measurements, which enable the real-time detection of bacterial growth and viability. The present study describes alternative methods for measuring bacterial viability and killing as well as detecting the effects of various antimicrobial agents on bacteria on a real-time basis. The suitability of bacterial (lux) and beetle (luc) luciferases as well as green fluorescent protein (GFP) to act as a marker of bacterial viability and cell growth was tested. In particular, a multiparameter microplate assay based on GFP-luciferase combination as well as a flow cytometric measurement based on GFP-PI combination were developed to perform divergent viability analyses. The results obtained suggest that the antimicrobial activities of various drugs against bacteria could be successfully measured using both of these methods. Specifically, the data reliability of flow cytometric viability analysis was notably improved as GFP was utilized in the assay. A fluoro-luminometric microplate assay enabled kinetic measurements, which significantly improved and accelerated the assessment of bacterial viability compared to more conventional viability assays such as plate counting. Moreover, the multiparameter assay made simultaneous detection of GFP fluorescence and luciferase bioluminescence possible and provided extensive information about multiple cellular parameters in single assay, thereby increasing the accuracy of the assessment of the kinetics of antimicrobial activities on target bacteria.
Resumo:
A diagrammatic scale to assess soybean (Glycine max) rust severity, caused by the fungus Phakopsora pachyrhizi, was developed in this study. Leaflets showing different severity levels were collected for determination of the minimum and maximum severity limits; intermediate levels were determined according to "Weber-Fechner's stimulus-response law". The proposed scale showed the levels of 0.6; 2; 7; 18; 42, and 78.5%. Scale validation was performed by eight raters (four inexperienced and four experienced), who estimated the severity of 44 soybean leaflets showing rust symptoms, with and without the use of the scale. Except for rater number eight, all showed a tendency to overestimate severity without the aid of the diagrammatic scale. With the scale, the raters obtained better accuracy and precision levels, although the tendency to overestimate was maintained. Experienced raters were more accurate and precise than inexperienced raters, and assessment improvements with the use of the scale were more significant for inexperienced raters.
Resumo:
The correct quantification of blast caused by the fungus Magnaporthe oryzae on wheat (Triticum aestivum) spikes is an important component to understand the development of this disease aimed at its control. Visual quantification based on a diagrammatic scale can be a practical and efficient strategy that has already proven to be useful against several plant pathosystems, including diseases affecting wheat spikes like glume blotch and fusarium head blight. Spikes showing different disease severity values were collected from a wheat field with the aim of elaborating a diagrammatic scale to quantify blast severity on wheat spikes. The spikes were photographed and blast severity was determined by using resources of the software ImageJ. A diagrammatic scale was developed with the following disease severity values: 3.7, 7.5, 21.4, 30.5, 43.8, 57.3, 68.1, 86.0, and 100.0%. An asymptomatic spike was added to the scale. Scale validation was performed by eight people who estimated blast severity by using digitalized images of 40 wheat spikes. The precision and the accuracy of the evaluations varied according to the rater (0.82
Resumo:
Surface area (SA) of poultry is an important parameter for heat and mass transfer calculations. Optical approaches, such as the moiré technique (MT), are non-destructive, result in accuracy and speed gains, and preserve the object integrity. The objective of this research was to develop and validate a new protocol for estimating the surface area (SA) of broiler chickens based on the MT. Sixty-six Ross breed broiler chickens (twenty-seven male, thirty-nine female, ages spanning all growth phases) were used in this study. The dimensions (length, width and height) and body mass of randomly selected broiler chickens were evaluated in the laboratory. Chickens were illuminated by a light source, and grids were projected onto the chickens to allow their shape to be determined and recorded. Next, the skin and feathers of the chickens were removed to allow SA to be determined by conventional means. These measurements were then used for calibration and validation. The MT for image analysis was a reliable means of evaluating the three-dimensional shape and SA of broiler chickens. This technique, which is neither invasive nor destructive, is a good alternative to the conventional destructive methods.
Resumo:
The topic of this Master’s Thesis is risk assessment in the supply chain, and the work was done for a company operating in the pharmaceutical industry. The unique features of the industry bring additional challenges to risk management, due to high regulatory, docu-mentation and traceability requirements. The objective of the thesis was to generate a template for assessing the risks in the supply chain of current and potential suppliers of the case company. Risks pertaining to the case setting were sought mainly from in-house expertise of this specific product and supply chain as well as academic research papers and theory on risk management. A questionnaire was set up to assess the found risks on impact, occurrence and possibility of detection. Through this classification of the severity of the risks, the supplier assessment template was formed. A questionnaire template, comprised of the top 10 risks affecting the flow of information and materials in this setting, was formulated to serve as a generic tool for assessing risks in the supply chain of a pharmaceutical company. The template was tested on another supplier for usability and accuracy of found risks, and it demonstrated functioning in a differing supply chain and product setting.
Resumo:
Thermal cutting methods, are commonly used in the manufacture of metal parts. Thermal cutting processes separate materials by using heat. The process can be done with or without a stream of cutting oxygen. Common processes are Oxygen, plasma and laser cutting. It depends on the application and material which cutting method is used. Numerically-controlled thermal cutting is a cost-effective way of prefabricating components. One design aim is to minimize the number of work steps in order to increase competitiveness. This has resulted in the holes and openings in plate parts manufactured today being made using thermal cutting methods. This is a problem from the fatigue life perspective because there is local detail in the as-welded state that causes a rise in stress in a local area of the plate. In a case where the static utilization of a net section is full used, the calculated linear local stresses and stress ranges are often over 2 times the material yield strength. The shakedown criteria are exceeded. Fatigue life assessment of flame-cut details is commonly based on the nominal stress method. For welded details, design standards and instructions provide more accurate and flexible methods, e.g. a hot-spot method, but these methods are not universally applied to flame cut edges. Some of the fatigue tests of flame cut edges in the laboratory indicated that fatigue life estimations based on the standard nominal stress method can give quite a conservative fatigue life estimate in cases where a high notch factor was present. This is an undesirable phenomenon and it limits the potential for minimizing structure size and total costs. A new calculation method is introduced to improve the accuracy of the theoretical fatigue life prediction method of a flame cut edge with a high stress concentration factor. Simple equations were derived by using laboratory fatigue test results, which are published in this work. The proposed method is called the modified FAT method (FATmod). The method takes into account the residual stress state, surface quality, material strength class and true stress ratio in the critical place.
Resumo:
Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.
Resumo:
L’objectif principal de cette thèse est d’examiner et d’intervenir auprès des déficits de la mémoire de travail (MdeT) à l’intérieur de deux populations cliniques : la maladie d’Alzheimer (MA) et le trouble cognitif léger (TCL). La thèse se compose de trois articles empiriques. Le but de la première expérimentation était d’examiner les déficits de MdeT dans le vieillissement normal, le TCL et la MA à l’aide de deux versions de l’empan complexe : l’empan de phrases et l’empan arithmétique. De plus, l’effet de «l’oubli» (forgetting) a été mesuré en manipulant la longueur de l’intervalle de rétention. Les résultats aux tâches d’empan complexe indiquent que la MdeT est déficitaire chez les individus atteints de TCL et encore plus chez les gens ayant la MA. Les données recueillies supportent également le rôle de l’oubli à l’intérieur de la MdeT. L’augmentation de l’intervalle de rétention exacerbait le déficit dans la MA et permettait de prédire un pronostic négatif dans le TCL. L’objectif de la deuxième étude était d’examiner la faisabilité d’un programme d’entraînement cognitif à l’ordinateur pour la composante de contrôle attentionnel à l’intérieur de la MdeT. Cette étude a été réalisée auprès de personnes âgées saines et de personnes âgées avec TCL. Les données de cette expérimentation ont révélé des effets positifs de l’entraînement pour les deux groupes de personnes. Toutefois, l’absence d’un groupe contrôle a limité l’interprétation des résultats. Sur la base de ces données, la troisième expérimentation visait à implémenter une étude randomisée à double-insu avec groupe contrôle d’un entraînement du contrôle attentionnel chez des personnes TCL avec atteinte exécutive. Ce protocole impliquait un paradigme de double-tâche composé d’une tâche de détection visuelle et d’une tâche de jugement alpha-arithmétique. Alors que le groupe contrôle pratiquait simplement la double-tâche sur six périodes d’une heure chacune, le groupe expérimental recevait un entraînement de type priorité variable dans lequel les participants devaient gérer leur contrôle attentionnel en variant la proportion de ressources attentionnelles allouée à chaque tâche. Les résultats montrent un effet significatif de l’intervention sur une des deux tâches impliquées (précision à la tâche de détection visuelle) ainsi qu’une tendance au transfert à une autre tâche d’attention divisée, mais peu d’effets de généralisation à d’autres tâches d’attention. En résumé, les données originales rapportées dans la présente thèse démontrent un déficit de la MdeT dans les maladies neurodégénératives liées à l’âge, avec un gradient entre le TCL et la MA. Elles suggèrent également une préservation de la plasticité des capacités attentionnelles chez les personnes à risque de développer une démence.
Resumo:
Triple quadrupole mass spectrometers coupled with high performance liquid chromatography are workhorses in quantitative bioanalyses. It provides substantial benefits including reproducibility, sensitivity and selectivity for trace analysis. Selected Reaction Monitoring allows targeted assay development but data sets generated contain very limited information. Data mining and analysis of non-targeted high-resolution mass spectrometry profiles of biological samples offer the opportunity to perform more exhaustive assessments, including quantitative and qualitative analysis. The objectives of this study was to test method precision and accuracy, statistically compare bupivacaine drug concentration in real study samples and verify if high resolution and accurate mass data collected in scan mode can actually permit retrospective data analysis, more specifically, extract metabolite related information. The precision and accuracy data presented using both instruments provided equivalent results. Overall, the accuracy was ranging from 106.2 to 113.2% and the precision observed was from 1.0 to 3.7%. Statistical comparisons using a linear regression between both methods reveal a coefficient of determination (R2) of 0.9996 and a slope of 1.02 demonstrating a very strong correlation between both methods. Individual sample comparison showed differences from -4.5% to 1.6% well within the accepted analytical error. Moreover, post acquisition extracted ion chromatograms at m/z 233.1648 ± 5 ppm (M-56) and m/z 305.2224 ± 5 ppm (M+16) revealed the presence of desbutyl-bupivacaine and three distinct hydroxylated bupivacaine metabolites. Post acquisition analysis allowed us to produce semiquantitative evaluations of the concentration-time profiles for bupicavaine metabolites.
Resumo:
Scoliosis is a 3D deformity of the spine and rib cage. Extensive validation of 3D reconstruction methods of the spine from biplanar radiography has already been published. In this article, we propose a novel method to reconstruct the rib cage, using the same biplanar views as for the 3D reconstruction of the spine, to allow clinical assessment of whole trunk deformities. This technique uses a semi-automatic segmentation of the ribs in the postero-anterior X-ray view and an interactive segmentation of partial rib edges in the lateral view. The rib midlines are automatically extracted in 2D and reconstructed in 3D using the epipolar geometry. For the ribs not visible in the lateral view, the method predicts their 3D shape. The accuracy of the proposed method has been assessed using data obtained from a synthetic bone model as a gold standard and has also been evaluated using data of real patients with scoliotic deformities. Results show that the reconstructed ribs enable a reliable evaluation of the rib axial rotation, which will allow a 3D clinical assessment of the spine and rib cage deformities.
Resumo:
An analysis of historical Corona images, Landsat images, recent radar and Google Earth® images was conducted to determine land use and land cover changes of oases settlements and surrounding rangelands at the fringe of the Altay Mountains from 1964 to 2008. For the Landsat datasets supervised classification methods were used to test the suitability of the Maximum Likelihood Classifier with subsequent smoothing and the Sequential Maximum A Posteriori Classifier (SMAPC). The results show a trend typical for the steppe and desert regions of northern China. From 1964 to 2008 farmland strongly increased (+ 61%), while the area of grassland and forest in the floodplains decreased (- 43%). The urban areas increased threefold and 400 ha of former agricultural land were abandoned. Farmland apparently affected by soil salinity decreased in size from 1990 (1180 ha) to 2008 (630 ha). The vegetated areas of the surrounding rangelands decreased, mainly as a result of overgrazing and drought events.The SMAPC with subsequent post processing revealed the highest classification accuracy. However, the specific landscape characteristics of mountain oasis systems required labour intensive post processing. Further research is needed to test the use of ancillary information for an automated classification of the examined landscape features.
Resumo:
La investigació que es presenta en aquesta tesi es centra en l'aplicació i millora de metodologies analítiques existents i el desenvolupament de nous procediments que poden ser utilitzats per a l'estudi dels efectes ambientals de la dispersió dels metalls entorn a les zones mineres abandonades. En primer lloc, es van aplicar diferents procediments d'extracció simple i seqüencial per a estudiar la mobilitat, perillositat i bio-disponibilitat dels metalls continguts en residus miners de característiques diferents. Per altra banda, per a estudiar les fonts potencials de Pb en la vegetació de les zones mineres d'estudi, una metodologia basada en la utilització de les relacions isotòpiques de Pb determinades mitjançant ICP-MS va ser avaluada. Finalment, tenint en compte l'elevat nombre de mostres analitzades per a avaluar l'impacte de les activitats mineres, es va considerar apropiat el desenvolupament de mètodes analítics d'elevada productivitat. En aquest sentit la implementació d'estratègies quantitatives així com l'aplicació de les millores instrumentals en els equips de XRF han estat avaluades per a aconseguir resultats analítics fiables en l'anàlisi de plantes. A més, alguns paràmetres de qualitat com la precisió, l'exactitud i els límits de detecció han estat curosament determinats en les diverses configuracions de espectròmetres de XRF utilitzats en el decurs d'aquest treball (EDXRF, WDXRF i EDPXRF) per a establir la capacitat de la tècnica de XRF com a tècnica alternativa a les clàssiques comunament aplicades en la determinació d'elements en mostres vegetals.