920 resultados para Analysis failure modes and effects (FMEA)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: In clinical practice a diagnosis is based on a combination of clinical history, physical examination and additional diagnostic tests. At present, studies on diagnostic research often report the accuracy of tests without taking into account the information already known from history and examination. Due to this lack of information, together with variations in design and quality of studies, conventional meta-analyses based on these studies will not show the accuracy of the tests in real practice. By using individual patient data (IPD) to perform meta-analyses, the accuracy of tests can be assessed in relation to other patient characteristics and allows the development or evaluation of diagnostic algorithms for individual patients. In this study we will examine these potential benefits in four clinical diagnostic problems in the field of gynaecology, obstetrics and reproductive medicine. METHODS/DESIGN: Based on earlier systematic reviews for each of the four clinical problems, studies are considered for inclusion. The first authors of the included studies will be invited to participate and share their original data. After assessment of validity and completeness the acquired datasets are merged. Based on these data, a series of analyses will be performed, including a systematic comparison of the results of the IPD meta-analysis with those of a conventional meta-analysis, development of multivariable models for clinical history alone and for the combination of history, physical examination and relevant diagnostic tests and development of clinical prediction rules for the individual patients. These will be made accessible for clinicians. DISCUSSION: The use of IPD meta-analysis will allow evaluating accuracy of diagnostic tests in relation to other relevant information. Ultimately, this could increase the efficiency of the diagnostic work-up, e.g. by reducing the need for invasive tests and/or improving the accuracy of the diagnostic workup. This study will assess whether these benefits of IPD meta-analysis over conventional meta-analysis can be exploited and will provide a framework for future IPD meta-analyses in diagnostic and prognostic research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To determine stiffness and load-displacement curves as a biomechanical response to applied torsion and shear forces in cadaveric canine lumbar and lumbosacral specimens. STUDY DESIGN: Biomechanical study. ANIMALS: Caudal lumbar and lumbosacral functional spine units (FSU) of nonchondrodystrophic large-breed dogs (n=31) with radiographically normal spines. METHODS: FSU from dogs without musculoskeletal disease were tested in torsion in a custom-built spine loading simulator with 6 degrees of freedom, which uses orthogonally mounted electric motors to apply pure axial rotation. For shear tests, specimens were mounted to a custom-made shear-testing device, driven by a servo hydraulic testing machine. Load-displacement curves were recorded for torsion and shear. RESULTS: Left and right torsion stiffness was not different within each FSU level; however, torsional stiffness of L7-S1 was significantly smaller compared with lumbar FSU (L4-5-L6-7). Ventral/dorsal stiffness was significantly different from lateral stiffness within an individual FSU level for L5-6, L6-7, and L7-S1 but not for L4-5. When the data from 4 tested shear directions from the same specimen were pooled, level L5-6 was significantly stiffer than L7-S1. CONCLUSIONS: Increased range of motion of the lumbosacral joint is reflected by an overall decreased shear and rotational stiffness at the lumbosacral FSU. CLINICAL RELEVANCE: Data from dogs with disc degeneration have to be collected, analyzed, and compared with results from our chondrodystrophic large-breed dogs with radiographically normal spines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Erosion of dentine causes mineral dissolution, while the organic compounds remain at the surface. Therefore, a determination of tissue loss is complicated. Established quantitative methods for the evaluation of enamel have also been used for dentine, but the suitability of these techniques in this field has not been systematically determined. Therefore, this study aimed to compare longitudinal microradiography (LMR), contacting (cPM) and non-contacting profilometry (ncPM), and analysis of dissolved calcium (Ca analysis) in the erosion solution. Results are discussed in the light of the histology of dentine erosion. Erosion was performed with 0.05 M citric acid (pH 2.5) for 30, 60, 90 or 120 min, and erosive loss was determined by each method. LMR, cPM and ncPM were performed before and after collagenase digestion of the demineralised organic surface layer, with an emphasis on moisture control. Scanning electron microscopy was performed on randomly selected specimens. All measurements were converted into micrometres. Profilometry was not suitable to adequately quantify mineral loss prior to collagenase digestion. After 120 min of erosion, values of 5.4 +/- 1.9 microm (ncPM) and 27.8 +/- 4.6 microm (cPM) were determined. Ca analysis revealed a mineral loss of 55.4 +/- 11.5 microm. The values for profilometry after matrix digestion were 43.0 +/- 5.5 microm (ncPM) and 46.9 +/- 6.2 (cPM). Relative and proportional biases were detected for all method comparisons. The mineral loss values were below the detection limit for LMR. The study revealed gross differences between methods, particularly when demineralised organic surface tissue was present. These results indicate that the choice of method is critical and depends on the parameter under study.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in spinal cord injury (SCI) research are dependent on quality animal models, which in turn rely on sensitive outcome measures able to detect functional differences in animals following injury. To date, most measurements of dysfunction following SCI rely either on the subjective rating of observers or the slow throughput of manual gait assessment. The present study compares the gait of normal and contusion-injured mice using the TreadScan system. TreadScan utilizes a transparent treadmill belt and a high-speed camera to capture the footprints of animals and automatically analyze gait characteristics. Adult female C57Bl/6 mice were introduced to the treadmill prior to receiving either a standardized mild, moderate, or sham contusion spinal cord injury. TreadScan gait analyses were performed weekly for 10 weeks and compared with scores on the Basso Mouse Scale (BMS). Results indicate that this software successfully differentiates sham animals from injured animals on a number of gait characteristics, including hindlimb swing time, stride length, toe spread, and track width. Differences were found between mild and moderate contusion injuries, indicating a high degree of sensitivity within the system. Rear track width, a measure of the animal's hindlimb base of support, correlated strongly both with spared white matter percentage and with terminal BMS. TreadScan allows for an objective and rapid behavioral assessment of locomotor function following mild-moderate contusive SCI, where the majority of mice still exhibit hindlimb weight support and plantar paw placement during stepping.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this retrospective study was to assess image quality with pulmonary CT angiography (CTA) using 80 kVp and to find anthropomorphic parameters other than body weight (BW) to serve as selection criteria for low-dose CTA. Attenuation in the pulmonary arteries, anteroposterior and lateral diameters, cross-sectional area and soft-tissue thickness of the chest were measured in 100 consecutive patients weighing less than 100 kg with 80 kVp pulmonary CTA. Body surface area (BSA) and contrast-to-noise ratios (CNR) were calculated. Three radiologists analyzed arterial enhancement, noise, and image quality. Image parameters between patients grouped by BW (group 1: 0-50 kg; groups 2-6: 51-100 kg, decadally increasing) were compared. CNR was higher in patients weighing less than 60 kg than in the BW groups 71-99 kg (P between 0.025 and <0.001). Subjective ranking of enhancement (P = 0.165-0.605), noise (P = 0.063), and image quality (P = 0.079) did not differ significantly across all patient groups. CNR correlated moderately strongly with weight (R = -0.585), BSA (R = -0.582), cross-sectional area (R = -0.544), and anteroposterior diameter of the chest (R = -0.457; P < 0.001 all parameters). We conclude that 80 kVp pulmonary CTA permits diagnostic image quality in patients weighing up to 100 kg. Body weight is a suitable criterion to select patients for low-dose pulmonary CTA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordinated eye and head movements simultaneously occur to scan the visual world for relevant targets. However, measuring both eye and head movements in experiments allowing natural head movements may be challenging. This paper provides an approach to study eye-head coordination: First, we demonstra- te the capabilities and limits of the eye-head tracking system used, and compare it to other technologies. Second, a beha- vioral task is introduced to invoke eye-head coordination. Third, a method is introduced to reconstruct signal loss in video- based oculography caused by cornea reflection artifacts in order to extend the tracking range. Finally, parameters of eye- head coordination are identified using EHCA (eye-head co- ordination analyzer), a MATLAB software which was developed to analyze eye-head shifts. To demonstrate the capabilities of the approach, a study with 11 healthy subjects was performed to investigate motion behavior. The approach presented here is discussed as an instrument to explore eye-head coordination, which may lead to further insights into attentional and motor symptoms of certain neurological or psychiatric diseases, e.g., schizophrenia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a program (Ragu; Randomization Graphical User interface) for statistical analyses of multichannel event-related EEG and MEG experiments. Based on measures of scalp field differences including all sensors, and using powerful, assumption-free randomization statistics, the program yields robust, physiologically meaningful conclusions based on the entire, untransformed, and unbiased set of measurements. Ragu accommodates up to two within-subject factors and one between-subject factor with multiple levels each. Significance is computed as function of time and can be controlled for type II errors with overall analyses. Results are displayed in an intuitive visual interface that allows further exploration of the findings. A sample analysis of an ERP experiment illustrates the different possibilities offered by Ragu. The aim of Ragu is to maximize statistical power while minimizing the need for a-priori choices of models and parameters (like inverse models or sensors of interest) that interact with and bias statistics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The platform-independent software package consisting of the oligonucleotide mass assembler (OMA) and the oligonucleotide peak analyzer (OPA) was created to support the analysis of oligonucleotide mass spectra. It calculates all theoretically possible fragments of a given input sequence and annotates it to an experimental spectrum, thus, saving a large amount of manual processing time. The software performs analysis of precursor and product ion spectra of oligonucleotides and their analogues comprising user-defined modifications of the backbone, the nucleobases, or the sugar moiety, as well as adducts with metal ions or drugs. The ability to expand the library of building blocks and to implement individual structural variations makes it extremely useful for supporting the analysis of therapeutically active compounds. The functionality of the software tool is demonstrated on the examples of a platinated doublestranded oligonucleotide and a modified RNA sequence. Experiments also reveal the unique dissociation behavior of platinated higher-order DNA structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stemmatology, or the reconstruction of the transmission history of texts, is a field that stands particularly to gain from digital methods. Many scholars already take stemmatic approaches that rely heavily on computational analysis of the collated text (e.g. Robinson and O’Hara 1996; Salemans 2000; Heikkilä 2005; Windram et al. 2008 among many others). Although there is great value in computationally assisted stemmatology, providing as it does a reproducible result and allowing access to the relevant methodological process in related fields such as evolutionary biology, computational stemmatics is not without its critics. The current state-of-the-art effectively forces scholars to choose between a preconceived judgment of the significance of textual differences (the Lachmannian or neo-Lachmannian approach, and the weighted phylogenetic approach) or to make no judgment at all (the unweighted phylogenetic approach). Some basis for judgment of the significance of variation is sorely needed for medieval text criticism in particular. By this, we mean that there is a need for a statistical empirical profile of the text-genealogical significance of the different sorts of variation in different sorts of medieval texts. The rules that apply to copies of Greek and Latin classics may not apply to copies of medieval Dutch story collections; the practices of copying authoritative texts such as the Bible will most likely have been different from the practices of copying the Lives of local saints and other commonly adapted texts. It is nevertheless imperative that we have a consistent, flexible, and analytically tractable model for capturing these phenomena of transmission. In this article, we present a computational model that captures most of the phenomena of text variation, and a method for analysis of one or more stemma hypotheses against the variation model. We apply this method to three ‘artificial traditions’ (i.e. texts copied under laboratory conditions by scholars to study the properties of text variation) and four genuine medieval traditions whose transmission history is known or deduced in varying degrees. Although our findings are necessarily limited by the small number of texts at our disposal, we demonstrate here some of the wide variety of calculations that can be made using our model. Certain of our results call sharply into question the utility of excluding ‘trivial’ variation such as orthographic and spelling changes from stemmatic analysis.