969 resultados para Dimensional measurement accuracy


Relevância:

30.00% 30.00%

Publicador:

Resumo:

For crime scene investigation in cases of homicide, the pattern of bloodstains at the incident site is of critical importance. The morphology of the bloodstain pattern serves to determine the approximate blood source locations, the minimum number of blows and the positioning of the victim. In the present work, the benefits of the three-dimensional bloodstain pattern analysis, including the ballistic approximation of the trajectories of the blood drops, will be demonstrated using two illustrative cases. The crime scenes were documented in 3D, using the non-contact methods digital photogrammetry, tachymetry and laser scanning. Accurate, true-to-scale 3D models of the crime scenes, including the bloodstain pattern and the traces, were created. For the determination of the areas of origin of the bloodstain pattern, the trajectories of up to 200 well-defined bloodstains were analysed in CAD and photogrammetry software. The ballistic determination of the trajectories was performed using ballistics software. The advantages of this method are the short preparation time on site, the non-contact measurement of the bloodstains and the high accuracy of the bloodstain analysis. It should be expected that this method delivers accurate results regarding the number and position of the areas of origin of bloodstains, in particular the vertical component is determined more precisely than using conventional methods. In both cases relevant forensic conclusions regarding the course of events were enabled by the ballistic bloodstain pattern analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To determine the practicability and accuracy of central corneal thickness (CCT) measurements in living chicks utilizing a noncontact, high-speed optical low-coherence reflectometer (OLCR) mounted on a slit lamp. ANIMALS STUDIED Twelve male chicks (Gallus gallus domesticus). Procedures  Measurements of CCT were obtained in triplicate in 24 eyes of twelve 1-day-old anaesthetized chicks using OLCR. Every single measurement taken by OLCR consisted of the average result of 20 scans obtained within seconds. Additionally, corneal thickness was determined histologically after immersion fixation in Karnovsky's solution alone (20 eyes) or with a previous injection of the fixative into the anterior chamber before enucleation (4 eyes). RESULTS Central corneal thickness measurements using OLCR in 1-day-old living chicks provide a rapid and feasible examination technique. Mean CCT measured with OLCR (189.7 ± 3.34 μm) was significantly lower than histological measurements (242.1 ± 47.27 μm) in eyes with fixation in Karnovsky's solution (P = 0.0005). In eyes with additional injection of Karnovsky's fixative into the anterior chamber, mean histologically determined CCT was 195.2 ± 8.25 μm vs. 191.9 ± 8.90 μm with OLCR. A trend for a lower variance was found compared to the eyes that had only been immersion fixed. CONCLUSION Optical low-coherence reflectometry is an accurate examination technique to measure in vivo CCT in the eye of newborn chicks. The knowledge of the thickness of the chick cornea and the ability to obtain noninvasive, noncontact measurements of CCT in the living animal may be of interest for research and development of eye diseases in chick models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surgical navigation might increase the safety of osteochondroplasty procedures in patients with femoroacetabular impingement. Feasibility and accuracy of navigation of a surgical reaming device were assessed. Three-dimensional models of 18 identical sawbone femora and 5 cadaver hips were created. Custom software was used to plan and perform repeated computer-assisted osteochondroplasty procedures using a navigated burr. Postoperative 3-dimensional models were created and compared with the preoperative models. A Bland-Altmann analysis assessing α angle and offset ratio accuracy showed even distribution along the zero line with narrow confidence intervals. No differences in α angle and offset ratio accuracy (P = 0.486 and P = 0.2) were detected between both observers. Planning and conduction of navigated osteochondroplasty using a surgical reaming device is feasible and accurate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An automated algorithm for detection of the acetabular rim was developed. Accuracy of the algorithm was validated in a sawbone study and compared against manually conducted digitization attempts, which were established as the ground truth. The latter proved to be reliable and reproducible, demonstrated by almost perfect intra- and interobserver reliability. Validation of the automated algorithm showed no significant difference compared to the manually acquired data in terms of detected version and inclination. Automated detection of the acetabular rim contour and the spatial orientation of the acetabular opening plane can be accurately achieved with this algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The measurement of fluid volumes in cases of pericardial effusion is a necessary procedure during autopsy. With the increased use of virtual autopsy methods in forensics, the need for a quick volume measurement method on computed tomography (CT) data arises, especially since methods such as CT angiography can potentially alter the fluid content in the pericardium. We retrospectively selected 15 cases with hemopericardium, which underwent post-mortem imaging and autopsy. Based on CT data, the pericardial blood volume was estimated using segmentation techniques and downsampling of CT datasets. Additionally, a variety of measures (distances, areas and 3D approximations of the effusion) were examined to find a quick and easy way of estimating the effusion volume. Segmentation of CT images as shown in the present study is a feasible method to measure the pericardial fluid amount accurately. Downsampling of a dataset significantly increases the speed of segmentation without losing too much accuracy. Some of the other methods examined might be used to quickly estimate the severity of the effusion volumes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: To evaluate high-definition and conventional oscillometry in comparison with direct blood pressure measurements in anaesthetised dogs. METHODS: Eight simultaneous readings for systolic, diastolic and mean pressure were obtained directly and with each of two devices in nine anaesthetised dogs. Measurement procedure and validation were based on the 2007 ACVIM guidelines. RESULTS: Sixty-three simultaneous readings were evaluated for each device and direct measurements. The mean differences (bias) to direct values were within 10 mmHg for both devices although bias for systolic and diastolic blood pressures was higher for Memodiagnostic. The standard deviations of differences (precision) were within 15 mmHg for Dinamap but exceeded for Memodiagnostic. Correlation coefficients were higher for Dinamap than Memodiagnostic but both failed to reach a correlation of 0.9. Over 50% of values lay within 10 mmHg of direct measures for both devices, but this percentage was greater for Dinamap than Memodiagnostic. Over 80% of values lay within 20 mmHg of direct measures for Dinamap but not for Memodiagnostic. CLINICAL SIGNIFICANCE: Both devices failed to meet ACVIM guideline validation. However, Dinamap only failed with regards to correlation. Memodiagnostic failed on several requirements, and based on poor correlation, accuracy and precision, this device cannot be currently recommended for dogs under anaesthesia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain functions, such as learning, orchestrating locomotion, memory recall, and processing information, all require glucose as a source of energy. During these functions, the glucose concentration decreases as the glucose is being consumed by brain cells. By measuring this drop in concentration, it is possible to determine which parts of the brain are used during specific functions and consequently, how much energy the brain requires to complete the function. One way to measure in vivo brain glucose levels is with a microdialysis probe. The drawback of this analytical procedure, as with many steadystate fluid flow systems, is that the probe fluid will not reach equilibrium with the brain fluid. Therefore, brain concentration is inferred by taking samples at multiple inlet glucose concentrations and finding a point of convergence. The goal of this thesis is to create a three-dimensional, time-dependent, finite element representation of the brainprobe system in COMSOL 4.2 that describes the diffusion and convection of glucose. Once validated with experimental results, this model can then be used to test parameters that experiments cannot access. When simulations were run using published values for physical constants (i.e. diffusivities, density and viscosity), the resulting glucose model concentrations were within the error of the experimental data. This verifies that the model is an accurate representation of the physical system. In addition to accurately describing the experimental brain-probe system, the model I created is able to show the validity of zero-net-flux for a given experiment. A useful discovery is that the slope of the zero-net-flux line is dependent on perfusate flow rate and diffusion coefficients, but it is independent of brain glucose concentrations. The model was simplified with the realization that the perfusate is at thermal equilibrium with the brain throughout the active region of the probe. This allowed for the assumption that all model parameters are temperature independent. The time to steady-state for the probe is approximately one minute. However, the signal degrades in the exit tubing due to Taylor dispersion, on the order of two minutes for two meters of tubing. Given an analytical instrument requiring a five μL aliquot, the smallest brain process measurable for this system is 13 minutes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Model-based calibration of steady-state engine operation is commonly performed with highly parameterized empirical models that are accurate but not very robust, particularly when predicting highly nonlinear responses such as diesel smoke emissions. To address this problem, and to boost the accuracy of more robust non-parametric methods to the same level, GT-Power was used to transform the empirical model input space into multiple input spaces that simplified the input-output relationship and improved the accuracy and robustness of smoke predictions made by three commonly used empirical modeling methods: Multivariate Regression, Neural Networks and the k-Nearest Neighbor method. The availability of multiple input spaces allowed the development of two committee techniques: a 'Simple Committee' technique that used averaged predictions from a set of 10 pre-selected input spaces chosen by the training data and the "Minimum Variance Committee" technique where the input spaces for each prediction were chosen on the basis of disagreement between the three modeling methods. This latter technique equalized the performance of the three modeling methods. The successively increasing improvements resulting from the use of a single best transformed input space (Best Combination Technique), Simple Committee Technique and Minimum Variance Committee Technique were verified with hypothesis testing. The transformed input spaces were also shown to improve outlier detection and to improve k-Nearest Neighbor performance when predicting dynamic emissions with steady-state training data. An unexpected finding was that the benefits of input space transformation were unaffected by changes in the hardware or the calibration of the underlying GT-Power model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With recent advances in mass spectrometry techniques, it is now possible to investigate proteins over a wide range of molecular weights in small biological specimens. This advance has generated data-analytic challenges in proteomics, similar to those created by microarray technologies in genetics, namely, discovery of "signature" protein profiles specific to each pathologic state (e.g., normal vs. cancer) or differential profiles between experimental conditions (e.g., treated by a drug of interest vs. untreated) from high-dimensional data. We propose a data analytic strategy for discovering protein biomarkers based on such high-dimensional mass-spectrometry data. A real biomarker-discovery project on prostate cancer is taken as a concrete example throughout the paper: the project aims to identify proteins in serum that distinguish cancer, benign hyperplasia, and normal states of prostate using the Surface Enhanced Laser Desorption/Ionization (SELDI) technology, a recently developed mass spectrometry technique. Our data analytic strategy takes properties of the SELDI mass-spectrometer into account: the SELDI output of a specimen contains about 48,000 (x, y) points where x is the protein mass divided by the number of charges introduced by ionization and y is the protein intensity of the corresponding mass per charge value, x, in that specimen. Given high coefficients of variation and other characteristics of protein intensity measures (y values), we reduce the measures of protein intensities to a set of binary variables that indicate peaks in the y-axis direction in the nearest neighborhoods of each mass per charge point in the x-axis direction. We then account for a shifting (measurement error) problem of the x-axis in SELDI output. After these pre-analysis processing of data, we combine the binary predictors to generate classification rules for cancer, benign hyperplasia, and normal states of prostate. Our approach is to apply the boosting algorithm to select binary predictors and construct a summary classifier. We empirically evaluate sensitivity and specificity of the resulting summary classifiers with a test dataset that is independent from the training dataset used to construct the summary classifiers. The proposed method performed nearly perfectly in distinguishing cancer and benign hyperplasia from normal. In the classification of cancer vs. benign hyperplasia, however, an appreciable proportion of the benign specimens were classified incorrectly as cancer. We discuss practical issues associated with our proposed approach to the analysis of SELDI output and its application in cancer biomarker discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Use of microarray technology often leads to high-dimensional and low- sample size data settings. Over the past several years, a variety of novel approaches have been proposed for variable selection in this context. However, only a small number of these have been adapted for time-to-event data where censoring is present. Among standard variable selection methods shown both to have good predictive accuracy and to be computationally efficient is the elastic net penalization approach. In this paper, adaptation of the elastic net approach is presented for variable selection both under the Cox proportional hazards model and under an accelerated failure time (AFT) model. Assessment of the two methods is conducted through simulation studies and through analysis of microarray data obtained from a set of patients with diffuse large B-cell lymphoma where time to survival is of interest. The approaches are shown to match or exceed the predictive performance of a Cox-based and an AFT-based variable selection method. The methods are moreover shown to be much more computationally efficient than their respective Cox- and AFT- based counterparts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 1s-2s interval has been measured in the muonium (;mgr;(+)e(-)) atom by Doppler-free two-photon pulsed laser spectroscopy. The frequency separation of the states was determined to be 2 455 528 941.0(9.8) MHz, in good agreement with quantum electrodynamics. The result may be interpreted as a measurement of the muon-electron charge ratio as -1-1.1(2.1)x10(-9). We expect significantly higher accuracy at future high flux muon sources and from cw laser technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The synchronization of dynamic multileaf collimator (DMLC) response with respiratory motion is critical to ensure the accuracy of DMLC-based four dimensional (4D) radiation delivery. In practice, however, a finite time delay (response time) between the acquisition of tumor position and multileaf collimator response necessitates predictive models of respiratory tumor motion to synchronize radiation delivery. Predicting a complex process such as respiratory motion introduces geometric errors, which have been reported in several publications. However, the dosimetric effect of such errors on 4D radiation delivery has not yet been investigated. Thus, our aim in this work was to quantify the dosimetric effects of geometric error due to prediction under several different conditions. Conformal and intensity modulated radiation therapy (IMRT) plans for a lung patient were generated for anterior-posterior/posterior-anterior (AP/PA) beam arrangements at 6 and 18 MV energies to provide planned dose distributions. Respiratory motion data was obtained from 60 diaphragm-motion fluoroscopy recordings from five patients. A linear adaptive filter was employed to predict the tumor position. The geometric error of prediction was defined as the absolute difference between predicted and actual positions at each diaphragm position. Distributions of geometric error of prediction were obtained for all of the respiratory motion data. Planned dose distributions were then convolved with distributions for the geometric error of prediction to obtain convolved dose distributions. The dosimetric effect of such geometric errors was determined as a function of several variables: response time (0-0.6 s), beam energy (6/18 MV), treatment delivery (3D/4D), treatment type (conformal/IMRT), beam direction (AP/PA), and breathing training type (free breathing/audio instruction/visual feedback). Dose difference and distance-to-agreement analysis was employed to quantify results. Based on our data, the dosimetric impact of prediction (a) increased with response time, (b) was larger for 3D radiation therapy as compared with 4D radiation therapy, (c) was relatively insensitive to change in beam energy and beam direction, (d) was greater for IMRT distributions as compared with conformal distributions, (e) was smaller than the dosimetric impact of latency, and (f) was greatest for respiration motion with audio instructions, followed by visual feedback and free breathing. Geometric errors of prediction that occur during 4D radiation delivery introduce dosimetric errors that are dependent on several factors, such as response time, treatment-delivery type, and beam energy. Even for relatively small response times of 0.6 s into the future, dosimetric errors due to prediction could approach delivery errors when respiratory motion is not accounted for at all. To reduce the dosimetric impact, better predictive models and/or shorter response times are required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A CT-based method ("HipMotion") for the noninvasive three-dimensional assessment of femoroacetabular impingement (FAI) was developed, validated, and applied in a clinical pilot study. The method allows for the anatomically based calculation of hip range of motion (ROM), the exact location of the impingement zone, and the simulation of quantified surgical maneuvers for FAI. The accuracy of HipMotion was 0.7 +/- 3.1 degrees in a plastic bone setup and -5.0 +/- 5.6 degrees in a cadaver setup. Reliability and reproducibility were excellent [intraclass correlation coefficient (ICC) > 0.87] for all measures except external rotation (ICC = 0.48). The normal ROM was determined from a cohort of 150 patients and was compared to 31 consecutive hips with FAI. Patients with FAI had a significantly decreased flexion, internal rotation, and abduction in comparison to normal hips (p < 0.001). Normal hip flexion and internal rotation are generally overestimated in a number of orthopedic textbooks. HipMotion is a useful tool for further assessment of impinging hips and for appropriate planning of the necessary amount of surgical intervention, which represents the basis for future computer-assisted treatment of FAI with less invasive surgical approaches, such as hip arthroscopy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: To assess magnetic resonance (MR)-colonography (MRC) for detection of colorectal lesions using two different T1w three-dimensional (3D)-gradient-recalled echo (GRE)-sequences and integrated parallel data acquisition (iPAT) at a 3.0 Tesla MR-unit. MATERIALS AND METHODS: In this prospective study, 34 symptomatic patients underwent dark lumen MRC at a 3.0 Tesla unit before conventional colonoscopy (CC). After colon distension with tap water, 2 high-resolution T1w 3D-GRE [3-dimensional fast low angle shot (3D-FLASH), iPAT factor 2 and 3D-volumetric interpolated breathhold examination (VIBE), iPAT 3] sequences were acquired without and after bolus injection of gadolinium. Prospective evaluation of MRC was performed. Image quality of the different sequences was assessed qualitatively and quantitatively. The findings of the same day CC served as standard of reference. RESULTS: MRC identified all polyps >5 mm (16 of 16) in size and all carcinomas (4 of 4) correctly. Fifty percent of the small polyps 0.6). CONCLUSIONS: MRC using 3D-GRE-sequences and iPAT is feasible at 3.0 T-systems. The high-resolution 3D-FLASH was slightly preferred over the 3D-VIBE because of better image quality, although both used sequences showed no statistical significant difference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Calcaneonavicular coalitions (CNC) have been reported to be associated with anatomical aberrations of either the calcaneus and/or navicular bones. These morphological abnormalities may complicate accurate surgical resection. Three-dimensional analysis of spatial orientation and morphological characteristics may help in preoperative planning of resection. MATERIALS AND METHODS: Sixteen feet with a diagnosis of CNC were evaluated by means of 3-D CT modeling. Three angles were defined that were expressed in relation to one reproducible landmark (lateral border of the calcaneus): the dorsoplantar inclination, anteroposterior inclination, and socket angle. The depth and width of the coalitions were measured and calculated to obtain the estimated contact surface. Three-dimensional reconstructions of the calcanei served to evaluate the presence, distortion or absence of the anterior calcaneal facet and presence of a navicular beak. The interrater correlations were assessed in order to obtain values for the accuracy of the measurement methods. Sixteen normal feet were used as controls for comparison of the socket angle; anatomy of the anterior calcaneal facet and navicular beak as well. RESULTS: The dorsoplantar inclination angle averaged 50 degrees (+/-17), the anteroposterior inclination angle 64 degrees (+/-15), and the pathologic socket angle 98 degrees (+/-11). The average contact area was 156 mm(2). Ninety-four percent of all patients in the CNC group revealed a plantar navicular beak. In 50% of those patients the anterior calcaneal facet was replaced by the navicular portion and in 44% the facet was totally missing. In contrast, the socket angle in the control group averaged 77 degrees (+/-18), which was found to be statistically different than the CNC group (p = 0.0004). Only 25% of the patients in the control group had a plantar navicular beak. High, statistically significant interrater correlations were found for all measured angles. CONCLUSION: Computer-aided CT analysis and reconstructions help to determine the spatial orientations of CNC in space and provide useful information in order to anticipate morphological abnormalities of the calcaneus and navicular.