29 resultados para statistical analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the study was to examine the clinical forensic findings of strangulation according to their ability to differentiate between life-threatening and non-life-threatening strangulation, compare clinical and MRI findings of the neck and discuss a simple score for life-threatening strangulation (SLS).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method for quantifying nociceptive withdrawal reflex receptive fields in human volunteers and patients is described. The reflex receptive field (RRF) for a specific muscle denotes the cutaneous area from which a muscle contraction can be evoked by a nociceptive stimulus. The method is based on random stimulations presented in a blinded sequence to 10 stimulation sites. The sensitivity map is derived by interpolating the reflex responses evoked from the 10 sites. A set of features describing the size and location of the RRF is presented based on statistical analysis of the sensitivity map within every subject. The features include RRF area, volume, peak location and center of gravity. The method was applied to 30 healthy volunteers. Electrical stimuli were applied to the sole of the foot evoking reflexes in the ankle flexor tibialis anterior. The RRF area covered a fraction of 0.57+/-0.06 (S.E.M.) of the foot and was located on the medial, distal part of the sole of the foot. An intramuscular injection into flexor digitorum brevis of capsaicin was performed in one spinal cord injured subject to attempt modulation of the reflex receptive field. The RRF area, RRF volume and location of the peak reflex response appear to be the most sensitive measures for detecting modulation of spinal nociceptive processing. This new method has important potential applications for exploring aspects of central plasticity in volunteers and patients. It may be utilized as a new diagnostic tool for central hypersensitivity and quantification of therapeutic interventions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exposimeters are increasingly applied in bioelectromagnetic research to determine personal radiofrequency electromagnetic field (RF-EMF) exposure. The main advantages of exposimeter measurements are their convenient handling for study participants and the large amount of personal exposure data, which can be obtained for several RF-EMF sources. However, the large proportion of measurements below the detection limit is a challenge for data analysis. With the robust ROS (regression on order statistics) method, summary statistics can be calculated by fitting an assumed distribution to the observed data. We used a preliminary sample of 109 weekly exposimeter measurements from the QUALIFEX study to compare summary statistics computed by robust ROS with a naïve approach, where values below the detection limit were replaced by the value of the detection limit. For the total RF-EMF exposure, differences between the naïve approach and the robust ROS were moderate for the 90th percentile and the arithmetic mean. However, exposure contributions from minor RF-EMF sources were considerably overestimated with the naïve approach. This results in an underestimation of the exposure range in the population, which may bias the evaluation of potential exposure-response associations. We conclude from our analyses that summary statistics of exposimeter data calculated by robust ROS are more reliable and more informative than estimates based on a naïve approach. Nevertheless, estimates of source-specific medians or even lower percentiles depend on the assumed data distribution and should be considered with caution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High density spatial and temporal sampling of EEG data enhances the quality of results of electrophysiological experiments. Because EEG sources typically produce widespread electric fields (see Chapter 3) and operate at frequencies well below the sampling rate, increasing the number of electrodes and time samples will not necessarily increase the number of observed processes, but mainly increase the accuracy of the representation of these processes. This is namely the case when inverse solutions are computed. As a consequence, increasing the sampling in space and time increases the redundancy of the data (in space, because electrodes are correlated due to volume conduction, and time, because neighboring time points are correlated), while the degrees of freedom of the data change only little. This has to be taken into account when statistical inferences are to be made from the data. However, in many ERP studies, the intrinsic correlation structure of the data has been disregarded. Often, some electrodes or groups of electrodes are a priori selected as the analysis entity and considered as repeated (within subject) measures that are analyzed using standard univariate statistics. The increased spatial resolution obtained with more electrodes is thus poorly represented by the resulting statistics. In addition, the assumptions made (e.g. in terms of what constitutes a repeated measure) are not supported by what we know about the properties of EEG data. From the point of view of physics (see Chapter 3), the natural “atomic” analysis entity of EEG and ERP data is the scalp electric field

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES In dental research multiple site observations within patients or taken at various time intervals are commonplace. These clustered observations are not independent; statistical analysis should be amended accordingly. This study aimed to assess whether adjustment for clustering effects during statistical analysis was undertaken in five specialty dental journals. METHODS Thirty recent consecutive issues of Orthodontics (OJ), Periodontology (PJ), Endodontology (EJ), Maxillofacial (MJ) and Paediatric Dentristry (PDJ) journals were hand searched. Articles requiring adjustment accounting for clustering effects were identified and statistical techniques used were scrutinized. RESULTS Of 559 studies considered to have inherent clustering effects, adjustment for this was made in the statistical analysis in 223 (39.1%). Studies published in the Periodontology specialty accounted for clustering effects in the statistical analysis more often than articles published in other journals (OJ vs. PJ: OR=0.21, 95% CI: 0.12, 0.37, p<0.001; MJ vs. PJ: OR=0.02, 95% CI: 0.00, 0.07, p<0.001; PDJ vs. PJ: OR=0.14, 95% CI: 0.07, 0.28, p<0.001; EJ vs. PJ: OR=0.11, 95% CI: 0.06, 0.22, p<0.001). A positive correlation was found between increasing prevalence of clustering effects in individual specialty journals and correct statistical handling of clustering (r=0.89). CONCLUSIONS The majority of studies in 5 dental specialty journals (60.9%) examined failed to account for clustering effects in statistical analysis where indicated, raising the possibility of inappropriate decreases in p-values and the risk of inappropriate inferences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the directional distribution of heavy neutral atoms in the heliosphere by using heavy neutral maps generated with the IBEX-Lo instrument over three years from 2009 to 2011. The interstellar neutral (ISN) O&Ne gas flow was found in the first-year heavy neutral map at 601 keV and its flow direction and temperature were studied. However, due to the low counting statistics, researchers have not treated the full sky maps in detail. The main goal of this study is to evaluate the statistical significance of each pixel in the heavy neutral maps to get a better understanding of the directional distribution of heavy neutral atoms in the heliosphere. Here, we examine three statistical analysis methods: the signal-to-noise filter, the confidence limit method, and the cluster analysis method. These methods allow us to exclude background from areas where the heavy neutral signal is statistically significant. These methods also allow the consistent detection of heavy neutral atom structures. The main emission feature expands toward lower longitude and higher latitude from the observational peak of the ISN O&Ne gas flow. We call this emission the extended tail. It may be an imprint of the secondary oxygen atoms generated by charge exchange between ISN hydrogen atoms and oxygen ions in the outer heliosheath.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The purpose of this clinical trial was to determine the active tactile sensibility of natural teeth and to obtain a statistical analysis method fitting a psychometric function through the observed data points. On 68 complete dentulous test persons (34 males, 34 females, mean age 45.9 ± 16.1 years), one pair of healthy natural teeth each was tested: n = 24 anterior teeth and n = 44 posterior teeth. The computer-assisted, randomized measurement was done by having the subjects bite on thin copper foils of different thickness (5-200 µm) inserted between the teeth. The threshold of active tactile sensibility was defined by the 50% value of correct answers. Additionally, the gradient of the sensibility curve and the support area (90-10% value) as a description of the shape of the sensibility curve were calculated. For modeling the sensibility curve, symmetric and asymmetric functions were used. The mean sensibility threshold was 14.2 ± 12.1 µm. The older the subject, the higher the tactile threshold (r = 0.42, p = 0.0006). The support area was 41.8 ± 43.3 µm. The higher the 50% threshold, the smaller the gradient of the curve and the larger the support area. The curves showing the active tactile sensibility of natural teeth demonstrate a tendency towards asymmetry, so that the active tactile sensibility of natural teeth can mathematically best be described by using the asymmetric Weibull function.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Seventeen bones (sixteen cadaveric bones and one plastic bone) were used to validate a method for reconstructing a surface model of the proximal femur from 2D X-ray radiographs and a statistical shape model that was constructed from thirty training surface models. Unlike previously introduced validation studies, where surface-based distance errors were used to evaluate the reconstruction accuracy, here we propose to use errors measured based on clinically relevant morphometric parameters. For this purpose, a program was developed to robustly extract those morphometric parameters from the thirty training surface models (training population), from the seventeen surface models reconstructed from X-ray radiographs, and from the seventeen ground truth surface models obtained either by a CT-scan reconstruction method or by a laser-scan reconstruction method. A statistical analysis was then performed to classify the seventeen test bones into two categories: normal cases and outliers. This classification step depends on the measured parameters of the particular test bone. In case all parameters of a test bone were covered by the training population's parameter ranges, this bone is classified as normal bone, otherwise as outlier bone. Our experimental results showed that statistically there was no significant difference between the morphometric parameters extracted from the reconstructed surface models of the normal cases and those extracted from the reconstructed surface models of the outliers. Therefore, our statistical shape model based reconstruction technique can be used to reconstruct not only the surface model of a normal bone but also that of an outlier bone.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

With the advent of high through-put sequencing (HTS), the emerging science of metagenomics is transforming our understanding of the relationships of microbial communities with their environments. While metagenomics aims to catalogue the genes present in a sample through assessing which genes are actively expressed, metatranscriptomics can provide a mechanistic understanding of community inter-relationships. To achieve these goals, several challenges need to be addressed from sample preparation to sequence processing, statistical analysis and functional annotation. Here we use an inbred non-obese diabetic (NOD) mouse model in which germ-free animals were colonized with a defined mixture of eight commensal bacteria, to explore methods of RNA extraction and to develop a pipeline for the generation and analysis of metatranscriptomic data. Applying the Illumina HTS platform, we sequenced 12 NOD cecal samples prepared using multiple RNA-extraction protocols. The absence of a complete set of reference genomes necessitated a peptide-based search strategy. Up to 16% of sequence reads could be matched to a known bacterial gene. Phylogenetic analysis of the mapped ORFs revealed a distribution consistent with ribosomal RNA, the majority from Bacteroides or Clostridium species. To place these HTS data within a systems context, we mapped the relative abundance of corresponding Escherichia coli homologs onto metabolic and protein-protein interaction networks. These maps identified bacterial processes with components that were well-represented in the datasets. In summary this study highlights the potential of exploiting the economy of HTS platforms for metatranscriptomics.