945 resultados para Statistical analysis methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background Measurements of hormonal concentrations by immunoassays using fluorescent tracer substance (Eu3+) are susceptible to the action of chemical agents that may cause alterations in its original structure. Our goal was to verify the effect of two types of anticoagulants in the hormone assays performed by fluorometric (FIA) or immunofluorometric (IFMA) methods. Methods Blood samples were obtained from 30 outpatients and were drawn in EDTA, sodium citrate, and serum separation Vacutainer®Blood Collection Tubes. Samples were analyzed in automatized equipment AutoDelfia™ (Perkin Elmer Brazil, Wallac, Finland) for the following hormones: Luteinizing hormone (LH), Follicle stimulating homone (FSH), prolactin (PRL), growth hormone (GH), Sex hormone binding globulin (SHBG), thyroid stimulating hormone (TSH), insulin, C peptide, total T3, total T4, free T4, estradiol, progesterone, testosterone, and cortisol. Statistical analysis was carried out by Kruskal-Wallis method and Dunn's test. Results No significant differences were seen between samples for LH, FSH, PRL and free T4. Results from GH, TSH, insulin, C peptide, SHBG, total T3, total T4, estradiol, testosterone, cortisol, and progesterone were significant different between serum and EDTA-treated samples groups. Differences were also identified between serum and sodium citrate-treated samples in the analysis for TSH, insulin, total T3, estradiol, testosterone and progesterone. Conclusions We conclude that the hormonal analysis carried through by FIA or IFMA are susceptible to the effects of anticoagulants in the biological material collected that vary depending on the type of assay.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While the use of statistical physics methods to analyze large corpora has been useful to unveil many patterns in texts, no comprehensive investigation has been performed on the interdependence between syntactic and semantic factors. In this study we propose a framework for determining whether a text (e.g., written in an unknown alphabet) is compatible with a natural language and to which language it could belong. The approach is based on three types of statistical measurements, i.e. obtained from first-order statistics of word properties in a text, from the topology of complex networks representing texts, and from intermittency concepts where text is treated as a time series. Comparative experiments were performed with the New Testament in 15 different languages and with distinct books in English and Portuguese in order to quantify the dependency of the different measurements on the language and on the story being told in the book. The metrics found to be informative in distinguishing real texts from their shuffled versions include assortativity, degree and selectivity of words. As an illustration, we analyze an undeciphered medieval manuscript known as the Voynich Manuscript. We show that it is mostly compatible with natural languages and incompatible with random texts. We also obtain candidates for keywords of the Voynich Manuscript which could be helpful in the effort of deciphering it. Because we were able to identify statistical measurements that are more dependent on the syntax than on the semantics, the framework may also serve for text analysis in language-dependent applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work provides a forward step in the study and comprehension of the relationships between stochastic processes and a certain class of integral-partial differential equation, which can be used in order to model anomalous diffusion and transport in statistical physics. In the first part, we brought the reader through the fundamental notions of probability and stochastic processes, stochastic integration and stochastic differential equations as well. In particular, within the study of H-sssi processes, we focused on fractional Brownian motion (fBm) and its discrete-time increment process, the fractional Gaussian noise (fGn), which provide examples of non-Markovian Gaussian processes. The fGn, together with stationary FARIMA processes, is widely used in the modeling and estimation of long-memory, or long-range dependence (LRD). Time series manifesting long-range dependence, are often observed in nature especially in physics, meteorology, climatology, but also in hydrology, geophysics, economy and many others. We deepely studied LRD, giving many real data examples, providing statistical analysis and introducing parametric methods of estimation. Then, we introduced the theory of fractional integrals and derivatives, which indeed turns out to be very appropriate for studying and modeling systems with long-memory properties. After having introduced the basics concepts, we provided many examples and applications. For instance, we investigated the relaxation equation with distributed order time-fractional derivatives, which describes models characterized by a strong memory component and can be used to model relaxation in complex systems, which deviates from the classical exponential Debye pattern. Then, we focused in the study of generalizations of the standard diffusion equation, by passing through the preliminary study of the fractional forward drift equation. Such generalizations have been obtained by using fractional integrals and derivatives of distributed orders. In order to find a connection between the anomalous diffusion described by these equations and the long-range dependence, we introduced and studied the generalized grey Brownian motion (ggBm), which is actually a parametric class of H-sssi processes, which have indeed marginal probability density function evolving in time according to a partial integro-differential equation of fractional type. The ggBm is of course Non-Markovian. All around the work, we have remarked many times that, starting from a master equation of a probability density function f(x,t), it is always possible to define an equivalence class of stochastic processes with the same marginal density function f(x,t). All these processes provide suitable stochastic models for the starting equation. Studying the ggBm, we just focused on a subclass made up of processes with stationary increments. The ggBm has been defined canonically in the so called grey noise space. However, we have been able to provide a characterization notwithstanding the underline probability space. We also pointed out that that the generalized grey Brownian motion is a direct generalization of a Gaussian process and in particular it generalizes Brownain motion and fractional Brownain motion as well. Finally, we introduced and analyzed a more general class of diffusion type equations related to certain non-Markovian stochastic processes. We started from the forward drift equation, which have been made non-local in time by the introduction of a suitable chosen memory kernel K(t). The resulting non-Markovian equation has been interpreted in a natural way as the evolution equation of the marginal density function of a random time process l(t). We then consider the subordinated process Y(t)=X(l(t)) where X(t) is a Markovian diffusion. The corresponding time-evolution of the marginal density function of Y(t) is governed by a non-Markovian Fokker-Planck equation which involves the same memory kernel K(t). We developed several applications and derived the exact solutions. Moreover, we considered different stochastic models for the given equations, providing path simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hybrid vehicles represent the future for automakers, since they allow to improve the fuel economy and to reduce the pollutant emissions. A key component of the hybrid powertrain is the Energy Storage System, that determines the ability of the vehicle to store and reuse energy. Though electrified Energy Storage Systems (ESS), based on batteries and ultracapacitors, are a proven technology, Alternative Energy Storage Systems (AESS), based on mechanical, hydraulic and pneumatic devices, are gaining interest because they give the possibility of realizing low-cost mild-hybrid vehicles. Currently, most literature of design methodologies focuses on electric ESS, which are not suitable for AESS design. In this contest, The Ohio State University has developed an Alternative Energy Storage System design methodology. This work focuses on the development of driving cycle analysis methodology that is a key component of Alternative Energy Storage System design procedure. The proposed methodology is based on a statistical approach to analyzing driving schedules that represent the vehicle typical use. Driving data are broken up into power events sequence, namely traction and braking events, and for each of them, energy-related and dynamic metrics are calculated. By means of a clustering process and statistical synthesis methods, statistically-relevant metrics are determined. These metrics define cycle representative braking events. By using these events as inputs for the Alternative Energy Storage System design methodology, different system designs are obtained. Each of them is characterized by attributes, namely system volume and weight. In the last part the work, the designs are evaluated in simulation by introducing and calculating a metric related to the energy conversion efficiency. Finally, the designs are compared accounting for attributes and efficiency values. In order to automate the driving data extraction and synthesis process, a specific script Matlab based has been developed. Results show that the driving cycle analysis methodology, based on the statistical approach, allows to extract and synthesize cycle representative data. The designs based on cycle statistically-relevant metrics are properly sized and have satisfying efficiency values with respect to the expectations. An exception is the design based on the cycle worst-case scenario, corresponding to same approach adopted by the conventional electric ESS design methodologies. In this case, a heavy system with poor efficiency is produced. The proposed new methodology seems to be a valid and consistent support for Alternative Energy Storage System design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Perfusion CT imaging of the liver has potential to improve evaluation of tumour angiogenesis. Quantitative parameters can be obtained applying mathematical models to Time Attenuation Curve (TAC). However, there are still some difficulties for an accurate quantification of perfusion parameters due, for example, to algorithms employed, to mathematical model, to patient’s weight and cardiac output and to the acquisition system. In this thesis, new parameters and alternative methodologies about liver perfusion CT are presented in order to investigate the cause of variability of this technique. Firstly analysis were made to assess the variability related to the mathematical model used to compute arterial Blood Flow (BFa) values. Results were obtained implementing algorithms based on “ maximum slope method” and “Dual input one compartment model” . Statistical analysis on simulated data demonstrated that the two methods are not interchangeable. Anyway slope method is always applicable in clinical context. Then variability related to TAC processing in the application of slope method is analyzed. Results compared with manual selection allow to identify the best automatic algorithm to compute BFa. The consistency of a Standardized Perfusion Index (SPV) was evaluated and a simplified calibration procedure was proposed. At the end the quantitative value of perfusion map was analyzed. ROI approach and map approach provide related values of BFa and this means that pixel by pixel algorithm give reliable quantitative results. Also in pixel by pixel approach slope method give better results. In conclusion the development of new automatic algorithms for a consistent computation of BFa and the analysis and definition of simplified technique to compute SPV parameter, represent an improvement in the field of liver perfusion CT analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present work is aimed to the study and the analysis of the defects detected in the civil structure and that are object of civil litigation in order to create an instruments capable of helping the different actor involved in the building process. It is divided in three main sections. The first part is focused on the collection of the data related to the civil proceeding of the 2012 and the development of in depth analysis of the main aspects regarding the defects on existing buildings. The research center “Osservatorio Claudio Ceccoli” developed a system for the collection of the information coming from the civil proceedings of the Court of Bologna. Statistical analysis are been performed and the results are been shown and discussed in the first chapters.The second part analyzes the main issues emerged during the study of the real cases, related to the activities of the technical consultant. The idea is to create documents, called “focus”, addressed to clarify and codify specific problems in order to develop guidelines that help the technician editing of the technical advice.The third part is centered on the estimation of the methods used for the collection of data. The first results show that these are not efficient. The critical analysis of the database, the result and the experience and throughout, allowed the implementation of the collection system for the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The revision hip arthroplasty is a surgical procedure, consisting in the reconstruction of the hip joint through the replacement of the damaged hip prosthesis. Several factors may give raise to the failure of the artificial device: aseptic loosening, infection and dislocation represent the principal causes of failure worldwide. The main effect is the raise of bone defects in the region closest to the prosthesis that weaken the bone structure for the biological fixation of the new artificial hip. For this reason bone reconstruction is necessary before the surgical revision operation. This work is born by the necessity to test the effects of bone reconstruction due to particular bone defects in the acetabulum, after the hip prosthesis revision. In order to perform biomechanical in vitro tests on hip prosthesis implanted in human pelvis or hemipelvis a practical definition of a reference frame for these kind of bone specimens is required. The aim of the current study is to create a repeatable protocol to align hemipelvic samples in the testing machine, that relies on a reference system based on anatomical landmarks on the human pelvis. In chapter 1 a general overview of the human pelvic bone is presented: anatomy, bone structure, loads and the principal devices for hip joint replacement. The purpose of chapters 2 is to identify the most common causes of the revision hip arthroplasty, analysing data from the most reliable orthopaedic registries in the world. Chapter 3 presents an overview of the most used classifications for acetabular bone defects and fractures and the most common techniques for acetabular and bone reconstruction. After a critical review of the scientific literature about reference frames for human pelvis, in chapter 4, the definition of a new reference frame is proposed. Based on this reference frame, the alignment protocol for the human hemipelvis is presented as well as the statistical analysis that confirm the good repeatability of the method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of high through-put sequencing (HTS), the emerging science of metagenomics is transforming our understanding of the relationships of microbial communities with their environments. While metagenomics aims to catalogue the genes present in a sample through assessing which genes are actively expressed, metatranscriptomics can provide a mechanistic understanding of community inter-relationships. To achieve these goals, several challenges need to be addressed from sample preparation to sequence processing, statistical analysis and functional annotation. Here we use an inbred non-obese diabetic (NOD) mouse model in which germ-free animals were colonized with a defined mixture of eight commensal bacteria, to explore methods of RNA extraction and to develop a pipeline for the generation and analysis of metatranscriptomic data. Applying the Illumina HTS platform, we sequenced 12 NOD cecal samples prepared using multiple RNA-extraction protocols. The absence of a complete set of reference genomes necessitated a peptide-based search strategy. Up to 16% of sequence reads could be matched to a known bacterial gene. Phylogenetic analysis of the mapped ORFs revealed a distribution consistent with ribosomal RNA, the majority from Bacteroides or Clostridium species. To place these HTS data within a systems context, we mapped the relative abundance of corresponding Escherichia coli homologs onto metabolic and protein-protein interaction networks. These maps identified bacterial processes with components that were well-represented in the datasets. In summary this study highlights the potential of exploiting the economy of HTS platforms for metatranscriptomics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Resonance frequency analysis (RFA) is a method of measuring implant stability. However, little is known about RFA of implants with long loading periods. The objective of the present study was to determine standard implant stability quotients (ISQs) for clinical successfully osseointegrated 1-stage implants in the edentulous mandible. MATERIALS AND METHODS: Stability measurements by means of RFA were performed in regularly followed patients who had received 1- stage implants for overdenture support. The time interval between implant placement and measurement ranged from 1 year up to 10 years. The short-term group comprised patients who were followed up to 5 years, while the long-term group included patients with an observation time of > 5 years up to 10 years. For further comparison RFA measurements were performed in a matching group with unloaded implants at the end of the surgical procedure. For statistical analysis various parameters that might influence the ISQs of loaded implants were included, and a mixed-effects model applied (regression analysis, P <.0125). RESULTS: Ninety-four patients were available with a total of 205 loaded implants, and 16 patients with 36 implants immediately after the surgical procedure. The mean ISQ of all measured implants was 64.5 +/- 7.9 (range, 58 to 72). Statistical analysis did not reveal significant differences in the mean ISQ related to the observation time. The parameters with overall statistical significance were the diameter of the implants and changes in the attachment level. In the short-term group, the gender and the clinically measured attachment level had a significant effect. Implant diameter had a significant effect in the long-term group. CONCLUSIONS: A mean ISQ of 64.5 +/- 7.9 was found to be representative for stable asymptomatic interforaminal implants measured by the RFA instrument at any given time point. No significant differences in ISQ values were found between implants with different postsurgical time intervals. Implant diameter appears to influence the ISQ of interforaminal implants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: This study analyzed the impact of weight reduction method, preoperative, and intraoperative variables on the outcome of reconstructive body contouring surgery following massive weight reduction. METHODS: All patients presenting with a maximal BMI >/=35 kg/m(2) before weight reduction who underwent body contouring surgery of the trunk following massive weight loss (excess body mass index loss (EBMIL) >/= 30%) between January 2002 and June 2007 were retrospectively analyzed. Incomplete records or follow-up led to exclusion. Statistical analysis focused on weight reduction method and pre-, intra-, and postoperative risk factors. The outcome was compared to current literature results. RESULTS: A total of 104 patients were included (87 female and 17 male; mean age 47.9 years). Massive weight reduction was achieved through bariatric surgery in 62 patients (59.6%) and dietetically in 42 patients (40.4%). Dietetically achieved excess body mass index loss (EBMIL) was 94.20% and in this cohort higher than surgically induced reduction EBMIL 80.80% (p < 0.01). Bariatric surgery did not present increased risks for complications for the secondary body contouring procedures. The observed complications (26.9%) were analyzed for risk factors. Total tissue resection weight was a significant risk factor (p < 0.05). Preoperative BMI had an impact on infections (p < 0.05). No impact on the postoperative outcome was detected in EBMIL, maximal BMI, smoking, hemoglobin, blood loss, body contouring technique or operation time. Corrective procedures were performed in 11 patients (10.6%). The results were compared to recent data. CONCLUSION: Bariatric surgery does not increase risks for complications in subsequent body contouring procedures when compared to massive dietetic weight reduction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While voxel-based 3-D MRI analysis methods as well as assessment of subtracted ictal versus interictal perfusion studies (SISCOM) have proven their potential in the detection of lesions in focal epilepsy, a combined approach has not yet been reported. The present study investigates if individual automated voxel-based 3-D MRI analyses combined with SISCOM studies contribute to an enhanced detection of mesiotemporal epileptogenic foci. Seven consecutive patients with refractory complex partial epilepsy were prospectively evaluated by SISCOM and voxel-based 3-D MRI analysis. The functional perfusion maps and voxel-based statistical maps were coregistered in 3-D space. In five patients with temporal lobe epilepsy (TLE), the area of ictal hyperperfusion and corresponding structural abnormalities detected by 3-D MRI analysis were identified within the same temporal lobe. In two patients, additional structural and functional abnormalities were detected beyond the mesial temporal lobe. Five patients with TLE underwent epileptic surgery with favourable postoperative outcome (Engel class Ia and Ib) after 3-5 years of follow-up, while two patients remained on conservative treatment. In summary, multimodal assessment of structural abnormalities by voxel-based analysis and SISCOM may contribute to advanced observer-independent preoperative assessment of seizure origin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: Resonance frequency analysis (RFA) offers the opportunity to monitor the osseointegration of an implant in a simple, noninvasive way. A better comprehension of the relationship between RFA and parameters related to bone quality would therefore help clinicians improve diagnoses. In this study, a bone analog made from polyurethane foam was used to isolate the influences of bone density and cortical thickness in RFA. MATERIALS AND METHODS: Straumann standard implants were inserted in polyurethane foam blocks, and primary implant stability was measured with RFA. The blocks were composed of two superimposed layers with different densities. The top layer was dense to mimic cortical bone, whereas the bottom layer had a lower density to represent trabecular bone. Different densities for both layers and different thicknesses for the simulated cortical layer were tested, resulting in eight different block combinations. RFA was compared with two other mechanical evaluations of primary stability: removal torque and axial loading response. RESULTS: The primary stability measured with RFA did not correlate with the two other methods, but there was a significant correlation between removal torque and the axial loading response (P < .005). Statistical analysis revealed that each method was sensitive to different aspects of bone quality. RFA was the only method able to detect changes in both bone density and cortical thickness. However, changes in trabecular bone density were easier to distinguish with removal torque and axial loading than with RFA. CONCLUSIONS: This study shows that RFA, removal torque, and axial loading are sensitive to different aspects of the bone-implant interface. This explains the absence of correlation among the methods and proves that no standard procedure exists for the evaluation of primary stability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Little is known about the pathologic changes in the epidural space after intervertebral disk (IVD) extrusion in the dog. OBJECTIVES To analyze the pathology of the epidural inflammatory response, and to search for correlations between this process and clinical findings. METHODS Clinical data from 105 chondrodystrophic (CD) and nonchondrodystrophic (NCD) dogs with IVD extrusion were recorded. Epidural material from these dogs was examined histopathologically and immunohistochemically. Using statistical analysis, we searched for correlations between severity of epidural inflammation and various clinical and pathologic variables. RESULTS Most dogs exhibited an epidural inflammatory response, ranging from acute invasion of neutrophils to formation of chronic granulation tissue. The mononuclear inflammatory infiltrates consisted mostly of monocytes and macrophages and only few T and B cells. Surprisingly, chronic inflammatory patterns also were found in animals with an acute clinical history. Severity of the epidural inflammation correlated with degree of the epidural hemorrhage and nucleus pulposus calcification (P = .003 and .040), but not with age, chondrodystrophic phenotype, neurologic grade, back pain, pretreatment, or duration. The degree of inflammation was statistically (P = .021) inversely correlated with the ability to regain ambulation. CONCLUSION AND CLINICAL IMPORTANCE Epidural inflammation occurs in the majority of dogs with IVD extrusion and may develop long before the onset of clinical signs. Presence of calcified IVD material and hemorrhage in the epidural space may be the triggers of this lesion rather than an adaptive immune response to the nucleus pulposus as suggested in previous studies. Because epidural inflammation may affect outcome, further research is warranted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The vestibular system contributes to the control of posture and eye movements and is also involved in various cognitive functions including spatial navigation and memory. These functions are subtended by projections to a vestibular cortex, whose exact location in the human brain is still a matter of debate (Lopez and Blanke, 2011). The vestibular cortex can be defined as the network of all cortical areas receiving inputs from the vestibular system, including areas where vestibular signals influence the processing of other sensory (e.g. somatosensory and visual) and motor signals. Previous neuroimaging studies used caloric vestibular stimulation (CVS), galvanic vestibular stimulation (GVS), and auditory stimulation (clicks and short-tone bursts) to activate the vestibular receptors and localize the vestibular cortex. However, these three methods differ regarding the receptors stimulated (otoliths, semicircular canals) and the concurrent activation of the tactile, thermal, nociceptive and auditory systems. To evaluate the convergence between these methods and provide a statistical analysis of the localization of the human vestibular cortex, we performed an activation likelihood estimation (ALE) meta-analysis of neuroimaging studies using CVS, GVS, and auditory stimuli. We analyzed a total of 352 activation foci reported in 16 studies carried out in a total of 192 healthy participants. The results reveal that the main regions activated by CVS, GVS, or auditory stimuli were located in the Sylvian fissure, insula, retroinsular cortex, fronto-parietal operculum, superior temporal gyrus, and cingulate cortex. Conjunction analysis indicated that regions showing convergence between two stimulation methods were located in the median (short gyrus III) and posterior (long gyrus IV) insula, parietal operculum and retroinsular cortex (Ri). The only area of convergence between all three methods of stimulation was located in Ri. The data indicate that Ri, parietal operculum and posterior insula are vestibular regions where afferents converge from otoliths and semicircular canals, and may thus be involved in the processing of signals informing about body rotations, translations and tilts. Results from the meta-analysis are in agreement with electrophysiological recordings in monkeys showing main vestibular projections in the transitional zone between Ri, the insular granular field (Ig), and SII.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In patients diagnosed with pharmaco-resistant epilepsy, cerebral areas responsible for seizure generation can be defined by performing implantation of intracranial electrodes. The identification of the epileptogenic zone (EZ) is based on visual inspection of the intracranial electroencephalogram (IEEG) performed by highly qualified neurophysiologists. New computer-based quantitative EEG analyses have been developed in collaboration with the signal analysis community to expedite EZ detection. The aim of the present report is to compare different signal analysis approaches developed in four different European laboratories working in close collaboration with four European Epilepsy Centers. Computer-based signal analysis methods were retrospectively applied to IEEG recordings performed in four patients undergoing pre-surgical exploration of pharmaco-resistant epilepsy. The four methods elaborated by the different teams to identify the EZ are based either on frequency analysis, on nonlinear signal analysis, on connectivity measures or on statistical parametric mapping of epileptogenicity indices. All methods converge on the identification of EZ in patients that present with fast activity at seizure onset. When traditional visual inspection was not successful in detecting EZ on IEEG, the different signal analysis methods produced highly discordant results. Quantitative analysis of IEEG recordings complement clinical evaluation by contributing to the study of epileptogenic networks during seizures. We demonstrate that the degree of sensitivity of different computer-based methods to detect the EZ in respect to visual EEG inspection depends on the specific seizure pattern.