933 resultados para Non Parametric Methodology
Resumo:
This study evaluated the operator variability of different finishing and polishing techniques. After placing 120 composite restorations (Tetric EvoCeram) in plexiglassmolds, the surface of the specimens was roughened in a standardized manner. Twelve operators with different experience levels polished the specimens using the following finishing/polishing procedures: method 1 (40 ?m diamond [40D], 15 ?m diamond [15D], 42 ?m silicon carbide polisher [42S], 6 ?m silicon carbide polisher [6S] and Occlubrush [O]); method 2 (40D, 42S, 6S and O); method 3 (40D, 42S, 6S and PoGo); method 4 (40D, 42S and PoGo) and method 5 (40D, 42S and O). The mean surface roughness (Ra) was measured with a profilometer. Differences between the methods were analyzed with non-parametric ANOVA and pairwise Wilcoxon signed rank tests (?=0.05). All the restorations were qualitatively assessed using SEM. Methods 3 and 4 showed the best polishing results and method 5 demonstrated the poorest. Method 5 was also most dependent on the skills of the operator. Except for method 5, all of the tested procedures reached a clinically acceptable surface polish of Ra?0.2 ?m. Polishing procedures can be simplified without increasing variability between operators and without jeopardizing polishing results.
Resumo:
We present an automatic method to segment brain tissues from volumetric MRI brain tumor images. The method is based on non-rigid registration of an average atlas in combination with a biomechanically justified tumor growth model to simulate soft-tissue deformations caused by the tumor mass-effect. The tumor growth model, which is formulated as a mesh-free Markov Random Field energy minimization problem, ensures correspondence between the atlas and the patient image, prior to the registration step. The method is non-parametric, simple and fast compared to other approaches while maintaining similar accuracy. It has been evaluated qualitatively and quantitatively with promising results on eight datasets comprising simulated images and real patient data.
Resumo:
This is the first part of a study investigating a model-based transient calibration process for diesel engines. The motivation is to populate hundreds of parameters (which can be calibrated) in a methodical and optimum manner by using model-based optimization in conjunction with the manual process so that, relative to the manual process used by itself, a significant improvement in transient emissions and fuel consumption and a sizable reduction in calibration time and test cell requirements is achieved. Empirical transient modelling and optimization has been addressed in the second part of this work, while the required data for model training and generalization are the focus of the current work. Transient and steady-state data from a turbocharged multicylinder diesel engine have been examined from a model training perspective. A single-cylinder engine with external air-handling has been used to expand the steady-state data to encompass transient parameter space. Based on comparative model performance and differences in the non-parametric space, primarily driven by a high engine difference between exhaust and intake manifold pressures (ΔP) during transients, it has been recommended that transient emission models should be trained with transient training data. It has been shown that electronic control module (ECM) estimates of transient charge flow and the exhaust gas recirculation (EGR) fraction cannot be accurate at the high engine ΔP frequently encountered during transient operation, and that such estimates do not account for cylinder-to-cylinder variation. The effects of high engine ΔP must therefore be incorporated empirically by using transient data generated from a spectrum of transient calibrations. Specific recommendations on how to choose such calibrations, how many data to acquire, and how to specify transient segments for data acquisition have been made. Methods to process transient data to account for transport delays and sensor lags have been developed. The processed data have then been visualized using statistical means to understand transient emission formation. Two modes of transient opacity formation have been observed and described. The first mode is driven by high engine ΔP and low fresh air flowrates, while the second mode is driven by high engine ΔP and high EGR flowrates. The EGR fraction is inaccurately estimated at both modes, while EGR distribution has been shown to be present but unaccounted for by the ECM. The two modes and associated phenomena are essential to understanding why transient emission models are calibration dependent and furthermore how to choose training data that will result in good model generalization.
Resumo:
Smoke spikes occurring during transient engine operation have detrimental health effects and increase fuel consumption by requiring more frequent regeneration of the diesel particulate filter. This paper proposes a decision tree approach to real-time detection of smoke spikes for control and on-board diagnostics purposes. A contemporary, electronically controlled heavy-duty diesel engine was used to investigate the deficiencies of smoke control based on the fuel-to-oxygen-ratio limit. With the aid of transient and steady state data analysis and empirical as well as dimensional modeling, it was shown that the fuel-to-oxygen ratio was not estimated correctly during the turbocharger lag period. This inaccuracy was attributed to the large manifold pressure ratios and low exhaust gas recirculation flows recorded during the turbocharger lag period, which meant that engine control module correlations for the exhaust gas recirculation flow and the volumetric efficiency had to be extrapolated. The engine control module correlations were based on steady state data and it was shown that, unless the turbocharger efficiency is artificially reduced, the large manifold pressure ratios observed during the turbocharger lag period cannot be achieved at steady state. Additionally, the cylinder-to-cylinder variation during this period were shown to be sufficiently significant to make the average fuel-to-oxygen ratio a poor predictor of the transient smoke emissions. The steady state data also showed higher smoke emissions with higher exhaust gas recirculation fractions at constant fuel-to-oxygen-ratio levels. This suggests that, even if the fuel-to-oxygen ratios were to be estimated accurately for each cylinder, they would still be ineffective as smoke limiters. A decision tree trained on snap throttle data and pruned with engineering knowledge was able to use the inaccurate engine control module estimates of the fuel-to-oxygen ratio together with information on the engine control module estimate of the exhaust gas recirculation fraction, the engine speed, and the manifold pressure ratio to predict 94% of all spikes occurring over the Federal Test Procedure cycle. The advantages of this non-parametric approach over other commonly used parametric empirical methods such as regression were described. An application of accurate smoke spike detection in which the injection pressure is increased at points with a high opacity to reduce the cumulative particulate matter emissions substantially with a minimum increase in the cumulative nitrogrn oxide emissions was illustrated with dimensional and empirical modeling.
Resumo:
OBJECTIVES: Donation after circulatory declaration of death (DCDD) could significantly improve the number of cardiac grafts for transplantation. Graft evaluation is particularly important in the setting of DCDD given that conditions of cardio-circulatory arrest and warm ischaemia differ, leading to variable tissue injury. The aim of this study was to identify, at the time of heart procurement, means to predict contractile recovery following cardioplegic storage and reperfusion using an isolated rat heart model. Identification of reliable approaches to evaluate cardiac grafts is key in the development of protocols for heart transplantation with DCDD. METHODS: Hearts isolated from anaesthetized male Wistar rats (n = 34) were exposed to various perfusion protocols. To simulate DCDD conditions, rats were exsanguinated and maintained at 37°C for 15-25 min (warm ischaemia). Isolated hearts were perfused with modified Krebs-Henseleit buffer for 10 min (unloaded), arrested with cardioplegia, stored for 3 h at 4°C and then reperfused for 120 min (unloaded for 60 min, then loaded for 60 min). Left ventricular (LV) function was assessed using an intraventricular micro-tip pressure catheter. Statistical significance was determined using the non-parametric Spearman rho correlation analysis. RESULTS: After 120 min of reperfusion, recovery of LV work measured as developed pressure (DP)-heart rate (HR) product ranged from 0 to 15 ± 6.1 mmHg beats min(-1) 10(-3) following warm ischaemia of 15-25 min. Several haemodynamic parameters measured during early, unloaded perfusion at the time of heart procurement, including HR and the peak systolic pressure-HR product, correlated significantly with contractile recovery after cardioplegic storage and 120 min of reperfusion (P < 0.001). Coronary flow, oxygen consumption and lactate dehydrogenase release also correlated significantly with contractile recovery following cardioplegic storage and 120 min of reperfusion (P < 0.05). CONCLUSIONS: Haemodynamic and biochemical parameters measured at the time of organ procurement could serve as predictive indicators of contractile recovery. We believe that evaluation of graft suitability is feasible prior to transplantation with DCDD, and may, consequently, increase donor heart availability.
Resumo:
Model-based calibration of steady-state engine operation is commonly performed with highly parameterized empirical models that are accurate but not very robust, particularly when predicting highly nonlinear responses such as diesel smoke emissions. To address this problem, and to boost the accuracy of more robust non-parametric methods to the same level, GT-Power was used to transform the empirical model input space into multiple input spaces that simplified the input-output relationship and improved the accuracy and robustness of smoke predictions made by three commonly used empirical modeling methods: Multivariate Regression, Neural Networks and the k-Nearest Neighbor method. The availability of multiple input spaces allowed the development of two committee techniques: a 'Simple Committee' technique that used averaged predictions from a set of 10 pre-selected input spaces chosen by the training data and the "Minimum Variance Committee" technique where the input spaces for each prediction were chosen on the basis of disagreement between the three modeling methods. This latter technique equalized the performance of the three modeling methods. The successively increasing improvements resulting from the use of a single best transformed input space (Best Combination Technique), Simple Committee Technique and Minimum Variance Committee Technique were verified with hypothesis testing. The transformed input spaces were also shown to improve outlier detection and to improve k-Nearest Neighbor performance when predicting dynamic emissions with steady-state training data. An unexpected finding was that the benefits of input space transformation were unaffected by changes in the hardware or the calibration of the underlying GT-Power model.
Resumo:
This study examines the influence of recovery-oriented peer events on participants' recovery attitudes and explores who benefits most from such events. Changes in participants' recovery attitudes were evaluated (pre, post, follow-up), and compared with changes of control groups. Distributions of recovery-related values in subgroups were analyzed descriptively. The results of non-parametric tests (Friedman) showed participants with significantly higher values in the dimension Recovery is possible directly after the interventions (P = 0.006), but not 6 months later, and not in comparison with members of control groups. On a descriptive level, women, participants with schizophrenia and with two or more episodes of the disorder showed higher recovery-related values compared to men, participants with an affective disorder and only one episode. Within their feedback, organizations and peers express a positive view of peer support, but evidence for a positive impact of the evaluated peer events on recovery attitude is limited.
Resumo:
OBJECTIVE: To characterize the impact of hepatitis C (HCV) serostatus on adherence to antiretroviral treatment (ART) among HIV-infected adults initiating ART. METHODS: The British Columbia HIV/AIDS Drug Treatment Program distributes, at no cost, all ART in this Canadian province. Eligible individuals used triple combination ART as their first HIV therapy and had documented HCV serology. Statistical analyses used parametric and non-parametric methods, including multivariate logistic regression. The primary outcome was > or = 95% adherence, defined as receiving > or = 95% of prescription refills during the first year of antiretroviral therapy. RESULTS: There were 1186 patients eligible for analysis, including 606 (51%) positive for HCV antibody and 580 (49%) who were negative. In adjusted analyses, adherence was independently associated with HCV seropositivity [adjusted odds ratio (AOR), 0.48; 95% confidence interval (CI), 0.23-0.97; P = 0.003], higher plasma albumin levels (AOR, 1.07; 95% CI, 1.01-1.12; P = 0.002) and male gender (AOR, 2.53; 95% CI, 1.04-6.15; P = 0.017), but not with injection drug use (IDU), age or other markers of liver injury. There was no evidence of an interaction between HCV and liver injury in adjusted analyses; comparing different strata of HCV and IDU confirmed that HCV was associated with poor adherence independent of IDU. CONCLUSIONS: HCV-coinfected individuals and those with lower albumin are less likely to be adherent to their ART.
Resumo:
The aim of many genetic studies is to locate the genomic regions (called quantitative trait loci, QTLs) that contribute to variation in a quantitative trait (such as body weight). Confidence intervals for the locations of QTLs are particularly important for the design of further experiments to identify the gene or genes responsible for the effect. Likelihood support intervals are the most widely used method to obtain confidence intervals for QTL location, but the non-parametric bootstrap has also been recommended. Through extensive computer simulation, we show that bootstrap confidence intervals are poorly behaved and so should not be used in this context. The profile likelihood (or LOD curve) for QTL location has a tendency to peak at genetic markers, and so the distribution of the maximum likelihood estimate (MLE) of QTL location has the unusual feature of point masses at genetic markers; this contributes to the poor behavior of the bootstrap. Likelihood support intervals and approximate Bayes credible intervals, on the other hand, are shown to behave appropriately.
Resumo:
OBJECTIVES: This experiment was performed to evaluate clinically and histologically the effect of mechanical therapy with or without antiseptic therapy on peri-implant mucositis lesions in nine cynomolgus monkeys. MATERIAL AND METHODS: Two ITI titanium implants were inserted into each side of the mandibles. After 90 days of plaque control and soft tissue healing, a baseline clinical examination was completed. Peri-implant lesions were induced by placing silk ligatures and allowing plaque to accumulate for 6 weeks. The clinical examination was then repeated, and the monkeys were randomly assigned to three treatment groups: group A, mechanical cleansing only; group B, mechanical cleansing and local irrigation with 0.12% chlorhexidine (CHX) and application of 0.2% CHX gel; and group C, control, no treatment. The implants in treatment groups A and B were treated and maintained according to the assigned treatment for two additional months. At the end of the maintenance period, a final clinical examination was performed and the animals were sacrificed for biopsies. RESULTS: The mean probing depths (PD) values at mucositis were: 3.5, 3.7, and 3.4 mm, and clinical attachment level (CAL) = 3.8, 4.1, and 3.9 mm for treatment groups A, B and C, respectively. The corresponding values after treatment were: PD = 1.7, 2.1, and 2.5 mm, and CAL=2.6, 2.6, and 3.1 mm. ANOVA of mean changes (Delta) in PD and CAL after treatment showed no statistical difference between the treatment groups. Comparison of the mean changes in PD and CAL after treatment yielded statistical differences between the control and treatment groups P < 0.01. According to the t-test, no statistical difference was found between treatment groups A and B for the PD reduction but there was a significant difference for the CAL change, P < 0.03. Group A had significantly more recession and less CAL gain than group B. Non-parametric tests yielded no significant differences in modified plaque index (mPlI) and gingival index (GI) after treatment between both treatment groups. Frequencies and percent distributions of the mPlI and GI scores changed considerably for both treatment groups when compared with the changes in the control group after treatment. With regard to the histological evaluation, no statistical differences existed between the treatments for any linear measurement. The proportion of inflammation found in the mucosal tissues of the control implants was greater than the one found for both treatment groups, P < 0.01. More importantly, both treatment groups showed a similar low proportion of inflammation after 2 months of treatment. CONCLUSIONS: Within the limitations of this experiment, and considering the supportive plaque control rendered, it can be concluded that for pockets of 3-4 mm: (1) mechanical therapy alone or combined with CHX results in the clinical resolution of peri-implant mucositis lesions, (2) histologically, both treatments result in minimal inflammation compatible with health, and (3) the mechanical effect alone is sufficient to achieve clinical and histologic resolution of mucositis lesions.
Resumo:
This paper proposes Poisson log-linear multilevel models to investigate population variability in sleep state transition rates. We specifically propose a Bayesian Poisson regression model that is more flexible, scalable to larger studies, and easily fit than other attempts in the literature. We further use hierarchical random effects to account for pairings of individuals and repeated measures within those individuals, as comparing diseased to non-diseased subjects while minimizing bias is of epidemiologic importance. We estimate essentially non-parametric piecewise constant hazards and smooth them, and allow for time varying covariates and segment of the night comparisons. The Bayesian Poisson regression is justified through a re-derivation of a classical algebraic likelihood equivalence of Poisson regression with a log(time) offset and survival regression assuming piecewise constant hazards. This relationship allows us to synthesize two methods currently used to analyze sleep transition phenomena: stratified multi-state proportional hazards models and log-linear models with GEE for transition counts. An example data set from the Sleep Heart Health Study is analyzed.
Resumo:
The aim of this study was to identify quantitative trait loci (QTL) for osteochondrosis (OC) and palmar/plantar osseous fragments (POF) in fetlock joints in a whole-genome scan of 219 South German Coldblood horses. Symptoms of OC and POF were checked by radiography in 117 South German Coldblood horses at a mean age of 17 months. The radiographic examination comprised the fetlock and hock joints of all limbs. The genome scan included 157 polymorphic microsatellite markers. All microsatellite markers were equally spaced over the 31 autosomes and the X chromosome, with an average distance of 17.7 cM and a mean polymorphism information content (PIC) of 63%. Sixteen chromosomes harbouring putative QTL regions were further investigated by genotyping the animals with 93 additional markers. QTL that had chromosome-wide significance by non-parametric Z-means and LOD scores were found on 10 chromosomes. This included seven QTL for fetlock OC and one QTL on ECA18 associated with hock OC and fetlock OC. Significant QTL for POF in fetlock joints were located on equine chromosomes 1, 4, 8, 12 and 18. This genome scan is an important step towards the identification of genes responsible for OC in horses.
Resumo:
We performed a Rey visual design learning test (RVDLT) in 17 subjects and measured intervoxel coherence (IC) by DTI as an indication of connectivity to investigate if visual memory performance would depend on white matter structure in healthy persons. IC considers the orientation of the adjacent voxels and has a better signal-to-noise ratio than the commonly used fractional anisotropy index. Voxel-based t-test analysis of the IC values was used to identify neighboring voxel clusters with significant differences between 7 low and 10 high test performers. We detected 9 circumscribed significant clusters (p< .01) with lower IC values in low performers than in high performers, with centers of gravity located in left and right superior temporal region, corpus callosum, left superior longitudinal fascicle, and left optic radiation. Using non-parametric correlation analysis, IC and memory performance were significantly correlated in each of the 9 clusters (r< .61 to r< .81; df=15, p< .01 to p< .0001). The findings provide in vivo evidence for the contribution of white matter structure to visual memory in healthy people.
Resumo:
BACKGROUND: Surfactant protein D (SP-D) deficient mice develop emphysema-like pathology associated with focal accumulations of foamy alveolar macrophages, an excess of surfactant phospholipids in the alveolar space and both hypertrophy and hyperplasia of alveolar type II cells. These findings are associated with a chronic inflammatory state. Treatment of SP-D deficient mice with a truncated recombinant fragment of human SP-D (rfhSP-D) has been shown to decrease the lipidosis and alveolar macrophage accumulation as well as production of proinflammatory chemokines. The aim of this study was to investigate if rfhSP-D treatment reduces the structural abnormalities in parenchymal architecture and type II cells characteristic of SP-D deficiency. METHODS: SP-D knock-out mice, aged 3 weeks, 6 weeks and 9 weeks were treated with rfhSP-D for 9, 6 and 3 weeks, respectively. All mice were sacrificed at age 12 weeks and compared to both PBS treated SP-D deficient and wild-type groups. Lung structure was quantified by design-based stereology at the light and electron microscopic level. Emphasis was put on quantification of emphysema, type II cell changes and intracellular surfactant. Data were analysed with two sided non-parametric Mann-Whitney U-test. MAIN RESULTS: After 3 weeks of treatment, alveolar number was higher and mean alveolar size was smaller compared to saline-treated SP-D knock-out controls. There was no significant difference concerning these indices of pulmonary emphysema within rfhSP-D treated groups. Type II cell number and size were smaller as a consequence of treatment. The total volume of lamellar bodies per type II cell and per lung was smaller after 6 weeks of treatment. CONCLUSION: Treatment of SP-D deficient mice with rfhSP-D leads to a reduction in the degree of emphysema and a correction of type II cell hyperplasia and hypertrophy. This supports the concept that rfhSP-D might become a therapeutic option in diseases that are characterized by decreased SP-D levels in the lung.
Resumo:
The Zagros oak forests in Western Iran are critically important to the sustainability of the region. These forests have undergone dramatic declines in recent decades. We evaluated the utility of the non-parametric Random Forest classification algorithm for land cover classification of Zagros landscapes, and selected the best spatial and spectral predictive variables. The algorithm resulted in high overall classification accuracies (>85%) and also equivalent classification accuracies for the datasets from the three different sensors. We evaluated the associations between trends in forest area and structure with trends in socioeconomic and climatic conditions, to identify the most likely driving forces creating deforestation and landscape structure change. We used available socioeconomic (urban and rural population, and rural income), and climatic (mean annual rainfall and mean annual temperature) data for two provinces in northern Zagros. The most correlated driving force of forest area loss was urban population, and climatic variables to a lesser extent. Landscape structure changes were more closely associated with rural population. We examined the effects of scale changes on the results from spatial pattern analysis. We assessed the impacts of eight years of protection in a protected area in northern Zagros at two different scales (both grain and extent). The effects of protection on the amount and structure of forests was scale dependent. We evaluated the nature and magnitude of changes in forest area and structure over the entire Zagros region from 1972 to 2009. We divided the Zagros region in 167 Landscape Units and developed two measures— Deforestation Sensitivity (DS) and Connectivity Sensitivity (CS) — for each landscape unit as the percent of the time steps that forest area and ECA experienced a decrease of greater than 10% in either measure. A considerable loss in forest area and connectivity was detected, but no sudden (nonlinear) changes were detected at the spatial and temporal scale of the study. Connectivity loss occurred more rapidly than forest loss due to the loss of connecting patches. More connectivity was lost in southern Zagros due to climatic differences and different forms of traditional land use.