948 resultados para statistical methods
Resumo:
Albumin concentrations in female Bronze turkeys were compared using agarose gel electrophoresis (AGE) and the bromocresol green (BCG) dye-binding method. The correlation coefficient observed for albumin in the BCG and AGE methods was low, and statistical differences were observed at paired t test (p < 0. 0001). Compared with electrophoresis, the BCG-binding method yielded significantly lower albumin values for female turkeys during laying season. © 2011 Springer-Verlag London Limited.
Resumo:
Objectives: This study investigated the effect of extreme cooling methods on the flexural strength, reliability and shear bond strength of veneer porcelain for zirconia. Methods: Vita VM9 porcelain was sintered on zirconia bar specimens and cooled by one of the following methods: inside a switched-off furnace (slow), at room temperature (normal) or immediately by compressed air (fast). Three-point flexural strength tests (FS) were performed on specimens with porcelain under tension (PT, n = 30) and zirconia under tension (ZT, n = 30). Shear bond strength tests (SBS, n = 15) were performed on cylindrical blocks of porcelain, which were applied on zirconia plates. Data were submitted to one-way ANOVA and Tukey's post hoc tests (p < 0.05). Weibull analysis was performed on the PT and ZT configurations. Results: One-way ANOVA for the PT configuration was significant, and Tukey's test revealed that fast cooling leads to significantly higher values (p < 0.01) than the other cooling methods. One-way ANOVA for the ZT configuration was not significant (p = 0.06). Weibull analysis showed that normal cooling had slightly higher reliability for both the PT and ZT configurations. Statistical tests showed that slow cooling decreased the SBS value (p < 0.01) and showed less adhesive fracture modes than the other cooling methods. Clinical Significance: Slow cooling seems to affect the veneer resistance and adhesion to the zirconia core; however, the reliability of fast cooling was slightly lower than that of the other methods. © 2013 Elsevier Ltd.
Resumo:
Questions: We assess gap size and shape distributions, two important descriptors of the forest disturbance regime, by asking: which statistical model best describes gap size distribution; can simple geometric forms adequately describe gap shape; does gap size or shape vary with forest type, gap age or the method used for gap delimitation; and how similar are the studied forests and other tropical and temperate forests? Location: Southeastern Atlantic Forest, Brazil. Methods: Analysing over 150 gaps in two distinct forest types (seasonal and rain forests), a model selection framework was used to select appropriate probability distributions and functions to describe gap size and gap shape. The first was described using univariate probability distributions, whereas the latter was assessed based on the gap area-perimeter relationship. Comparisons of gap size and shape between sites, as well as size and age classes were then made based on the likelihood of models having different assumptions for the values of their parameters. Results: The log-normal distribution was the best descriptor of gap size distribution, independently of the forest type or gap delimitation method. Because gaps became more irregular as they increased in size, all geometric forms (triangle, rectangle and ellipse) were poor descriptors of gap shape. Only when small and large gaps (> 100 or 400m2 depending on the delimitation method) were treated separately did the rectangle and isosceles triangle become accurate predictors of gap shape. Ellipsoidal shapes were poor descriptors. At both sites, gaps were at least 50% longer than they were wide, a finding with important implications for gap microclimate (e.g. light entrance regime) and, consequently, for gap regeneration. Conclusions: In addition to more appropriate descriptions of gap size and shape, the model selection framework used here efficiently provided a means by which to compare the patterns of two different types of forest. With this framework we were able to recommend the log-normal parameters μ and σ for future comparisons of gap size distribution, and to propose possible mechanisms related to random rates of gap expansion and closure. We also showed that gap shape varied highly and that no single geometric form was able to predict the shape of all gaps, the ellipse in particular should no longer be used as a standard gap shape. © 2012 International Association for Vegetation Science.
Resumo:
The purpose of this study was to compare-using cephalometric analysis (McNamara, and Legan and Burstone)-prediction tracings performed using three different methods, that is, manual and using the Dentofacial Planner Plus and Dolphin Image computer programs, with postoperative outcomes. Pre- and postoperative (6 months after surgery) lateral cephalometric radiographs were selected from 25 long-faced patients treated with combined surgery. Prediction tracings were made with each method and compared cephalometrically with the postoperative results. This protocol was repeated once more for method error evaluation. Statistical analysis was made by ANOVA and the Tukey test. The results showed superior predictability when the manual method was applied (50% similarity to postoperative results), followed by Dentofacial Planner Plus (31.2%) and Dolphin Image (18.8%). The experimental condition suggests that the manual method provides greater accuracy, although the predictability of the digital methods proved quite satisfactory. © 2013 World Federation of Orthodontists.
Resumo:
Background: The purpose of this study is to analyze the tension distribution on bone tissue around implants with different angulations (0 degrees, 17 degrees, and 30 degrees) and connections (external hexagon and tapered) through the use of three-dimensional finite element and statistical analyses.Methods: Twelve different configurations of three-dimensional finite element models, including three inclinations of the implants (0 degrees, 17 degrees, and 30 degrees), two connections (an external hexagon and a tapered), and two load applications (axial and oblique), were simulated. The maximum principal stress values for cortical bone were measured at the mesial, distal, buccal, and lingual regions around the implant for each analyzed situation, totaling 48 groups. Loads of 200 and 100 N were applied at the occlusal surface in the axial and oblique directions, respectively. Maximum principal stress values were measured at the bone crest and statistically analyzed using analysis of variance. Stress patterns in the bone tissue around the implant were analyzed qualitatively.Results: The results demonstrated that under the oblique loading process, the external hexagon connection showed significantly higher stress concentrations in the bone tissue (P < 0.05) compared with the tapered connection. Moreover, the buccal and mesial regions of the cortical bone concentrated significantly higher stress (P < 0.005) to the external hexagon implant type. Under the oblique loading direction, the increased external hexagon implant angulation induced a significantly higher stress concentration (P = 0.045).Conclusions: The study results show that: 1) the oblique load was more damaging to bone tissue, mainly when associated with external hexagon implants; and 2) there was a higher stress concentration on the buccal region in comparison to all other regions under oblique load.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The algorithm creates a buffer area around the cartographic features of interest in one of the images and compare it with the other one. During the comparison, the algorithm calculates the number of equals and different points and uses it to calculate the statistical values of the analysis. One calculated statistical value is the correctness, which shows the user the percentage of points that were correctly extracted. Another one is the completeness that shows the percentage of points that really belong to the interest feature. And the third value shows the idea of quality obtained by the extraction method, since that in order to calculate the quality the algorithm uses the correctness and completeness previously calculated. All the performed tests using this algorithm were possible to use the statistical values calculated to represent quantitatively the quality obtained by the extraction method executed. So, it is possible to say that the developed algorithm can be used to analyze extraction methods of cartographic features of interest, since that the results obtained were promising.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The purpose of this study was to compare the quantity and quality of platelets in platelet-rich plasma (PRP) samples prepared using either the single- or the double-centrifugation protocol. Ten adult white New Zealand rabbits were used. Ten ml of blood were drawn from each animal via cardiac puncture. Each blood sample was divided into two equal parts for PRP preparation: 5 ml of blood were centrifuged according to a single-centrifugation protocol (Group I), and 5 ml were centrifuged according to a double-centrifugation protocol (Group II). Manual platelet counts were performed on the whole blood and PRP samples of each group. Smears were also done on all samples in order to see the morphology of the platelets. The data obtained in the manual platelet count were submitted to statistical analysis (repeated measures ANOVA, Tukey, P<.05). The average whole blood platelet count was 446,389/μl. The PRP samples in Group II presented an average platelet amount significantly higher than that of Group I (1,986,875 ± 685,020/μl and 781,875 ± 217,693/μl, respectively). The PRP smears from Group II were the only one to present platelets with altered morphology (75% of the smears). A few lymphocytes with increased cytoplasm were observed in the PRP smears of both Groups I (25% of the smears) and II (62.5% of the smears). Within the limits of this study, it can be concluded that the double-centrifugation protocol resulted in higher platelet concentrations than did the single-centrifugation protocol. However, the double-centrifugation protocol caused alterations in platelet morphology and was more sensitive to small processing errors.
Resumo:
We develop spatial statistical models for stream networks that can estimate relationships between a response variable and other covariates, make predictions at unsampled locations, and predict an average or total for a stream or a stream segment. There have been very few attempts to develop valid spatial covariance models that incorporate flow, stream distance, or both. The application of typical spatial autocovariance functions based on Euclidean distance, such as the spherical covariance model, are not valid when using stream distance. In this paper we develop a large class of valid models that incorporate flow and stream distance by using spatial moving averages. These methods integrate a moving average function, or kernel, against a white noise process. By running the moving average function upstream from a location, we develop models that use flow, and by construction they are valid models based on stream distance. We show that with proper weighting, many of the usual spatial models based on Euclidean distance have a counterpart for stream networks. Using sulfate concentrations from an example data set, the Maryland Biological Stream Survey (MBSS), we show that models using flow may be more appropriate than models that only use stream distance. For the MBSS data set, we use restricted maximum likelihood to fit a valid covariance matrix that uses flow and stream distance, and then we use this covariance matrix to estimate fixed effects and make kriging and block kriging predictions.
Resumo:
Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.
Resumo:
This study evaluated color change, stability, and tooth sensitivity in patients submitted to different bleaching techniques. Material and methods: In this study, 48 patients were divided into five groups. A half-mouth design was conducted to compare two in-office bleaching bleaching techniques (with and without light activation): G1: 35% hydrogen peroxide (HP) (Lase Peroxide - DMC Equipments, Sao Carlos, SP, Brazil) + hybrid light (HL) (LED/Diode Laser, Whitening Lase II DMC Equipments, Sao Carlos, SP, Brazil); G2: 35% HP; G3: 38% HP (X-traBoost - Ultradent, South Jordan UT, USA) + HL; G4: 38% HP; and G5: 15% carbamide peroxide (CP) (Opalescence PF - Ultradent, South Jordan UT, USA). For G1 and G3, HP was applied on the enamel surface for 3 consecutive applications activated by HL. Each application included 3x3' HL activations with 1' between each interval; for G2 and G4, HP was applied 3x15' with 15' between intervals; and for G5, 15% CP was applied for 120'/10 days at home. A spectrophotometer was used to measure color change before the treatment and after 24 h, 1 week, 1, 6, 12, 18 and 24 months. A VAS questionnaire was used to evaluate tooth sensitivity before the treatment, immediately following treatment, 24 h after and finally 1 week after. Results: Statistical analysis did not reveal any significant differences between in-office bleaching with or without HL activation related to effectiveness; nevertheless the time required was less with HL. Statistical differences were observed between the result after 24 h, 1 week and 1, 6, 12, 18 and 24 months (integroup). Immediately, in-office bleaching increased tooth sensitivity. The groups activated with HL required less application time with gel. Conclusion: All techniques and bleaching agents used were effective and demonstrated similar behaviors.
Resumo:
Brazil is the largest sugarcane producer in the world and has a privileged position to attend to national and international market places. To maintain the high production of sugarcane, it is fundamental to improve the forecasting models of crop seasons through the use of alternative technologies, such as remote sensing. Thus, the main purpose of this article is to assess the results of two different statistical forecasting methods applied to an agroclimatic index (the water requirement satisfaction index; WRSI) and the sugarcane spectral response (normalized difference vegetation index; NDVI) registered on National Oceanic and Atmospheric Administration Advanced Very High Resolution Radiometer (NOAA-AVHRR) satellite images. We also evaluated the cross-correlation between these two indexes. According to the results obtained, there are meaningful correlations between NDVI and WRSI with time lags. Additionally, the adjusted model for NDVI presented more accurate results than the forecasting models for WRSI. Finally, the analyses indicate that NDVI is more predictable due to its seasonality and the WRSI values are more variable making it difficult to forecast.
Resumo:
Objectives: To compare, in vivo, the accuracy of conventional and digital radiographic methods in determining root canal working length. Material and Methods: Twenty-five maxillary incisor or canine teeth from 22 patients were used in this study. Considering the preoperative radiographs as the baseline, a 25 K file was inserted into the root canal to the point where the Root ZX electronic apex locator indicated the APEX measurement in the screen. From this measurement, 1 mm was subtracted for positioning the file. The radiographic measurements were made using a digital sensor (Digora 1.51) or conventional type-E films, size 2, following the paralleling technique, to determine the distance of the file tip and the radiographic apex. Results: The Student "t" test indicated mean distances of 1.11 mm to conventional and 1.20 mm for the digital method and indicated a significant statistical difference (p<0.05). Conclusions: The conventional radiographic method was found to be superior to the digital one in determining the working length of the root canal.