911 resultados para Null Hypothesis
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Objectives. The null hypothesis was that mechanical testing systems used to determine polymerization stress (sigma(pol)) would rank a series of composites similarly. Methods. Two series of composites were tested in the following systems: universal testing machine (UTM) using glass rods as bonding substrate, UTM/acrylic rods, "low compliance device", and single cantilever device ("Bioman"). One series had five experimental composites containing BisGMA:TEGDMA in equimolar concentrations and 60, 65, 70, 75 or 80 wt% of filler. The other series had five commercial composites: Filtek Z250 (3M ESPE), Filtek A110 (3M ESPE), Tetric Ceram (Ivoclar), Heliomolar (Ivoclar) and Point 4 (Kerr). Specimen geometry, dimensions and curing conditions were similar in all systems. sigma(pol) was monitored for 10 min. Volumetric shrinkage (VS) was measured in a mercury dilatometer and elastic modulus (E) was determined by three-point bending. Shrinkage rate was used as a measure of reaction kinetics. ANOVA/Tukey test was performed for each variable, separately for each series. Results. For the experimental composites, sigma(pol) decreased with filler content in all systems, following the variation in VS. For commercial materials, sigma(pol) did not vary in the UTM/acrylic system and showed very few similarities in rankings in the others tests system. Also, no clear relationships were observed between sigma(pol) and VS or E. Significance. The testing systems showed a good agreement for the experimental composites, but very few similarities for the commercial composites. Therefore, comparison of polymerization stress results from different devices must be done carefully. (c) 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Resumo:
Objective: This study evaluated the 56-month clinical performance of Class I and II resin composite restorations. Filtek P60 was compared with Filtek Z250, which are both indicated for posterior restorations but differ in terms of handling characteristics. The null hypothesis tested was that there is no difference in the clinical performance of the two resin composites in posterior teeth. Material and Methods: Thirty-three patients were treated by the same operator, who prepared 48 Class I and 42 Class II cavities, which were restored with Single Bond/Filtek Z250 or Single Bond/Filtek P60 restorative systems. Restorations were evaluated by two independent examiners at baseline and after 56 months, using the modified USPHS criteria. Data were analyzed statistically using Chi-square and Fisher's Exact tests (alpha=0.05). Results: After 56 months, 25 patients (31 Class I and 36 Class II) were analyzed. A 3% failure rate occurred due to secondary caries and excessive loss of anatomic form for P60. For both restorative systems, there were no significant differences in secondary caries and postoperative sensitivity. However, significant changes were observed with respect to anatomic form, marginal discoloration, and marginal adaptation. Significant decreases in surface texture were observed exclusively for the Z250 restorations. Conclusions: Both restorative systems can be used for posterior restorations and can be expected to perform well in the oral environment.
Resumo:
Background: Arboviral diseases are major global public health threats. Yet, our understanding of infection risk factors is, with a few exceptions, considerably limited. A crucial shortcoming is the widespread use of analytical methods generally not suited for observational data - particularly null hypothesis-testing (NHT) and step-wise regression (SWR). Using Mayaro virus (MAYV) as a case study, here we compare information theory-based multimodel inference (MMI) with conventional analyses for arboviral infection risk factor assessment. Methodology/Principal Findings: A cross-sectional survey of anti-MAYV antibodies revealed 44% prevalence (n = 270 subjects) in a central Amazon rural settlement. NHT suggested that residents of village-like household clusters and those using closed toilet/latrines were at higher risk, while living in non-village-like areas, using bednets, and owning fowl, pigs or dogs were protective. The "minimum adequate" SWR model retained only residence area and bednet use. Using MMI, we identified relevant covariates, quantified their relative importance, and estimated effect-sizes (beta +/- SE) on which to base inference. Residence area (beta(Village) = 2.93 +/- 0.41; beta(Upland) = -0.56 +/- 0.33, beta(Riverbanks) = -2.37 +/- 0.55) and bednet use (beta = -0.95 +/- 0.28) were the most important factors, followed by crop-plot ownership (beta = 0.39 +/- 0.22) and regular use of a closed toilet/latrine (beta = 0.19 +/- 0.13); domestic animals had insignificant protective effects and were relatively unimportant. The SWR model ranked fifth among the 128 models in the final MMI set. Conclusions/Significance: Our analyses illustrate how MMI can enhance inference on infection risk factors when compared with NHT or SWR. MMI indicates that forest crop-plot workers are likely exposed to typical MAYV cycles maintained by diurnal, forest dwelling vectors; however, MAYV might also be circulating in nocturnal, domestic-peridomestic cycles in village-like areas. This suggests either a vector shift (synanthropic mosquitoes vectoring MAYV) or a habitat/habits shift (classical MAYV vectors adapting to densely populated landscapes and nocturnal biting); any such ecological/adaptive novelty could increase the likelihood of MAYV emergence in Amazonia.
Resumo:
Objectives: Nanofilled composite resins are claimed to provide superior mechanical properties compared with microhybrid resins. Thus, the aim of this study was to compare nanofilled with microhybrid composite resins. The null hypothesis was that the size and the distribution of fillers do not influence the mechanical properties of surface roughness and wear after simulated toothbrushing test. Material and methods: Ten rectangular specimens (15 mm x 5 mm x 4 mm) of Filtek Z250 (FZ2), Admira (A), TPH3 (T), Esthet-X (EX), Estelite Sigma (ES), Concept Advanced (C), Grandio (G) and Filtek Z350 (F) were prepared according to manufacturer's instructions. Half of each top surface was protected with nail polish as control surface (not brushed) while the other half was assessed with five random readings using a roughness tester (Ra). Following, the specimens were abraded by simulated toothbrushing with soft toothbrushes and slurry comprised of 2: 1 water and dentifrice (w/w). 100,000 strokes were performed and the brushed surfaces were re-analyzed. Nail polish layers were removed from the specimens so that the roughness (Ra) and the wear could be assessed with three random readings (mu m). Data were analyzed by ANOVA and Tukey's multiple-comparison test (alpha = 0.05). Results: Overall outcomes indicated that composite resins showed a significant increase in roughness after simulated toothbrushing, except for Grandio, which presented a smoother surface. Generally, wear of nanofilled resins was significantly lower compared with microhybrid resins. Conclusions: As restorative materials suffer alterations under mechanical challenges, such as toothbrushing, the use of nanofilled materials seem to be more resistant than microhybrid composite resins, being less prone to be rougher and worn.
Resumo:
The issue of assessing variance components is essential in deciding on the inclusion of random effects in the context of mixed models. In this work we discuss this problem by supposing nonlinear elliptical models for correlated data by using the score-type test proposed in Silvapulle and Silvapulle (1995). Being asymptotically equivalent to the likelihood ratio test and only requiring the estimation under the null hypothesis, this test provides a fairly easy computable alternative for assessing one-sided hypotheses in the context of the marginal model. Taking into account the possible non-normal distribution, we assume that the joint distribution of the response variable and the random effects lies in the elliptical class, which includes light-tailed and heavy-tailed distributions such as Student-t, power exponential, logistic, generalized Student-t, generalized logistic, contaminated normal, and the normal itself, among others. We compare the sensitivity of the score-type test under normal, Student-t and power exponential models for the kinetics data set discussed in Vonesh and Carter (1992) and fitted using the model presented in Russo et al. (2009). Also, a simulation study is performed to analyze the consequences of the kurtosis misspecification.
Resumo:
Terrestrial amphibians may dehydrate when exposed to low humidity, representing an important factor affecting spatial distribution and community composition. In this study we investigated whether rates of dehydration and rehydration are able to explain the spatial distribution of an anuran community in a Restinga environment at the northern coast of the State of Bahia, Brazil, represented by 11 species distributed in 27 sample units. The environmental data set containing 20 variables was reduced to a few synthetic axes by principal component analysis (PCA). Physiological variables measured were rates of dehydration, rehydration from water, and rehydration from a neutral substrate. Multiple regression analyses were used to test the null hypothesis of no association between the environmental data set (synthetic axes of PCA) and each axis representative of a physiological variable, which was rejected (P < 0.001). Of 15 possible partial regressions only rehydration rate from neutral substrate vs. PC1. and PC2, rehydration rate from water vs. PC1, and dehydration rate vs. PC2 were significant. Our analysis was influenced by a gradient between two different groups of sample units: a beach area with high density of bromeliads and an environment without bodies of water with low density of bromeliads. Species of very specific natural history and morphological characters occur in these environments: Phyllodytes melanomystax and Scinax auratus, species frequently occurring in terrestrial bromeliads, and Ischnocnema paulodutrai, common along the northern coast of Bahia and usually found in forest remnants within environments with low number of bodies of water. In dry environments species with lower rates of dehydration were dominant, whereas species showing greater rates of dehydration were found predominantly in microhabitats with greater moisture or abundance of bodies of water.
Resumo:
Objective: To test the null hypothesis: Subjects with isolated complete unilateral cleft lip and palate (UCLP) show no differences in overall frequency of tooth agenesis (hypodontia), comparing a subsample with cleft-side maxillary lateral incisor (MxI2) agenesis to a subsample without cleftside MxI2 agenesis. Findings could clarify the origins of cleft-side MxI2 agenesis. Materials and Methods: Tooth agenesis was identified from dental radiographs of 141 subjects with UCLP. The UCLP cohort was segregated into four categories according to the status and location of MxI2 in the region of the unilateral cleft: group M: subjects with one tooth, located on the mesial side of the alveolar cleft; group D: subjects with one tooth, located on the distal side of the alveolar cleft; group MD: subjects with two teeth present, one mesial and one distal to the cleft; and group ABS: subjects with lateral incisor absent (agenesis) in the cleft area. Results: The null hypothesis was rejected. Among UCLP subjects, there was a twofold increase (P < .0008) in overall frequency of tooth agenesis outside the cleft region in a subsample with cleftside MxI2 agenesis (ABS), compared to a subsample presenting with no agenesis of the cleft-side MxI2 (M+D+MD). Conclusions: Cleft-side MxI2 agenesis in CLP subjects appears to be largely a genetically controlled anomaly associated with cleft development, rather than a collateral environmental consequence of the adjacent cleft defect, since increased hypodontia involving multiple missing teeth observed remote from a cleft clearly has a significant genetic basis. (Angle Orthod. 2012;82:959-963.)
Resumo:
The asymptotic expansion of the distribution of the gradient test statistic is derived for a composite hypothesis under a sequence of Pitman alternative hypotheses converging to the null hypothesis at rate n(-1/2), n being the sample size. Comparisons of the local powers of the gradient, likelihood ratio, Wald and score tests reveal no uniform superiority property. The power performance of all four criteria in one-parameter exponential family is examined.
Resumo:
Abstract Background Cardiovascular disease is the leading cause of death in Brazil, and hypertension is its major risk factor. The benefit of its drug treatment to prevent major cardiovascular events was consistently demonstrated. Angiotensin-receptor blockers (ARB) have been the preferential drugs in the management of hypertension worldwide, despite the absence of any consistent evidence of advantage over older agents, and the concern that they may be associated with lower renal protection and risk for cancer. Diuretics are as efficacious as other agents, are well tolerated, have longer duration of action and low cost, but have been scarcely compared with ARBs. A study comparing diuretic and ARB is therefore warranted. Methods/design This is a randomized, double-blind, clinical trial, comparing the association of chlorthalidone and amiloride with losartan as first drug option in patients aged 30 to 70 years, with stage I hypertension. The primary outcomes will be variation of blood pressure by time, adverse events and development or worsening of microalbuminuria and of left ventricular hypertrophy in the EKG. The secondary outcomes will be fatal or non-fatal cardiovascular events: myocardial infarction, stroke, heart failure, evidence of new subclinical atherosclerosis and sudden death. The study will last 18 months. The sample size will be of 1200 participants for group in order to confer enough power to test for all primary outcomes. The project was approved by the Ethics committee of each participating institution. Discussion The putative pleiotropic effects of ARB agents, particularly renal protection, have been disputed, and they have been scarcely compared with diuretics in large clinical trials, despite that they have been at least as efficacious as newer agents in managing hypertension. Even if the null hypothesis is not rejected, the information will be useful for health care policy to treat hypertension in Brazil. Clinical trials registration number ClinicalTrials.gov: NCT00971165
Resumo:
In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.
Resumo:
The thesis studies the economic and financial conditions of Italian households, by using microeconomic data of the Survey on Household Income and Wealth (SHIW) over the period 1998-2006. It develops along two lines of enquiry. First it studies the determinants of households holdings of assets and liabilities and estimates their correlation degree. After a review of the literature, it estimates two non-linear multivariate models on the interactions between assets and liabilities with repeated cross-sections. Second, it analyses households financial difficulties. It defines a quantitative measure of financial distress and tests, by means of non-linear dynamic probit models, whether the probability of experiencing financial difficulties is persistent over time. Chapter 1 provides a critical review of the theoretical and empirical literature on the estimation of assets and liabilities holdings, on their interactions and on households net wealth. The review stresses the fact that a large part of the literature explain households debt holdings as a function, among others, of net wealth, an assumption that runs into possible endogeneity problems. Chapter 2 defines two non-linear multivariate models to study the interactions between assets and liabilities held by Italian households. Estimation refers to a pooling of cross-sections of SHIW. The first model is a bivariate tobit that estimates factors affecting assets and liabilities and their degree of correlation with results coherent with theoretical expectations. To tackle the presence of non normality and heteroskedasticity in the error term, generating non consistent tobit estimators, semi-parametric estimates are provided that confirm the results of the tobit model. The second model is a quadrivariate probit on three different assets (safe, risky and real) and total liabilities; the results show the expected patterns of interdependence suggested by theoretical considerations. Chapter 3 reviews the methodologies for estimating non-linear dynamic panel data models, drawing attention to the problems to be dealt with to obtain consistent estimators. Specific attention is given to the initial condition problem raised by the inclusion of the lagged dependent variable in the set of explanatory variables. The advantage of using dynamic panel data models lies in the fact that they allow to simultaneously account for true state dependence, via the lagged variable, and unobserved heterogeneity via individual effects specification. Chapter 4 applies the models reviewed in Chapter 3 to analyse financial difficulties of Italian households, by using information on net wealth as provided in the panel component of the SHIW. The aim is to test whether households persistently experience financial difficulties over time. A thorough discussion is provided of the alternative approaches proposed by the literature (subjective/qualitative indicators versus quantitative indexes) to identify households in financial distress. Households in financial difficulties are identified as those holding amounts of net wealth lower than the value corresponding to the first quartile of net wealth distribution. Estimation is conducted via four different methods: the pooled probit model, the random effects probit model with exogenous initial conditions, the Heckman model and the recently developed Wooldridge model. Results obtained from all estimators accept the null hypothesis of true state dependence and show that, according with the literature, less sophisticated models, namely the pooled and exogenous models, over-estimate such persistence.
Resumo:
Currently, a variety of linear and nonlinear measures is in use to investigate spatiotemporal interrelation patterns of multivariate time series. Whereas the former are by definition insensitive to nonlinear effects, the latter detect both nonlinear and linear interrelation. In the present contribution we employ a uniform surrogate-based approach, which is capable of disentangling interrelations that significantly exceed random effects and interrelations that significantly exceed linear correlation. The bivariate version of the proposed framework is explored using a simple model allowing for separate tuning of coupling and nonlinearity of interrelation. To demonstrate applicability of the approach to multivariate real-world time series we investigate resting state functional magnetic resonance imaging (rsfMRI) data of two healthy subjects as well as intracranial electroencephalograms (iEEG) of two epilepsy patients with focal onset seizures. The main findings are that for our rsfMRI data interrelations can be described by linear cross-correlation. Rejection of the null hypothesis of linear iEEG interrelation occurs predominantly for epileptogenic tissue as well as during epileptic seizures.
Enamel loss and adhesive remnants following bracket removal and various clean-up procedures in vitro
Resumo:
This study evaluated the enamel loss and composite remnants after debonding and clean-up. The tested null hypothesis is that there are no differences between different polishing systems regarding removing composite remnants without damaging the tooth surface. Brackets were bonded to 75 extracted human molars and removed after a storage period of 100 hours. The adhesive remnant index (ARI) was evaluated. The clean-up was carried out with five different procedures: 1. carbide bur; 2. carbide bur and Brownie and Greenie silicone polishers; 3. carbide bur and Astropol polishers; 4. carbide bur and Renew polishers; and 5. carbide bur, Brownie, Greenie and PoGo polishers. Silicone impressions were made at baseline (T0) and after debonding (T1) and polishing (T2) to produce plaster replicas. The replicas were analysed with a three-dimensional laser scanner and measured with analytical software. Statistical analysis was performed with the Kruskal-Wallis test and pairwise Wilcoxon tests with Bonferroni-Holm adjustment (α = 0.05). Enamel breakouts after debonding were detectable in 27 per cent of all cases, with a mean volume loss of 0.02 mm(3) (±0.03 mm(3)) and depth of 44.9 μm (±48.3 μm). The overall ARI scores was 3 with a few scores of 1 and 2. The composite remnants after debonding had a mean volume of 2.48 mm(3) (±0.92 mm(3)). Mean volume loss due to polishing was 0.05 mm(3) (±0.26 mm(3)) and the composite remnants had a mean volume of 0.22 mm(3) (±0.32 mm(3)). There were no statistically significant differences in volumetric changes after polishing (P = 0.054) between the different clean-up methods. However, sufficient clean-up without enamel loss was difficult to achieve.
Resumo:
Pseudogenes (Ψs), including processed and non-processed Ψs, are ubiquitous genetic elements derived from originally functional genes in all studied genomes within the three kingdoms of life. However, systematic surveys of non-processed Ψs utilizing genomic information from multiple samples within a species are still rare. Here a systematic comparative analysis was conducted of Ψs within 80 fully re-sequenced Arabidopsis thaliana accessions, and 7546 genes, representing ~28% of the genomic annotated open reading frames (ORFs), were found with disruptive mutations in at least one accession. The distribution of these Ψs on chromosomes showed a significantly negative correlation between Ψs/ORFs and their local gene densities, suggesting a higher proportion of Ψs in gene desert regions, e.g. near centromeres. On the other hand, compared with the non-Ψ loci, even the intact coding sequences (CDSs) in the Ψ loci were found to have shorter CDS length, fewer exon number and lower GC content. In addition, a significant functional bias against the null hypothesis was detected in the Ψs mainly involved in responses to environmental stimuli and biotic stress as reported, suggesting that they are likely important for adaptive evolution to rapidly changing environments by pseudogenization to accumulate successive mutations.