804 resultados para Null hypothesis


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract Background Cardiovascular disease is the leading cause of death in Brazil, and hypertension is its major risk factor. The benefit of its drug treatment to prevent major cardiovascular events was consistently demonstrated. Angiotensin-receptor blockers (ARB) have been the preferential drugs in the management of hypertension worldwide, despite the absence of any consistent evidence of advantage over older agents, and the concern that they may be associated with lower renal protection and risk for cancer. Diuretics are as efficacious as other agents, are well tolerated, have longer duration of action and low cost, but have been scarcely compared with ARBs. A study comparing diuretic and ARB is therefore warranted. Methods/design This is a randomized, double-blind, clinical trial, comparing the association of chlorthalidone and amiloride with losartan as first drug option in patients aged 30 to 70 years, with stage I hypertension. The primary outcomes will be variation of blood pressure by time, adverse events and development or worsening of microalbuminuria and of left ventricular hypertrophy in the EKG. The secondary outcomes will be fatal or non-fatal cardiovascular events: myocardial infarction, stroke, heart failure, evidence of new subclinical atherosclerosis and sudden death. The study will last 18 months. The sample size will be of 1200 participants for group in order to confer enough power to test for all primary outcomes. The project was approved by the Ethics committee of each participating institution. Discussion The putative pleiotropic effects of ARB agents, particularly renal protection, have been disputed, and they have been scarcely compared with diuretics in large clinical trials, despite that they have been at least as efficacious as newer agents in managing hypertension. Even if the null hypothesis is not rejected, the information will be useful for health care policy to treat hypertension in Brazil. Clinical trials registration number ClinicalTrials.gov: NCT00971165

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The thesis studies the economic and financial conditions of Italian households, by using microeconomic data of the Survey on Household Income and Wealth (SHIW) over the period 1998-2006. It develops along two lines of enquiry. First it studies the determinants of households holdings of assets and liabilities and estimates their correlation degree. After a review of the literature, it estimates two non-linear multivariate models on the interactions between assets and liabilities with repeated cross-sections. Second, it analyses households financial difficulties. It defines a quantitative measure of financial distress and tests, by means of non-linear dynamic probit models, whether the probability of experiencing financial difficulties is persistent over time. Chapter 1 provides a critical review of the theoretical and empirical literature on the estimation of assets and liabilities holdings, on their interactions and on households net wealth. The review stresses the fact that a large part of the literature explain households debt holdings as a function, among others, of net wealth, an assumption that runs into possible endogeneity problems. Chapter 2 defines two non-linear multivariate models to study the interactions between assets and liabilities held by Italian households. Estimation refers to a pooling of cross-sections of SHIW. The first model is a bivariate tobit that estimates factors affecting assets and liabilities and their degree of correlation with results coherent with theoretical expectations. To tackle the presence of non normality and heteroskedasticity in the error term, generating non consistent tobit estimators, semi-parametric estimates are provided that confirm the results of the tobit model. The second model is a quadrivariate probit on three different assets (safe, risky and real) and total liabilities; the results show the expected patterns of interdependence suggested by theoretical considerations. Chapter 3 reviews the methodologies for estimating non-linear dynamic panel data models, drawing attention to the problems to be dealt with to obtain consistent estimators. Specific attention is given to the initial condition problem raised by the inclusion of the lagged dependent variable in the set of explanatory variables. The advantage of using dynamic panel data models lies in the fact that they allow to simultaneously account for true state dependence, via the lagged variable, and unobserved heterogeneity via individual effects specification. Chapter 4 applies the models reviewed in Chapter 3 to analyse financial difficulties of Italian households, by using information on net wealth as provided in the panel component of the SHIW. The aim is to test whether households persistently experience financial difficulties over time. A thorough discussion is provided of the alternative approaches proposed by the literature (subjective/qualitative indicators versus quantitative indexes) to identify households in financial distress. Households in financial difficulties are identified as those holding amounts of net wealth lower than the value corresponding to the first quartile of net wealth distribution. Estimation is conducted via four different methods: the pooled probit model, the random effects probit model with exogenous initial conditions, the Heckman model and the recently developed Wooldridge model. Results obtained from all estimators accept the null hypothesis of true state dependence and show that, according with the literature, less sophisticated models, namely the pooled and exogenous models, over-estimate such persistence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Currently, a variety of linear and nonlinear measures is in use to investigate spatiotemporal interrelation patterns of multivariate time series. Whereas the former are by definition insensitive to nonlinear effects, the latter detect both nonlinear and linear interrelation. In the present contribution we employ a uniform surrogate-based approach, which is capable of disentangling interrelations that significantly exceed random effects and interrelations that significantly exceed linear correlation. The bivariate version of the proposed framework is explored using a simple model allowing for separate tuning of coupling and nonlinearity of interrelation. To demonstrate applicability of the approach to multivariate real-world time series we investigate resting state functional magnetic resonance imaging (rsfMRI) data of two healthy subjects as well as intracranial electroencephalograms (iEEG) of two epilepsy patients with focal onset seizures. The main findings are that for our rsfMRI data interrelations can be described by linear cross-correlation. Rejection of the null hypothesis of linear iEEG interrelation occurs predominantly for epileptogenic tissue as well as during epileptic seizures.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study evaluated the enamel loss and composite remnants after debonding and clean-up. The tested null hypothesis is that there are no differences between different polishing systems regarding removing composite remnants without damaging the tooth surface. Brackets were bonded to 75 extracted human molars and removed after a storage period of 100 hours. The adhesive remnant index (ARI) was evaluated. The clean-up was carried out with five different procedures: 1. carbide bur; 2. carbide bur and Brownie and Greenie silicone polishers; 3. carbide bur and Astropol polishers; 4. carbide bur and Renew polishers; and 5. carbide bur, Brownie, Greenie and PoGo polishers. Silicone impressions were made at baseline (T0) and after debonding (T1) and polishing (T2) to produce plaster replicas. The replicas were analysed with a three-dimensional laser scanner and measured with analytical software. Statistical analysis was performed with the Kruskal-Wallis test and pairwise Wilcoxon tests with Bonferroni-Holm adjustment (α = 0.05). Enamel breakouts after debonding were detectable in 27 per cent of all cases, with a mean volume loss of 0.02 mm(3) (±0.03 mm(3)) and depth of 44.9 μm (±48.3 μm). The overall ARI scores was 3 with a few scores of 1 and 2. The composite remnants after debonding had a mean volume of 2.48 mm(3) (±0.92 mm(3)). Mean volume loss due to polishing was 0.05 mm(3) (±0.26 mm(3)) and the composite remnants had a mean volume of 0.22 mm(3) (±0.32 mm(3)). There were no statistically significant differences in volumetric changes after polishing (P = 0.054) between the different clean-up methods. However, sufficient clean-up without enamel loss was difficult to achieve.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pseudogenes (Ψs), including processed and non-processed Ψs, are ubiquitous genetic elements derived from originally functional genes in all studied genomes within the three kingdoms of life. However, systematic surveys of non-processed Ψs utilizing genomic information from multiple samples within a species are still rare. Here a systematic comparative analysis was conducted of Ψs within 80 fully re-sequenced Arabidopsis thaliana accessions, and 7546 genes, representing ~28% of the genomic annotated open reading frames (ORFs), were found with disruptive mutations in at least one accession. The distribution of these Ψs on chromosomes showed a significantly negative correlation between Ψs/ORFs and their local gene densities, suggesting a higher proportion of Ψs in gene desert regions, e.g. near centromeres. On the other hand, compared with the non-Ψ loci, even the intact coding sequences (CDSs) in the Ψ loci were found to have shorter CDS length, fewer exon number and lower GC content. In addition, a significant functional bias against the null hypothesis was detected in the Ψs mainly involved in responses to environmental stimuli and biotic stress as reported, suggesting that they are likely important for adaptive evolution to rapidly changing environments by pseudogenization to accumulate successive mutations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is of interest in some applications to determine whether there is a relationship between a hazard rate function (or a cumulative incidence function) and a mark variable which is only observed at uncensored failure times. We develop nonparametric tests for this problem when the mark variable is continuous. Tests are developed for the null hypothesis that the mark-specific hazard rate is independent of the mark versus ordered and two-sided alternatives expressed in terms of mark-specific hazard functions and mark-specific cumulative incidence functions. The test statistics are based on functionals of a bivariate test process equal to a weighted average of differences between a Nelson--Aalen-type estimator of the mark-specific cumulative hazard function and a nonparametric estimator of this function under the null hypothesis. The weight function in the test process can be chosen so that the test statistics are asymptotically distribution-free.Asymptotically correct critical values are obtained through a simple simulation procedure. The testing procedures are shown to perform well in numerical studies, and are illustrated with an AIDS clinical trial example. Specifically, the tests are used to assess if the instantaneous or absolute risk of treatment failure depends on the amount of accumulation of drug resistance mutations in a subject's HIV virus. This assessment helps guide development of anti-HIV therapies that surmount the problem of drug resistance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We introduce a diagnostic test for the mixing distribution in a generalised linear mixed model. The test is based on the difference between the marginal maximum likelihood and conditional maximum likelihood estimates of a subset of the fixed effects in the model. We derive the asymptotic variance of this difference, and propose a test statistic that has a limiting chi-square distribution under the null hypothesis that the mixing distribution is correctly specified. For the important special case of the logistic regression model with random intercepts, we evaluate via simulation the power of the test in finite samples under several alternative distributional forms for the mixing distribution. We illustrate the method by applying it to data from a clinical trial investigating the effects of hormonal contraceptives in women.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Equivalence testing is growing in use in scientific research outside of its traditional role in the drug approval process. Largely due to its ease of use and recommendation from the United States Food and Drug Administration guidance, the most common statistical method for testing (bio)equivalence is the two one-sided tests procedure (TOST). Like classical point-null hypothesis testing, TOST is subject to multiplicity concerns as more comparisons are made. In this manuscript, a condition that bounds the family-wise error rate (FWER) using TOST is given. This condition then leads to a simple solution for controlling the FWER. Specifically, we demonstrate that if all pairwise comparisons of k independent groups are being evaluated for equivalence, then simply scaling the nominal Type I error rate down by (k - 1) is sufficient to maintain the family-wise error rate at the desired value or less. The resulting rule is much less conservative than the equally simple Bonferroni correction. An example of equivalence testing in a non drug-development setting is given.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We previously showed that lifetime cumulative lead dose, measured as lead concentration in the tibia bone by X-ray fluorescence, was associated with persistent and progressive declines in cognitive function and with decreases in MRI-based brain volumes in former lead workers. Moreover, larger region-specific brain volumes were associated with better cognitive function. These findings motivated us to explore a novel application of path analysis to evaluate effect mediation. Voxel-wise path analysis, at face value, represents the natural evolution of voxel-based morphometry methods to answer questions of mediation. Application of these methods to the former lead worker data demonstrated potential limitations in this approach where there was a tendency for results to be strongly biased towards the null hypothesis (lack of mediation). Moreover, a complimentary analysis using anatomically-derived regions of interest volumes yielded opposing results, suggesting evidence of mediation. Specifically, in the ROI-based approach, there was evidence that the association of tibia lead with function in three cognitive domains was mediated through the volumes of total brain, frontal gray matter, and/or possibly cingulate. A simulation study was conducted to investigate whether the voxel-wise results arose from an absence of localized mediation, or more subtle defects in the methodology. The simulation results showed the same null bias evidenced as seen in the lead workers data. Both the lead worker data results and the simulation study suggest that a null-bias in voxel-wise path analysis limits its inferential utility for producing confirmatory results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The penetration, translocation, and distribution of ultrafine and nanoparticles in tissues and cells are challenging issues in aerosol research. This article describes a set of novel quantitative microscopic methods for evaluating particle distributions within sectional images of tissues and cells by addressing the following questions: (1) is the observed distribution of particles between spatial compartments random? (2) Which compartments are preferentially targeted by particles? and (3) Does the observed particle distribution shift between different experimental groups? Each of these questions can be addressed by testing an appropriate null hypothesis. The methods all require observed particle distributions to be estimated by counting the number of particles associated with each defined compartment. For studying preferential labeling of compartments, the size of each of the compartments must also be estimated by counting the number of points of a randomly superimposed test grid that hit the different compartments. The latter provides information about the particle distribution that would be expected if the particles were randomly distributed, that is, the expected number of particles. From these data, we can calculate a relative deposition index (RDI) by dividing the observed number of particles by the expected number of particles. The RDI indicates whether the observed number of particles corresponds to that predicted solely by compartment size (for which RDI = 1). Within one group, the observed and expected particle distributions are compared by chi-squared analysis. The total chi-squared value indicates whether an observed distribution is random. If not, the partial chi-squared values help to identify those compartments that are preferential targets of the particles (RDI > 1). Particle distributions between different groups can be compared in a similar way by contingency table analysis. We first describe the preconditions and the way to implement these methods, then provide three worked examples, and finally discuss the advantages, pitfalls, and limitations of this method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Whether the subgingival microbiota differ between individuals with chronic and those with aggressive periodontitis, and whether smoking influences bacterial composition, is controversial. We hypothesized that the subgingival microbiota do not differ between sites in individuals with chronic or aggressive periodontitis, or by smoking status. Bacterial counts and proportional distributions were assessed in 84 individuals with chronic periodontitis and 22 with aggressive periodontitis. No differences in probing pocket depth by periodontal status were found (mean, 0.11 mm; 95% CI, 0.6 to 0.8, p = 0.74). Including Staphylococcus aureus, Parvimonas micra, and Prevotella intermedia, 7/40 species were found at higher levels in those with aggressive periodontitis (p < 0.001). Smokers had higher counts of Tannerella forsythia (p < 0.01). The prevalence of S. aureus in non-smokers with aggressive periodontitis was 60.5%. The null hypothesis was rejected, in that P. intermedia, S. aureus, and S. mutans were robust in diagnosing sites in individuals with aggressive periodontitis. S. aureus, S. sanguinis, and T. forsythia differentiated smoking status.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: To test the null hypothesis that there is no difference between premolar position visualized on panoramic radiographs (PRs) and lateral headfilms (LHs). MATERIALS AND METHODS: The prevalence of differences in the direction of crown angulation between PR and LH was assessed. Furthermore, brass wire markers with different sagittal and transverse angulations were placed in a dry skull. With the markers in place, LHs and PRs were taken. RESULTS: A difference in the direction of crown angulation of unerupted second premolars between PR and LH occurred in 19.5% of patients. The reason for the angulation differences is a buccolingual orientation of the tooth, which appears as a mesiodistal angulation on the PR. CONCLUSION: The null hypothesis was rejected since in one-fifth of the patients premolar projection differs between the panoramic radiograph and the lateral headfilm.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The longboard skateboard has a longer, and usually wider, deck than the standard skateboard to provide greater support for the rider during the higher speeds attained on this version of the skateboard. Fourteen volunteer subjects participated in downhill and uphill longboarding trials. Heart rates were monitored during both trials, and the downhill and uphill average heart rates were compared with resting heart rates and then compared with accepted intensity recommendations for health and fitness benefits. The study questions were: Does longboarding have an acute effect on heart rates? If so, will longboarding uphill and/or downhill cause heart rate changes to levels recommended to improve cardiorespiratory health and fitness? With these questions as guidance we developed four hypotheses. With beats/minute and average uphill heart rate of 167.8 beats/minute statistical analysis showed statistically significant p values < .0001 and each null hypothesis was rejected in favor of their respective research hypotheses. Based on average age and average resting heart rate, average age-predicted maximum heart rate was 193.2 beats/minute and heart rate reserve was 133.2 beats/minute. The average percentages of heart rate reserve for the downhill section (131.4 beats/minute) and uphill section )(167.8 beats/minute) were 54% and 81% respectively. Downhill heart rates are within moderate intensity levels, 40% to 60% of heart rate reserve, and uphill heart rates are within vigorous intensity levels, greater than 60% of heart rate reserve. These results indicate that longboarding can increase heart rate to suggested levels suggested by the American College of Sports Medicine for improving cardiovascular health and fitness.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE To compare the precision of fit of full-arch implant-supported screw-retained computer-aided designed and computer-aided manufactured (CAD/CAM) titanium-fixed dental prostheses (FDP) before and after veneering. The null-hypothesis was that there is no difference in vertical microgap values between pure titanium frameworks and FDPs after porcelain firing. MATERIALS AND METHODS Five CAD/CAM titanium grade IV frameworks for a screw-retained 10-unit implant-supported reconstruction on six implants (FDI tooth positions 15, 13, 11, 21, 23, 25) were fabricated after digitizing the implant platforms and the cuspid-supporting framework resin pattern with a laser scanner (CARES(®) Scan CS2; Institut Straumann AG, Basel, Switzerland). A bonder, an opaquer, three layers of porcelain, and one layer of glaze were applied (Vita Titankeramik) and fired according to the manufacturer's preheating and fire cycle instructions at 400-800°C. The one-screw test (implant 25 screw-retained) was applied before and after veneering of the FDPs to assess the vertical microgap between implant and framework platform with a scanning electron microscope. The mean microgap was calculated from interproximal and buccal values. Statistical comparison was performed with non-parametric tests. RESULTS All vertical microgaps were clinically acceptable with values <90 μm. No statistically significant pairwise difference (P = 0.98) was observed between the relative effects of vertical microgap of unveneered (median 19 μm; 95% CI 13-35 μm) and veneered FDPs (20 μm; 13-31 μm), providing support for the null-hypothesis. Analysis within the groups showed significantly different values between the five implants of the FDPs before (P = 0.044) and after veneering (P = 0.020), while a monotonous trend of increasing values from implant 23 (closest position to screw-retained implant 25) to 15 (most distant implant) could not be observed (P = 0.169, P = 0.270). CONCLUSIONS Full-arch CAD/CAM titanium screw-retained frameworks have a high accuracy. Porcelain firing procedure had no impact on the precision of fit of the final FDPs. All implant microgap measurements of each FDP showed clinically acceptable vertical misfit values before and after veneering. Thus, the results do not only show accurate performance of the milling and firing but show also a reproducible scanning and designing process.