886 resultados para Multivariate measurement model
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Aim To analyse the local regulatory mechanisms of osteoclastogenesis and angiogenesis during the progression of periapical lesions in female rats with oestrogen deficiency and treatment with raloxifene (RLX). Methodology Female Wistar rats were distributed into groups: SHAM-veh, subjected to sham surgery and treated with a vehicle; OVX-veh, subjected to ovary removal and treated with a vehicle; and OVX-RLX, subjected to ovary removal and treated with RLX. Vehicle or RLX was administered orally for 90 days. During treatment, the dental pulp of mandibular first molars was exposed to the oral environment for induction of periapical lesions, which were analysed after 7 and 30 days. After the experimental periods, blood samples were collected for measurement of oestradiol, calcium, phosphorus and alkaline phosphatase. The rats were euthanized and the mandibles removed and processed for immunohistochemical detection of receptor activator of nuclear factor kappa-B ligand (RANKL), osteoprotegerin (OPG), hypoxia-inducible factor-1 alpha (HIF-1α) and bone-specific alkaline phosphatase (BALP). Data were compared using Kruskal–Wallis followed by Dunn test (nonparametric values) and anova followed by the Tukey's test (parametric values). Results The plasma concentration of oestradiol showed hypo-oestrogenism in the rats subjected to ovary removal. On day 7, alkaline phosphatase activity, calcium and phosphorus were higher in the OVX-RLX group than in the OVX-veh group (P < 0.001), but immunolabelling for RANKL and HIF-1α was lower in OVX-RLX group (P < 0.001). On day 30, the OVX-veh group had higher immunolabelling for RANKL than the OVX-RLX group (P < 0.05). There were no significant differences in the immunoreactivity of OPG and BALP between any groups at either time-point (P > 0.05). Conclusion RLX therapy reversed the increased levels of the local regulators of both osteoclastogenesis and angiogenesis induced by oestrogen deficiency.
Resumo:
Pós-graduação em Pesquisa e Desenvolvimento (Biotecnologia Médica) - FMB
Resumo:
Preservation of rivers and water resources is crucial in most environmental policies and many efforts are made to assess water quality. Environmental monitoring of large river networks are based on measurement stations. Compared to the total length of river networks, their number is often limited and there is a need to extend environmental variables that are measured locally to the whole river network. The objective of this paper is to propose several relevant geostatistical models for river modeling. These models use river distance and are based on two contrasting assumptions about dependency along a river network. Inference using maximum likelihood, model selection criterion and prediction by kriging are then developed. We illustrate our approach on two variables that differ by their distributional and spatial characteristics: summer water temperature and nitrate concentration. The data come from 141 to 187 monitoring stations in a network on a large river located in the Northeast of France that is more than 5000 km long and includes Meuse and Moselle basins. We first evaluated different spatial models and then gave prediction maps and error variance maps for the whole stream network.
Resumo:
Several experimental studies of pulmonary emphysema using animal models have been described in the literature. However, only a few of these studies have focused on the assessment of ergometric function as a non-invasive technique to validate the methodology used for induction of experimental emphysema. Additionally, functional assessments of emphysema are rarely correlated with morphological pulmonary abnormalities caused by induced emphysema. The present study aimed to evaluate the effects of elastase administered by tracheal puncture on pulmonary parenchyma and their corresponding functional impairment. This was evaluated by measuring exercise capacity in C57Bl/6 mice in order to establish a reproducible and safe methodology of inducing experimental emphysema. Thirty six mice underwent ergometric tests before and 28 days after elastase administration. Pancreatic porcine elastase solution was administered by tracheal puncture, which resulted in a significantly decreased exercise capacity, shown by a shorter distance run (-30.5%) and a lower mean velocity (-15%), as well as in failure to increase the elimination of carbon dioxide. The mean linear intercept increased significantly by 50% in tracheal elastase administration. In conclusion, application of elastase by tracheal function in C57Bl/6 induces emphysema, as validated by morphometric analyses, and resulted in a significantly lower exercise capacity, while resulting in a low mortality rate. (C) 2011 Sociedade Portuguesa de Pneumologia. Published by Elsevier Espana, S.L. All rights reserved.
Resumo:
Purpose: The aim of this study was to evaluate the accuracy of cone-beam computed tomography (CBCT) for measuring the buccal bone volume around dental implants. Materials and methods: Three to six implants were inserted into the anterior maxilla of eight skulls, depending on the availability of bone, and after this, the CBCT was performed. By means of CBCT image, measurements of the bone wall at three points of the implant were obtained, analyzed and compared with those obtained in the plaster skull casting. Results: The results showed that for the three points of the implants, no statistically significant difference in the measurements was obtained from the plaster model and CBCT images. Conclusions: CBCT can be a useful tool for assessing buccal bone volume along the implant. To cite this article:?Shiratori LN, Marotti J, Yamanouchi J, Chilvarquer I, Contin I, Tortamano-Neto P. Measurement of buccal bone volume of dental implants by means of cone-beam computed tomography.?Clin. Oral Impl. Res. 23, 2012; 797804.?doi: 10.1111/j.1600-0501.2011.02207.x
Resumo:
Our objective was to assess extrinsic influences upon childbirth. In a cohort of 1,826 days containing 17,417 childbirths among them 13,252 spontaneous labor admissions, we studied the influence of environment upon the high incidence of labor (defined by 75th percentile or higher), analyzed by logistic regression. The predictors of high labor admission included increases in outdoor temperature (odds ratio: 1.742, P = 0.045, 95%CI: 1.011 to 3.001), and decreases in atmospheric pressure (odds ratio: 1.269, P = 0.029, 95%CI: 1.055 to 1.483). In contrast, increases in tidal range were associated with a lower probability of high admission (odds ratio: 0.762, P = 0.030, 95%CI: 0.515 to 0.999). Lunar phase was not a predictor of high labor admission (P = 0.339). Using multivariate analysis, increases in temperature and decreases in atmospheric pressure predicted high labor admission, and increases of tidal range, as a measurement of the lunar gravitational force, predicted a lower probability of high admission.
Resumo:
We address the problem of selecting the best linear unbiased predictor (BLUP) of the latent value (e.g., serum glucose fasting level) of sample subjects with heteroskedastic measurement errors. Using a simple example, we compare the usual mixed model BLUP to a similar predictor based on a mixed model framed in a finite population (FPMM) setup with two sources of variability, the first of which corresponds to simple random sampling and the second, to heteroskedastic measurement errors. Under this last approach, we show that when measurement errors are subject-specific, the BLUP shrinkage constants are based on a pooled measurement error variance as opposed to the individual ones generally considered for the usual mixed model BLUP. In contrast, when the heteroskedastic measurement errors are measurement condition-specific, the FPMM BLUP involves different shrinkage constants. We also show that in this setup, when measurement errors are subject-specific, the usual mixed model predictor is biased but has a smaller mean squared error than the FPMM BLUP which points to some difficulties in the interpretation of such predictors. (C) 2011 Elsevier By. All rights reserved.
Resumo:
The main goal of this article is to consider influence assessment in models with error-prone observations and variances of the measurement errors changing across observations. The techniques enable to identify potential influential elements and also to quantify the effects of perturbations in these elements on some results of interest. The approach is illustrated with data from the WHO MONICA Project on cardiovascular disease.
Resumo:
Concentrations of 39 organic compounds were determined in three fractions (head, heart and tail) obtained from the pot still distillation of fermented sugarcane juice. The results were evaluated using analysis of variance (ANOVA), Tukey's test, principal component analysis (PCA), hierarchical cluster analysis (HCA) and linear discriminant analysis (LDA). According to PCA and HCA, the experimental data lead to the formation of three clusters. The head fractions give rise to a more defined group. The heart and tail fractions showed some overlap consistent with its acid composition. The predictive ability of calibration and validation of the model generated by LDA for the three fractions classification were 90.5 and 100%, respectively. This model recognized as the heart twelve of the thirteen commercial cachacas (92.3%) with good sensory characteristics, thus showing potential for guiding the process of cuts.
Resumo:
IDENTIFICATION OF ETHANOLIC WOOD EXTRACTS USING ELECTRONIC ABSORPTION SPECTRUM AND MULTIVARIATE ANALYSIS. The application of multivariate analysis to spectrophotometric (UV) data was explored for distinguishing extracts of cachaca woods commonly used in the manufacture of casks for aging cachacas (oak, cabretiva-parda, jatoba, amendoim and canela-sassafras). Absorbances close to 280 nm were more strongly correlated with oak and jatoba woods, whereas absorbances near 230 nm were more correlated with canela-sassafras and cabretiva-parda. A comparison between the spectrophotometric model and the model based on chromatographic (HPLC-DAD) data was carried out. The spectrophotometric model better explained the variance data (PC1 + PC2 = 91%) exhibiting potential as a routine method for checking aged spirits.
Resumo:
Using the high-resolution performance of the fragment separator FRS at GSI we have discovered 60 new neutron-rich isotopes in the atomic number range of 60 <= Z <= 78. The new isotopes were unambiguously identified in reactions with a U-238 beam impinging on a Be target at 1 GeV/nucleon. The production cross-section for the new isotopes have been measured down to the pico-barn level and compared with predictions of different model calculations. For elements above hafnium fragmentation is the dominant reaction mechanism which creates the new isotopes, whereas fission plays a dominant role for the production of the new isotopes up to thulium. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is an interest in studying latent variables (or latent traits). Usually such latent traits are assumed to be random variables and a convenient distribution is assigned to them. A very common choice for such a distribution has been the standard normal. Recently, Azevedo et al. [Bayesian inference for a skew-normal IRT model under the centred parameterization, Comput. Stat. Data Anal. 55 (2011), pp. 353-365] proposed a skew-normal distribution under the centred parameterization (SNCP) as had been studied in [R. B. Arellano-Valle and A. Azzalini, The centred parametrization for the multivariate skew-normal distribution, J. Multivariate Anal. 99(7) (2008), pp. 1362-1382], to model the latent trait distribution. This approach allows one to represent any asymmetric behaviour concerning the latent trait distribution. Also, they developed a Metropolis-Hastings within the Gibbs sampling (MHWGS) algorithm based on the density of the SNCP. They showed that the algorithm recovers all parameters properly. Their results indicated that, in the presence of asymmetry, the proposed model and the estimation algorithm perform better than the usual model and estimation methods. Our main goal in this paper is to propose another type of MHWGS algorithm based on a stochastic representation (hierarchical structure) of the SNCP studied in [N. Henze, A probabilistic representation of the skew-normal distribution, Scand. J. Statist. 13 (1986), pp. 271-275]. Our algorithm has only one Metropolis-Hastings step, in opposition to the algorithm developed by Azevedo et al., which has two such steps. This not only makes the implementation easier but also reduces the number of proposal densities to be used, which can be a problem in the implementation of MHWGS algorithms, as can be seen in [R.J. Patz and B.W. Junker, A straightforward approach to Markov Chain Monte Carlo methods for item response models, J. Educ. Behav. Stat. 24(2) (1999), pp. 146-178; R. J. Patz and B. W. Junker, The applications and extensions of MCMC in IRT: Multiple item types, missing data, and rated responses, J. Educ. Behav. Stat. 24(4) (1999), pp. 342-366; A. Gelman, G.O. Roberts, and W.R. Gilks, Efficient Metropolis jumping rules, Bayesian Stat. 5 (1996), pp. 599-607]. Moreover, we consider a modified beta prior (which generalizes the one considered in [3]) and a Jeffreys prior for the asymmetry parameter. Furthermore, we study the sensitivity of such priors as well as the use of different kernel densities for this parameter. Finally, we assess the impact of the number of examinees, number of items and the asymmetry level on the parameter recovery. Results of the simulation study indicated that our approach performed equally as well as that in [3], in terms of parameter recovery, mainly using the Jeffreys prior. Also, they indicated that the asymmetry level has the highest impact on parameter recovery, even though it is relatively small. A real data analysis is considered jointly with the development of model fitting assessment tools. The results are compared with the ones obtained by Azevedo et al. The results indicate that using the hierarchical approach allows us to implement MCMC algorithms more easily, it facilitates diagnosis of the convergence and also it can be very useful to fit more complex skew IRT models.
Resumo:
This study performed an exploratory analysis of the anthropometrical and morphological muscle variables related to the one-repetition maximum (1RM) performance. In addition, the capacity of these variables to predict the force production was analyzed. 50 active males were submitted to the experimental procedures: vastus lateralis muscle biopsy, quadriceps magnetic resonance imaging, body mass assessment and 1RM test in the leg-press exercise. K-means cluster analysis was performed after obtaining the body mass, sum of the left and right quadriceps muscle cross-sectional area (Sigma CSA), percentage of the type II fibers and the 1RM performance. The number of clusters was defined a priori and then were labeled as high strength performance (HSP1RM) group and low strength performance (LSP1RM) group. Stepwise multiple regressions were performed by means of body mass, Sigma CSA, percentage of the type II fibers and clusters as predictors' variables and 1RM performance as response variable. The clusters mean +/- SD were: 292.8 +/- 52.1 kg, 84.7 +/- 17.9 kg, 19249.7 +/- 1645.5 mm(2) and 50.8 +/- 7.2% for the HSP1RM and 254.0 +/- 51.1 kg, 69.2 +/- 8.1 kg, 15483.1 +/- 1 104.8 mm(2) and 51.7 +/- 6.2 %, for the LSP1RM in the 1RM, body mass, Sigma CSA and muscle fiber type II percentage, respectively. The most important variable in the clusters division was the Sigma CSA. In addition, the Sigma CSA and muscle fiber type II percentage explained the variance in the 1RM performance (Adj R-2 = 0.35, p = 0.0001) for all participants and for the LSP1RM (Adj R-2 = 0.25, p = 0.002). For the HSP1RM, only the Sigma CSA was entered in the model and showed the highest capacity to explain the variance in the 1RM performance (Adj R-2 = 0.38, p = 0.01). As a conclusion, the muscle CSA was the most relevant variable to predict force production in individuals with no strength training background.
Resumo:
Background Statistical methods for estimating usual intake require at least two short-term dietary measurements in a subsample of the target population. However, the percentage of individuals with a second dietary measurement (replication rate) may influence the precision of estimates, such as percentiles and proportions of individuals below cut-offs of intake. Objective To investigate the precision of the usual food intake estimates using different replication rates and different sample sizes. Participants/setting Adolescents participating in the continuous National Health and Nutrition Examination Survey 2007-2008 (n=1,304) who completed two 24-hour recalls. Statistical analyses performed The National Cancer Institute method was used to estimate the usual intake of dark green vegetables in the original sample comprising 1,304 adolescents with a replication rate of 100%. A bootstrap with 100 replications was performed to estimate CIs for percentiles and proportions of individuals below cut-offs of intake. Using the same bootstrap replications, four sets of data sets were sampled with different replication rates (80%, 60%, 40%, and 20%). For each data set created, the National Cancer Institute method was performed and percentiles, Cl, and proportions of individuals below cut-offs were calculated. Precision estimates were checked by comparing each Cl obtained from data sets with different replication rates with the Cl obtained from original data set. Further, we sampled 1,000, 750, 500, and 250 individuals from the original data set, and performed the same analytical procedures. Results Percentiles of intake and percentage of individuals below the cut-off points were similar throughout the replication rates and sample sizes, but the Cl increased as the replication rate decreased. Wider CIs were observed at 40% and 20% of replication rate. Conclusions The precision of the usual intake estimates decreased when low replication rates were used. However, even with different sample sizes, replication rates >40% may not lead to an important loss of precision. J Acad Nutr Diet. 2012;112:1015-1020.