994 resultados para Multifractal Products, Log-Normal Scenario


Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Consumption of sugar-reformulated products (commercially available foods and beverages that have been reduced in sugar content through reformulation) is a potential strategy for lowering sugar intake at a population level. The impact of sugar-reformulated products on body weight, energy balance (EB) dynamics and cardiovascular disease risk indicators has yet to be established. The REFORMulated foods (REFORM) study examined the impact of an 8-week sugar-reformulated product exchange on body weight, EB dynamics, blood pressure, arterial stiffness, glycemia and lipemia. METHODS: A randomized, controlled, double-blind, crossover dietary intervention study was performed with fifty healthy normal to overweight men and women (age 32.0 ± 9.8 year, BMI 23.5 ± 3.0 kg/m2) who were randomly assigned to consume either regular sugar or sugar-reduced foods and beverages for 8 weeks, separated by 4-week washout period. Body weight, energy intake (EI), energy expenditure and vascular markers were assessed at baseline and after both interventions. RESULTS: We found that carbohydrate (P < 0.001), total sugars (P < 0.001) and non-milk extrinsic sugars (P < 0.001) (% EI) were lower, whereas fat (P = 0.001) and protein (P = 0.038) intakes (% EI) were higher on the sugar-reduced than the regular diet. No effects on body weight, blood pressure, arterial stiffness, fasting glycemia or lipemia were observed. CONCLUSIONS: Consumption of sugar-reduced products, as part of a blinded dietary exchange for an 8-week period, resulted in a significant reduction in sugar intake. Body weight did not change significantly, which we propose was due to energy compensation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to develop a Bayesian approach for log-Birnbaum-Saunders Student-t regression models under right-censored survival data. Markov chain Monte Carlo (MCMC) methods are used to develop a Bayesian procedure for the considered model. In order to attenuate the influence of the outlying observations on the parameter estimates, we present in this paper Birnbaum-Saunders models in which a Student-t distribution is assumed to explain the cumulative damage. Also, some discussions on the model selection to compare the fitted models are given and case deletion influence diagnostics are developed for the joint posterior distribution based on the Kullback-Leibler divergence. The developed procedures are illustrated with a real data set. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In interval-censored survival data, the event of interest is not observed exactly but is only known to occur within some time interval. Such data appear very frequently. In this paper, we are concerned only with parametric forms, and so a location-scale regression model based on the exponentiated Weibull distribution is proposed for modeling interval-censored data. We show that the proposed log-exponentiated Weibull regression model for interval-censored data represents a parametric family of models that include other regression models that are broadly used in lifetime data analysis. Assuming the use of interval-censored data, we employ a frequentist analysis, a jackknife estimator, a parametric bootstrap and a Bayesian analysis for the parameters of the proposed model. We derive the appropriate matrices for assessing local influences on the parameter estimates under different perturbation schemes and present some ways to assess global influences. Furthermore, for different parameter settings, sample sizes and censoring percentages, various simulations are performed; in addition, the empirical distribution of some modified residuals are displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to a modified deviance residual in log-exponentiated Weibull regression models for interval-censored data. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we compare three residuals based on the deviance component in generalised log-gamma regression models with censored observations. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. For all cases studied, the empirical distributions of the proposed residuals are in general symmetric around zero, but only a martingale-type residual presented negligible kurtosis for the majority of the cases studied. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for the martingale-type residual in generalised log-gamma regression models with censored data. A lifetime data set is analysed under log-gamma regression models and a model checking based on the martingale-type residual is performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we consider local influence analysis for the skew-normal linear mixed model (SN-LMM). As the observed data log-likelihood associated with the SN-LMM is intractable, Cook`s well-known approach cannot be applied to obtain measures of local influence. Instead, we develop local influence measures following the approach of Zhu and Lee (2001). This approach is based on the use of an EM-type algorithm and is measurement invariant under reparametrizations. Four specific perturbation schemes are discussed. Results obtained for a simulated data set and a real data set are reported, illustrating the usefulness of the proposed methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Foram elaborados hambúrgueres e filés empanados com peitos de frango pálidos e normais e foram realizadas as seguintes análises de qualidade: cor, Perda de Peso por Cozimento (PPC), cisalhamento, Encolhimento por Fritura (EF), TBA, avaliação microbiológica e sensorial para os hambúrgueres, e TBA, análise microbiológica e análise sensorial para os filés empanados. As amostras de hambúrgueres elaboradas não diferiram significativamente (p > 0,05) nos parâmetros de coloração, EF, PPC e análise microbiológica e sensorial. Para análise de força de cisalhamento, houve diferença significativa (p < 0,05) entre os hambúrgueres no período de 7, 60 e 120 dias, sendo que os hambúrgueres elaborados com carne pálida (1,92; 1,31 e 1,46, respectivamente) apresentaram as menores médias quando comparados com os de carne normal (2,34; 1,85 e 1,73, respectivamente). Na análise de TBA, as amostras elaboradas com carne pálida também tiveram os maiores resultados com 90 a 180 dias de estocagem (5,28; 7,78; 8,89; 5,02) quando comparadas às de carne normal (2,62; 7,05; 8,08; 3,89). Para os filés empanados, não foram encontradas diferenças significativas (p > 0,05) entre a elaboração com carne de coloração normal e pálida para os parâmetros avaliados. Estes resultados demonstram que a carne pálida pode ser utilizada para a elaboração de produtos industrializados sem causar prejuízos em sua qualidade.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pós-graduação em Psicologia - FCLAS

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a skewed log-Birnbaum-Saunders regression model based on the skewed sinh-normal distribution proposed by Leiva et al. [A skewed sinh-normal distribution and its properties and application to air pollution, Comm. Statist. Theory Methods 39 (2010), pp. 426-443]. Some influence methods, such as the local influence and generalized leverage, are presented. Additionally, we derived the normal curvatures of local influence under some perturbation schemes. An empirical application to a real data set is presented in order to illustrate the usefulness of the proposed model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present study is to evaluate the differences on FTIR spectra of the normal lung cell (noncancerous mice lung epithelial cell line e10) due to different fixation protocols for histological processing. The results shown that formalin and methacarn (normally used in fixation) did cause many changes on the FTIR spectra of mice lung cells e10, mainly in the organic compounds (800-1800 cm(-1)) in lipids, DNA, and proteins, and the alcohol 70% fixation protocol caused almost no changes on the FTIR spectra compared to unfixed cells spectra (in PBS). It can be concluded that histological processing with alcohol 70% fixation protocol can be used in the FTIR study of mice lung cell line e10.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we give a definition of the term logarithmically symplectic variety; to be precise, we distinguish even two types of such varieties. The general type is a triple $(f,nabla,omega)$ comprising a log smooth morphism $fcolon Xtomathrm{Spec}kappa$ of log schemes together with a flat log connection $nablacolon LtoOmega^1_fotimes L$ and a ($nabla$-closed) log symplectic form $omegainGamma(X,Omega^2_fotimes L)$. We define the functor of log Artin rings of log smooth deformations of such varieties $(f,nabla,omega)$ and calculate its obstruction theory, which turns out to be given by the vector spaces $H^i(X,B^bullet_{(f,nabla)}(omega))$, $i=0,1,2$. Here $B^bullet_{(f,nabla)}(omega)$ is the class of a certain complex of $mathcal{O}_X$-modules in the derived category $mathrm{D}(X/kappa)$ associated to the log symplectic form $omega$. The main results state that under certain conditions a log symplectic variety can, by a flat deformation, be smoothed to a symplectic variety in the usual sense. This may provide a new approach to the construction of new examples of irreducible symplectic manifolds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The separation of small molecules by capillary electrophoresis is governed by a complex interplay among several physical effects. Until recently, a systematic understanding of how the influence of all of these effects is observed experimentally has remained unclear. The work presented in this thesis involves the use of transient isotachophoretic stacking (tITP) and computer simulation to improve and better understand an in-capillary chemical assay for creatinine. This assay involves the use of electrophoretically mediated micro-analysis (EMMA) to carry out the Jaffé reaction inside a capillary tube. The primary contribution of this work is the elucidation of the role of the length and concentration of the hydroxide plug used to achieve tITP stacking of the product formed by the in-capillary EMMA/Jaffé method. Computer simulation using SIMUL 5.0 predicts that a 3-4 fold gain in sensitivity can be recognized by timing the tITP stacking event such that the Jaffé product peak is at its maximum height as that peak is electrophoresing past the detection window. Overall, the length of the hydroxide plug alters the timing of the stacking event and lower concentration plugs of hydroxide lead to more rapidly occurring tITP stacking events. Also, the inclusion of intentional tITP stacking in the EMMA/Jaffé method improves the sensitivity of the assay, including creatinine concentrations within the normal biological range. Ultimately, improvement in assay sensitivity can be rationally designed by using the length and concentration of the hydroxide plug to engineer the timing of the tITP stacking event such that stacking occurs as the Jaffé product is passing the detection window.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nuclear morphometry (NM) uses image analysis to measure features of the cell nucleus which are classified as: bulk properties, shape or form, and DNA distribution. Studies have used these measurements as diagnostic and prognostic indicators of disease with inconclusive results. The distributional properties of these variables have not been systematically investigated although much of the medical data exhibit nonnormal distributions. Measurements are done on several hundred cells per patient so summary measurements reflecting the underlying distribution are needed.^ Distributional characteristics of 34 NM variables from prostate cancer cells were investigated using graphical and analytical techniques. Cells per sample ranged from 52 to 458. A small sample of patients with benign prostatic hyperplasia (BPH), representing non-cancer cells, was used for general comparison with the cancer cells.^ Data transformations such as log, square root and 1/x did not yield normality as measured by the Shapiro-Wilks test for normality. A modulus transformation, used for distributions having abnormal kurtosis values, also did not produce normality.^ Kernel density histograms of the 34 variables exhibited non-normality and 18 variables also exhibited bimodality. A bimodality coefficient was calculated and 3 variables: DNA concentration, shape and elongation, showed the strongest evidence of bimodality and were studied further.^ Two analytical approaches were used to obtain a summary measure for each variable for each patient: cluster analysis to determine significant clusters and a mixture model analysis using a two component model having a Gaussian distribution with equal variances. The mixture component parameters were used to bootstrap the log likelihood ratio to determine the significant number of components, 1 or 2. These summary measures were used as predictors of disease severity in several proportional odds logistic regression models. The disease severity scale had 5 levels and was constructed of 3 components: extracapsulary penetration (ECP), lymph node involvement (LN+) and seminal vesicle involvement (SV+) which represent surrogate measures of prognosis. The summary measures were not strong predictors of disease severity. There was some indication from the mixture model results that there were changes in mean levels and proportions of the components in the lower severity levels. ^