903 resultados para Tests for Continuous Lifetime Data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose a bivariate distribution for the bivariate survival times based on Farlie-Gumbel-Morgenstern copula to model the dependence on a bivariate survival data. The proposed model allows for the presence of censored data and covariates. For inferential purpose a Bayesian approach via Markov Chain Monte Carlo (MCMC) is considered. Further, some discussions on the model selection criteria are given. In order to examine outlying and influential observations, we present a Bayesian case deletion influence diagnostics based on the Kullback-Leibler divergence. The newly developed procedures are illustrated via a simulation study and a real dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective The present study aimed to examine the reproducibility of heart rate (HR) and rating of perceived exertion (RPE) values obtained during different incremental treadmill tests. Equipment and methods Twenty male, recreational, endurance-trained runners (10-km running pace: 10–15 km·h−1) performed, in a counterbalanced order, three continuous incremental exercise tests with different speed increments (0.5 km·h−1, 1 km·h−1 and 2 km·h−1). Thereafter, each participant performed the three tests again, maintaining the same order as before. The reproducibility of the HR and RPE values were analyzed for all protocols during submaximal intensities (8, 10, 12, and 14 km·h−1). In addition, it was examined the reproducibility of maximal HR (HRmax) and peak RPE (RPEpeak). Results The variability of both the HR and RPE values showed a tendency to decrease over the stages during the incremental test and was not or slightly influenced by the incremental test design. The HR at 14 km·h−1 and HRmax presented the highest reproducibility (CV < 2%). In contrast, the submaximal RPE values showed higher variability indices (i.e., CV > 5.0%). In conclusion, the HR values were highly reproducible during the stages of the incremental test, in contrast to the RPE values that presented limited reproducibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A CURRENT EXAMINATION OF DIETARY INTAKES OF FIBER, CALCIUM, IRON, AND ZINC AND THEIR RELATIONSHIP TO BLOOD LEAD LEVELS IN U.S. CHILDREN AGED 1-5 YEARS Stephanie Ann Melchert, M.S. University of Nebraska, 2010 Adviser: Kaye Stanek Krogstrand The effect of lead on the health and well-being of those exposed has been well documented and many efforts have been made to reduce exposure of lead to the United States population. Despite these efforts, many studies have documented cognitive impairments and behavioral problems in children with even low levels of lead in their blood. Previous studies have suggested that a proper diet may have a role in the prevention of elevated blood lead levels in children. The objective of this study was to determine if there was an inverse correlation of blood lead levels (BLL) in children to their dietary intakes of fiber, calcium, iron, and zinc considering low levels of lead exposure. This study examined 1019 children in the National Health and Nutrition Examination Survey (NHANES) conducted from 2005-2006. Data were analyzed using Spearman’s rank correlations to correlate continuous variables to BLL in children and independent samples t-tests were used to compare mean blood lead levels of categorical variables. Results indicate that BLL in children is significantly correlated with and weight, recumbent length/standing height, dietary fiber intake and continine, a marker of cigarette smoke exposure. BLL was not significantly correlated with calcium, iron, zinc, or vitamin C. A significant difference was found in the mean BLL of children who took supplements, lived in smoking homes, as well as those who lived in homes built before 1978. Overall, this study shows that children living in homes built before 1978 remain at greater risk for lead exposure, and adequate dietary fiber intake may provide benefits to children who are exposed to lead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evaluations of measurement invariance provide essential construct validity evidence. However, the quality of such evidence is partly dependent upon the validity of the resulting statistical conclusions. The presence of Type I or Type II errors can render measurement invariance conclusions meaningless. The purpose of this study was to determine the effects of categorization and censoring on the behavior of the chi-square/likelihood ratio test statistic and two alternative fit indices (CFI and RMSEA) under the context of evaluating measurement invariance. Monte Carlo simulation was used to examine Type I error and power rates for the (a) overall test statistic/fit indices, and (b) change in test statistic/fit indices. Data were generated according to a multiple-group single-factor CFA model across 40 conditions that varied by sample size, strength of item factor loadings, and categorization thresholds. Seven different combinations of model estimators (ML, Yuan-Bentler scaled ML, and WLSMV) and specified measurement scales (continuous, censored, and categorical) were used to analyze each of the simulation conditions. As hypothesized, non-normality increased Type I error rates for the continuous scale of measurement and did not affect error rates for the categorical scale of measurement. Maximum likelihood estimation combined with a categorical scale of measurement resulted in more correct statistical conclusions than the other analysis combinations. For the continuous and censored scales of measurement, the Yuan-Bentler scaled ML resulted in more correct conclusions than normal-theory ML. The censored measurement scale did not offer any advantages over the continuous measurement scale. Comparing across fit statistics and indices, the chi-square-based test statistics were preferred over the alternative fit indices, and ΔRMSEA was preferred over ΔCFI. Results from this study should be used to inform the modeling decisions of applied researchers. However, no single analysis combination can be recommended for all situations. Therefore, it is essential that researchers consider the context and purpose of their analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report new archeointensity data obtained from the analyses of baked clay elements (architectural and kiln brick fragments) sampled in Southeast Brazil and historically and/or archeologically dated between the end of the XVIth century and the beginning of the XXth century AD. The results were determined using the classical Thellier and Thellier protocol as modified by Coe, including partial thermoremanent magnetization (pTRM) and pTRM-tail checks, and the Triaxe protocol, which involves continuous high-temperature magnetization measurements. In both protocols, TRM anisotropy and cooling rate TRM dependence effects were taken into account for intensity determinations which were successfully performed for 150 specimens from 43 fragments, with a good agreement between intensity results obtained from the two procedures. Nine site-mean intensity values were derived from three to eight fragments and defined with standard deviations of less than 8%. The site-mean values vary from similar to 25 mu T to similar to 42 mu T and describe in Southeast Brazil a continuous decreasing trend by similar to 5 mu T per century between similar to 1600 AD and similar to 1900 AD. Their comparison with recent archeointensity results obtained from Northeast Brazil and reduced at a same latitude shows that: (1) the geocentric axial dipole approximation is not valid between these southeastern and northeastern regions of Brazil, whose latitudes differ by similar to 10 degrees, and (2) the available global geomagnetic field models (gufm1 models, their recalibrated versions and the CALSK3 models) are not sufficiently precise to reliably reproduce the non-dipole field effects which prevailed in Brazil for at least the 1600-1750 period. The large non-dipole contribution thus highlighted is most probably linked to the evolution of the South Atlantic Magnetic Anomaly (SAMA) during that period. Furthermore, although our dataset is limited, the Brazilian archeointensity data appear to support the view of a rather oscillatory behavior of the axial dipole moment during the past three centuries that would have been marked in particular by a moderate increase between the end of the XVIIIth century and the middle of the XIXth century followed by the well-known decrease from 1840 AD attested by direct measurements. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the analysis of instrumented indentation data, it is common practice to incorporate the combined moduli of the indenter (E-i) and the specimen (E) in the so-called reduced modulus (E-r) to account for indenter deformation. Although indenter systems with rigid or elastic tips are considered as equivalent if E-r is the same, the validity of this practice has been questioned over the years. The present work uses systematic finite element simulations to examine the role of the elastic deformation of the indenter tip in instrumented indentation measurements and the validity of the concept of the reduced modulus in conical and pyramidal (Berkovich) indentations. It is found that the apical angle increases as a result of the indenter deformation, which influences in the analysis of the results. Based upon the inaccuracies introduced by the reduced modulus approximation in the analysis of the unloading segment of instrumented indentation applied load (P)-penetration depth (delta) curves, a detailed examination is then conducted on the role of indenter deformation upon the dimensionless functions describing the loading stages of such curves. Consequences of the present results in the extraction of the uniaxial stress-strain characteristics of the indented material through such dimensional analyses are finally illustrated. It is found that large overestimations in the assessment of the strain hardening behavior result by neglecting tip compliance. Guidelines are given in the paper to reduce such overestimations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives. The null hypothesis was that mechanical testing systems used to determine polymerization stress (sigma(pol)) would rank a series of composites similarly. Methods. Two series of composites were tested in the following systems: universal testing machine (UTM) using glass rods as bonding substrate, UTM/acrylic rods, "low compliance device", and single cantilever device ("Bioman"). One series had five experimental composites containing BisGMA:TEGDMA in equimolar concentrations and 60, 65, 70, 75 or 80 wt% of filler. The other series had five commercial composites: Filtek Z250 (3M ESPE), Filtek A110 (3M ESPE), Tetric Ceram (Ivoclar), Heliomolar (Ivoclar) and Point 4 (Kerr). Specimen geometry, dimensions and curing conditions were similar in all systems. sigma(pol) was monitored for 10 min. Volumetric shrinkage (VS) was measured in a mercury dilatometer and elastic modulus (E) was determined by three-point bending. Shrinkage rate was used as a measure of reaction kinetics. ANOVA/Tukey test was performed for each variable, separately for each series. Results. For the experimental composites, sigma(pol) decreased with filler content in all systems, following the variation in VS. For commercial materials, sigma(pol) did not vary in the UTM/acrylic system and showed very few similarities in rankings in the others tests system. Also, no clear relationships were observed between sigma(pol) and VS or E. Significance. The testing systems showed a good agreement for the experimental composites, but very few similarities for the commercial composites. Therefore, comparison of polymerization stress results from different devices must be done carefully. (c) 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: This pilot study aimed to verify if glycemic control can be achieved in type 2 diabetes patients after acute myocardial infarction (AMI), using insulin glargine (iGlar) associated with regular insulin (iReg), compared with the standard intensive care unit protocol, which uses continuous insulin intravenous delivery followed by NPH insulin and iReg (St. Care). Patients and Methods: Patients (n = 20) within 24 h of AMI were randomized to iGlar or St. Care. Therapy was guided exclusively by capillary blood glucose (CBG), but glucometric parameters were also analyzed by blinded continuous glucose monitoring system (CGMS). Results: Mean glycemia was 141 +/- 39 mg/dL for St. Care and 132 +/- 42 mg/dL for iGlar by CBG or 138 +/- 35 mg/dL for St. Care and 129 +/- 34 mg/dL for iGlar by CGMS. Percentage of time in range (80-180 mg/dL) by CGMS was 73 +/- 18% for iGlar and 77 +/- 11% for St. Care. No severe hypoglycemia (<= 40 mg/dL) was detected by CBG, but CGMS indicated 11 (St. Care) and seven (iGlar) excursions in four subjects from each group, mostly in sulfonylurea users (six of eight patients). Conclusions: This pilot study suggests that equivalent glycemic control without increase in severe hyperglycemia may be achieved using iGlar with background iReg. Data outputs were controlled by both CBG and CGMS measurements in a real-life setting to ensure reliability. Based on CGMS measurements, there were significant numbers of glycemic excursions outside of the target range. However, this was not detected by CBG. In addition, the data indicate that previous use of sulfonylurea may be a potential major risk factor for severe hypoglycemia irrespective of the type of insulin treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The log-Burr XII regression model for grouped survival data is evaluated in the presence of many ties. The methodology for grouped survival data is based on life tables, where the times are grouped in k intervals, and we fit discrete lifetime regression models to the data. The model parameters are estimated by maximum likelihood and jackknife methods. To detect influential observations in the proposed model, diagnostic measures based on case deletion, so-called global influence, and influence measures based on small perturbations in the data or in the model, referred to as local influence, are used. In addition to these measures, the total local influence and influential estimates are also used. We conduct Monte Carlo simulation studies to assess the finite sample behavior of the maximum likelihood estimators of the proposed model for grouped survival. A real data set is analyzed using a regression model for grouped data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we obtain asymptotic expansions, up to order n(-1/2) and under a sequence of Pitman alternatives, for the nonnull distribution functions of the likelihood ratio, Wald, score and gradient test statistics in the class of symmetric linear regression models. This is a wide class of models which encompasses the t model and several other symmetric distributions with longer-than normal tails. The asymptotic distributions of all four statistics are obtained for testing a subset of regression parameters. Furthermore, in order to compare the finite-sample performance of these tests in this class of models, Monte Carlo simulations are presented. An empirical application to a real data set is considered for illustrative purposes. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Obtaining ecotoxicological data on pesticides in tropical regions is imperative for performing more realistic risk analysis, and avoidance tests have been proposed as a useful, fast and cost-effective tool. Therefore, the present study aimed to evaluate the avoidance behavior of Eisenia andrei to a formulated product, Vertimec(A (R)) 18 EC (a.i abamectin), in tests performed on a reference tropical artificial soil (TAS), to derive ecotoxicological data on tropical conditions, and a natural soil (NS), simulating crop field conditions. In TAS tests an adaptation of the substrate recommended by OECD and ISO protocols was used, with residues of coconut fiber as a source of organic matter. Concentrations of the pesticide on TAS test ranged from 0 to 7 mg abamectin/kg (dry weight-d.w.). In NS tests, earthworms were exposed to samples of soils sprayed in situ with: 0.9 L of Vertimec(A (R)) 18 EC/ha (RD); twice as much this dosage (2RD); and distilled water (Control), respectively, and to 2RD: control dilutions (12.5, 25, 50, 75%). All tests were performed under 25 +/- A 2A degrees C, to simulate tropical conditions, and a 12hL:12hD photoperiod. The organisms avoided contaminated TAS for an EC50,48h = 3.918 mg/kg soil d.w., LOEC = 1.75 mg/kg soil d.w. and NOEC = 0.85 mg/kg soil d.w. No significant avoidance response occurred for any NS test. Abamectin concentrations in NS were rather lower than EC50, 48h and LOEC determined in TAS tests. The results obtained contribute to overcome a lack of ecotoxicological data on pesticides under tropical conditions, but more tests with different soil invertebrates are needed to improve pesticides risk analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we proposed a new three-parameter long-term lifetime distribution induced by a latent complementary risk framework with decreasing, increasing and unimodal hazard function, the long-term complementary exponential geometric distribution. The new distribution arises from latent competing risk scenarios, where the lifetime associated scenario, with a particular risk, is not observable, rather we observe only the maximum lifetime value among all risks, and the presence of long-term survival. The properties of the proposed distribution are discussed, including its probability density function and explicit algebraic formulas for its reliability, hazard and quantile functions and order statistics. The parameter estimation is based on the usual maximum-likelihood approach. A simulation study assesses the performance of the estimation procedure. We compare the new distribution with its particular cases, as well as with the long-term Weibull distribution on three real data sets, observing its potential and competitiveness in comparison with some usual long-term lifetime distributions.