992 resultados para regression dicontinuity design
Resumo:
Background Regression to the mean (RTM) is a statistical phenomenon that can make natural variation in repeated data look like real change. It happens when unusually large or small measurements tend to be followed by measurements that are closer to the mean. Methods We give some examples of the phenomenon, and discuss methods to overcome it at the design and analysis stages of a study. Results The effect of RTM in a sample becomes more noticeable with increasing measurement error and when follow-up measurements are only examined on a sub-sample selected using a baseline value. Conclusions RTM is a ubiquitous phenomenon in repeated data and should always be considered as a possible cause of an observed change. Its effect can be alleviated through better study design and use of suitable statistical methods.
Resumo:
Background. Children of alcoholics are significantly more likely to experience high-risk environmental exposures, including prenatal substance exposure, and are more likely to exhibit externalizing problems [e.g. attention deficit hyperactivity disorder (ADHD)]. While there is evidence that genetic influences and prenatal nicotine and/or alcohol exposure play separate roles in determining risk of ADHD, little has been done on determining the joint roles that genetic risk associated with maternal alcohol use disorder (AUD) and prenatal risk factors play in determining risk of ADHD. Method. Using a children-of-twins design, diagnostic telephone interview data from high-risk families (female monozygotic and dizygotic twins concordant or discordant for AUD as parents) and control families targeted from a large Australian twin cohort were analyzed using logistic regression models. Results. Offspring of twins with a history of AUD, as well as offspring of non-AUD monozygotic twins whose co-twin had AUD, were significantly more likely to exhibit ADHD than offspring of controls. This pattern is consistent with a genetic explanation for the association between maternal AUD and increased offspring risk of ADHD. Adjustment for prenatal smoking, which remained significantly predictive, did not remove the significant genetic association between maternal AUD and offspring ADHD. Conclusions. While maternal smoking during pregnancy probably contributes to the association between maternal AUD and offspring ADHD risk, the evidence for a significant genetic correlation suggests: (i) pleiotropic genetic effects, with some genes that influence risk of AUD also influencing vulnerability to ADHD; or (ii) ADHD is a direct risk-factor for AUD.
Resumo:
Count data with excess zeros relative to a Poisson distribution are common in many biomedical applications. A popular approach to the analysis of such data is to use a zero-inflated Poisson (ZIP) regression model. Often, because of the hierarchical Study design or the data collection procedure, zero-inflation and lack of independence may occur simultaneously, which tender the standard ZIP model inadequate. To account for the preponderance of zero counts and the inherent correlation of observations, a class of multi-level ZIP regression model with random effects is presented. Model fitting is facilitated using an expectation-maximization algorithm, whereas variance components are estimated via residual maximum likelihood estimating equations. A score test for zero-inflation is also presented. The multi-level ZIP model is then generalized to cope with a more complex correlation structure. Application to the analysis of correlated count data from a longitudinal infant feeding study illustrates the usefulness of the approach.
Resumo:
The kinematic mapping of a rigid open-link manipulator is a homomorphism between Lie groups. The homomorphisrn has solution groups that act on an inverse kinematic solution element. A canonical representation of solution group operators that act on a solution element of three and seven degree-of-freedom (do!) dextrous manipulators is determined by geometric analysis. Seven canonical solution groups are determined for the seven do! Robotics Research K-1207 and Hollerbach arms. The solution element of a dextrous manipulator is a collection of trivial fibre bundles with solution fibres homotopic to the Torus. If fibre solutions are parameterised by a scalar, a direct inverse funct.ion that maps the scalar and Cartesian base space coordinates to solution element fibre coordinates may be defined. A direct inverse pararneterisation of a solution element may be approximated by a local linear map generated by an inverse augmented Jacobian correction of a linear interpolation. The action of canonical solution group operators on a local linear approximation of the solution element of inverse kinematics of dextrous manipulators generates cyclical solutions. The solution representation is proposed as a model of inverse kinematic transformations in primate nervous systems. Simultaneous calibration of a composition of stereo-camera and manipulator kinematic models is under-determined by equi-output parameter groups in the composition of stereo-camera and Denavit Hartenberg (DH) rnodels. An error measure for simultaneous calibration of a composition of models is derived and parameter subsets with no equi-output groups are determined by numerical experiments to simultaneously calibrate the composition of homogeneous or pan-tilt stereo-camera with DH models. For acceleration of exact Newton second-order re-calibration of DH parameters after a sequential calibration of stereo-camera and DH parameters, an optimal numerical evaluation of DH matrix first order and second order error derivatives with respect to a re-calibration error function is derived, implemented and tested. A distributed object environment for point and click image-based tele-command of manipulators and stereo-cameras is specified and implemented that supports rapid prototyping of numerical experiments in distributed system control. The environment is validated by a hierarchical k-fold cross validated calibration to Cartesian space of a radial basis function regression correction of an affine stereo model. Basic design and performance requirements are defined for scalable virtual micro-kernels that broker inter-Java-virtual-machine remote method invocations between components of secure manageable fault-tolerant open distributed agile Total Quality Managed ISO 9000+ conformant Just in Time manufacturing systems.
Resumo:
Purpose: The purpose of this paper is to examine the effect of the quality of senior management leadership on social support and job design, whose main effects on strains, and moderating effects on work stressors-to-strains relationships were assessed. Design/methodology/approach: A survey involving distribution of questionnaires was carried out on a random sample of health care employees in acute hospital practice in the UK. The sample comprised 65,142 respondents. The work stressors tested were quantitative overload and hostile environment, whereas strains were measured through job satisfaction and turnover intentions. Structural equation modelling and moderated regression analyses were used in the analysis. Findings: Quality of senior management leadership explained 75 per cent and 94 per cent of the variance of social support and job design respectively, whereas work stressors explained 51 per cent of the variance of strains. Social support and job design predicted job satisfaction and turnover intentions, as well as moderated significantly the relationships between quantitative workload/hostility and job satisfaction/turnover intentions. Research limitations/implications: The findings are useful to management and to health employees working in acute/specialist hospitals. Further research could be done in other counties to take into account cultural differences and variations in health systems. The limitations included self-reported data and percept-percept bias due to same source data collection. Practical implications: The quality of senior management leaders in hospitals has an impact on the social environment, the support given to health employees, their job design, as well as work stressors and strains perceived. Originality/value: The study argues in favour of effective senior management leadership of hospitals, as well as ensuring adequate support structures and job design. The findings may be useful to health policy makers and human resources managers. © Emerald Group Publishing Limited.
Resumo:
Optimal design for parameter estimation in Gaussian process regression models with input-dependent noise is examined. The motivation stems from the area of computer experiments, where computationally demanding simulators are approximated using Gaussian process emulators to act as statistical surrogates. In the case of stochastic simulators, which produce a random output for a given set of model inputs, repeated evaluations are useful, supporting the use of replicate observations in the experimental design. The findings are also applicable to the wider context of experimental design for Gaussian process regression and kriging. Designs are proposed with the aim of minimising the variance of the Gaussian process parameter estimates. A heteroscedastic Gaussian process model is presented which allows for an experimental design technique based on an extension of Fisher information to heteroscedastic models. It is empirically shown that the error of the approximation of the parameter variance by the inverse of the Fisher information is reduced as the number of replicated points is increased. Through a series of simulation experiments on both synthetic data and a systems biology stochastic simulator, optimal designs with replicate observations are shown to outperform space-filling designs both with and without replicate observations. Guidance is provided on best practice for optimal experimental design for stochastic response models. © 2013 Elsevier Inc. All rights reserved.
Resumo:
2000 Mathematics Subject Classification: 62J12, 62K15, 91B42, 62H99.
Resumo:
Surface modification by means of nanostructures is of interest to enhance boiling heat transfer in various applications including the organic Rankine cycle (ORC). With the goal of obtaining rough and dense aluminum oxide (Al2O3) nanofilms, the optimal combination of process parameters for electrophoretic deposition (EPD) based on the uniform design (UD) method is explored in this paper. The detailed procedures for the EPD process and UD method are presented. Four main influencing conditions controlling the EPD process were identified as nanofluid concentration, deposition time, applied voltage and suspension pH. A series of tests were carried out based on the UD experimental design. A regression model and statistical analysis were applied to the results. Sensitivity analyses of the effect of the four main parameters on the roughness and deposited mass of Al2O3 films were also carried out. The results showed that Al2O3 nanofilms were deposited compactly and uniformly on the substrate. Within the range of the experiments, the preferred combination of process parameters was determined to be nanofluid concentration of 2 wt.%, deposition time of 15 min, applied voltage of 23 V and suspension pH of 3, yielding roughness and deposited mass of 520.9 nm and 161.6 × 10− 4 g/cm2, respectively. A verification experiment was carried out at these conditions and gave values of roughness and deposited mass within 8% error of the expected ones as determined from the UD approach. It is concluded that uniform design is useful for the optimization of electrophoretic deposition requiring only 7 tests compared to 49 using the orthogonal design method.
Resumo:
Annual average daily traffic (AADT) is important information for many transportation planning, design, operation, and maintenance activities, as well as for the allocation of highway funds. Many studies have attempted AADT estimation using factor approach, regression analysis, time series, and artificial neural networks. However, these methods are unable to account for spatially variable influence of independent variables on the dependent variable even though it is well known that to many transportation problems, including AADT estimation, spatial context is important. ^ In this study, applications of geographically weighted regression (GWR) methods to estimating AADT were investigated. The GWR based methods considered the influence of correlations among the variables over space and the spatially non-stationarity of the variables. A GWR model allows different relationships between the dependent and independent variables to exist at different points in space. In other words, model parameters vary from location to location and the locally linear regression parameters at a point are affected more by observations near that point than observations further away. ^ The study area was Broward County, Florida. Broward County lies on the Atlantic coast between Palm Beach and Miami-Dade counties. In this study, a total of 67 variables were considered as potential AADT predictors, and six variables (lanes, speed, regional accessibility, direct access, density of roadway length, and density of seasonal household) were selected to develop the models. ^ To investigate the predictive powers of various AADT predictors over the space, the statistics including local r-square, local parameter estimates, and local errors were examined and mapped. The local variations in relationships among parameters were investigated, measured, and mapped to assess the usefulness of GWR methods. ^ The results indicated that the GWR models were able to better explain the variation in the data and to predict AADT with smaller errors than the ordinary linear regression models for the same dataset. Additionally, GWR was able to model the spatial non-stationarity in the data, i.e., the spatially varying relationship between AADT and predictors, which cannot be modeled in ordinary linear regression. ^
Resumo:
The primary purpose of this research is to study the linkage between perceived job design characteristics and information system environment characteristics before and after the replacement of a legacy information system with a new type of information system (referred to as an Enterprise Resource Planning or ERP system). A public state University implementing an academic version of an ERP system was selected for the study. Three survey instruments were used to examine the perception of the information system, the job characteristics, and the organizational culture before and after the system implementation. The research participants included two large departments resulting in a sample of 130 workers. Research questions were analyzed using multivariate procedures including factor analysis, path analysis, step-wise regression, and matched pair analysis. ^ Results indicated that the ERP system has introduced new elements into the working environment that has changed the perception of how the job design characteristics and organization culture dimensions are viewed by the workers. The understanding of how the perceived system characteristics align with an individual's perceived job design characteristics is supported by each of the system characteristics significantly correlated in the proposed direction. The stronger support of this relationship becomes visible in the causal flow of the effects seen in the path diagram and in the step-wise regression. The perceived job design characteristics aligning with dimensions of organizational culture are not as strong as the literature suggests. Although there are significant correlations between the job and culture variables, only one relationship can be seen in the causal flow. ^ This research has demonstrated that system characteristics of ERP do contribute to the perception of change in an organization and do support organizational culture behaviors and job characteristics. ^
Resumo:
Quantile regression (QR) was first introduced by Roger Koenker and Gilbert Bassett in 1978. It is robust to outliers which affect least squares estimator on a large scale in linear regression. Instead of modeling mean of the response, QR provides an alternative way to model the relationship between quantiles of the response and covariates. Therefore, QR can be widely used to solve problems in econometrics, environmental sciences and health sciences. Sample size is an important factor in the planning stage of experimental design and observational studies. In ordinary linear regression, sample size may be determined based on either precision analysis or power analysis with closed form formulas. There are also methods that calculate sample size based on precision analysis for QR like C.Jennen-Steinmetz and S.Wellek (2005). A method to estimate sample size for QR based on power analysis was proposed by Shao and Wang (2009). In this paper, a new method is proposed to calculate sample size based on power analysis under hypothesis test of covariate effects. Even though error distribution assumption is not necessary for QR analysis itself, researchers have to make assumptions of error distribution and covariate structure in the planning stage of a study to obtain a reasonable estimate of sample size. In this project, both parametric and nonparametric methods are provided to estimate error distribution. Since the method proposed can be implemented in R, user is able to choose either parametric distribution or nonparametric kernel density estimation for error distribution. User also needs to specify the covariate structure and effect size to carry out sample size and power calculation. The performance of the method proposed is further evaluated using numerical simulation. The results suggest that the sample sizes obtained from our method provide empirical powers that are closed to the nominal power level, for example, 80%.
Resumo:
Objectives: To determine if providing informal care to a co-resident with dementia symptoms places an additional risk on the likelihood of poor mental health or mortality compared to co-resident non-caregivers.
Design: A quasi-experimental design of caregiving and non-caregiving co-residents of individuals with dementia symptoms, providing a natural comparator for the additive effects of caregiving on top of living with an individual with dementia symptoms.
Methods: Census records, providing information on household structure, intensity of caregiving, presence of dementia symptoms and self-reported mental health, were linked to mortality records over the following 33 months. Multi-level regression models were constructed to determine the risk of poor mental health and death in co-resident caregivers of individuals with dementia symptoms compared to co-resident non-caregivers, adjusting for the clustering of individuals within households.
Results: The cohort consisted of 10,982 co-residents (55.1% caregivers), with 12.1% of non-caregivers reporting poor mental health compared to 8.4% of intense caregivers (>20 hours of care per week). During follow-up the cohort experienced 560 deaths (245 to caregivers). Overall, caregiving co-residents were at no greater risk of poor mental health but had lower mortality risk than non-caregiving co-residents (ORadj=0.93, 95% CI 0.79, 1.10 and ORadj=0.67, 95% CI 0.56, 0.81, respectively); this lower mortality risk was also seen amongst the most intensive caregivers (ORadj=0.65, 95% CI 0.53, 0.79).
Conclusion: Caregiving poses no additional risk to mental health over and above the risk associated with merely living with someone with dementia, and is associated with a lower mortality risk compared to non-caregiving co-residents.
Resumo:
The flow rates of drying and nebulizing gas, heat block and desolvation line temperatures and interface voltage are potential electrospray ionization parameters as they may enhance sensitivity of the mass spectrometer. The conditions that give higher sensitivity of 13 pharmaceuticals were explored. First, Plackett-Burman design was implemented to screen significant factors, and it was concluded that interface voltage and nebulizing gas flow were the only factors that influence the intensity signal for all pharmaceuticals. This fractionated factorial design was projected to set a full 2(2) factorial design with center points. The lack-of-fit test proved to be significant. Then, a central composite face-centered design was conducted. Finally, a stepwise multiple linear regression and subsequently an optimization problem solving were carried out. Two main drug clusters were found concerning the signal intensities of all runs of the augmented factorial design. p-Aminophenol, salicylic acid, and nimesulide constitute one cluster as a result of showing much higher sensitivity than the remaining drugs. The other cluster is more homogeneous with some sub-clusters comprising one pharmaceutical and its respective metabolite. It was observed that instrumental signal increased when both significant factors increased with maximum signal occurring when both codified factors are set at level +1. It was also found that, for most of the pharmaceuticals, interface voltage influences the intensity of the instrument more than the nebulizing gas flowrate. The only exceptions refer to nimesulide where the relative importance of the factors is reversed and still salicylic acid where both factors equally influence the instrumental signal. Graphical Abstract ᅟ.
Resumo:
Objectifs: Examiner les tendances temporelles, les déterminants en lien avec le design des études et la qualité des taux de réponse rapportés dans des études cas-témoins sur le cancer publiées lors des 30 dernières années. Méthodes: Une revue des études cas-témoins sur le cancer a été menée. Les critères d'inclusion étaient la publication (i) dans l’un de 15 grands périodiques ciblés et (ii) lors de quatre périodes de publication (1984-1986, 1995, 2005 et 2013) couvrant trois décennies. 370 études ont été sélectionnées et examinées. La méthodologie en lien avec le recrutement des sujets et la collecte de données, les caractéristiques de la population, les taux de participation et les raisons de la non-participation ont été extraites de ces études. Des statistiques descriptives ont été utilisées pour résumer la qualité des taux de réponse rapportés (en fonction de la quantité d’information disponible), les tendances temporelles et les déterminants des taux de réponse; des modèles de régression linéaire ont été utilisés pour analyser les tendances temporelles et les déterminants des taux de participation. Résultats: Dans l'ensemble, les qualités des taux de réponse rapportés et des raisons de non-participation étaient très faible, particulièrement chez les témoins. La participation a diminué au cours des 30 dernières années, et cette baisse est plus marquée dans les études menées après 2000. Lorsque l'on compare les taux de réponse dans les études récentes a ceux des études menées au cours de 1971 à 1980, il y a une plus grande baisse chez les témoins sélectionnés en population générale ( -17,04%, IC 95%: -23,17%, -10,91%) que chez les cas (-5,99%, IC 95%: -11,50%, -0,48%). Les déterminants statistiquement significatifs du taux de réponse chez les cas étaient: le type de cancer examiné, la localisation géographique de la population de l'étude, et le mode de collecte des données. Le seul déterminant statistiquement significatif du taux de réponse chez les témoins hospitaliers était leur localisation géographique. Le seul déterminant statistiquement significatif du taux de participation chez les témoins sélectionnés en population générale était le type de répondant (sujet uniquement ou accompagné d’une tierce personne). Conclusion: Le taux de participation dans les études cas-témoins sur le cancer semble avoir diminué au cours des 30 dernières années et cette baisse serait plus marquée dans les études récentes. Afin d'évaluer le niveau réel de non-participation et ses déterminants, ainsi que l'impact de la non-participation sur la validité des études, il est nécessaire que les études publiées utilisent une approche normalisée pour calculer leurs taux de participation et qu’elles rapportent ceux-ci de façon transparente.
Resumo:
Objectifs: Examiner les tendances temporelles, les déterminants en lien avec le design des études et la qualité des taux de réponse rapportés dans des études cas-témoins sur le cancer publiées lors des 30 dernières années. Méthodes: Une revue des études cas-témoins sur le cancer a été menée. Les critères d'inclusion étaient la publication (i) dans l’un de 15 grands périodiques ciblés et (ii) lors de quatre périodes de publication (1984-1986, 1995, 2005 et 2013) couvrant trois décennies. 370 études ont été sélectionnées et examinées. La méthodologie en lien avec le recrutement des sujets et la collecte de données, les caractéristiques de la population, les taux de participation et les raisons de la non-participation ont été extraites de ces études. Des statistiques descriptives ont été utilisées pour résumer la qualité des taux de réponse rapportés (en fonction de la quantité d’information disponible), les tendances temporelles et les déterminants des taux de réponse; des modèles de régression linéaire ont été utilisés pour analyser les tendances temporelles et les déterminants des taux de participation. Résultats: Dans l'ensemble, les qualités des taux de réponse rapportés et des raisons de non-participation étaient très faible, particulièrement chez les témoins. La participation a diminué au cours des 30 dernières années, et cette baisse est plus marquée dans les études menées après 2000. Lorsque l'on compare les taux de réponse dans les études récentes a ceux des études menées au cours de 1971 à 1980, il y a une plus grande baisse chez les témoins sélectionnés en population générale ( -17,04%, IC 95%: -23,17%, -10,91%) que chez les cas (-5,99%, IC 95%: -11,50%, -0,48%). Les déterminants statistiquement significatifs du taux de réponse chez les cas étaient: le type de cancer examiné, la localisation géographique de la population de l'étude, et le mode de collecte des données. Le seul déterminant statistiquement significatif du taux de réponse chez les témoins hospitaliers était leur localisation géographique. Le seul déterminant statistiquement significatif du taux de participation chez les témoins sélectionnés en population générale était le type de répondant (sujet uniquement ou accompagné d’une tierce personne). Conclusion: Le taux de participation dans les études cas-témoins sur le cancer semble avoir diminué au cours des 30 dernières années et cette baisse serait plus marquée dans les études récentes. Afin d'évaluer le niveau réel de non-participation et ses déterminants, ainsi que l'impact de la non-participation sur la validité des études, il est nécessaire que les études publiées utilisent une approche normalisée pour calculer leurs taux de participation et qu’elles rapportent ceux-ci de façon transparente.