914 resultados para Random regression


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: We studied the association between cigarette smoking and ovarian cancer in a population-based case-control study. Methods: A total of 794 women with histologically confirmed epithelial ovarian cancer who were aged 18-79 years and resident in one of three Australian states were interviewed, together with 855 controls aged 18-79 years selected at random from the electoral roll from the same states. Information was obtained about cigarette smoking and other factors including age, parity, oral contraceptive use, and reproductive factors. We estimated the relative risk of ovarian cancer associated with cigarette smoking, accounting for histologic type, using multivariable logistic regression to adjust for confounding factors. Results: Women who had ever smoked cigarettes were more likely to develop ovarian cancer than women who had never smoked (adjusted odds ratio (OR) = 1.5; 95% confidence interval (CI) = 1.2-1.9). Risk was greater for ovarian cancers of borderline malignancy (OR = 2.4; 95% CI = 1.4-4.1) than for invasive tumors (OR = 1.7; 95% CI = 1.2-2.4) and the histologic subtype most strongly associated overall was the mucinous subtype among both current smokers (OR = 3.2; 95% CI = 1.8-5.7) and past smokers (OR = 2.3; 95% CI = 1.3-3.9). Conclusions: These data extend recent findings and suggest that cigarette smoking is a risk factor for ovarian cancer, especially mucinous and borderline mucinous types. From a public health viewpoint, this is one of the few reports of a potentially avoidable risk factor for ovarian cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To determine whether constriction of proximal arterial vessels precedes involution of the distal hyaloid vasculature in the mouse, under normal conditions, and whether this vasoconstriction is less pronounced when the distal hyaloid network persists, as it does in oxygen-induced retinopathy (OIR). Methods: Photomicrographs of the vasa hyaloidea propria were analysed from pre-term pups (1-2 days prior to birth), and on Days 1-11 post-birth. The OIR model involved exposing pups to similar to 90% O-2 from D1-5, followed by return to ambient air. At sampling times pups were anaesthetised and perfused with india ink. Retinal flatmounts were also incubated with FITC-lectin (BS-1, G. simplicifolia,); this labels all vessels, allowing identification of vessels not patent to the perfusate. Results: Mean diameter of proximal hyaloid vessels in preterm pups was 25.44 +/- 1.98 mum; +/-1 SEM). Within 3-12 hrs of birth, significant vasoconstriction was evident (diameter:12.45 +/- 0.88 mum), and normal hyaloid regression subsequently occurred. Similar vasoconstriction occurred in the O-2-treated group, but this was reversed upon return to room air, with significant dilation of proximal vessels by D7 (diameter: 31.75 +/- 11.99 mum) and distal hyaloid vessels subsequently became enlarged and tortuous. Conclusions: Under normal conditions, vasoconstriction of proximal hyaloid vessels occurs at birth, preceding attenuation of distal hyaloid vessels. Vasoconstriction also occurs in O-2-treated pups during treatment, but upon return to room air, the remaining hyaloid vessels dilate proximally, and the distal vessels become dilated and tortuous. These observations support the contention that regression of the hyaloid network is dependent, in the first instance, on proximal arterial vasoconstriction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many occupational safety interventions, the objective is to reduce the injury incidence as well as the mean claims cost once injury has occurred. The claims cost data within a period typically contain a large proportion of zero observations (no claim). The distribution thus comprises a point mass at 0 mixed with a non-degenerate parametric component. Essentially, the likelihood function can be factorized into two orthogonal components. These two components relate respectively to the effect of covariates on the incidence of claims and the magnitude of claims, given that claims are made. Furthermore, the longitudinal nature of the intervention inherently imposes some correlation among the observations. This paper introduces a zero-augmented gamma random effects model for analysing longitudinal data with many zeros. Adopting the generalized linear mixed model (GLMM) approach reduces the original problem to the fitting of two independent GLMMs. The method is applied to evaluate the effectiveness of a workplace risk assessment teams program, trialled within the cleaning services of a Western Australian public hospital.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A finite-element method is used to study the elastic properties of random three-dimensional porous materials with highly interconnected pores. We show that Young's modulus, E, is practically independent of Poisson's ratio of the solid phase, nu(s), over the entire solid fraction range, and Poisson's ratio, nu, becomes independent of nu(s) as the percolation threshold is approached. We represent this behaviour of nu in a flow diagram. This interesting but approximate behaviour is very similar to the exactly known behaviour in two-dimensional porous materials. In addition, the behaviour of nu versus nu(s) appears to imply that information in the dilute porosity limit can affect behaviour in the percolation threshold limit. We summarize the finite-element results in terms of simple structure-property relations, instead of tables of data, to make it easier to apply the computational results. Without using accurate numerical computations, one is limited to various effective medium theories and rigorous approximations like bounds and expansions. The accuracy of these equations is unknown for general porous media. To verify a particular theory it is important to check that it predicts both isotropic elastic moduli, i.e. prediction of Young's modulus alone is necessary but not sufficient. The subtleties of Poisson's ratio behaviour actually provide a very effective method for showing differences between the theories and demonstrating their ranges of validity. We find that for moderate- to high-porosity materials, none of the analytical theories is accurate and, at present, numerical techniques must be relied upon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, several groups have investigated quantum analogues of random walk algorithms, both on a line and on a circle. It has been found that the quantum versions have markedly different features to the classical versions. Namely, the variance on the line, and the mixing time on the circle increase quadratically faster in the quantum versions as compared to the classical versions. Here, we propose a scheme to implement the quantum random walk on a line and on a circle in an ion trap quantum computer. With current ion trap technology, the number of steps that could be experimentally implemented will be relatively small. However, we show how the enhanced features of these walks could be observed experimentally. In the limit of strong decoherence, the quantum random walk tends to the classical random walk. By measuring the degree to which the walk remains quantum, '' this algorithm could serve as an important benchmarking protocol for ion trap quantum computers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study compared an enzyme-linked immunosorbent assay (ELISA) to a liquid chromatography-tandem mass spectrometry (LC/MS/MS) technique for measurement of tacrolimus concentrations in adult kidney and liver transplant recipients, and investigated how assay choice influenced pharmacokinetic parameter estimates and drug dosage decisions. Tacrolimus concentrations measured by both ELISA and LC/MS/MS from 29 kidney (n = 98 samples) and 27 liver (n = 97 samples) transplant recipients were used to evaluate the performance of these methods in the clinical setting. Tacrolimus concentrations measured by the two techniques were compared via regression analysis. Population pharmacokinetic models were developed independently using ELISA and LC/MS/MS data from 76 kidney recipients. Derived kinetic parameters were used to formulate typical dosing regimens for concentration targeting. Dosage recommendations for the two assays were compared. The relation between LC/MS/MS and ELISA measurements was best described by the regression equation ELISA = 1.02 . (LC/MS/MS) + 0.14 in kidney recipients, and ELISA = 1.12 . (LC/MS/MS) - 0.87 in liver recipients. ELISA displayed less accuracy than LC/MS/MS at lower tacrolimus concentrations. Population pharmacokinetic models based on ELISA and LC/MS/MS data were similar with residual random errors of 4.1 ng/mL and 3.7 ng/mL, respectively. Assay choice gave rise to dosage prediction differences ranging from 0% to 30%. ELISA measurements of tacrolimus are not automatically interchangeable with LC/MS/MS values. Assay differences were greatest in adult liver recipients, probably reflecting periods of liver dysfunction and impaired biliary secretion of metabolites. While the majority of data collected in this study suggested assay differences in adult kidney recipients were minimal, findings of ELISA dosage underpredictions of up to 25% in the long term must be investigated further.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new approach to the LU decomposition method for the simulation of stationary and ergodic random fields. The approach overcomes the size limitations of LU and is suitable for any size simulation. The proposed approach can facilitate fast updating of generated realizations with new data, when appropriate, without repeating the full simulation process. Based on a novel column partitioning of the L matrix, expressed in terms of successive conditional covariance matrices, the approach presented here demonstrates that LU simulation is equivalent to the successive solution of kriging residual estimates plus random terms. Consequently, it can be used for the LU decomposition of matrices of any size. The simulation approach is termed conditional simulation by successive residuals as at each step, a small set (group) of random variables is simulated with a LU decomposition of a matrix of updated conditional covariance of residuals. The simulated group is then used to estimate residuals without the need to solve large systems of equations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst-case design settings, the disturbance is considered random with imprecisely known probability distribution. The prior set of probability measures can be chosen so as to quantify how far the disturbance deviates from the white-noise hypothesis of Linear Quadratic Gaussian control. Such deviation can be measured by the minimal Kullback-Leibler informational divergence from the Gaussian distributions with zero mean and scalar covariance matrices. The resulting anisotropy functional is defined for finite power random vectors. Originally, anisotropy was introduced for directionally generic random vectors as the relative entropy of the normalized vector with respect to the uniform distribution on the unit sphere. The associated a-anisotropic norm of a matrix is then its maximum root mean square or average energy gain with respect to finite power or directionally generic inputs whose anisotropy is bounded above by a≥0. We give a systematic comparison of the anisotropy functionals and the associated norms. These are considered for unboundedly growing fragments of homogeneous Gaussian random fields on multidimensional integer lattice to yield mean anisotropy. Correspondingly, the anisotropic norms of finite matrices are extended to bounded linear translation invariant operators over such fields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O objetivo desta dissertação é analisar a relação existente entre remuneração executiva e desempenho em companhias brasileiras de capital aberto listadas na BM&FBOVESPA. A linha teórica parte do pressuposto que o contrato de incentivos corrobora com o alinhamento de interesses entre acionistas e executivos e atua como um mecanismo de governança corporativa a fim de direcionar os esforços dos executivos para maximização de valor da companhia. A amostra foi composta pelas 100 companhias mais líquidas listadas em quantidade de negociações de ações na BM&FBOVESPA durante o período 2010-2012, totalizando 296 observações. Os dados foram extraídos dos Formulários de Referência disponibilizados pela CVM e a partir dos softwares Economática® e Thomson Reuters ®. Foram estabelecidas oito hipóteses de pesquisa e estimados modelos de regressão linear múltipla com a técnica de dados em painel desbalanceado, empregando como variável dependente a remuneração total e a remuneração média individual e como regressores variáveis concernentes ao desempenho operacional, valor de mercado, tamanho, estrutura de propriedade, governança corporativa, além de variáveis de controle. Para verificar os fatores que explicam a utilização de stock options, programa de bônus e maior percentual de remuneração variável foram estimados modelos de regressão logit. Os resultados demonstram que, na amostra selecionada, existe relação positiva entre remuneração executiva e valor de mercado. Verificou-se também que os setores de mineração, química, petróleo e gás exercem influência positiva na remuneração executiva. Não obstante, exerce relação inversa com a remuneração total à concentração acionária, o controle acionário público e o fato da companhia pertencer ao nível 2 ou novo mercado conforme classificação da BMF&BOVESPA. O maior valor de mercado influencia na utilização de stock options, assim como no emprego de bônus, sendo que este também é impactado pelo maior desempenho contábil. Foram empregados também testes de robustez com estimações por efeitos aleatórios, regressões com erros-padrão robustos clusterizados, modelos dinâmicos e os resultados foram similares. Conclui-se que a remuneração executiva está relacionada com o valor corporativo gerando riqueza aos acionistas, mas que a ausência de relação com o desempenho operacional sugere falhas no sistema remuneratório que ainda depende de maior transparência e outros mecanismos de governança para alinhar os interesses entre executivos e acionistas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Survival analysis is applied when the time until the occurrence of an event is of interest. Such data are routinely collected in plant diseases, although applications of the method are uncommon. The objective of this study was to use two studies on post-harvest diseases of peaches, considering two harvests together and the existence of random effect shared by fruits of a same tree, in order to describe the main techniques in survival analysis. The nonparametric Kaplan-Meier method, the log-rank test and the semi-parametric Cox's proportional hazards model were used to estimate the effect of cultivars and the number of days after full bloom on the survival to the brown rot symptom and the instantaneous risk of expressing it in two consecutive harvests. The joint analysis with baseline effect, varying between harvests, and the confirmation of the tree effect as a grouping factor with random effect were appropriate to interpret the phenomenon (disease) evaluated and can be important tools to replace or complement the conventional analysis, respecting the nature of the variable and the phenomenon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses the commercial and socio-demographic antecedents of the importance of price in buyers' decisions. The study uses ordinal regression in order to analyze the data obtained from a random sample of consumers of frequently purchased products; these consumers were surveyed in different stores. The results demonstrate that shopping enjoyment and brand loyalty have an influence over the importance of price. However, responsibility for shopping (purchase frequency) does not show a significant relationship. Furthermore, some interesting socio-demographic characteristics were found in the context of the study that can be analyzed in future research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: A case-control study of patients with pneumonia was conducted to investigate whether wheezing diseases could be a risk factor. METHODS: A random sample was taken from a general university hospital in S. Paulo City between March and August 1994 comprising 51 cases of pneumonia paired by age and sex to 51 non-respiratory controls and 51 healthy controls. Data collection was carried out by two senior paediatricians. Diagnoses of pneumonia and presence of wheezing disease were independently established by each paediatrician for both cases and controls. Pneumonia was radiologically confirmed and repeatability of information on wheezing diseases was measured. Logistic regression analysis was used to identify risk factors. RESULTS: Wheezing diseases, interpreted as proxies of asthma, were found to be an important risk factor for pneumonia with an odds ratio of 7.07 (95%CI= 2.34-21.36), when the effects of bedroom crowding (odds ratio = 1.49 per person, 95%CI= 0.95-2.32) and of low family income (odds ratio = 5.59 against high family income, 95%CI= 1.38-22.63) were controlled. The risk of pneumonia attributable to wheezing diseases is tentatively calculated at 51.42%. CONCLUSION: It is concluded that at practice level asthmatics should deserve proper surveillance for infection and that at public health level pneumonia incidence could be reduced if current World Health Organisation's guidelines were reviewed as to include comprehensive care for this illness.