889 resultados para Heterogeneous regression
Resumo:
BACKGROUND: We assessed the impact of a multicomponent worksite health promotion program for0 reducing cardiovascular risk factors (CVRF) with short intervention, adjusting for regression towards the mean (RTM) affecting such nonexperimental study without control group. METHODS: A cohort of 4,198 workers (aged 42 +/- 10 years, range 16-76 years, 27% women) were analyzed at 3.7-year interval and stratified by each CVRF risk category (low/medium/high blood pressure [BP], total cholesterol [TC], body mass index [BMI], and smoking) with RTM and secular trend adjustments. Intervention consisted of 15 min CVRF screening and individualized counseling by health professionals to medium- and high-risk individuals, with eventual physician referral. RESULTS: High-risk groups participants improved diastolic BP (-3.4 mm Hg [95%CI: -5.1, -1.7]) in 190 hypertensive patients, TC (-0.58 mmol/l [-0.71, -0.44]) in 693 hypercholesterolemic patients, and smoking (-3.1 cig/day [-3.9, -2.3]) in 808 smokers, while systolic BP changes reflected RTM. Low-risk individuals without counseling deteriorated TC and BMI. Body weight increased uniformly in all risk groups (+0.35 kg/year). CONCLUSIONS: In real-world conditions, short intervention program participants in high-risk groups for diastolic BP, TC, and smoking improved their CVRF, whereas low-risk TC and BMI groups deteriorated. Future programs may include specific advises to low-risk groups to maintain a favorable CVRF profile.
Resumo:
The main goal of CleanEx is to provide access to public gene expression data via unique gene names. A second objective is to represent heterogeneous expression data produced by different technologies in a way that facilitates joint analysis and cross-data set comparisons. A consistent and up-to-date gene nomenclature is achieved by associating each single experiment with a permanent target identifier consisting of a physical description of the targeted RNA population or the hybridization reagent used. These targets are then mapped at regular intervals to the growing and evolving catalogues of human genes and genes from model organisms. The completely automatic mapping procedure relies partly on external genome information resources such as UniGene and RefSeq. The central part of CleanEx is a weekly built gene index containing cross-references to all public expression data already incorporated into the system. In addition, the expression target database of CleanEx provides gene mapping and quality control information for various types of experimental resource, such as cDNA clones or Affymetrix probe sets. The web-based query interfaces offer access to individual entries via text string searches or quantitative expression criteria. CleanEx is accessible at: http://www.cleanex.isb-sib.ch/.
Resumo:
There is increasing evidence to suggest that the presence of mesoscopic heterogeneities constitutes an important seismic attenuation mechanism in porous rocks. As a consequence, centimetre-scale perturbations of the rock physical properties should be taken into account for seismic modelling whenever detailed and accurate responses of specific target structures are desired, which is, however, computationally prohibitive. A convenient way to circumvent this problem is to use an upscaling procedure to replace each of the heterogeneous porous media composing the geological model by corresponding equivalent visco-elastic solids and to solve the visco-elastic equations of motion for the inferred equivalent model. While the overall qualitative validity of this procedure is well established, there are as of yet no quantitative analyses regarding the equivalence of the seismograms resulting from the original poro-elastic and the corresponding upscaled visco-elastic models. To address this issue, we compare poro-elastic and visco-elastic solutions for a range of marine-type models of increasing complexity. We found that despite the identical dispersion and attenuation behaviour of the heterogeneous poro-elastic and the equivalent visco-elastic media, the seismograms may differ substantially due to diverging boundary conditions, where there exist additional options for the poro-elastic case. In particular, we observe that at the fluid/porous-solid interface, the poro- and visco-elastic seismograms agree for closed-pore boundary conditions, but differ significantly for open-pore boundary conditions. This is an important result which has potentially far-reaching implications for wave-equation-based algorithms in exploration geophysics involving fluid/porous-solid interfaces, such as, for example, wavefield decomposition.
Resumo:
Simulated-annealing-based conditional simulations provide a flexible means of quantitatively integrating diverse types of subsurface data. Although such techniques are being increasingly used in hydrocarbon reservoir characterization studies, their potential in environmental, engineering and hydrological investigations is still largely unexploited. Here, we introduce a novel simulated annealing (SA) algorithm geared towards the integration of high-resolution geophysical and hydrological data which, compared to more conventional approaches, provides significant advancements in the way that large-scale structural information in the geophysical data is accounted for. Model perturbations in the annealing procedure are made by drawing from a probability distribution for the target parameter conditioned to the geophysical data. This is the only place where geophysical information is utilized in our algorithm, which is in marked contrast to other approaches where model perturbations are made through the swapping of values in the simulation grid and agreement with soft data is enforced through a correlation coefficient constraint. Another major feature of our algorithm is the way in which available geostatistical information is utilized. Instead of constraining realizations to match a parametric target covariance model over a wide range of spatial lags, we constrain the realizations only at smaller lags where the available geophysical data cannot provide enough information. Thus we allow the larger-scale subsurface features resolved by the geophysical data to have much more due control on the output realizations. Further, since the only component of the SA objective function required in our approach is a covariance constraint at small lags, our method has improved convergence and computational efficiency over more traditional methods. Here, we present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on a synthetic data set, and then applied to data collected at the Boise Hydrogeophysical Research Site.
Resumo:
We describe the case of a man with a history of complex partial seizures and severe language, cognitive and behavioural regression during early childhood (3.5 years), who underwent epilepsy surgery at the age of 25 years. His early epilepsy had clinical and electroencephalogram features of the syndromes of epilepsy with continuous spike waves during sleep and acquired epileptic aphasia (Landau-Kleffner syndrome), which we considered initially to be of idiopathic origin. Seizures recurred at 19 years and presurgical investigations at 25 years showed a lateral frontal epileptic focus with spread to Broca's area and the frontal orbital regions. Histopathology revealed a focal cortical dysplasia, not visible on magnetic resonance imaging. The prolonged but reversible early regression and the residual neuropsychological disorders during adulthood were probably the result of an active left frontal epilepsy, which interfered with language and behaviour during development. Our findings raise the question of the role of focal cortical dysplasia as an aetiology in the syndromes of epilepsy with continuous spike waves during sleep and acquired epileptic aphasia.
Resumo:
Abstract
Resumo:
Spatial data analysis mapping and visualization is of great importance in various fields: environment, pollution, natural hazards and risks, epidemiology, spatial econometrics, etc. A basic task of spatial mapping is to make predictions based on some empirical data (measurements). A number of state-of-the-art methods can be used for the task: deterministic interpolations, methods of geostatistics: the family of kriging estimators (Deutsch and Journel, 1997), machine learning algorithms such as artificial neural networks (ANN) of different architectures, hybrid ANN-geostatistics models (Kanevski and Maignan, 2004; Kanevski et al., 1996), etc. All the methods mentioned above can be used for solving the problem of spatial data mapping. Environmental empirical data are always contaminated/corrupted by noise, and often with noise of unknown nature. That's one of the reasons why deterministic models can be inconsistent, since they treat the measurements as values of some unknown function that should be interpolated. Kriging estimators treat the measurements as the realization of some spatial randomn process. To obtain the estimation with kriging one has to model the spatial structure of the data: spatial correlation function or (semi-)variogram. This task can be complicated if there is not sufficient number of measurements and variogram is sensitive to outliers and extremes. ANN is a powerful tool, but it also suffers from the number of reasons. of a special type ? multiplayer perceptrons ? are often used as a detrending tool in hybrid (ANN+geostatistics) models (Kanevski and Maignank, 2004). Therefore, development and adaptation of the method that would be nonlinear and robust to noise in measurements, would deal with the small empirical datasets and which has solid mathematical background is of great importance. The present paper deals with such model, based on Statistical Learning Theory (SLT) - Support Vector Regression. SLT is a general mathematical framework devoted to the problem of estimation of the dependencies from empirical data (Hastie et al, 2004; Vapnik, 1998). SLT models for classification - Support Vector Machines - have shown good results on different machine learning tasks. The results of SVM classification of spatial data are also promising (Kanevski et al, 2002). The properties of SVM for regression - Support Vector Regression (SVR) are less studied. First results of the application of SVR for spatial mapping of physical quantities were obtained by the authorsin for mapping of medium porosity (Kanevski et al, 1999), and for mapping of radioactively contaminated territories (Kanevski and Canu, 2000). The present paper is devoted to further understanding of the properties of SVR model for spatial data analysis and mapping. Detailed description of the SVR theory can be found in (Cristianini and Shawe-Taylor, 2000; Smola, 1996) and basic equations for the nonlinear modeling are given in section 2. Section 3 discusses the application of SVR for spatial data mapping on the real case study - soil pollution by Cs137 radionuclide. Section 4 discusses the properties of the modelapplied to noised data or data with outliers.
Resumo:
The objective of this work was to estimate the stability and adaptability of pod and seed yield in runner peanut genotypes based on the nonlinear regression and AMMI analysis. Yield data from 11 trials, distributed in six environments and three harvests, carried out in the Northeast region of Brazil during the rainy season were used. Significant effects of genotypes (G), environments (E), and GE interactions were detected in the analysis, indicating different behaviors among genotypes in favorable and unfavorable environmental conditions. The genotypes BRS Pérola Branca and LViPE‑06 are more stable and adapted to the semiarid environment, whereas LGoPE‑06 is a promising material for pod production, despite being highly dependent on favorable environments.
Resumo:
This paper presents the general regression neural networks (GRNN) as a nonlinear regression method for the interpolation of monthly wind speeds in complex Alpine orography. GRNN is trained using data coming from Swiss meteorological networks to learn the statistical relationship between topographic features and wind speed. The terrain convexity, slope and exposure are considered by extracting features from the digital elevation model at different spatial scales using specialised convolution filters. A database of gridded monthly wind speeds is then constructed by applying GRNN in prediction mode during the period 1968-2008. This study demonstrates that using topographic features as inputs in GRNN significantly reduces cross-validation errors with respect to low-dimensional models integrating only geographical coordinates and terrain height for the interpolation of wind speed. The spatial predictability of wind speed is found to be lower in summer than in winter due to more complex and weaker wind-topography relationships. The relevance of these relationships is studied using an adaptive version of the GRNN algorithm which allows to select the useful terrain features by eliminating the noisy ones. This research provides a framework for extending the low-dimensional interpolation models to high-dimensional spaces by integrating additional features accounting for the topographic conditions at multiple spatial scales. Copyright (c) 2012 Royal Meteorological Society.
Resumo:
OBJECTIVES: Current indications for therapeutic hypothermia (TH) are restricted to comatose patients with cardiac arrest (CA) due to ventricular fibrillation (VF) and without circulatory shock. Additional studies are needed to evaluate the benefit of this treatment in more heterogeneous groups of patients, including those with non-VF rhythms and/or shock and to identify early predictors of outcome in this setting. DESIGN: Prospective study, from December 2004 to October 2006. SETTING: 32-bed medico-surgical intensive care unit, university hospital. PATIENTS: Comatose patients with out-of-hospital CA. INTERVENTIONS: TH to 33 +/- 1 degrees C (external cooling, 24 hrs) was administered to patients resuscitated from CA due to VF and non-VF (including asystole or pulseless electrical activity), independently from the presence of shock. MEASUREMENTS AND MAIN RESULTS: We hypothesized that simple clinical criteria available on hospital admission (initial arrest rhythm, duration of CA, and presence of shock) might help to identify patients who eventually survive and might most benefit from TH. For this purpose, outcome was related to these predefined variables. Seventy-four patients (VF 38, non-VF 36) were included; 46% had circulatory shock. Median duration of CA (time from collapse to return of spontaneous circulation [ROSC]) was 25 mins. Overall survival was 39.2%. However, only 3.1% of patients with time to ROSC > 25 mins survived, as compared to 65.7% with time to ROSC < or = 25 mins. Using a logistic regression analysis, time from collapse to ROSC, but not initial arrest rhythm or presence of shock, independently predicted survival at hospital discharge. CONCLUSIONS: Time from collapse to ROSC is strongly associated with outcome following VF and non-VF cardiac arrest treated with therapeutic hypothermia and could therefore be helpful to identify patients who benefit most from active induced cooling.
Resumo:
Peer-reviewed
Resumo:
OBJECTIVES: To evaluate the performance of the INTERMED questionnaire score, alone or combined with other criteria, in predicting return to work after a multidisciplinary rehabilitation program in patients with non-specific chronic low back pain. METHODS: The INTERMED questionnaire is a biopsychosocial assessment and clinical classification tool that separates heterogeneous populations into subgroups according to case complexity. We studied 88 patients with chronic low back pain who followed an intensive multidisciplinary rehabilitation program on an outpatient basis. Before the program, we recorded the INTERMED score, radiological abnormalities, subjective pain severity, and sick leave duration. Associations between these variables and return to full-time work within 3 months after the end of the program were evaluated using one-sided Fisher tests and univariate logistic regression followed by multivariate logistic regression. RESULTS: The univariate analysis showed a significant association between the INTERMED score and return to work (P<0.001; odds ratio, 0.90; 95% confidence interval, 0.86-0.96). In the multivariate analysis, prediction was best when the INTERMED score and sick leave duration were used in combination (P=0.03; odds ratio, 0.48; 95% confidence interval, 0.25-0.93). CONCLUSION: The INTERMED questionnaire is useful for evaluating patients with chronic low back pain. It could be used to improve the selection of patients for intensive multidisciplinary programs, thereby improving the quality of care, while reducing healthcare costs.
Resumo:
The objective of this work was to compare random regression models for the estimation of genetic parameters for Guzerat milk production, using orthogonal Legendre polynomials. Records (20,524) of test-day milk yield (TDMY) from 2,816 first-lactation Guzerat cows were used. TDMY grouped into 10-monthly classes were analyzed for additive genetic effect and for environmental and residual permanent effects (random effects), whereas the contemporary group, calving age (linear and quadratic effects) and mean lactation curve were analized as fixed effects. Trajectories for the additive genetic and permanent environmental effects were modeled by means of a covariance function employing orthogonal Legendre polynomials ranging from the second to the fifth order. Residual variances were considered in one, four, six, or ten variance classes. The best model had six residual variance classes. The heritability estimates for the TDMY records varied from 0.19 to 0.32. The random regression model that used a second-order Legendre polynomial for the additive genetic effect, and a fifth-order polynomial for the permanent environmental effect is adequate for comparison by the main employed criteria. The model with a second-order Legendre polynomial for the additive genetic effect, and that with a fourth-order for the permanent environmental effect could also be employed in these analyses.