856 resultados para regression algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Ipilimumab is a monoclonal antibody that blocks the immune-inhibitory interaction between CTL antigen 4 (CTLA-4) and its ligands on T cells. Clinical trials in cancer patients with ipilimumab have shown promising antitumor activity, particularly in patients with advanced melanoma. Often, tumor regressions in these patients are correlated with immune-related side effects such as dermatitis, enterocolitis, and hypophysitis. Although these reactions are believed to be immune-mediated, the antigenic targets for the cellular or humoral immune response are not known. EXPERIMENTAL DESIGN: We enrolled patients with advanced melanoma in a phase II study with ipilimumab. One of these patients experienced a complete remission of his tumor. The specificity and functional properties of CD8-positive T cells in his peripheral blood, in regressing tumor tissue, and at the site of an immune-mediated skin rash were investigated. RESULTS: Regressing tumor tissue was infiltrated with CD8-positive T cells, a high proportion of which were specific for Melan-A. The skin rash was similarly infiltrated with Melan-A-specific CD8-positive T cells, and a dramatic (>30-fold) increase in Melan-A-specific CD8-positive T cells was apparent in peripheral blood. These cells had an effector phenotype and lysed Melan-A-expressing tumor cells. CONCLUSIONS: Our results show that Melan-A may be a major target for both the autoimmune and antitumor reactions in patients treated with anti-CTLA-4, and describe for the first time the antigen specificity of CD8-positive T cells that mediate tumor rejection in a patient undergoing treatment with an anti-CTLA-4 antibody. These findings may allow a better integration of ipilimumab into other forms of immunotherapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this work is to establish a relationship between schistosomiasis prevalence and social-environmental variables, in the state of Minas Gerais, Brazil, through multiple linear regression. The final regression model was established, after a variables selection phase, with a set of spatial variables which contains the summer minimum temperature, human development index, and vegetation type variables. Based on this model, a schistosomiasis risk map was built for Minas Gerais.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Coronary magnetic resonance angiography (MRA) is a medical imaging technique that involves collecting data from consecutive heartbeats, always at the same time in the cardiac cycle, in order to minimize heart motion artifacts. This technique relies on the assumption that coronary arteries always follow the same trajectory from heartbeat to heartbeat. Until now, choosing the acquisition window in the cardiac cycle was based exclusively on the position of minimal coronary motion. The goal of this study was to test the hypothesis that there are time intervals during the cardiac cycle when coronary beat-to-beat repositioning is optimal. The repositioning uncertainty values in these time intervals were then compared with the intervals of low coronary motion in order to propose an optimal acquisition window for coronary MRA. Methods: Cine breath-hold x-ray angiograms with synchronous ECG were collected from 11 patients who underwent elective routine diagnostic coronarography. Twenty-three bifurcations of the left coronary artery were selected as markers to evaluate repositioning uncertainty and velocity during cardiac cycle. Each bifurcation was tracked by two observers, with the help of a user-assisted algorithm implemented in Matlab (The Mathworks, Natick, MA, USA) that compared the trajectories of the markers coming from consecutive heartbeats and computed the coronary repositioning uncertainty with steps of 50ms until 650ms after the R-wave. Repositioning uncertainty was defined as the diameter of the smallest circle encompassing the points to be compared at the same time after the R-wave. Student's t-tests with a false discovery rate (FDR, q=0.1) correction for multiple comparison were applied to see whether coronary repositioning and velocity vary statistically during cardiac cycle. Bland-Altman plots and linear regression were used to assess intra- and inter-observer agreement. Results: The analysis of left coronary artery beat-to-beat repositioning uncertainty shows a tendency to have better repositioning in mid systole (less than 0.84±0.58mm) and mid diastole (less than 0.89±0.6mm) than in the rest of the cardiac cycle (highest value at 50ms=1.35±0.64mm). According to Student's t-tests with FDR correction for multiple comparison (q=0.1), two intervals, in mid systole (150-200ms) and mid diastole (550-600ms), provide statistically better repositioning in comparison with the early systole and the early diastole. Coronary velocity analysis reveals that left coronary artery moves more slowly in end systole (14.35±11.35mm/s at 225ms) and mid diastole (11.78±11.62mm/s at 625ms) than in the rest of the cardiac cycle (highest value at 25ms: 55.96±22.34mm/s). This was confirmed by Student's t-tests with FDR correction for multiple comparison (q=0.1, FDR-corrected p-value=0.054): coronary velocity values at 225, 575 and 625ms are not much different between them but they are statistically inferior to all others. Bland-Altman plots and linear regression show that intra-observer agreement (y=0.97x+0.02 with R²=0.93 at 150ms) is better than inter-observer (y=0.8x+0.11 with R²=0.67 at 150ms). Discussion: The present study has demonstrated that there are two time intervals in the cardiac cycle, one in mid systole and one in mid diastole, where left coronary artery repositioning uncertainty reaches points of local minima. It has also been calculated that the velocity is the lowest in end systole and mid diastole. Since systole is less influenced by heart rate variability than diastole, it was finally proposed to test an acquisition window between 150 and 200ms after the R-wave.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The accumulation of mutations after long-lasting exposure to a failing combination antiretroviral therapy (cART) is problematic and severely reduces the options for further successful treatments. METHODS: We studied patients from the Swiss HIV Cohort Study who failed cART with nucleoside reverse transcriptase inhibitors (NRTIs) and either a ritonavir-boosted PI (PI/r) or a non-nucleoside reverse transcriptase inhibitor (NNRTI). The loss of genotypic activity <3, 3-6, >6 months after virological failure was analyzed with Stanford algorithm. Risk factors associated with early emergence of drug resistance mutations (<6 months after failure) were identified with multivariable logistic regression. RESULTS: Ninety-nine genotypic resistance tests from PI/r-treated and 129 from NNRTI-treated patients were analyzed. The risk of losing the activity of ≥1 NRTIs was lower among PI/r- compared to NNRTI-treated individuals <3, 3-6, and >6 months after failure: 8.8% vs. 38.2% (p = 0.009), 7.1% vs. 46.9% (p<0.001) and 18.9% vs. 60.9% (p<0.001). The percentages of patients who have lost PI/r activity were 2.9%, 3.6% and 5.4% <3, 3-6, >6 months after failure compared to 41.2%, 49.0% and 63.0% of those who have lost NNRTI activity (all p<0.001). The risk to accumulate an early NRTI mutation was strongly associated with NNRTI-containing cART (adjusted odds ratio: 13.3 (95% CI: 4.1-42.8), p<0.001). CONCLUSIONS: The loss of activity of PIs and NRTIs was low among patients treated with PI/r, even after long-lasting exposure to a failing cART. Thus, more options remain for second-line therapy. This finding is potentially of high relevance, in particular for settings with poor or lacking virological monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A wide range of numerical models and tools have been developed over the last decades to support the decision making process in environmental applications, ranging from physical models to a variety of statistically-based methods. In this study, a landslide susceptibility map of a part of Three Gorges Reservoir region of China was produced, employing binary logistic regression analyses. The available information includes the digital elevation model of the region, geological map and different GIS layers including land cover data obtained from satellite imagery. The landslides were observed and documented during the field studies. The validation analysis is exploited to investigate the quality of mapping.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time series regression models are especially suitable in epidemiology for evaluating short-term effects of time-varying exposures on health. The problem is that potential for confounding in time series regression is very high. Thus, it is important that trend and seasonality are properly accounted for. Our paper reviews the statistical models commonly used in time-series regression methods, specially allowing for serial correlation, make them potentially useful for selected epidemiological purposes. In particular, we discuss the use of time-series regression for counts using a wide range Generalised Linear Models as well as Generalised Additive Models. In addition, recently critical points in using statistical software for GAM were stressed, and reanalyses of time series data on air pollution and health were performed in order to update already published. Applications are offered through an example on the relationship between asthma emergency admissions and photochemical air pollutants

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robust Huber type regression and testing of linear hypotheses are adapted to statistical analysis of parallel line and slope ratio assays. They are applied in the evaluation of results of several experiments carried out in order to compare and validate alternatives to animal experimentation based on embryo and cell cultures. Computational procedures necessary for the application of robust methods of analysis used the conversational statistical package ROBSYS. Special commands for the analysis of parallel line and slope ratio assays have been added to ROBSYS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The multiscale finite volume (MsFV) method has been developed to efficiently solve large heterogeneous problems (elliptic or parabolic); it is usually employed for pressure equations and delivers conservative flux fields to be used in transport problems. The method essentially relies on the hypothesis that the (fine-scale) problem can be reasonably described by a set of local solutions coupled by a conservative global (coarse-scale) problem. In most cases, the boundary conditions assigned for the local problems are satisfactory and the approximate conservative fluxes provided by the method are accurate. In numerically challenging cases, however, a more accurate localization is required to obtain a good approximation of the fine-scale solution. In this paper we develop a procedure to iteratively improve the boundary conditions of the local problems. The algorithm relies on the data structure of the MsFV method and employs a Krylov-subspace projection method to obtain an unconditionally stable scheme and accelerate convergence. Two variants are considered: in the first, only the MsFV operator is used; in the second, the MsFV operator is combined in a two-step method with an operator derived from the problem solved to construct the conservative flux field. The resulting iterative MsFV algorithms allow arbitrary reduction of the solution error without compromising the construction of a conservative flux field, which is guaranteed at any iteration. Since it converges to the exact solution, the method can be regarded as a linear solver. In this context, the schemes proposed here can be viewed as preconditioned versions of the Generalized Minimal Residual method (GMRES), with a very peculiar characteristic that the residual on the coarse grid is zero at any iteration (thus conservative fluxes can be obtained).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a parallel architecture for estimation of the motion of an underwater robot. It is well known that image processing requires a huge amount of computation, mainly at low-level processing where the algorithms are dealing with a great number of data. In a motion estimation algorithm, correspondences between two images have to be solved at the low level. In the underwater imaging, normalised correlation can be a solution in the presence of non-uniform illumination. Due to its regular processing scheme, parallel implementation of the correspondence problem can be an adequate approach to reduce the computation time. Taking into consideration the complexity of the normalised correlation criteria, a new approach using parallel organisation of every processor from the architecture is proposed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In computer graphics, global illumination algorithms take into account not only the light that comes directly from the sources, but also the light interreflections. This kind of algorithms produce very realistic images, but at a high computational cost, especially when dealing with complex environments. Parallel computation has been successfully applied to such algorithms in order to make it possible to compute highly-realistic images in a reasonable time. We introduce here a speculation-based parallel solution for a global illumination algorithm in the context of radiosity, in which we have taken advantage of the hierarchical nature of such an algorithm

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffusion tensor magnetic resonance imaging, which measures directional information of water diffusion in the brain, has emerged as a powerful tool for human brain studies. In this paper, we introduce a new Monte Carlo-based fiber tracking approach to estimate brain connectivity. One of the main characteristics of this approach is that all parameters of the algorithm are automatically determined at each point using the entropy of the eigenvalues of the diffusion tensor. Experimental results show the good performance of the proposed approach