996 resultados para dose optimization
Resumo:
Increasing evidence suggests that adoptive transfer of antigen-specific CD8(+) T cells could represent an effective strategy in the fight against chronic viral infections and malignancies such as melanoma. None the less, a major limitation in the implementation of such therapy resides in the difficulties associated with achieving rapid and efficient expansion of functional T cells in culture necessary to obtain the large numbers required for intravenous infusion. Recently, the critical role of the cytokines interleukin (IL)-2, IL-7 and IL-15 in driving T cell proliferation has been emphasized, thus suggesting their use in the optimization of expansion protocols. We have used major histocompatibility complex (MHC) class I/peptide multimers to monitor the expansion of antigen-specific CD8 T lymphocytes from whole blood, exploring the effect of antigenic peptide dose, IL-2, IL-7 and IL-15 concentrations on the magnitude and functional characteristics of the antigen-specific CD8(+) T cells generated. We show here that significant expansions of antigen-specific T cells, up to 50% of the CD8(+) T cell population, can be obtained after a single round of antigen/cytokine (IL-2 or IL-15) stimulation, and that these cells display good cytolytic and interferon (IFN)-gamma secretion capabilities. Our results provide an important basis for the rapid in vitro expansion of autologous T cells from the circulating lymphocyte pool using a simple procedure, which is necessary for the development of adoptive transfer therapies.
Resumo:
En aquest projecte s’ha analitzat i optimitzat l’enllaç satèl·lit amb avió per a un sistema aeronàutic global. Aquest nou sistema anomenat ANTARES està dissenyat per a comunicar avions amb estacions base mitjançant un satèl·lit. Aquesta és una iniciativa on hi participen institucions oficials en l’aviació com ara l’ECAC i que és desenvolupat en una col·laboració europea d’universitats i empreses. El treball dut a terme en el projecte compren bàsicament tres aspectes. El disseny i anàlisi de la gestió de recursos. La idoneïtat d’utilitzar correcció d’errors en la capa d’enllaç i en cas que sigui necessària dissenyar una opció de codificació preliminar. Finalment, estudiar i analitzar l’efecte de la interferència co-canal en sistemes multifeix. Tots aquests temes són considerats només per al “forward link”. L’estructura que segueix el projecte és primer presentar les característiques globals del sistema, després centrar-se i analitzar els temes mencionats per a poder donar resultats i extreure conclusions.
Resumo:
We evaluate the performance of different optimization techniques developed in the context of optical flowcomputation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we develop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional multilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrectional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimization search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow computation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.
Resumo:
Restriction site-associated DNA sequencing (RADseq) provides researchers with the ability to record genetic polymorphism across thousands of loci for nonmodel organisms, potentially revolutionizing the field of molecular ecology. However, as with other genotyping methods, RADseq is prone to a number of sources of error that may have consequential effects for population genetic inferences, and these have received only limited attention in terms of the estimation and reporting of genotyping error rates. Here we use individual sample replicates, under the expectation of identical genotypes, to quantify genotyping error in the absence of a reference genome. We then use sample replicates to (i) optimize de novo assembly parameters within the program Stacks, by minimizing error and maximizing the retrieval of informative loci; and (ii) quantify error rates for loci, alleles and single-nucleotide polymorphisms. As an empirical example, we use a double-digest RAD data set of a nonmodel plant species, Berberis alpina, collected from high-altitude mountains in Mexico.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Graph pebbling is a network model for studying whether or not a given supply of discrete pebbles can satisfy a given demand via pebbling moves. A pebbling move across an edge of a graph takes two pebbles from one endpoint and places one pebble at the other endpoint; the other pebble is lost in transit as a toll. It has been shown that deciding whether a supply can meet a demand on a graph is NP-complete. The pebbling number of a graph is the smallest t such that every supply of t pebbles can satisfy every demand of one pebble. Deciding if the pebbling number is at most k is NP 2 -complete. In this paper we develop a tool, called theWeight Function Lemma, for computing upper bounds and sometimes exact values for pebbling numbers with the assistance of linear optimization. With this tool we are able to calculate the pebbling numbers of much larger graphs than in previous algorithms, and much more quickly as well. We also obtain results for many families of graphs, in many cases by hand, with much simpler and remarkably shorter proofs than given in previously existing arguments (certificates typically of size at most the number of vertices times the maximum degree), especially for highly symmetric graphs. Here we apply theWeight Function Lemma to several specific graphs, including the Petersen, Lemke, 4th weak Bruhat, Lemke squared, and two random graphs, as well as to a number of infinite families of graphs, such as trees, cycles, graph powers of cycles, cubes, and some generalized Petersen and Coxeter graphs. This partly answers a question of Pachter, et al., by computing the pebbling exponent of cycles to within an asymptotically small range. It is conceivable that this method yields an approximation algorithm for graph pebbling.
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.
Resumo:
The paper develops a stability theory for the optimal value and the optimal set mapping of optimization problems posed in a Banach space. The problems considered in this paper have an arbitrary number of inequality constraints involving lower semicontinuous (not necessarily convex) functions and one closed abstract constraint set. The considered perturbations lead to problems of the same type as the nominal one (with the same space of variables and the same number of constraints), where the abstract constraint set can also be perturbed. The spaces of functions involved in the problems (objective and constraints) are equipped with the metric of the uniform convergence on the bounded sets, meanwhile in the space of closed sets we consider, coherently, the Attouch-Wets topology. The paper examines, in a unified way, the lower and upper semicontinuity of the optimal value function, and the closedness, lower and upper semicontinuity (in the sense of Berge) of the optimal set mapping. This paper can be seen as a second part of the stability theory presented in [17], where we studied the stability of the feasible set mapping (completed here with the analysis of the Lipschitz-like property).
Resumo:
This paper presents the Juste-Neige system for predicting the snow height on the ski runs of a resort using a multi-agent simulation software. Its aim is to facilitate snow cover management in order to i) reduce the production cost of artificial snow and to improve the profit margin for the companies managing the ski resorts; and ii) to reduce the water and energy consumption, and thus to reduce the environmental impact, by producing only the snow needed for a good skiing experience. The software provides maps with the predicted snow heights for up to 13 days. On these maps, the areas most exposed to snow erosion are highlighted. The software proceeds in three steps: i) interpolation of snow height measurements with a neural network; ii) local meteorological forecasts for every ski resort; iii) simulation of the impact caused by skiers using a multi-agent system. The software has been evaluated in the Swiss ski resort of Verbier and provides useful predictions.
Resumo:
Purpose/Objective(s): RTwith TMZ is the standard for GBM. dd TMZ causes prolongedMGMTdepletion in mononuclear cells and possibly in tumor. The RTOG 0525 trial (ASCO 2011) did not show an advantage from dd TMZ for survival or progression free survival. We conducted exploratory, hypothesis-generating subset analyses to detect possible benefit from dd TMZ.Materials/Methods: Patients were randomized to std (150-200 mg/m2 x 5 d) or dd TMZ (75-100 mg/m2 x 21 d) q 4 weeks for 6- 12 cycles. Eligibility included age.18, KPS$ 60, and. 1 cm2 tissue for prospective MGMTanalysis for stratification. Furtheranalyses were performed for all randomized patients (''intent-to-treat'', ITT), and for all patients starting protocol therapy (SPT). Subset analyses were performed by RPA class (III, IV, V), KPS (90-100, = 50,\50), resection (partial, total), gender (female, male), and neurologic dysfunction (nf = none, minor, moderate).Results: No significant difference was seen for median OS (16.6 vs. 14.9 months), or PFS (5.5 vs. 6.7 months, p = 0.06). MGMT methylation was linked to improved OS (21.2 vs. 14 months, p\0.0001), and PFS (8.7 vs. 5.7 months, p\0.0001). For the ITT (n = 833), there was no OS benefit from dd TMZ in any subset. Two subsets showed a PFS benefit for dd TMZ: RPA class III (6.2 vs. 12.6 months, HR 0.69, p = 0.03) and nf = minor (HR 0.77, p = 0.01). For RPA III, dd dramatically delayed progression, but post-progression dd patients died more quickly than std. A similar pattern for nf = minor was observed. For the SPT group (n = 714) there was neither PFS nor OS benefit for dd TMZ, overall. For RPA class III and nf = minor, there was a PFS benefit for dd TMZ (HR 0.73, p = 0.08; HR 0.77, p = 0.02). For nf = moderate subset, both ITT and SPT, the std arm showed superior OS (14.4 vs. 10.9 months) compared to dd, without improved PFS (HR 1.46, p = 0.03; and HR 1.74, p = 0.01. In terms of methylation status within this subset, there were more methylated patients in the std arm of the ITT subset (n = 159; 32 vs. 24%). For the SPT subset (n = 124), methylation status was similar between arms.Conclusions: This study did not demonstrate improved OS for dd TMZ for any subgroup, but for 2 highly functional subgroups, PFS was significantly increased. These data generate the testable hypothesis that intensive treatment may selectively improve disease control in those most likely able to tolerate dd therapy. Interpretation of this should be considered carefully due to small sample size, the process of multiple observations, and other confounders.Acknowledgment: This project was supported by RTOG grant U10 CA21661, and CCOP grant U10 CA37422 from the National Cancer Institute (NCI).
Resumo:
OBJECTIVE: To assess the impact of nonuniform dose distribution within lesions and tumor-involved organs of patients receiving Zevalin, and to discuss possible implications of equivalent uniform biological effective doses (EU-BED) on treatment efficacy and toxicity. MATLAB? -based software for voxel-based dosimetry was adopted for this purpose. METHODS: Eleven lesions from seven patients with either indolent or aggressive non-Hodgkin lymphoma were analyzed, along with four organs with disease. Absorbed doses were estimated by a direct integration of single-voxel kinetic data from serial tomographic images. After proper corrections, differential BED distributions and surviving cell fractions were estimated, allowing for the calculation of EU-BED. To quantify dose uniformity in each target area, a heterogeneity index was defined. RESULTS: Average doses were below those prescribed by conventional radiotherapy to eradicate lymphoma lesions. Dose heterogeneity and effect on tumor control varied among lesions, with no apparent relation to tumor mass. Although radiation doses to involved organs were safe, unexpected liver toxicity occurred in one patient who presented with a pattern of diffuse infiltration. CONCLUSION: Voxel-based dosimetry and radiobiologic modeling can be successfully applied to lesions and tumor-involved organs, representing a methodological advance over estimation of mean absorbed doses. However, effects on tumor control and organ toxicity still cannot be easily predicted.