910 resultados para Multi-Objective Optimization
Resumo:
Offset printing is a common method to produce large amounts of printed matter. We consider a real-world offset printing process that is used to imprint customer-specific designs on napkin pouches. The print- ing technology used yields a number of specific constraints. The planning problem consists of allocating designs to printing-plate slots such that the given customer demand for each design is fulfilled, all technologi- cal and organizational constraints are met and the total overproduction and setup costs are minimized. We formulate this planning problem as a mixed-binary linear program, and we develop a multi-pass matching-based savings heuristic. We report computational results for a set of problem instances devised from real-world data.
Resumo:
PURPOSE Survivin is a member of the inhibitor-of-apoptosis family. Essential for tumor cell survival and overexpressed in most cancers, survivin is a promising target for anti-cancer immunotherapy. Immunogenicity has been demonstrated in multiple cancers. Nonetheless, few clinical trials have demonstrated survivin-vaccine-induced immune responses. EXPERIMENTAL DESIGN This phase I trial was conducted to test whether vaccine EMD640744, a cocktail of five HLA class I-binding survivin peptides in Montanide(®) ISA 51 VG, promotes anti-survivin T-cell responses in patients with solid cancers. The primary objective was to compare immunologic efficacy of EMD640744 at doses of 30, 100, and 300 μg. Secondary objectives included safety, tolerability, and clinical efficacy. RESULTS In total, 49 patients who received ≥2 EMD640744 injections with available baseline- and ≥1 post-vaccination samples [immunologic-diagnostic (ID)-intention-to-treat] were analyzed by ELISpot- and peptide/MHC-multimer staining, revealing vaccine-activated peptide-specific T-cell responses in 31 patients (63 %). This cohort included the per study protocol relevant ID population for the primary objective, i.e., T-cell responses by ELISpot in 17 weeks following first vaccination, as well as subjects who discontinued the study before week 17 but showed responses to the treatment. No dose-dependent effects were observed. In the majority of patients (61 %), anti-survivin responses were detected only after vaccination, providing evidence for de novo induction. Best overall tumor response was stable disease (28 %). EMD640744 was well tolerated; local injection-site reactions constituted the most frequent adverse event. CONCLUSIONS Vaccination with EMD640744 elicited T-cell responses against survivin peptides in the majority of patients, demonstrating the immunologic efficacy of EMD640744.
Resumo:
BACKGROUND High-risk prostate cancer (PCa) is an extremely heterogeneous disease. A clear definition of prognostic subgroups is mandatory. OBJECTIVE To develop a pretreatment prognostic model for PCa-specific survival (PCSS) in high-risk PCa based on combinations of unfavorable risk factors. DESIGN, SETTING, AND PARTICIPANTS We conducted a retrospective multicenter cohort study including 1360 consecutive patients with high-risk PCa treated at eight European high-volume centers. INTERVENTION Retropubic radical prostatectomy with pelvic lymphadenectomy. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Two Cox multivariable regression models were constructed to predict PCSS as a function of dichotomization of clinical stage (< cT3 vs cT3-4), Gleason score (GS) (2-7 vs 8-10), and prostate-specific antigen (PSA; ≤ 20 ng/ml vs > 20 ng/ml). The first "extended" model includes all seven possible combinations; the second "simplified" model includes three subgroups: a good prognosis subgroup (one single high-risk factor); an intermediate prognosis subgroup (PSA >20 ng/ml and stage cT3-4); and a poor prognosis subgroup (GS 8-10 in combination with at least one other high-risk factor). The predictive accuracy of the models was summarized and compared. Survival estimates and clinical and pathologic outcomes were compared between the three subgroups. RESULTS AND LIMITATIONS The simplified model yielded an R(2) of 33% with a 5-yr area under the curve (AUC) of 0.70 with no significant loss of predictive accuracy compared with the extended model (R(2): 34%; AUC: 0.71). The 5- and 10-yr PCSS rates were 98.7% and 95.4%, 96.5% and 88.3%, 88.8% and 79.7%, for the good, intermediate, and poor prognosis subgroups, respectively (p = 0.0003). Overall survival, clinical progression-free survival, and histopathologic outcomes significantly worsened in a stepwise fashion from the good to the poor prognosis subgroups. Limitations of the study are the retrospective design and the long study period. CONCLUSIONS This study presents an intuitive and easy-to-use stratification of high-risk PCa into three prognostic subgroups. The model is useful for counseling and decision making in the pretreatment setting.
Resumo:
Recent studies of Schwinger pair production have demonstrated that the asymptotic particle spectrum is extremely sensitive to the applied field profile. We extend the idea of the dynamically assisted Schwinger effect from single pulse profiles to more realistic field configurations to be generated in an all-optical experiment searching for pair creation. We use the quantum kinetic approach to study the particle production and employ a multi-start method, combined with optimal control theory, to determine a set of parameters for which the particle yield in the forward direction in momentum space is maximized. We argue that this strategy can be used to enhance the signal of pair production on a given detector in an experimental setup.
Resumo:
OBJECTIVE In this study, the "Progressive Resolution Optimizer PRO3" (Varian Medical Systems) is compared to the previous version "PRO2" with respect to its potential to improve dose sparing to the organs at risk (OAR) and dose coverage of the PTV for head and neck cancer patients. MATERIALS AND METHODS For eight head and neck cancer patients, volumetric modulated arc therapy (VMAT) treatment plans were generated in this study. All cases have 2-3 phases and the total prescribed dose (PD) was 60-72Gy in the PTV. The study is mainly focused on the phase 1 plans, which all have an identical PD of 54Gy, and complex PTV structures with an overlap to the parotids. Optimization was performed based on planning objectives for the PTV according to ICRU83, and with minimal dose to spinal cord, and parotids outside PTV. In order to assess the quality of the optimization algorithms, an identical set of constraints was used for both, PRO2 and PRO3. The resulting treatment plans were investigated with respect to dose distribution based on the analysis of the dose volume histograms. RESULTS For the phase 1 plans (PD=54Gy) the near maximum dose D2% of the spinal cord, could be minimized to 22±5 Gy with PRO3, as compared to 32±12Gy with PRO2, averaged for all patients. The mean dose to the parotids was also lower in PRO3 plans compared to PRO2, but the differences were less pronounced. A PTV coverage of V95%=97±1% could be reached with PRO3, as compared to 86±5% with PRO2. In clinical routine, these PRO2 plans would require modifications to obtain better PTV coverage at the cost of higher OAR doses. CONCLUSION A comparison between PRO3 and PRO2 optimization algorithms was performed for eight head and neck cancer patients. In general, the quality of VMAT plans for head and neck patients are improved with PRO3 as compared to PRO2. The dose to OARs can be reduced significantly, especially for the spinal cord. These reductions are achieved with better PTV coverage as compared to PRO2. The improved spinal cord sparing offers new opportunities for all types of paraspinal tumors and for re-irradiation of recurrent tumors or second malignancies.
Resumo:
OBJECTIVE Cochlear implants (CIs) have become the gold standard treatment for deafness. These neuroprosthetic devices feature a linear electrode array, surgically inserted into the cochlea, and function by directly stimulating the auditory neurons located within the spiral ganglion, bypassing lost or not-functioning hair cells. Despite their success, some limitations still remain, including poor frequency resolution and high-energy consumption. In both cases, the anatomical gap between the electrode array and the spiral ganglion neurons (SGNs) is believed to be an important limiting factor. The final goal of the study is to characterize response profiles of SGNs growing in intimate contact with an electrode array, in view of designing novel CI devices and stimulation protocols, featuring a gapless interface with auditory neurons. APPROACH We have characterized SGN responses to extracellular stimulation using multi-electrode arrays (MEAs). This setup allows, in our view, to optimize in vitro many of the limiting interface aspects between CIs and SGNs. MAIN RESULTS Early postnatal mouse SGN explants were analyzed after 6-18 days in culture. Different stimulation protocols were compared with the aim to lower the stimulation threshold and the energy needed to elicit a response. In the best case, a four-fold reduction of the energy was obtained by lengthening the biphasic stimulus from 40 μs to 160 μs. Similarly, quasi monophasic pulses were more effective than biphasic pulses and the insertion of an interphase gap moderately improved efficiency. Finally, the stimulation with an external electrode mounted on a micromanipulator showed that the energy needed to elicit a response could be reduced by a factor of five with decreasing its distance from 40 μm to 0 μm from the auditory neurons. SIGNIFICANCE This study is the first to show electrical activity of SGNs on MEAs. Our findings may help to improve stimulation by and to reduce energy consumption of CIs and thereby contribute to the development of fully implantable devices with better auditory resolution in the future.
Resumo:
This work deals with parallel optimization of expensive objective functions which are modelled as sample realizations of Gaussian processes. The study is formalized as a Bayesian optimization problem, or continuous multi-armed bandit problem, where a batch of q > 0 arms is pulled in parallel at each iteration. Several algorithms have been developed for choosing batches by trading off exploitation and exploration. As of today, the maximum Expected Improvement (EI) and Upper Confidence Bound (UCB) selection rules appear as the most prominent approaches for batch selection. Here, we build upon recent work on the multipoint Expected Improvement criterion, for which an analytic expansion relying on Tallis’ formula was recently established. The computational burden of this selection rule being still an issue in application, we derive a closed-form expression for the gradient of the multipoint Expected Improvement, which aims at facilitating its maximization using gradient-based ascent algorithms. Substantial computational savings are shown in application. In addition, our algorithms are tested numerically and compared to state-of-the-art UCB-based batchsequential algorithms. Combining starting designs relying on UCB with gradient-based EI local optimization finally appears as a sound option for batch design in distributed Gaussian Process optimization.
Resumo:
Background. The CDC estimates that 40% of adults 50 years of age or older do not receive time-appropriate colorectal cancer screening. Sixty percent of colorectal cancer deaths could be prevented by regular screening of adults 50 years of age and older. Yet, in 2000 only 42.5% of adults age 50 or older in the U.S. had received recommended screening. Disparities by health care, nativity status, socioeconomic status, and race/ethnicity are evident. Disparities in minority, underserved populations prevent us from attaining Goal 2 of Healthy People 2010 to “eliminate health disparities.” This review focuses on community-based screening research among underserved populations that includes multiple ethnic groups for appropriate disparities analysis. There is a gap in the colorectal cancer screening literature describing the effectiveness of community-based randomized controlled trials. ^ Objective. To critically review the literature describing community-based colorectal cancer screening strategies that are randomized controlled trials, and that include multiple racial/ethnic groups. ^ Methods. The review includes a preliminary disparities analysis to assess whether interventions were appropriately targeted in communities to those groups experiencing the greatest health disparities. Review articles are from an original search using Ovid Medline and a cross-matching search in Pubmed, both from January 2001 to June 2009. The Ovid Medline literature review is divided into eight exclusionary stages, seven electronic, and the last stage consisting of final manual review. ^ Results. The final studies (n=15) are categorized into four categories: Patient mailings (n=3), Telephone outreach (n=3), Electronic/multimedia (n=4), and Counseling/community education (n=5). Of 15 studies, 11 (73%) demonstrated that screening rates increased for the intervention group compared to controls, including all studies (100%) from the Patient mailings and Telephone outreach groups, 4 of 5 (80%) Counseling/community education studies, and 1 of 4 (25%) Electronic/multimedia interventions. ^ Conclusions. Patient choice and tailoring education and/or messages to individuals have proven to be two important factors in improving colorectal cancer screening adherence rates. Technological strategies have not been overly successful with underserved populations in community-based trials. Based on limited findings to date, future community-based colorectal cancer screening trials should include diverse populations who are experiencing incidence, survival, mortality and screening disparities. ^
Resumo:
Next to leisure, sport, and household activities, the most common activity resulting in medically consulted injuries and poisonings in the United States is work, with an estimated 4 million workplace related episodes reported in 2008 (U.S. Department of Health and Human Services, 2009). To address the risks inherent to various occupations, risk management programs are typically put in place that include worker training, engineering controls, and personal protective equipment. Recent studies have shown that such interventions alone are insufficient to adequately manage workplace risks, and that the climate in which the workers and safety program exist (known as the "safety climate") is an equally important consideration. The organizational safety climate is so important that many studies have focused on developing means of measuring it in various work settings. While safety climate studies have been reported for several industrial settings, published studies on assessing safety climate in the university work setting are largely absent. Universities are particularly unique workplaces because of the potential exposure to a diversity of agents representing both acute and chronic risks. Universities are also unique because readily detectable health and safety outcomes are relatively rare. The ability to measure safety climate in a work setting with rarely observed systemic outcome measures could serve as a powerful means of measure for the evaluation of safety risk management programs. ^ The goal of this research study was the development of a survey tool to measure safety climate specifically in the university work setting. The use of a standardized tool also allows for comparisons among universities throughout the United States. A specific study objective was accomplished to quantitatively assess safety climate at five universities across the United States. At five universities, 971 participants completed an online questionnaire to measure the safety climate. The average safety climate score across the five universities was 3.92 on a scale of 1 to 5, with 5 indicating very high perceptions of safety at these universities. The two lowest overall dimensions of university safety climate were "acknowledgement of safety performance" and "department and supervisor's safety commitment". The results underscore how the perception of safety climate is significantly influenced at the local level. A second study objective regarding evaluating the reliability and validity of the safety climate questionnaire was accomplished. A third objective fulfilled was to provide executive summaries resulting from the questionnaire to the participating universities' health & safety professionals and collect feedback on usefulness, relevance and perceived accuracy. Overall, the professionals found the survey and results to be very useful, relevant and accurate. Finally, the safety climate questionnaire will be offered to other universities for benchmarking purposes at the annual meeting of a nationally recognized university health and safety organization. The ultimate goal of the project was accomplished and was the creation of a standardized tool that can be used for measuring safety climate in the university work setting and can facilitate meaningful comparisons amongst institutions.^
EPANET Input Files of New York tunnels and Pacific City used in a metamodel-based optimization study
Resumo:
Metamodels have proven be very useful when it comes to reducing the computational requirements of Evolutionary Algorithm-based optimization by acting as quick-solving surrogates for slow-solving fitness functions. The relationship between metamodel scope and objective function varies between applications, that is, in some cases the metamodel acts as a surrogate for the whole fitness function, whereas in other cases it replaces only a component of the fitness function. This paper presents a formalized qualitative process to evaluate a fitness function to determine the most suitable metamodel scope so as to increase the likelihood of calibrating a high-fidelity metamodel and hence obtain good optimization results in a reasonable amount of time. The process is applied to the risk-based optimization of water distribution systems; a very computationally-intensive problem for real-world systems. The process is validated with a simple case study (modified New York Tunnels) and the power of metamodelling is demonstrated on a real-world case study (Pacific City) with a computational speed-up of several orders of magnitude.
Resumo:
Providing accurate maps of coral reefs where the spatial scale and labels of the mapped features correspond to map units appropriate for examining biological and geomorphic structures and processes is a major challenge for remote sensing. The objective of this work is to assess the accuracy and relevance of the process used to derive geomorphic zone and benthic community zone maps for three western Pacific coral reefs produced from multi-scale, object-based image analysis (OBIA) of high-spatial-resolution multi-spectral images, guided by field survey data. Three Quickbird-2 multi-spectral data sets from reefs in Australia, Palau and Fiji and georeferenced field photographs were used in a multi-scale segmentation and object-based image classification to map geomorphic zones and benthic community zones. A per-pixel approach was also tested for mapping benthic community zones. Validation of the maps and comparison to past approaches indicated the multi-scale OBIA process enabled field data, operator field experience and a conceptual hierarchical model of the coral reef environment to be linked to provide output maps at geomorphic zone and benthic community scales on coral reefs. The OBIA mapping accuracies were comparable with previously published work using other methods; however, the classes mapped were matched to a predetermined set of features on the reef.
Resumo:
Despite the fact that input–output (IO) tables form a central part of the System of National Accounts, each individual country's national IO table exhibits more or less different features and characteristics, reflecting the country's socioeconomic idiosyncrasies. Consequently, the compilers of a multi-regional input–output table (MRIOT) are advised to thoroughly examine the conceptual as well as methodological differences among countries in the estimation of basic statistics for national IO tables and, if necessary, to carry out pre-adjustment of these tables into a common format prior to the MRIOT compilation. The objective of this study is to provide a practical guide for harmonizing national IO tables to construct a consistent MRIOT, referring to the adjustment practices used by the Institute of Developing Economies, JETRO (IDE-JETRO) in compiling the Asian International Input–Output Table.
Resumo:
The technique of Abstract Interpretation has allowed the development of very sophisticated global program analyses which are at the same time provably correct and practical. We present in a tutorial fashion a novel program development framework which uses abstract interpretation as a fundamental tool. The framework uses modular, incremental abstract interpretation to obtain information about the program. This information is used to validate programs, to detect bugs with respect to partial specifications written using assertions (in the program itself and/or in system libraries), to generate and simplify run-time tests, and to perform high-level program transformations such as multiple abstract specialization, parallelization, and resource usage control, all in a provably correct way. In the case of validation and debugging, the assertions can refer to a variety of program points such as procedure entry, procedure exit, points within procedures, or global computations. The system can reason with much richer information than, for example, traditional types. This includes data structure shape (including pointer sharing), bounds on data structure sizes, and other operational variable instantiation properties, as well as procedure-level properties such as determinacy, termination, nonfailure, and bounds on resource consumption (time or space cost). CiaoPP, the preprocessor of the Ciao multi-paradigm programming system, which implements the described functionality, will be used to illustrate the fundamental ideas.
Resumo:
The work presented here aims to reduce the cost of multijunction solar cell technology by developing ways to manufacture them on cheap substrates such as silicon. In particular, our main objective is the growth of III-V semiconductors on silicon substrates for photovoltaic applications. The goal is to create a GaAsP/Si virtual substrates onto which other III-V cells could be integrated with an interesting efficiency potential. This technology involves several challenges due to the difficulty of growing III-V materials on silicon. In this paper, our first work done aimed at developing such structure is presented. It was focused on the development of phosphorus diffusion models on silicon and on the preparation of an optimal silicon surface to grow on it III-V materials.
Resumo:
We present a novel framework for encoding latency analysis of arbitrary multiview video coding prediction structures. This framework avoids the need to consider an specific encoder architecture for encoding latency analysis by assuming an unlimited processing capacity on the multiview encoder. Under this assumption, only the influence of the prediction structure and the processing times have to be considered, and the encoding latency is solved systematically by means of a graph model. The results obtained with this model are valid for a multiview encoder with sufficient processing capacity and serve as a lower bound otherwise. Furthermore, with the objective of low latency encoder design with low penalty on rate-distortion performance, the graph model allows us to identify the prediction relationships that add higher encoding latency to the encoder. Experimental results for JMVM prediction structures illustrate how low latency prediction structures with a low rate-distortion penalty can be derived in a systematic manner using the new model.