975 resultados para Optimization analysis
Resumo:
Current advanced cloud infrastructure management solutions allow scheduling actions for dynamically changing the number of running virtual machines (VMs). This approach, however, does not guarantee that the scheduled number of VMs will properly handle the actual user generated workload, especially if the user utilization patterns will change. We propose using a dynamically generated scaling model for the VMs containing the services of the distributed applications, which is able to react to the variations in the number of application users. We answer the following question: How to dynamically decide how many services of each type are needed in order to handle a larger workload within the same time constraints? We describe a mechanism for dynamically composing the SLAs for controlling the scaling of distributed services by combining data analysis mechanisms with application benchmarking using multiple VM configurations. Based on processing of multiple application benchmarks generated data sets we discover a set of service monitoring metrics able to predict critical Service Level Agreement (SLA) parameters. By combining this set of predictor metrics with a heuristic for selecting the appropriate scaling-out paths for the services of distributed applications, we show how SLA scaling rules can be inferred and then used for controlling the runtime scale-in and scale-out of distributed services. We validate our architecture and models by performing scaling experiments with a distributed application representative for the enterprise class of information systems. We show how dynamically generated SLAs can be successfully used for controlling the management of distributed services scaling.
Resumo:
ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.
Resumo:
Kriging-based optimization relying on noisy evaluations of complex systems has recently motivated contributions from various research communities. Five strategies have been implemented in the DiceOptim package. The corresponding functions constitute a user-friendly tool for solving expensive noisy optimization problems in a sequential framework, while offering some flexibility for advanced users. Besides, the implementation is done in a unified environment, making this package a useful device for studying the relative performances of existing approaches depending on the experimental setup. An overview of the package structure and interface is provided, as well as a description of the strategies and some insight about the implementation challenges and the proposed solutions. The strategies are compared to some existing optimization packages on analytical test functions and show promising performances.
Resumo:
Treatment of metastatic melanoma with tumor reactive T cells (adoptive T cell therapy, ACT) is a promising approach associated with a high clinical response rate. However, further optimization of this treatment modality is required to increase the clinical response after this therapy. ACT in melanoma involves an initial phase (pre-REP) of tumor-infiltrating lymphocyte (TIL) expansion ex vivo from tumor isolates followed by a second phase, “rapid expansion protocol” (REP) generating the billions of cells used as the TIL infusion product. The main question addressed in this thesis was how the currently used REP affected the responsiveness of the CD8+ T cells to defined melanoma antigens. We hypothesized that the REP drives the TIL to further differentiate and become hyporesponsive to antigen restimulation, therefore, proper cytokine treatment or other ways to expand TIL is required to improve upon this outcome. We evaluated the response of CD8+ TIL to melanoma antigen restimulation using MART-1 peptide-pulsed mature DC in vitro. Post-REP TILs were mostly hypo-responsive with poor proliferation and higher apoptosis. Phenotypic analysis revealed that the expression of CD28 was significantly reduced in post-REP TILs. By sorting experiment and microarray analysis, we confirmed that the few CD28+ post-REP TILs had superior survival capacity and proliferated after restimulation. We then went on to investigate methods to maintain CD28 expression during the REP and improve TIL responsiveness. Firstly, IL-15 and IL-21 were found to synergize in maintaining TIL CD28 expression and antigenic responsiveness during REP. Secondly, we found IL-15 was superior as compared to IL-2 in supporting the long-term expansion of antigen-specific CD8+ TIL after restimulation. These results suggest that current expansion protocols used for adoptive T-cell therapy in melanoma yield largely hyporesponsive products containing CD8+ T cells unable to respond in vivo to re-stimulation with antigen. A modification of our current approaches by using IL-15+IL-21 as supporting cytokines in the REP, or/and administration of IL-15 instead of IL-2 after TIL infusion, may enhance the anti-tumor efficacy and long-term persistence of infused T cells in vivo.
Resumo:
PURPOSE Management of ureteral stones remains controversial. To determine whether optimizing extracorporeal shock wave lithotripsy (ESWL) delivery rates improves treatment of solitary ureteral stones, we compared outcomes of two SW delivery rates in a prospective, randomized trial. MATERIALS AND METHODS From July 2010 to October 2012, 254 consecutive patients were randomized to undergo ESWL at SW delivery rates of either 60 pulses (n=130) or 90 pulses (n=124) per min. The primary endpoint was stone-free rate at 3-month follow-up. Secondary endpoints included stone disintegration, treatment time, complications, and the rate of secondary treatments. Descriptive statistics were used to compare endpoints between the two groups. Adjusted odds ratios and 95% confidence intervals were calculated to assess predictors of success. RESULTS The stone-free rate at 3 months was significantly higher in patients who underwent ESWL at a SW delivery rate of 90 pulses per min than in those receiving 60 pulses (91% vs. 80%, p=0.01). Patients with proximal and mid-ureter stones, but not those with distal ureter stones, accounted for the observed difference (100% vs. 83%; p=0.005; 96% vs. 73%, p=0.03; and 81% vs. 80%, p=0.9, respectively). Treatment time, complications, and the rate of secondary treatments were comparable between the two groups. In multivariable analysis, SW delivery rate of 90 pulses per min, proximal stone location, stone density, stone size and the absence of an indwelling JJ stent were independent predictors of success. CONCLUSIONS Optimization of ESWL delivery rates can achieve excellent results for ureteral stones.
Resumo:
INTRODUCTION Spinal disc herniation, lumbar spinal stenosis and spondylolisthesis are known to be leading causes of lumbar back pain. The cost of low back pain management and related operations are continuously increasing in the healthcare sector. There are many studies regarding complications after spine surgery but little is known about the factors predicting the length of stay in hospital. The purpose of this study was to identify these factors in lumbar spine surgery in order to adapt the postoperative treatment. MATERIAL AND METHODS The current study was carried out as a post hoc analysis on the basis of the German spine registry. Patients who underwent lumbar spine surgery by posterior surgical access and with posterior fusion and/or rigid stabilization, whereby procedures with dynamic stabilization were excluded. Patient characteristics were tested for association with length of stay (LOS) using bivariate and multivariate analyses. RESULTS A total of 356 patients met the inclusion criteria. The average age of all patients was 64.6 years and the mean LOS was 11.9 ± 6.0 days with a range of 2-44 days. Independent factors that were influencing LOS were increased age at the time of surgery, higher body mass index, male gender, blood transfusion of 1-2 erythrocyte concentrates and the presence of surgical complications. CONCLUSION Identification of predictive factors for prolonged LOS may allow for estimation of patient hospitalization time and for optimization of postoperative care. In individual cases this may result of a reduction in the LOS.
Resumo:
Femoroacetabular impingement (FAI) before or after Periacetabular Osteotomy (PAO) is surprisingly frequent and surgeons need to be aware of the risk preoperatively and be able to avoid it intraoperatively. In this paper we present a novel computer assisted planning and navigation system for PAO with impingement analysis and range of motion (ROM) optimization. Our system starts with a fully automatic detection of the acetabular rim, which allows for quantifying the acetabular morphology with parameters such as acetabular version, inclination and femoral head coverage ratio for a computer assisted diagnosis and planning. The planned situation was optimized with impingement simulation by balancing acetabuar coverage with ROM. Intra-operatively navigation was conducted until the optimized planning situation was achieved. Our experimental results demonstrated: 1) The fully automated acetabular rim detection was validated with accuracy 1.1 ± 0.7mm; 2) The optimized PAO planning improved ROM significantly compared to that without ROM optimization; 3) By comparing the pre-operatively planned situation and the intra-operatively achieved situation, sub-degree accuracy was achieved for all directions.
Resumo:
Progress toward elucidating the 3D structures of eukaryotic membrane proteins has been hampered by the lack of appropriate expression systems. Recent work using the Xenopus oocyte as a novel expression system for structural analysis demonstrates the capability of providing not only the significant amount of protein yields required for structural work but also the expression of eukaryotic membrane proteins in a more native and functional conformation. There is a long history using the oocyte expression system as an efficient tool for membrane transporter and channel expression in direct functional analysis, but improvements in robotic injection systems and protein yield optimization allow the rapid scalability of expressed proteins to be purified and characterized in physiologically relevant structural states. Traditional overexpression systems (yeast, bacteria, and insect cells) by comparison require chaotropic conditions over several steps for extraction, solubilization, and purification. By contrast, overexpressing within the oocyte system for subsequent negative-staining transmission electron microscopy studies provides a single system that can functionally assess and purify eukaryotic membrane proteins in fewer steps maintaining the physiological properties of the membrane protein.
Resumo:
OBJECTIVE In this study, the "Progressive Resolution Optimizer PRO3" (Varian Medical Systems) is compared to the previous version "PRO2" with respect to its potential to improve dose sparing to the organs at risk (OAR) and dose coverage of the PTV for head and neck cancer patients. MATERIALS AND METHODS For eight head and neck cancer patients, volumetric modulated arc therapy (VMAT) treatment plans were generated in this study. All cases have 2-3 phases and the total prescribed dose (PD) was 60-72Gy in the PTV. The study is mainly focused on the phase 1 plans, which all have an identical PD of 54Gy, and complex PTV structures with an overlap to the parotids. Optimization was performed based on planning objectives for the PTV according to ICRU83, and with minimal dose to spinal cord, and parotids outside PTV. In order to assess the quality of the optimization algorithms, an identical set of constraints was used for both, PRO2 and PRO3. The resulting treatment plans were investigated with respect to dose distribution based on the analysis of the dose volume histograms. RESULTS For the phase 1 plans (PD=54Gy) the near maximum dose D2% of the spinal cord, could be minimized to 22±5 Gy with PRO3, as compared to 32±12Gy with PRO2, averaged for all patients. The mean dose to the parotids was also lower in PRO3 plans compared to PRO2, but the differences were less pronounced. A PTV coverage of V95%=97±1% could be reached with PRO3, as compared to 86±5% with PRO2. In clinical routine, these PRO2 plans would require modifications to obtain better PTV coverage at the cost of higher OAR doses. CONCLUSION A comparison between PRO3 and PRO2 optimization algorithms was performed for eight head and neck cancer patients. In general, the quality of VMAT plans for head and neck patients are improved with PRO3 as compared to PRO2. The dose to OARs can be reduced significantly, especially for the spinal cord. These reductions are achieved with better PTV coverage as compared to PRO2. The improved spinal cord sparing offers new opportunities for all types of paraspinal tumors and for re-irradiation of recurrent tumors or second malignancies.
Resumo:
Diamonds are known for both their beauty and their durability. Jefferson National Lab in Newport News, VA has found a way to utilize the diamond's strength to view the beauty of the inside of the atomic nucleus with the hopes of finding exotic forms of matter. By firing very fast electrons at a diamond sheet no thicker than a human hair, high energy particles of light known as photons are produced with a high degree of polarization that can illuminate the constituents of the nucleus known as quarks. The University of Connecticut Nuclear Physics group has responsibility for crafting these extremely thin, high quality diamond wafers. These wafers must be cut from larger stones that are about the size of a human finger, and then carefully machined down to the final thickness. The thinning of these diamonds is extremely challenging, as the diamond's greatest strength also becomes its greatest weakness. The Connecticut Nuclear Physics group has developed a novel technique to assist industrial partners in assessing the quality of the final machining steps, using a technique based on laser interferometry. The images of the diamond surface produced by the interferometer encode the thickness and shape of the diamond surface in a complex way that requires detailed analysis to extract. We have developed a novel software application to analyze these images based on the method of simulated annealing. Being able to image the surface of these diamonds without requiring costly X-ray diffraction measurements allows rapid feedback to the industrial partners as they refine their thinning techniques. Thus, by utilizing a material found to be beautiful by many, the beauty of nature can be brought more clearly into view.
Resumo:
Current shortcomings in cancer therapy require the generation of new, broadly applicable, potent, targeted treatments. Here, an adenovirus is engineered to replicate specifically in cells with active human telomerase promotion using a modified hTERT promoter, fused to a CMV promoter element. The virus was also modified to contain a visible reporter transgene, GFP. The virus, Ad/hTC-GFP-E1 was characterized in vitro and demonstrated tumor specific activity both by dose and over time course experiments in a variety of cell lines. In vivo, Ad/hTC-GFP-E1 was affected at suppressing tumor growth and providing a survival benefit without causing any measurable toxicity. To increase the host range of the vector, the fiber region was modified to contain an RGD-motif. The vector, AdRGD/hTC-GFP-E1, was recharacterized in vitro, revealing heightened levels of infectivity and toxicity however maintaining a therapeutic window between cancer and normal cell toxicity. AdRGD/hTC-GFP-E1 was administered in vivo by limb perfusion and was observed to be tumor specific both in expression and replication. To further enhance the efficacy of viral vectors in lung delivery, asthma medications were investigated for their abilities to enhance transgene delivery and expression. A combination of bronchodilators, mast cell inhibitors, and mucolytic agents was devised which demonstrated fold increases in expression in immunocompetent mouse lungs as single agents and more homogenous, intense levels of expression when done in combination of all agents. To characterize the methods in which some cancers are resistant or may become resistant to oncolytic treatments, several small molecule inhibitors of metabolic pathways were applied in combination with oncolytic infection in vitro. SP600125 and PD 98059, respective JNK and ERK inhibitors, successfully suppressed oncolytic toxicity, however did not affect infectivity or transgene expression of Ad/hTC-GFP-E1. JNK and ERK inhibition did significantly suppress viral replication, however, as analyzed by lysate transfer and titration assays. In contrast, SB 203580, an inhibitor for p38, did not demonstrate any protective effects with infected cells. Flow cytometric analysis indicated a possible correlation with G1 arrest and suppressed viral production, however more compounds must be investigated to clarify this observation. ^
Resumo:
Various airborne aldehydes and ketones (i.e., airborne carbonyls) present in outdoor, indoor, and personal air pose a risk to human health at present environmental concentrations. To date, there is no adequate, simple-to-use sampler for monitoring carbonyls at parts per billion concentrations in personal air. The Passive Aldehydes and Ketones Sampler (PAKS) originally developed for this purpose has been found to be unreliable in a number of relatively recent field studies. The PAKS method uses dansylhydrazine, DNSH, as the derivatization agent to produce aldehyde derivatives that are analyzed by HPLC with fluorescence detection. The reasons for the poor performance of the PAKS are not known but it is hypothesized that the chemical derivatization conditions and reaction kinetics combined with a relatively low sampling rate may play a role. This study evaluated the effect of absorption and emission wavelengths, pH of the DNSH coating solution, extraction solvent, and time post-extraction for the yield and stability of formaldehyde, acetaldehyde, and acrolein DNSH derivatives. The results suggest that the optimum conditions for the analysis of DNSHydrazones are the following. The excitation and emission wavelengths for HPLC analysis should be at 250nm and 500nm, respectively. The optimal pH of the coating solution appears to be pH 2 because it improves the formation of di-derivatized acrolein DNSHydrazones without affecting the response of the derivatives of the formaldehyde and acetaldehyde derivatives. Acetonitrile is the preferable extraction solvent while the optimal time to analyze the aldehyde derivatives is 72 hours post-extraction. ^
Resumo:
The influence of respiratory motion on patient anatomy poses a challenge to accurate radiation therapy, especially in lung cancer treatment. Modern radiation therapy planning uses models of tumor respiratory motion to account for target motion in targeting. The tumor motion model can be verified on a per-treatment session basis with four-dimensional cone-beam computed tomography (4D-CBCT), which acquires an image set of the dynamic target throughout the respiratory cycle during the therapy session. 4D-CBCT is undersampled if the scan time is too short. However, short scan time is desirable in clinical practice to reduce patient setup time. This dissertation presents the design and optimization of 4D-CBCT to reduce the impact of undersampling artifacts with short scan times. This work measures the impact of undersampling artifacts on the accuracy of target motion measurement under different sampling conditions and for various object sizes and motions. The results provide a minimum scan time such that the target tracking error is less than a specified tolerance. This work also presents new image reconstruction algorithms for reducing undersampling artifacts in undersampled datasets by taking advantage of the assumption that the relevant motion of interest is contained within a volume-of-interest (VOI). It is shown that the VOI-based reconstruction provides more accurate image intensity than standard reconstruction. The VOI-based reconstruction produced 43% fewer least-squares error inside the VOI and 84% fewer error throughout the image in a study designed to simulate target motion. The VOI-based reconstruction approach can reduce acquisition time and improve image quality in 4D-CBCT.
Resumo:
Proton therapy is growing increasingly popular due to its superior dose characteristics compared to conventional photon therapy. Protons travel a finite range in the patient body and stop, thereby delivering no dose beyond their range. However, because the range of a proton beam is heavily dependent on the tissue density along its beam path, uncertainties in patient setup position and inherent range calculation can degrade thedose distribution significantly. Despite these challenges that are unique to proton therapy, current management of the uncertainties during treatment planning of proton therapy has been similar to that of conventional photon therapy. The goal of this dissertation research was to develop a treatment planning method and a planevaluation method that address proton-specific issues regarding setup and range uncertainties. Treatment plan designing method adapted to proton therapy: Currently, for proton therapy using a scanning beam delivery system, setup uncertainties are largely accounted for by geometrically expanding a clinical target volume (CTV) to a planning target volume (PTV). However, a PTV alone cannot adequately account for range uncertainties coupled to misaligned patient anatomy in the beam path since it does not account for the change in tissue density. In order to remedy this problem, we proposed a beam-specific PTV (bsPTV) that accounts for the change in tissue density along the beam path due to the uncertainties. Our proposed method was successfully implemented, and its superiority over the conventional PTV was shown through a controlled experiment.. Furthermore, we have shown that the bsPTV concept can be incorporated into beam angle optimization for better target coverage and normal tissue sparing for a selected lung cancer patient. Treatment plan evaluation method adapted to proton therapy: The dose-volume histogram of the clinical target volume (CTV) or any other volumes of interest at the time of planning does not represent the most probable dosimetric outcome of a given plan as it does not include the uncertainties mentioned earlier. Currently, the PTV is used as a surrogate of the CTV’s worst case scenario for target dose estimation. However, because proton dose distributions are subject to change under these uncertainties, the validity of the PTV analysis method is questionable. In order to remedy this problem, we proposed the use of statistical parameters to quantify uncertainties on both the dose-volume histogram and dose distribution directly. The robust plan analysis tool was successfully implemented to compute both the expectation value and its standard deviation of dosimetric parameters of a treatment plan under the uncertainties. For 15 lung cancer patients, the proposed method was used to quantify the dosimetric difference between the nominal situation and its expected value under the uncertainties.
Resumo:
Nowadays computing platforms consist of a very large number of components that require to be supplied with diferent voltage levels and power requirements. Even a very small platform, like a handheld computer, may contain more than twenty diferent loads and voltage regulators. The power delivery designers of these systems are required to provide, in a very short time, the right power architecture that optimizes the performance, meets electrical specifications plus cost and size targets. The appropriate selection of the architecture and converters directly defines the performance of a given solution. Therefore, the designer needs to be able to evaluate a significant number of options in order to know with good certainty whether the selected solutions meet the size, energy eficiency and cost targets. The design dificulties of selecting the right solution arise due to the wide range of power conversion products provided by diferent manufacturers. These products range from discrete components (to build converters) to complete power conversion modules that employ diferent manufacturing technologies. Consequently, in most cases it is not possible to analyze all the alternatives (combinations of power architectures and converters) that can be built. The designer has to select a limited number of converters in order to simplify the analysis. In this thesis, in order to overcome the mentioned dificulties, a new design methodology for power supply systems is proposed. This methodology integrates evolutionary computation techniques in order to make possible analyzing a large number of possibilities. This exhaustive analysis helps the designer to quickly define a set of feasible solutions and select the best trade-off in performance according to each application. The proposed approach consists of two key steps, one for the automatic generation of architectures and other for the optimized selection of components. In this thesis are detailed the implementation of these two steps. The usefulness of the methodology is corroborated by contrasting the results using real problems and experiments designed to test the limits of the algorithms.