920 resultados para Model-based optimization
Resumo:
Knowledge-based radiation treatment is an emerging concept in radiotherapy. It
mainly refers to the technique that can guide or automate treatment planning in
clinic by learning from prior knowledge. Dierent models are developed to realize
it, one of which is proposed by Yuan et al. at Duke for lung IMRT planning. This
model can automatically determine both beam conguration and optimization ob-
jectives with non-coplanar beams based on patient-specic anatomical information.
Although plans automatically generated by this model demonstrate equivalent or
better dosimetric quality compared to clinical approved plans, its validity and gener-
ality are limited due to the empirical assignment to a coecient called angle spread
constraint dened in the beam eciency index used for beam ranking. To eliminate
these limitations, a systematic study on this coecient is needed to acquire evidences
for its optimal value.
To achieve this purpose, eleven lung cancer patients with complex tumor shape
with non-coplanar beams adopted in clinical approved plans were retrospectively
studied in the frame of the automatic lung IMRT treatment algorithm. The primary
and boost plans used in three patients were treated as dierent cases due to the
dierent target size and shape. A total of 14 lung cases, thus, were re-planned using
the knowledge-based automatic lung IMRT planning algorithm by varying angle
spread constraint from 0 to 1 with increment of 0.2. A modied beam angle eciency
index used for navigate the beam selection was adopted. Great eorts were made to assure the quality of plans associated to every angle spread constraint as good
as possible. Important dosimetric parameters for PTV and OARs, quantitatively
re
ecting the plan quality, were extracted from the DVHs and analyzed as a function
of angle spread constraint for each case. Comparisons of these parameters between
clinical plans and model-based plans were evaluated by two-sampled Students t-tests,
and regression analysis on a composite index built on the percentage errors between
dosimetric parameters in the model-based plans and those in the clinical plans as a
function of angle spread constraint was performed.
Results show that model-based plans generally have equivalent or better quality
than clinical approved plans, qualitatively and quantitatively. All dosimetric param-
eters except those for lungs in the automatically generated plans are statistically
better or comparable to those in the clinical plans. On average, more than 15% re-
duction on conformity index and homogeneity index for PTV and V40, V60 for heart
while an 8% and 3% increase on V5, V20 for lungs, respectively, are observed. The
intra-plan comparison among model-based plans demonstrates that plan quality does
not change much with angle spread constraint larger than 0.4. Further examination
on the variation curve of the composite index as a function of angle spread constraint
shows that 0.6 is the optimal value that can result in statistically the best achievable
plans.
Resumo:
The work presented in this dissertation is focused on applying engineering methods to develop and explore probabilistic survival models for the prediction of decompression sickness in US NAVY divers. Mathematical modeling, computational model development, and numerical optimization techniques were employed to formulate and evaluate the predictive quality of models fitted to empirical data. In Chapters 1 and 2 we present general background information relevant to the development of probabilistic models applied to predicting the incidence of decompression sickness. The remainder of the dissertation introduces techniques developed in an effort to improve the predictive quality of probabilistic decompression models and to reduce the difficulty of model parameter optimization.
The first project explored seventeen variations of the hazard function using a well-perfused parallel compartment model. Models were parametrically optimized using the maximum likelihood technique. Model performance was evaluated using both classical statistical methods and model selection techniques based on information theory. Optimized model parameters were overall similar to those of previously published Results indicated that a novel hazard function definition that included both ambient pressure scaling and individually fitted compartment exponent scaling terms.
We developed ten pharmacokinetic compartmental models that included explicit delay mechanics to determine if predictive quality could be improved through the inclusion of material transfer lags. A fitted discrete delay parameter augmented the inflow to the compartment systems from the environment. Based on the observation that symptoms are often reported after risk accumulation begins for many of our models, we hypothesized that the inclusion of delays might improve correlation between the model predictions and observed data. Model selection techniques identified two models as having the best overall performance, but comparison to the best performing model without delay and model selection using our best identified no delay pharmacokinetic model both indicated that the delay mechanism was not statistically justified and did not substantially improve model predictions.
Our final investigation explored parameter bounding techniques to identify parameter regions for which statistical model failure will not occur. When a model predicts a no probability of a diver experiencing decompression sickness for an exposure that is known to produce symptoms, statistical model failure occurs. Using a metric related to the instantaneous risk, we successfully identify regions where model failure will not occur and identify the boundaries of the region using a root bounding technique. Several models are used to demonstrate the techniques, which may be employed to reduce the difficulty of model optimization for future investigations.
Resumo:
Software development guidelines are a set of rules which can help improve the quality of software. These rules are defined on the basis of experience gained by the software development community over time. This paper discusses a set of design guidelines for model-based development of complex real-time embedded software systems. To be precise, we propose nine design conventions, three design patterns and thirteen antipatterns for developing UML-RT models. These guidelines have been identified based on our analysis of around 100 UML-RT models from industry and academia. Most of the guidelines are explained with the help of examples, and standard templates from the current state of the art are used for documenting the design rules.
Resumo:
The VLT-FLAMES Tarantula Survey (VFTS) has secured mid-resolution spectra of over 300 O-type stars in the 30 Doradus region of the Large Magellanic Cloud. A homogeneous analysis of such a large sample requires automated techniques, an approach that will also be needed for the upcoming analysis of the Gaia surveys of the Northern and Southern Hemisphere supplementing the Gaia measurements. We point out the importance of Gaia for the study of O stars, summarize the O star science case of VFTS and present a test of the automated modeling technique using synthetically generated data. This method employs a genetic algorithm based optimization technique in combination with fastwind model atmospheres. The method is found to be robust and able to recover the main photospheric parameters accurately. Precise wind parameters can be obtained as well, however, as expected, for dwarf stars the rate of acceleration of the ow is poorly constrained.
Resumo:
Emerging cybersecurity vulnerabilities in supervisory control and data acquisition (SCADA) systems are becoming urgent engineering issues for modern substations. This paper proposes a novel intrusion detection system (IDS) tailored for cybersecurity of IEC 61850 based substations. The proposed IDS integrates physical knowledge, protocol specifications and logical behaviours to provide a comprehensive and effective solution that is able to mitigate various cyberattacks. The proposed approach comprises access control detection, protocol whitelisting, model-based detection, and multi-parameter based detection. This SCADA-specific IDS is implemented and validated using a comprehensive and realistic cyber-physical test-bed and data from a real 500kV smart substation.
Resumo:
Reverse engineering is usually the stepping stone of a variety of at-tacks aiming at identifying sensitive information (keys, credentials, data, algo-rithms) or vulnerabilities and flaws for broader exploitation. Software applica-tions are usually deployed as identical binary code installed on millions of com-puters, enabling an adversary to develop a generic reverse-engineering strategy that, if working on one code instance, could be applied to crack all the other in-stances. A solution to mitigate this problem is represented by Software Diversity, which aims at creating several structurally different (but functionally equivalent) binary code versions out of the same source code, so that even if a successful attack can be elaborated for one version, it should not work on a diversified ver-sion. In this paper, we address the problem of maximizing software diversity from a search-based optimization point of view. The program to protect is subject to a catalogue of transformations to generate many candidate versions. The problem of selecting the subset of most diversified versions to be deployed is formulated as an optimisation problem, that we tackle with different search heuristics. We show the applicability of this approach on some popular Android apps.
Resumo:
This paper proposes a novel demand response model using a fuzzy subtractive cluster approach. The model development provides support to domestic consumer decisions on controllable loads management, considering consumers' consumption needs and the appropriate load shape or rescheduling in order to achieve possible economic benefits. The model based on fuzzy subtractive clustering method considers clusters of domestic consumption covering an adequate consumption range. Analysis of different scenarios is presented considering available electric power and electric energy prices. Simulation results are presented and conclusions of the proposed demand response model are discussed. (C) 2016 Elsevier Ltd. All rights reserved.
Resumo:
A variety of physical and biomedical imaging techniques, such as digital holography, interferometric synthetic aperture radar (InSAR), or magnetic resonance imaging (MRI) enable measurement of the phase of a physical quantity additionally to its amplitude. However, the phase can commonly only be measured modulo 2π, as a so called wrapped phase map. Phase unwrapping is the process of obtaining the underlying physical phase map from the wrapped phase. Tile-based phase unwrapping algorithms operate by first tessellating the phase map, then unwrapping individual tiles, and finally merging them to a continuous phase map. They can be implemented computationally efficiently and are robust to noise. However, they are prone to failure in the presence of phase residues or erroneous unwraps of single tiles. We tried to overcome these shortcomings by creating novel tile unwrapping and merging algorithms as well as creating a framework that allows to combine them in modular fashion. To increase the robustness of the tile unwrapping step, we implemented a model-based algorithm that makes efficient use of linear algebra to unwrap individual tiles. Furthermore, we adapted an established pixel-based unwrapping algorithm to create a quality guided tile merger. These original algorithms as well as previously existing ones were implemented in a modular phase unwrapping C++ framework. By examining different combinations of unwrapping and merging algorithms we compared our method to existing approaches. We could show that the appropriate choice of unwrapping and merging algorithms can significantly improve the unwrapped result in the presence of phase residues and noise. Beyond that, our modular framework allows for efficient design and test of new tile-based phase unwrapping algorithms. The software developed in this study is freely available.
Resumo:
The protein lysate array is an emerging technology for quantifying the protein concentration ratios in multiple biological samples. It is gaining popularity, and has the potential to answer questions about post-translational modifications and protein pathway relationships. Statistical inference for a parametric quantification procedure has been inadequately addressed in the literature, mainly due to two challenges: the increasing dimension of the parameter space and the need to account for dependence in the data. Each chapter of this thesis addresses one of these issues. In Chapter 1, an introduction to the protein lysate array quantification is presented, followed by the motivations and goals for this thesis work. In Chapter 2, we develop a multi-step procedure for the Sigmoidal models, ensuring consistent estimation of the concentration level with full asymptotic efficiency. The results obtained in this chapter justify inferential procedures based on large-sample approximations. Simulation studies and real data analysis are used to illustrate the performance of the proposed method in finite-samples. The multi-step procedure is simpler in both theory and computation than the single-step least squares method that has been used in current practice. In Chapter 3, we introduce a new model to account for the dependence structure of the errors by a nonlinear mixed effects model. We consider a method to approximate the maximum likelihood estimator of all the parameters. Using the simulation studies on various error structures, we show that for data with non-i.i.d. errors the proposed method leads to more accurate estimates and better confidence intervals than the existing single-step least squares method.
Resumo:
The purpose of this report is to present the Crossdock Door Assignment Problem, which involves assigning destinations to outbound dock doors of Crossdock centres such that travel distance by material handling equipment is minimized. We propose a two fold solution; simulation and optimization of the simulation model - simulation optimization. The novel aspect of our solution approach is that we intend to use simulation to derive a more realistic objective function and use Memetic algorithms to find an optimal solution. The main advantage of using Memetic algorithms is that it combines a local search with Genetic Algorithms. The Crossdock Door Assignment Problem is a new domain application to Memetic Algorithms and it is yet unknown how it will perform.
Resumo:
This thesis considers a three- dimensional numerical model based on 3-D Navier— Stokes and continuity equations involving various wind speeds (North west), water surface levels, horizontal shier stresses, eddy viscosity, densities of oil and gas condensate- water mixture flows. The model is used to simulate the prediction of the surface movement of oil and gas condensate slicks from spill accident in the north coasts of Persian Gulf.
Resumo:
The municipal management in any country of the globe requires planning and allocation of resources evenly. In Brazil, the Law of Budgetary Guidelines (LDO) guides municipal managers toward that balance. This research develops a model that seeks to find the balance of the allocation of public resources in Brazilian municipalities, considering the LDO as a parameter. For this using statistical techniques and multicriteria analysis as a first step in order to define allocation strategies, based on the technical aspects arising from the municipal manager. In a second step, presented in linear programming based optimization where the objective function is derived from the preference of the results of the manager and his staff. The statistical representation is presented to support multicriteria development in the definition of replacement rates through time series. The multicriteria analysis was structured by defining the criteria, alternatives and the application of UTASTAR methods to calculate replacement rates. After these initial settings, an application of linear programming was developed to find the optimal allocation of enforcement resources of the municipal budget. Data from the budget of a municipality in southwestern Paraná were studied in the application of the model and analysis of results.
Resumo:
The purpose of this report is to present the Crossdock Door Assignment Problem, which involves assigning destinations to outbound dock doors of Crossdock centres such that travel distance by material handling equipment is minimized. We propose a two fold solution; simulation and optimization of the simulation model - simulation optimization. The novel aspect of our solution approach is that we intend to use simulation to derive a more realistic objective function and use Memetic algorithms to find an optimal solution. The main advantage of using Memetic algorithms is that it combines a local search with Genetic Algorithms. The Crossdock Door Assignment Problem is a new domain application to Memetic Algorithms and it is yet unknown how it will perform.
Resumo:
In this research work, a new routing protocol for Opportunistic Networks is presented. The proposed protocol is called PSONET (PSO for Opportunistic Networks) since the proposal uses a hybrid system composed of a Particle Swarm Optimization algorithm (PSO). The main motivation for using the PSO is to take advantage of its search based on individuals and their learning adaptation. The PSONET uses the Particle Swarm Optimization technique to drive the network traffic through of a good subset of forwarders messages. The PSONET analyzes network communication conditions, detecting whether each node has sparse or dense connections and thus make better decisions about routing messages. The PSONET protocol is compared with the Epidemic and PROPHET protocols in three different scenarios of mobility: a mobility model based in activities, which simulates the everyday life of people in their work activities, leisure and rest; a mobility model based on a community of people, which simulates a group of people in their communities, which eventually will contact other people who may or may not be part of your community, to exchange information; and a random mobility pattern, which simulates a scenario divided into communities where people choose a destination at random, and based on the restriction map, move to this destination using the shortest path. The simulation results, obtained through The ONE simulator, show that in scenarios where the mobility model based on a community of people and also where the mobility model is random, the PSONET protocol achieves a higher messages delivery rate and a lower replication messages compared with the Epidemic and PROPHET protocols.
Resumo:
Einleitung: Notwendige Voraussetzung für die Entstehung von Zervixkarzinomen ist eine persistierende Infektion mit humanen Papillomaviren (HPV). Die HPV-Typen 16 und 18 verursachen mit etwa 70% den überwiegenden Teil der Zervixkarzinome. Seit 2006/2007 stehen zwei Impfstoffe gegen HPV 16 und 18 zur Verfügung. Fragestellung: Wie effektiv ist die HPV-Impfung hinsichtlich der Reduktion von Zervixkarzinomen bzw. ihren Vorstufen (CIN)? Stellt die HPV-Impfung eine kosteneffektive Ergänzung zur derzeitigen Screeningpraxis dar? Gibt es Unterschiede bezüglich der Kosten-Effektivität zwischen den beiden verfügbaren Impfstoffen? Sollte aus gesundheitsökonomischer Perspektive eine Empfehlung für den Einsatz der HPV-Impfung gegeben werden? Falls ja, welche Empfehlungen bezüglich der Ausgestaltung einer Impfstrategie lassen sich ableiten? Welche ethischen, sozialen und juristischen Implikationen sind zu berücksichtigen? Methoden: Basierend auf einer systematischen Literaturrecherche werden randomisierte kontrollierte Studien zur Wirksamkeit der HPV-Impfungen für die Prävention von Zervixkarzinomen bzw. deren Vorstufen, den zervikalen intraepithelialen Neoplasien, identifiziert. Gesundheitsökonomische Modellierungen werden zur Beantwortung der ökonomischen Fragestellungen herangezogen. Die Beurteilung der Qualität der medizinischen und ökonomischen Studien erfolgt mittels anerkannter Standards zur systematischen Bewertung wissenschaftlicher Studien Ergebnisse: Bei zu Studienbeginn HPV 16/18 negativen Frauen, die alle Impfdosen erhalten haben, liegt die Wirksamkeit der Impfungen gegen HPV 16/18-induzierten CIN 2 oder höher bei 98% bis 100%. Nebenwirkungen der Impfung sind vor allem mit der Injektion assoziierte Beschwerden (Rötungen, Schwellungen, Schmerzen). Es gibt keine signifikanten Unterschiede für schwerwiegende unerwünschte Ereignisse zwischen Impf- und Placebogruppe. Die Ergebnisse der Basisfallanalysen der gesundheitsökonomischen Modellierungen reichen bei ausschließlicher Berücksichtigung direkter Kostenkomponenten von ca. 3.000 Euro bis ca. 40.000 Euro pro QALY (QALY = Qualitätskorrigiertes Lebensjahr), bzw. von ca. 9.000 Euro bis ca. 65.000 Euro pro LYG (LYG = Gewonnenes Lebensjahr). Diskussion: Nach den Ergebnissen der eingeschlossenen Studien sind die verfügbaren HPV-Impfstoffe wirksam zur Prävention gegen durch HPV 16/18 verursachte prämaligne Läsionen der Zervix. Unklar ist derzeit noch die Dauer des Impfschutzes. Hinsichtlich der Nebenwirkungen ist die Impfung als sicher einzustufen. Allerdings ist die Fallzahl der Studien nicht ausreichend groß, um das Auftreten sehr seltener Nebenwirkungen zuverlässig zu bestimmen. Inwieweit die HPV-Impfung zur Reduktion der Inzidenz und Mortalität des Zervixkarzinoms in Deutschland führen wird, hängt nicht allein von der klinischen Wirksamkeit der Impfstoffe ab, sondern wird von einer Reihe weiterer Faktoren wie der Impfquote oder den Auswirkungen der Impfungen auf die Teilnahmerate an den bestehenden Screeningprogrammen determiniert. Infolge der Heterogenität der methodischen Rahmenbedingungen und Inputparameter variieren die Ergebnisse der gesundheitsökonomischen Modellierungen erheblich. Fast alle Modellanalysen lassen jedoch den Schluss zu, dass die Einführung einer Impfung mit lebenslanger Schutzdauer bei Fortführung der derzeitigen Screeningpraxis als kosteneffektiv zu bewerten ist. Eine Gegenüberstellung der beiden verschiedenen Impfstoffe ergab, dass die Modellierung der tetravalenten Impfung bei der Berücksichtigung von QALY als Ergebnisparameter in der Regel mit einem niedrigeren (besseren) Kosten-Effektivitäts-Verhältnis einhergeht als die Modellierung der bivalenten Impfung, da auch Genitalwarzen berücksichtigt werden. In Sensitivitätsanalysen stellten sich sowohl die Schutzdauer der Impfung als auch die Höhe der Diskontierungsrate als wesentliche Einflussparameter der Kosten-Effektivität heraus. Schlussfolgerung: Die Einführung der HPV-Impfung kann zu einem verringerten Auftreten von Zervixkarzinomen bei geimpften Frauen führen. Jedoch sollten die Impfprogramme von weiteren Evaluationen begleitet werden, um die langfristige Wirksamkeit und Sicherheit beurteilen sowie die Umsetzung der Impfprogramme optimieren zu können. Von zentraler Bedeutung sind hohe Teilnahmeraten sowohl an den Impfprogrammen als auch - auch bei geimpften Frauen - an den Früherkennungsuntersuchungen. Da die Kosten-Effektivität entscheidend von der Schutzdauer, die bislang ungewiss ist, beeinflusst wird, ist eine abschließende Beurteilung der Kosten-Effektivität der HPV-Impfung nicht möglich. Eine langfristige Schutzdauer ist eine bedeutende Vorraussetzung für die Kosten-Effektivität der Impfung. Der Abschluss einer Risk-Sharing-Vereinbarung zwischen Kostenträgern und Herstellerfirmen stellt eine Option dar, um die Auswirkungen der Unsicherheit der Schutzdauer auf die Kosten-Effektivität zu begrenzen.