877 resultados para MODELING SYSTEM


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual machines (VMs) are powerful platforms for building agile datacenters and emerging cloud systems. However, resource management for a VM-based system is still a challenging task. First, the complexity of application workloads as well as the interference among competing workloads makes it difficult to understand their VMs’ resource demands for meeting their Quality of Service (QoS) targets; Second, the dynamics in the applications and system makes it also difficult to maintain the desired QoS target while the environment changes; Third, the transparency of virtualization presents a hurdle for guest-layer application and host-layer VM scheduler to cooperate and improve application QoS and system efficiency. This dissertation proposes to address the above challenges through fuzzy modeling and control theory based VM resource management. First, a fuzzy-logic-based nonlinear modeling approach is proposed to accurately capture a VM’s complex demands of multiple types of resources automatically online based on the observed workload and resource usages. Second, to enable fast adaption for resource management, the fuzzy modeling approach is integrated with a predictive-control-based controller to form a new Fuzzy Modeling Predictive Control (FMPC) approach which can quickly track the applications’ QoS targets and optimize the resource allocations under dynamic changes in the system. Finally, to address the limitations of black-box-based resource management solutions, a cross-layer optimization approach is proposed to enable cooperation between a VM’s host and guest layers and further improve the application QoS and resource usage efficiency. The above proposed approaches are prototyped and evaluated on a Xen-based virtualized system and evaluated with representative benchmarks including TPC-H, RUBiS, and TerraFly. The results demonstrate that the fuzzy-modeling-based approach improves the accuracy in resource prediction by up to 31.4% compared to conventional regression approaches. The FMPC approach substantially outperforms the traditional linear-model-based predictive control approach in meeting application QoS targets for an oversubscribed system. It is able to manage dynamic VM resource allocations and migrations for over 100 concurrent VMs across multiple hosts with good efficiency. Finally, the cross-layer optimization approach further improves the performance of a virtualized application by up to 40% when the resources are contended by dynamic workloads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Unified Modeling Language (UML) has quickly become the industry standard for object-oriented software development. It is being widely used in organizations and institutions around the world. However, UML is often found to be too complex for novice systems analysts. Although prior research has identified difficulties novice analysts encounter in learning UML, no viable solution has been proposed to address these difficulties. Sequence-diagram modeling, in particular, has largely been overlooked. The sequence diagram models the behavioral aspects of an object-oriented software system in terms of interactions among its building blocks, i.e. objects and classes. It is one of the most commonly-used UML diagrams in practice. However, there has been little research on sequence-diagram modeling. The current literature scarcely provides effective guidelines for developing a sequence diagram. Such guidelines will be greatly beneficial to novice analysts who, unlike experienced systems analysts, do not possess relevant prior experience to easily learn how to develop a sequence diagram. There is the need for an effective sequence-diagram modeling technique for novices. This dissertation reports a research study that identified novice difficulties in modeling a sequence diagram and proposed a technique called CHOP (CHunking, Ordering, Patterning), which was designed to reduce the cognitive load by addressing the cognitive complexity of sequence-diagram modeling. The CHOP technique was evaluated in a controlled experiment against a technique recommended in a well-known textbook, which was found to be representative of approaches provided in many textbooks as well as practitioner literatures. The results indicated that novice analysts were able to perform better using the CHOP technique. This outcome seems have been enabled by pattern-based heuristics provided by the technique. Meanwhile, novice analysts rated the CHOP technique more useful although not significantly easier to use than the control technique. The study established that the CHOP technique is an effective sequence-diagram modeling technique for novice analysts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation focused on developing an integrated surface – subsurface hydrologic simulation numerical model by programming and testing the coupling of the USGS MODFLOW-2005 Groundwater Flow Process (GWF) package (USGS, 2005) with the 2D surface water routing model: FLO-2D (O’Brien et al., 1993). The coupling included the necessary procedures to numerically integrate and verify both models as a single computational software system that will heretofore be referred to as WHIMFLO-2D (Wetlands Hydrology Integrated Model). An improved physical formulation of flow resistance through vegetation in shallow waters based on the concept of drag force was also implemented for the simulations of floodplains, while the use of the classical methods (e.g., Manning, Chezy, Darcy-Weisbach) to calculate flow resistance has been maintained for the canals and deeper waters. A preliminary demonstration exercise WHIMFLO-2D in an existing field site was developed for the Loxahatchee Impoundment Landscape Assessment (LILA), an 80 acre area, located at the Arthur R. Marshall Loxahatchee National Wild Life Refuge in Boynton Beach, Florida. After applying a number of simplifying assumptions, results have illustrated the ability of the model to simulate the hydrology of a wetland. In this illustrative case, a comparison between measured and simulated stages level showed an average error of 0.31% with a maximum error of 2.8%. Comparison of measured and simulated groundwater head levels showed an average error of 0.18% with a maximum of 2.9%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shipboard power systems have different characteristics than the utility power systems. In the Shipboard power system it is crucial that the systems and equipment work at their peak performance levels. One of the most demanding aspects for simulations of the Shipboard Power Systems is to connect the device under test to a real-time simulated dynamic equivalent and in an environment with actual hardware in the Loop (HIL). The real time simulations can be achieved by using multi-distributed modeling concept, in which the global system model is distributed over several processors through a communication link. The advantage of this approach is that it permits the gradual change from pure simulation to actual application. In order to perform system studies in such an environment physical phase variable models of different components of the shipboard power system were developed using operational parameters obtained from finite element (FE) analysis. These models were developed for two types of studies low and high frequency studies. Low frequency studies are used to examine the shipboard power systems behavior under load switching, and faults. High-frequency studies were used to predict abnormal conditions due to overvoltage, and components harmonic behavior. Different experiments were conducted to validate the developed models. The Simulation and experiment results show excellent agreement. The shipboard power systems components behavior under internal faults was investigated using FE analysis. This developed technique is very curial in the Shipboard power systems faults detection due to the lack of comprehensive fault test databases. A wavelet based methodology for feature extraction of the shipboard power systems current signals was developed for harmonic and fault diagnosis studies. This modeling methodology can be utilized to evaluate and predicate the NPS components future behavior in the design stage which will reduce the development cycles, cut overall cost, prevent failures, and test each subsystem exhaustively before integrating it into the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing dependence on groundwater in the Wakal River basin, India, jeopardizes water supply sustainability. A numerical groundwater model was developed to better understand the aquifer system and to evaluate its potential in terms of quantity and replenishment. Potential artificial recharge areas were delineated using landscape and hydrogeologic parameters, Geographic Information System (GIS), and remote sensing. Groundwater models are powerful tools for recharge estimation when transmissivity is known. Proper recharge must be applied to reproduce field-measured heads. The model showed that groundwater levels could decline significantly if there are two drought years in every four years that result in reduced recharge, and groundwater withdrawal is increased by 15%. The effect of such drought is currently uncertain however, because runoff from the basin is unknown. Remote sensing and GIS revealed areas with slopes less than 5%, forest cover, and Normalized Difference Vegetative Index greater than 0.5 that are suitable recharge sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. Although a number of prototype KB systems have been proposed, there are many shortcomings. Firstly, few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. Secondly, there does not seem to be any published empirical study that experimentally tested the effectiveness of any of these KB tools. Thirdly, problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project, a consulting system, called CODA, for conceptual database design that addresses the above short comings was developed and empirically validated. More specifically, the CODA system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation were used and compared in this project, namely system restrictiveness and decisional guidance (Silver 1990). The Restrictive system uses a proscriptive approach and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach, which is less restrictive, involves providing context specific, informative and suggestive guidance throughout the design process. Both the approaches would prevent erroneous design decisions. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than the system without a knowledge-base and (2) which approach to knowledge implementation - whether Restrictive or Guidance - is more effective. To evaluate the effectiveness of the knowledge base itself, the systems were compared with a system that does not incorporate the expertise (Control). An experimental procedure using student subjects was used to test the effectiveness of the systems. The subjects solved a task without using the system (pre-treatment task) and another task using one of the three systems, viz. Control, Guidance or Restrictive (experimental task). Analysis of experimental task scores of those subjects who performed satisfactorily in the pre-treatment task revealed that the knowledge based approach to database design support lead to more accurate solutions than the control system. Among the two KB approaches, Guidance approach was found to lead to better performance when compared to the Control system. It was found that the subjects perceived the Restrictive system easier to use than the Guidance system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to relative ground movement, buried pipelines experience geotechnical loads. The imposed geotechnical loads may initiate pipeline deformations that affect system serviceability and integrity. Engineering guidelines (e.g., ALA, 2005; Honegger and Nyman, 2001) provide the technical framework to develop idealized structural models to analyze pipe‒soil interaction events and assess pipe mechanical response. The soil behavior is modeled using discrete springs that represent the geotechnical loads per unit pipe length developed during the interaction event. Soil forces are defined along three orthogonal directions (i.e., axial, lateral and vertical) to analyze the response of pipelines. Nonlinear load-displacement relationships of soil defined by a spring, is independent of neighboring spring elements. However, recent experimental and numerical studies demonstrate significant coupling effects during oblique (i.e., not along one of the orthogonal axes) pipe‒soil interaction events. In the present study, physical modeling using a geotechnical centrifuge was conducted to improve the current understanding of soil load coupling effects of buried pipes in loose and dense sand. A section of pipeline, at shallow burial depth, was translated through the soil at different oblique angles in the axial-lateral plane. The force exerted by the soil on pipe is critically examined to assess the significance of load coupling effects and establish a yield envelope. The displacements required to soil yield force are also examined to assess potential coupling in mobilization distance. A set of laboratory tests were conducted on the sand used for centrifuge modeling to find the stress-strain behavior of sand, which was used to examine the possible mechanisms of centrifuge model test. The yield envelope, deformation patterns, and interpreted failure mechanisms obtained from centrifuge modeling are compared with other physical modeling and numerical simulations available in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Durante i trattamenti radioterapici dei pazienti oncologici testa-collo, le ghiandole parotidee (PGs) possono essere indebitamente irradiate a seguito di modificazioni volumetriche-spaziali inter/intra-frazione causate da fattori quali il dimagrimento, l’esposizione a radiazioni ionizzanti ed il morphing anatomico degli organi coinvolti nelle aree d’irraggiamento. Il presente lavoro svolto presso la struttura di Fisica Medica e di Radioterapia Oncologica dell’A.O.U di Modena, quale parte del progetto di ricerca del Ministero della Salute (MoH2010, GR-2010-2318757) “ Dose warping methods for IGRT and Adaptive RT: dose accumulation based on organ motion and anatomical variations of the patients during radiation therapy treatments ”, sviluppa un modello biomeccanico in grado di rappresentare il processo di deformazione delle PGs, considerandone la geometria, le proprietà elastiche e l'evoluzione durante il ciclo terapeutico. Il modello di deformazione d’organo è stato realizzato attraverso l’utilizzo di un software agli elementi finiti (FEM). Molteplici superfici mesh, rappresentanti la geometria e l’evoluzione delle parotidi durante le sedute di trattamento, sono state create a partire dai contorni dell’organo definiti dal medico radioterapista sull’immagine tomografica di pianificazione e generati automaticamente sulle immagini di setup e re-positioning giornaliere mediante algoritmi di registrazione rigida/deformabile. I constraints anatomici e il campo di forze del modello sono stati definiti sulla base di ipotesi semplificative considerando l’alterazione strutturale (perdita di cellule acinari) e le barriere anatomiche dovute a strutture circostanti. L’analisi delle mesh ha consentito di studiare la dinamica della deformazione e di individuare le regioni maggiormente soggette a cambiamento. Le previsioni di morphing prodotte dal modello proposto potrebbero essere integrate in un treatment planning system per metodiche di Adaptive Radiation Therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The northern Antarctic Peninsula is one of the fastest changing regions on Earth. The disintegration of the Larsen-A Ice Shelf in 1995 caused tributary glaciers to adjust by speeding up, surface lowering, and overall increased ice-mass discharge. In this study, we investigate the temporal variation of these changes at the Dinsmoor-Bombardier-Edgeworth glacier system by analyzing dense time series from various spaceborne and airborne Earth observation missions. Precollapse ice shelf conditions and subsequent adjustments through 2014 were covered. Our results show a response of the glacier system some months after the breakup, reaching maximum surface velocities at the glacier front of up to 8.8 m/d in 1999 and a subsequent decrease to ~1.5 m/d in 2014. Using a dense time series of interferometrically derived TanDEM-X digital elevation models and photogrammetric data, an exponential function was fitted for the decrease in surface elevation. Elevation changes in areas below 1000 m a.s.l. amounted to at least 130±15 m130±15 m between 1995 and 2014, with change rates of ~3.15 m/a between 2003 and 2008. Current change rates (2010-2014) are in the range of 1.7 m/a. Mass imbalances were computed with different scenarios of boundary conditions. The most plausible results amount to -40.7±3.9 Gt-40.7±3.9 Gt. The contribution to sea level rise was estimated to be 18.8±1.8 Gt18.8±1.8 Gt, corresponding to a 0.052±0.005 mm0.052±0.005 mm sea level equivalent, for the period 1995-2014. Our analysis and scenario considerations revealed that major uncertainties still exist due to insufficiently accurate ice-thickness information. The second largest uncertainty in the computations was the glacier surface mass balance, which is still poorly known. Our time series analysis facilitates an improved comparison with GRACE data and as input to modeling of glacio-isostatic uplift in this region. The study contributed to a better understanding of how glacier systems adjust to ice shelf disintegration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The model of Reshaping and Re-amplification (2R) regenerator based on High Nonlinear Dispersion Imbalanced Loop Mirror (HN-DILM) has been designed to examine its capability to reduce the necessary of fiber loop length and input peak power by deploying High Non linear Fiber (HNLF) compared to Dispersion Shifted Fiber (DSF). The simulation results show by deployed a HNLF as a nonlinear element in Dispersion Imbalanced Loop Mirror (DILM) requires only 400mW peak powers to obtain a peak of transmission compared to DSF which requires a higher peak power at 2000mW to obtain a certain transmissivity. It also shows that HNLF required shorter fiber length to achieve the highest transmission. The 2R regenerator also increases the extinction ratio (ER) of the entire system. © 2010 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The full-scale base-isolated structure studied in this dissertation is the only base-isolated building in South Island of New Zealand. It sustained hundreds of earthquake ground motions from September 2010 and well into 2012. Several large earthquake responses were recorded in December 2011 by NEES@UCLA and by GeoNet recording station nearby Christchurch Women's Hospital. The primary focus of this dissertation is to advance the state-of-the art of the methods to evaluate performance of seismic-isolated structures and the effects of soil-structure interaction by developing new data processing methodologies to overcome current limitations and by implementing advanced numerical modeling in OpenSees for direct analysis of soil-structure interaction.

This dissertation presents a novel method for recovering force-displacement relations within the isolators of building structures with unknown nonlinearities from sparse seismic-response measurements of floor accelerations. The method requires only direct matrix calculations (factorizations and multiplications); no iterative trial-and-error methods are required. The method requires a mass matrix, or at least an estimate of the floor masses. A stiffness matrix may be used, but is not necessary. Essentially, the method operates on a matrix of incomplete measurements of floor accelerations. In the special case of complete floor measurements of systems with linear dynamics, real modes, and equal floor masses, the principal components of this matrix are the modal responses. In the more general case of partial measurements and nonlinear dynamics, the method extracts a number of linearly-dependent components from Hankel matrices of measured horizontal response accelerations, assembles these components row-wise and extracts principal components from the singular value decomposition of this large matrix of linearly-dependent components. These principal components are then interpolated between floors in a way that minimizes the curvature energy of the interpolation. This interpolation step can make use of a reduced-order stiffness matrix, a backward difference matrix or a central difference matrix. The measured and interpolated floor acceleration components at all floors are then assembled and multiplied by a mass matrix. The recovered in-service force-displacement relations are then incorporated into the OpenSees soil structure interaction model.

Numerical simulations of soil-structure interaction involving non-uniform soil behavior are conducted following the development of the complete soil-structure interaction model of Christchurch Women's Hospital in OpenSees. In these 2D OpenSees models, the superstructure is modeled as two-dimensional frames in short span and long span respectively. The lead rubber bearings are modeled as elastomeric bearing (Bouc Wen) elements. The soil underlying the concrete raft foundation is modeled with linear elastic plane strain quadrilateral element. The non-uniformity of the soil profile is incorporated by extraction and interpolation of shear wave velocity profile from the Canterbury Geotechnical Database. The validity of the complete two-dimensional soil-structure interaction OpenSees model for the hospital is checked by comparing the results of peak floor responses and force-displacement relations within the isolation system achieved from OpenSees simulations to the recorded measurements. General explanations and implications, supported by displacement drifts, floor acceleration and displacement responses, force-displacement relations are described to address the effects of soil-structure interaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scatter in medical imaging is typically cast off as image-related noise that detracts from meaningful diagnosis. It is therefore typically rejected or removed from medical images. However, it has been found that every material, including cancerous tissue, has a unique X-ray coherent scatter signature that can be used to identify the material or tissue. Such scatter-based tissue-identification provides the advantage of locating and identifying particular materials over conventional anatomical imaging through X-ray radiography. A coded aperture X-ray coherent scatter spectral imaging system has been developed in our group to classify different tissue types based on their unique scatter signatures. Previous experiments using our prototype have demonstrated that the depth-resolved coherent scatter spectral imaging system (CACSSI) can discriminate healthy and cancerous tissue present in the path of a non-destructive x-ray beam. A key to the successful optimization of CACSSI as a clinical imaging method is to obtain anatomically accurate phantoms of the human body. This thesis describes the development and fabrication of 3D printed anatomical scatter phantoms of the breast and lung.

The purpose of this work is to accurately model different breast geometries using a tissue equivalent phantom, and to classify these tissues in a coherent x-ray scatter imaging system. Tissue-equivalent anatomical phantoms were designed to assess the capability of the CACSSI system to classify different types of breast tissue (adipose, fibroglandular, malignant). These phantoms were 3D printed based on DICOM data obtained from CT scans of prone breasts. The phantoms were tested through comparison of measured scatter signatures with those of adipose and fibroglandular tissue from literature. Tumors in the phantom were modeled using a variety of biological tissue including actual surgically excised benign and malignant tissue specimens. Lung based phantoms have also been printed for future testing. Our imaging system has been able to define the location and composition of the various materials in the phantom. These phantoms were used to characterize the CACSSI system in terms of beam width and imaging technique. The result of this work showed accurate modeling and characterization of the phantoms through comparison of the tissue-equivalent form factors to those from literature. The physical construction of the phantoms, based on actual patient anatomy, was validated using mammography and computed tomography to visually compare the clinical images to those of actual patient anatomy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Globally, the current state of freshwater resource management is insufficient and impeding the chance at a sustainable future. Human interference within the natural hydrologic cycle is becoming dangerously irreversible and the need to redefine resource managerial approaches is imminent. This research involves the development of a coupled natural-human freshwater resource supply model using a System Dynamics approach. The model was applied to two case studies, Somalia, Africa and the Phoenix Active Management Area in Arizona, USA. It is suggested that System Dynamic modeling would be an invaluable tool for achieving sustainable freshwater resource management in individual watersheds. Through a series of thought experiments, a thorough understanding of the systems’ dynamic behaviors is obtainable for freshwater resource managers and policy-makers to examine various courses of action for alleviating freshwater supply concerns. This thesis reviews the model, its development and an analysis of several thought experiments applied to the case studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Florida Everglades has a long history of anthropogenic changes which have impacted the quantity and quality of water entering the system. Since the construction of Tamiami Trail in the 1920's, overland flow to the Florida Everglades has decreased significantly, impacting ecosystems from the wetlands to the estuary. The MIKE Marsh Model of Everglades National Park (M3ENP) is a numerical model, which simulates Everglades National Park (ENP) hydrology using MIKE SHE/MIKE 11software. This model has been developed to determine the parameters that effect Everglades hydrology and understand the impact of specific flow changes on the hydrology of the system. As part of the effort to return flows to the historical levels, several changes to the existing water management infrastructure have been implemented or are in the design phase. Bridge construction scenarios were programed into the M3ENP model to review the effect of these structural changes and evaluate the potential impacts on water levels and hydroperiods in the receiving Northeast Shark Slough ecosystem. These scenarios have shown critical water level increases in an area which has been in decline due to low water levels. Results from this work may help guide future decisions for restoration designs. Excess phosphorus entering Everglades National Park in South Florida may promote the growth of more phosphorus-opportunistic species and alter the food chain from the bottom up. Two phosphorus transport methods were developed into the M3ENP hydrodynamic model to determine the factors affecting phosphorus transport and the impact of bridge construction on water quality. Results showed that while phosphorus concentrations in surface waters decreased overall, some areas within ENP interior may experience an increase in phosphorus loading which the addition of bridges to Tamiami Trail. Finally, phosphorus data and modeled water level data was used to evaluate the spectral response of Everglades vegetation to increasing phosphorus availability using Landsat imagery.