878 resultados para System-Level Models
Resumo:
Credit for prior learning programs help students complete degrees more quickly and for less money. This report addresses the challenges of scaling up a credit for prior learning program at the university system level, and explores the delineation of responsibilities between system staff and institutional staff.
Resumo:
The search for better performance in the structural systems has been taken to more refined models, involving the analysis of a growing number of details, which should be correctly formulated aiming at defining a representative model of the real system. Representative models demand a great detailing of the project and search for new techniques of evaluation and analysis. Model updating is one of this technologies, it can be used to improve the predictive capabilities of computer-based models. This paper presents a FRF-based finite element model updating procedure whose the updating variables are physical parameters of the model. It includes the damping effects in the updating procedure assuming proportional and none proportional damping mechanism. The updating parameters are defined at an element level or macro regions of the model. So, the parameters are adjusted locally, facilitating the physical interpretation of the adjusting of the model. Different tests for simulated and experimental data are discussed aiming at defining the characteristics and potentialities of the methodology.
Resumo:
The approach Software Product Line (SPL) has become very promising these days, since it allows the production of customized systems on large scale through product families. For the modeling of these families the Features Model is being widely used, however, it is a model that has low level of detail and not may be sufficient to guide the development team of LPS. Thus, it is recommended add the Features Model to other models representing the system from other perspectives. The goals model PL-AOVgraph can assume this role complementary to the Features Model, since it has a to context oriented language of LPS's, which allows the requirements modeling in detail and identification of crosscutting concerns that may arise as result of variability. In order to insert PL-AOVgraph in development of LPS's, this paper proposes a bi-directional mapping between PL-AOVgraph and Features Model, which will be automated by tool ReqSys-MDD. This tool uses the approach of Model-Driven Development (MDD), which allows the construction of systems from high level models through successive transformations. This enables the integration of ReqSys-MDD with other tools MDD that use their output models as input to other transformations. So it is possible keep consistency among the models involved, avoiding loss of informations on transitions between stages of development
Resumo:
Software Transactional Memory (STM) systems have poor performance under high contention scenarios. Since many transactions compete for the same data, most of them are aborted, wasting processor runtime. Contention management policies are typically used to avoid that, but they are passive approaches as they wait for an abort to happen so they can take action. More proactive approaches have emerged, trying to predict when a transaction is likely to abort so its execution can be delayed. Such techniques are limited, as they do not replace the doomed transaction by another or, when they do, they rely on the operating system for that, having little or no control on which transaction should run. In this paper we propose LUTS, a Lightweight User-Level Transaction Scheduler, which is based on an execution context record mechanism. Unlike other techniques, LUTS provides the means for selecting another transaction to run in parallel, thus improving system throughput. Moreover, it avoids most of the issues caused by pseudo parallelism, as it only launches as many system-level threads as the number of available processor cores. We discuss LUTS design and present three conflict-avoidance heuristics built around LUTS scheduling capabilities. Experimental results, conducted with STMBench7 and STAMP benchmark suites, show LUTS efficiency when running high contention applications and how conflict-avoidance heuristics can improve STM performance even more. In fact, our transaction scheduling techniques are capable of improving program performance even in overloaded scenarios. © 2011 Springer-Verlag.
Resumo:
In the paper we discuss the potential of the new Galileo signals for pseudorange based surveying and mapping in open areas under optimal reception conditions (open sky scenarios) and suboptimal ones (multipath created by moderate to thick tree coverage). The paper reviews the main features of the Galileo E5 AltBOC and E1 CBOC signals; describes the simulation strategy, models and algorithms to generate realistic E5 and E1 pseudoranges with and without multipath sources; describes the ionosphere modeling strategy, models and algorithms and discusses and presents the expected positioning accuracy and precision results. According to the simulations performed, pseudoranges can be extracted from the Galileo E5 AltBOC signals with tracking errors (1-σ level) ranging from 0.02 m (open sky scenarios) to 0.08 m (tree covered scenarios) whereas for the Galileo E1 CBOC signals the tracking errors range between 0.25 m to 2.00 m respectively. With these tracking errors and with the explicit estimation of the ionosphere parameters, simulations indicate real-time open sky cm-level horizontal positioning precisions and dm-level vertical ones and dm-level accuracies for both the horizontal and vertical position components.
Resumo:
Malicious programs (malware) can cause severe damage on computer systems and data. The mechanism that the human immune system uses to detect and protect from organisms that threaten the human body is efficient and can be adapted to detect malware attacks. In this paper we propose a system to perform malware distributed collection, analysis and detection, this last inspired by the human immune system. After collecting malware samples from Internet, they are dynamically analyzed so as to provide execution traces at the operating system level and network flows that are used to create a behavioral model and to generate a detection signature. Those signatures serve as input to a malware detector, acting as the antibodies in the antigen detection process. This allows us to understand the malware attack and aids in the infection removal procedures. © 2012 Springer-Verlag.
Resumo:
Low-level laser (LLL) has been used on peri-implant tissues for accelerating bone formation. However, the effect of one session of LLL in the strength of bone-implant interface during early healing process remains unclear. The present study aims to evaluate the removal torque of titanium implants irradiated with LLL during surgical preparation of implant bed, in comparison to non-irradiation. Sixty-four Wistar rats were used. Half of the animals were included in LLL group, while the other half remained as control. All animals had the tibia prepared with a 2 mm drill, and a titanium implant (2.2 × 4 mm) was inserted. Animals from LLL group were irradiated with laser (gallium aluminum arsenide), with a wavelength of 808 nm, a measured power output of 50 mW, to emit radiation in collimated beams (0.4 cm2), for 1 min and 23 s, and an energy density of 11 J/cm2. Two applications (22 J/cm 2) were performed immediately after bed preparation for implant installation. Flaps were sutured, and animals from both groups were sacrificed 7, 15, 30, and 45 days after implant installation, when load necessary for removing implant from bone was evaluated by using a torquimeter. In both groups, torque values tended to increase overtime; and at 30 and 45 days periods, values were statistically higher for LLL group in comparison to control (ANOVA test, p < 0.0001). Thus, it could be suggested that a single session of irradiation with LLL was beneficial to improve bone-implant interface strength, contributing to the osseointegration process. © 2012 Springer-Verlag London Ltd.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper presents the development of a mathematical model to optimize the management and operation of the Brazilian hydrothermal system. The system consists of a large set of individual hydropower plants and a set of aggregated thermal plants. The energy generated in the system is interconnected by a transmission network so it can be transmitted to centers of consumption throughout the country. The optimization model offered is capable of handling different types of constraints, such as interbasin water transfers, water supply for various purposes, and environmental requirements. Its overall objective is to produce energy to meet the country's demand at a minimum cost. Called HIDROTERM, the model integrates a database with basic hydrological and technical information to run the optimization model, and provides an interface to manage the input and output data. The optimization model uses the General Algebraic Modeling System (GAMS) package and can invoke different linear as well as nonlinear programming solvers. The optimization model was applied to the Brazilian hydrothermal system, one of the largest in the world. The system is divided into four subsystems with 127 active hydropower plants. Preliminary results under different scenarios of inflow, demand, and installed capacity demonstrate the efficiency and utility of the model. From this and other case studies in Brazil, the results indicate that the methodology developed is suitable to different applications, such as planning operation, capacity expansion, and operational rule studies, and trade-off analysis among multiple water users. DOI: 10.1061/(ASCE)WR.1943-5452.0000149. (C) 2012 American Society of Civil Engineers.
Resumo:
Systems Biology is an innovative way of doing biology recently raised in bio-informatics contexts, characterised by the study of biological systems as complex systems with a strong focus on the system level and on the interaction dimension. In other words, the objective is to understand biological systems as a whole, putting on the foreground not only the study of the individual parts as standalone parts, but also of their interaction and of the global properties that emerge at the system level by means of the interaction among the parts. This thesis focuses on the adoption of multi-agent systems (MAS) as a suitable paradigm for Systems Biology, for developing models and simulation of complex biological systems. Multi-agent system have been recently introduced in informatics context as a suitabe paradigm for modelling and engineering complex systems. Roughly speaking, a MAS can be conceived as a set of autonomous and interacting entities, called agents, situated in some kind of nvironment, where they fruitfully interact and coordinate so as to obtain a coherent global system behaviour. The claim of this work is that the general properties of MAS make them an effective approach for modelling and building simulations of complex biological systems, following the methodological principles identified by Systems Biology. In particular, the thesis focuses on cell populations as biological systems. In order to support the claim, the thesis introduces and describes (i) a MAS-based model conceived for modelling the dynamics of systems of cells interacting inside cell environment called niches. (ii) a computational tool, developed for implementing the models and executing the simulations. The tool is meant to work as a kind of virtual laboratory, on top of which kinds of virtual experiments can be performed, characterised by the definition and execution of specific models implemented as MASs, so as to support the validation, falsification and improvement of the models through the observation and analysis of the simulations. A hematopoietic stem cell system is taken as reference case study for formulating a specific model and executing virtual experiments.
Resumo:
Application of biogeochemical models to the study of marine ecosystems is pervasive, yet objective quantification of these models' performance is rare. Here, 12 lower trophic level models of varying complexity are objectively assessed in two distinct regions (equatorial Pacific and Arabian Sea). Each model was run within an identical one-dimensional physical framework. A consistent variational adjoint implementation assimilating chlorophyll-a, nitrate, export, and primary productivity was applied and the same metrics were used to assess model skill. Experiments were performed in which data were assimilated from each site individually and from both sites simultaneously. A cross-validation experiment was also conducted whereby data were assimilated from one site and the resulting optimal parameters were used to generate a simulation for the second site. When a single pelagic regime is considered, the simplest models fit the data as well as those with multiple phytoplankton functional groups. However, those with multiple phytoplankton functional groups produced lower misfits when the models are required to simulate both regimes using identical parameter values. The cross-validation experiments revealed that as long as only a few key biogeochemical parameters were optimized, the models with greater phytoplankton complexity were generally more portable. Furthermore, models with multiple zooplankton compartments did not necessarily outperform models with single zooplankton compartments, even when zooplankton biomass data are assimilated. Finally, even when different models produced similar least squares model-data misfits, they often did so via very different element flow pathways, highlighting the need for more comprehensive data sets that uniquely constrain these pathways.
Resumo:
A search is presented for direct chargino production based on a disappearing-track signature using 20.3 fb−1 of proton-proton collisions at s√=8 TeV collected with the ATLAS experiment at the LHC. In anomaly-mediated supersymmetry breaking (AMSB) models, the lightest chargino is nearly mass degenerate with the lightest neutralino and its lifetime is long enough to be detected in the tracking detectors by identifying decays that result in tracks with no associated hits in the outer region of the tracking system. Some models with supersymmetry also predict charginos with a significant lifetime. This analysis attains sensitivity for charginos with a lifetime between 0.1 and 10 ns, and significantly surpasses the reach of the LEP experiments. No significant excess above the background expectation is observed for candidate tracks with large transverse momentum, and constraints on chargino properties are obtained. In the AMSB scenarios, a chargino mass below 270 GeV is excluded at 95% confidence level.
Resumo:
Vertical integration is grounded in economic theory as a corporate strategy for reducing cost and enhancing efficiency. There were three purposes for this dissertation. The first was to describe and understand vertical integration theory. The review of the economic theory established vertical integration as a corporate cost reduction strategy in response to environmental, structural and performance dimensions of the market. The second purpose was to examine vertical integration in the context of the health care industry, which has greater complexity, higher instability, and more unstable demand than other industries, although many of the same dimensions of the market supported a vertical integration strategy. Evidence on the performance of health systems after integration revealed mixed results. Because the market continues to be turbulent, hybrid non-owned integration in the form of alliances have increased to over 40% of urban hospitals. The third purpose of the study was to examine the application of vertical integration in health care and evaluate the effects. The case studied was an alliance formed between a community hospital and a tertiary medical center to facilitate vertical integration of oncology services while maintaining effectiveness and preserving access. The economic benefits for 1934 patients were evaluated in the delivery system before and after integration with a more detailed economic analysis of breast, lung, colon/rectal, and non-malignant cases. A regression analysis confirmed the relationship between the independent variables of age, sex, location of services, race, stage of disease, and diagnosis, and the dependent variable, cost. The results of the basic regression model, as well as the regression with first-order interaction terms, were statistically significant. The study shows that vertical integration at an intermediate health care system level has economic benefits. If the pre-integration oncology group had been treated in the post-integration model, the expected cost savings from integration would be 31.5%. Quality indicators used were access to health care services and research treatment protocols, and access was preserved in the integrated model. Using survival as a direct quality outcome measure, the survival of lung cancer patients was statistically the same before and after integration. ^
Resumo:
This paper introduces a method to analyze and predict stability and transient performance of a distributed system where COTS (Commercial-off-the-shelf) modules share an input filter. The presented procedure is based on the measured data from the input and output terminals of the power modules. The required information for the analysis is obtained by performing frequency response measurements for each converter. This attained data is utilized to compute special transfer functions, which partly determine the source and load interactions within the converters. The system level dynamic description is constructed based on the measured and computed transfer functions introducing cross-coupling mechanisms within the system. System stability can be studied based on the well-known impedance- related minor-loop gain at an arbitrary interface within the system.