933 resultados para model driven system, semantic representation, semantic modeling, enterprise system development
Resumo:
Operational forecasting centres are currently developing data assimilation systems for coupled atmosphere-ocean models. Strongly coupled assimilation, in which a single assimilation system is applied to a coupled model, presents significant technical and scientific challenges. Hence weakly coupled assimilation systems are being developed as a first step, in which the coupled model is used to compare the current state estimate with observations, but corrections to the atmosphere and ocean initial conditions are then calculated independently. In this paper we provide a comprehensive description of the different coupled assimilation methodologies in the context of four dimensional variational assimilation (4D-Var) and use an idealised framework to assess the expected benefits of moving towards coupled data assimilation. We implement an incremental 4D-Var system within an idealised single column atmosphere-ocean model. The system has the capability to run both strongly and weakly coupled assimilations as well as uncoupled atmosphere or ocean only assimilations, thus allowing a systematic comparison of the different strategies for treating the coupled data assimilation problem. We present results from a series of identical twin experiments devised to investigate the behaviour and sensitivities of the different approaches. Overall, our study demonstrates the potential benefits that may be expected from coupled data assimilation. When compared to uncoupled initialisation, coupled assimilation is able to produce more balanced initial analysis fields, thus reducing initialisation shock and its impact on the subsequent forecast. Single observation experiments demonstrate how coupled assimilation systems are able to pass information between the atmosphere and ocean and therefore use near-surface data to greater effect. We show that much of this benefit may also be gained from a weakly coupled assimilation system, but that this can be sensitive to the parameters used in the assimilation.
Resumo:
There are a number of factors that lead to non-linearity between precipitation anomalies and flood hazard; this non-linearity is a pertinent issue for applications that use a precipitation forecast as a proxy for imminent flood hazard. We assessed the degree of this non-linearity for the first time using a recently developed global-scale hydrological model driven by the ERA-Interim Land precipitation reanalysis (1980–2010). We introduced new indices to assess large-scale flood hazard, or floodiness, and quantified the link between monthly precipitation, river discharge and floodiness anomalies at the global and regional scales. The results show that monthly floodiness is not well correlated with precipitation, therefore demonstrating the value of hydrometeorological systems for providing floodiness forecasts for decision-makers. A method is described for forecasting floodiness using the Global Flood Awareness System, building a climatology of regional floodiness from which to forecast floodiness anomalies out to two weeks.
Resumo:
We study the exact solution of an N-state vertex model based on the representation of the U(q)[SU(2)] algebra at roots of unity with diagonal open boundaries. We find that the respective reflection equation provides us one general class of diagonal K-matrices having one free-parameter. We determine the eigenvalues of the double-row transfer matrix and the respective Bethe ansatz equation within the algebraic Bethe ansatz framework. The structure of the Bethe ansatz equation combine a pseudomomenta function depending on a free-parameter with scattering phase-shifts that are fixed by the roots of unity and boundary variables. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This paper describe a model from system theory that can be used as a base for better understanding of different situations in the firms evolution. This change model is derived from the theory of organic systems and divides the evolution of the system into higher complexity of the system structure in three distinctive phases. These phases are a formative phase, a normative phase and an integrative phase. After a summary of different types of models of the dynamics of the firm the paper makes a theoretical presentation of the model and how this model is adaptable for better understanding of the need for change in strategic orientation, organization form and leadership style over time.
Resumo:
A procedure for characterizing global uncertainty of a rainfall-runoff simulation model based on using grey numbers is presented. By using the grey numbers technique the uncertainty is characterized by an interval; once the parameters of the rainfall-runoff model have been properly defined as grey numbers, by using the grey mathematics and functions it is possible to obtain simulated discharges in the form of grey numbers whose envelope defines a band which represents the vagueness/uncertainty associated with the simulated variable. The grey numbers representing the model parameters are estimated in such a way that the band obtained from the envelope of simulated grey discharges includes an assigned percentage of observed discharge values and is at the same time as narrow as possible. The approach is applied to a real case study highlighting that a rigorous application of the procedure for direct simulation through the rainfall-runoff model with grey parameters involves long computational times. However, these times can be significantly reduced using a simplified computing procedure with minimal approximations in the quantification of the grey numbers representing the simulated discharges. Relying on this simplified procedure, the conceptual rainfall-runoff grey model is thus calibrated and the uncertainty bands obtained both downstream of the calibration process and downstream of the validation process are compared with those obtained by using a well-established approach, like the GLUE approach, for characterizing uncertainty. The results of the comparison show that the proposed approach may represent a valid tool for characterizing the global uncertainty associable with the output of a rainfall-runoff simulation model.
Resumo:
Os modelos e as técnicas de modelação são, hoje em dia, fundamentais na engenharia de software, devido à complexidade e sofisticação dos sistemas de informação actuais.A linguagem Unified Modeling Language (UML) [OMG, 2005a] [OMG, 2005b] tornou-se uma norma para modelação, na engenharia de software e em outras áreas e domínios, mas é reconhecida a sua falta de suporte para a modelação da interactividade e da interface com o utilizador [Nunes and Falcão e Cunha, 2000].Neste trabalho, é explorada a ligação entre as áreas de engenharia de software e de interacção humano-computador, tendo, para isso, sido escolhido o processo de desenvolvimento Wisdom [Nunes and Falcão e Cunha, 2000] [Nunes, 2001]. O método Wisdom é conduzido por casos de utilização essenciais e pelo princípio da prototipificação evolutiva, focando-se no desenho das interfaces com o utilizador através da estrutura da apresentação, com a notação Protótipos Abstractos Canónicos (PAC) [Constantine and Lockwood, 1999] [Constantine, 2003], e do comportamento da interacção com a notação ConcurTaskTrees (CTT) [Paternò, 1999] [Mori, Paternò, et al., 2004] em UML.É proposto, também, neste trabalho um novo passo no processo Wisdom, sendo definido um modelo específico, construído segundo os requisitos da recomendação Model Driven Architecture (MDA) [Soley and OMG, 2000] [OMG, 2003] elaborada pela organização Object Managent Group (OMG). Este modelo específico será o intermediário entre o modelo de desenho e a implementação da interface final com o utilizador. Esta proposta alinha o método Wisdom com a recomendação MDA, tornando possível que sejam gerados, de forma automática, protótipos funcionais de interfaces com o utilizador a partir dos modelos conceptuais de análise e desenho.Foi utilizada a ferramenta de modelação e de metamodelação MetaSketch [Nóbrega, Nunes, et al., 2006] para a definição e manipulação dos modelos e elementos propostos. Foram criadas as aplicações Model2Model e Model2Code para suportar as transformações entre modelos e a geração de código a partir destes. Para a plataforma de implementação foi escolhida a framework Hydra, desenvolvida na linguagem PHP [PHP, 2006], que foi adaptada com alguns conceitos de modo a suportar a abordagem defendida neste trabalho.
Resumo:
The following document proposes a traceability solution for model-driven development. There as been already previous work done in this area, but so far there has not been yet any standardized way for exchanging traceability information, thus the goal of this project developed and documented here is not to automatize the traceability process but to provide an approach to achieve traceability that follows OMG standards, making traceability information exchangeable between tools that follow the same standards. As such, we propose a traceability meta-model as an extension of MetaObject Facility (MOF)1. Using MetaSketch2 modeling language workbench, we present a modeling language for traceability information. This traceability information then can be used for tool cooperation. Using Meta.Tracer (our tool developed for this thesis), we enable the users to establish traceability relationships between different traceability elements and offer a visualization for the traceability information. We then demonstrate the benefits of using a traceability tool on a software development life cycle using a case study. We finalize by commenting on the work developed.
Resumo:
This thesis reports on research done for the integration of eye tracking technology into virtual reality environments, with the goal of using it in rehabilitation of patients who suffered from stroke. For the last few years, eye tracking has been a focus on medical research, used as an assistive tool to help people with disabilities interact with new technologies and as an assessment tool to track the eye gaze during computer interactions. However, tracking more complex gaze behaviors and relating them to motor deficits in people with disabilities is an area that has not been fully explored, therefore it became the focal point of this research. During the research, two exploratory studies were performed in which eye tracking technology was integrated in the context of a newly created virtual reality task to assess the impact of stroke. Using an eye tracking device and a custom virtual task, the system developed is able to monitor the eye gaze pattern changes over time in patients with stroke, as well as allowing their eye gaze to function as an input for the task. Based on neuroscientific hypotheses of upper limb motor control, the studies aimed at verifying the differences in gaze patterns during the observation and execution of the virtual goal-oriented task in stroke patients (N=10), and also to assess normal gaze behavior in healthy participants (N=20). Results were found consistent and supported the hypotheses formulated, showing that eye gaze could be used as a valid assessment tool on these patients. However, the findings of this first exploratory approach are limited in order to fully understand the effect of stroke on eye gaze behavior. Therefore, a novel model-driven paradigm is proposed to further understand the relation between the neuronal mechanisms underlying goal-oriented actions and eye gaze behavior.
Resumo:
Due to major progress of communication system in the last decades, need for more precise characterization of used components. The S-parameters modeling has been used to characterization, simulation and test of communication system. However, limitation of S-parameters to model nonlinear system has created new modeling systems that include the nonlinear characteristics. The polyharmonic distortion modeling is a characterizationg technique for nonlinear systems that has been growing up due to praticity and similarity with S-parameters. This work presents analysis the polyharmonic distortion modeling, the test bench development for simulation of planar structure and planar structure characterization with X-parameters
Resumo:
The advantages offered by the electronic component LED (Light Emitting Diode) have caused a quick and wide application of this device in replacement of incandescent lights. However, in its combined application, the relationship between the design variables and the desired effect or result is very complex and it becomes difficult to model by conventional techniques. This work consists of the development of a technique, through comparative analysis of neuro-fuzzy architectures, to make possible to obtain the luminous intensity values of brake lights using LEDs from design data.
Resumo:
One of the current challenges of Ubiquitous Computing is the development of complex applications, those are more than simple alarms triggered by sensors or simple systems to configure the environment according to user preferences. Those applications are hard to develop since they are composed by services provided by different middleware and it is needed to know the peculiarities of each of them, mainly the communication and context models. This thesis presents OpenCOPI, a platform which integrates various services providers, including context provision middleware. It provides an unified ontology-based context model, as well as an environment that enable easy development of ubiquitous applications via the definition of semantic workflows that contains the abstract description of the application. Those semantic workflows are converted into concrete workflows, called execution plans. An execution plan consists of a workflow instance containing activities that are automated by a set of Web services. OpenCOPI supports the automatic Web service selection and composition, enabling the use of services provided by distinct middleware in an independent and transparent way. Moreover, this platform also supports execution adaptation in case of service failures, user mobility and degradation of services quality. The validation of OpenCOPI is performed through the development of case studies, specifically applications of the oil industry. In addition, this work evaluates the overhead introduced by OpenCOPI and compares it with the provided benefits, and the efficiency of OpenCOPI s selection and adaptation mechanism
Resumo:
This thesis presents ⇡SOD-M (Policy-based Service Oriented Development Methodology), a methodology for modeling reliable service-based applications using policies. It proposes a model driven method with: (i) a set of meta-models for representing non-functional constraints associated to service-based applications, starting from an use case model until a service composition model; (ii) a platform providing guidelines for expressing the composition and the policies; (iii) model-to-model and model-to-text transformation rules for semi-automatizing the implementation of reliable service-based applications; and (iv) an environment that implements these meta-models and rules, and enables the application of ⇡SOD-M. This thesis also presents a classification and nomenclature for non-functional requirements for developing service-oriented applications. Our approach is intended to add value to the development of service-oriented applications that have quality requirements needs. This work uses concepts from the service-oriented development, non-functional requirements design and model-driven delevopment areas to propose a solution that minimizes the problem of reliable service modeling. Some examples are developed as proof of concepts
Resumo:
The use of middleware technology in various types of systems, in order to abstract low-level details related to the distribution of application logic, is increasingly common. Among several systems that can be benefited from using these components, we highlight the distributed systems, where it is necessary to allow communications between software components located on different physical machines. An important issue related to the communication between distributed components is the provision of mechanisms for managing the quality of service. This work presents a metamodel for modeling middlewares based on components in order to provide to an application the abstraction of a communication between components involved in a data stream, regardless their location. Another feature of the metamodel is the possibility of self-adaptation related to the communication mechanism, either by updating the values of its configuration parameters, or by its replacement by another mechanism, in case of the restrictions of quality of service specified are not being guaranteed. In this respect, it is planned the monitoring of the communication state (application of techniques like feedback control loop), analyzing performance metrics related. The paradigm of Model Driven Development was used to generate the implementation of a middleware that will serve as proof of concept of the metamodel, and the configuration and reconfiguration policies related to the dynamic adaptation processes. In this sense was defined the metamodel associated to the process of a communication configuration. The MDD application also corresponds to the definition of the following transformations: the architectural model of the middleware in Java code, and the configuration model to XML
Resumo:
This dissertation presents a model-driven and integrated approach to variability management, customization and execution of software processes. Our approach is founded on the principles and techniques of software product lines and model-driven engineering. Model-driven engineering provides support to the specification of software processes and their transformation to workflow specifications. Software product lines techniques allows the automatic variability management of process elements and fragments. Additionally, in our approach, workflow technologies enable the process execution in workflow engines. In order to evaluate the approach feasibility, we have implemented it using existing model-driven engineering technologies. The software processes are specified using Eclipse Process Framework (EPF). The automatic variability management of software processes has been implemented as an extension of an existing product derivation tool. Finally, ATL and Acceleo transformation languages are adopted to transform EPF process to jPDL workflow language specifications in order to enable the deployment and execution of software processes in the JBoss BPM workflow engine. The approach is evaluated through the modeling and modularization of the project management discipline of the Open Unified Process (OpenUP)
Resumo:
The academic community and software industry have shown, in recent years, substantial interest in approaches and technologies related to the area of model-driven development (MDD). At the same time, continues the relentless pursuit of industry for technologies to raise productivity and quality in the development of software products. This work aims to explore those two statements, through an experiment carried by using MDD technology and evaluation of its use on solving an actual problem under the security context of enterprise systems. By building and using a tool, a visual DSL denominated CALV3, inspired by the software factory approach: a synergy between software product line, domainspecific languages and MDD, we evaluate the gains in abstraction and productivity through a systematic case study conducted in a development team. The results and lessons learned from the evaluation of this tool within industry are the main contributions of this work