895 resultados para Model-driven Architecture, Goal-Oriented design, usability
Resumo:
This paper explores the context of and developments in Research by Design (RbD) as currently developing in Schools of Architecture. It starts from noticing that the design studio is the core of the bachelor and master curriculum. Extending this position to PhD research implies the search for research where the design process is the main method of researching and creating knowledge and understanding. These developments connect to similar developments in the arts. Mode 1 and mode 2 knowledge, reflection and other knowledge processes are the base for developing knowledge for the field of architecture when practice and designing are the main method of research. The paper concludes with observing many PhD and research projects building on design activities and practice are currently under way and are supported by academia. They produce a specific type of knowledge and understanding, usually opening up problems and exploring boundaries.
Resumo:
Reanalysis data obtained from data assimilation are increasingly used for diagnostic studies of the general circulation of the atmosphere, for the validation of modelling experiments and for estimating energy and water fluxes between the Earth surface and the atmosphere. Because fluxes are not specifically observed, but determined by the data assimilation system, they are not only influenced by the utilized observations but also by model physics and dynamics and by the assimilation method. In order to better understand the relative importance of humidity observations for the determination of the hydrological cycle, in this paper we describe an assimilation experiment using the ERA40 reanalysis system where all humidity data have been excluded from the observational data base. The surprising result is that the model, driven by the time evolution of wind, temperature and surface pressure, is able to almost completely reconstitute the large-scale hydrological cycle of the control assimilation without the use of any humidity data. In addition, analysis of the individual weather systems in the extratropics and tropics using an objective feature tracking analysis indicates that the humidity data have very little impact on these systems. We include a discussion of these results and possible consequences for the way moisture information is assimilated, as well as the potential consequences for the design of observing systems for climate monitoring. It is further suggested, with support from a simple assimilation study with another model, that model physics and dynamics play a decisive role for the hydrological cycle, stressing the need to better understand these aspects of model parametrization. .
Resumo:
Estimating the magnitude of Agulhas leakage, the volume flux of water from the Indian to the Atlantic Ocean, is difficult because of the presence of other circulation systems in the Agulhas region. Indian Ocean water in the Atlantic Ocean is vigorously mixed and diluted in the Cape Basin. Eulerian integration methods, where the velocity field perpendicular to a section is integrated to yield a flux, have to be calibrated so that only the flux by Agulhas leakage is sampled. Two Eulerian methods for estimating the magnitude of Agulhas leakage are tested within a high-resolution two-way nested model with the goal to devise a mooring-based measurement strategy. At the GoodHope line, a section halfway through the Cape Basin, the integrated velocity perpendicular to that line is compared to the magnitude of Agulhas leakage as determined from the transport carried by numerical Lagrangian floats. In the first method, integration is limited to the flux of water warmer and more saline than specific threshold values. These threshold values are determined by maximizing the correlation with the float-determined time series. By using the threshold values, approximately half of the leakage can directly be measured. The total amount of Agulhas leakage can be estimated using a linear regression, within a 90% confidence band of 12 Sv. In the second method, a subregion of the GoodHope line is sought so that integration over that subregion yields an Eulerian flux as close to the float-determined leakage as possible. It appears that when integration is limited within the model to the upper 300 m of the water column within 900 km of the African coast the time series have the smallest root-mean-square difference. This method yields a root-mean-square error of only 5.2 Sv but the 90% confidence band of the estimate is 20 Sv. It is concluded that the optimum thermohaline threshold method leads to more accurate estimates even though the directly measured transport is a factor of two lower than the actual magnitude of Agulhas leakage in this model.
Resumo:
This paper presents a multicriteria decision-making model for lifespan energy efficiency assessment of intelligent buildings (IBs). The decision-making model called IBAssessor is developed using an analytic network process (ANP) method and a set of lifespan performance indicators for IBs selected by a new quantitative approach called energy-time consumption index (ETI). In order to improve the quality of decision-making, the authors of this paper make use of previous research achievements including a lifespan sustainable business model, the Asian IB Index, and a number of relevant publications. Practitioners can use the IBAssessor ANP model at different stages of an IB lifespan for either engineering or business oriented assessments. Finally, this paper presents an experimental case study to demonstrate how to use IBAssessor ANP model to solve real-world design tasks.
Resumo:
Individuals with elevated levels of plasma low density lipoprotein (LDL) cholesterol (LDL-C) are considered to be at risk of developing coronary heart disease. LDL particles are removed from the blood by a process known as receptor-mediated endocytosis, which occurs mainly in the liver. A series of classical experiments delineated the major steps in the endocytotic process; apolipoprotein B-100 present on LDL particles binds to a specific receptor (LDL receptor, LDL-R) in specialized areas of the cell surface called clathrin-coated pits. The pit comprising the LDL-LDL-R complex is internalized forming a cytoplasmic endosome. Fusion of the endosome with a lysosome leads to degradation of the LDL into its constituent parts (that is, cholesterol, fatty acids, and amino acids), which are released for reuse by the cell, or are excreted. In this paper, we formulate a mathematical model of LDL endocytosis, consisting of a system of ordinary differential equations. We validate our model against existing in vitro experimental data, and we use it to explore differences in system behavior when a single bolus of extracellular LDL is supplied to cells, compared to when a continuous supply of LDL particles is available. Whereas the former situation is common to in vitro experimental systems, the latter better reflects the in vivo situation. We use asymptotic analysis and numerical simulations to study the longtime behavior of model solutions. The implications of model-derived insights for experimental design are discussed.
Resumo:
Industrial projects are often complex and burdened with time pressures and a lack of information. The term 'soft-project' used here stands for projects where the ‘what’ and/or the ‘how’ is uncertain, which is often the experience in projects involving software intensive systems developments. This thesis intertwines the disciplines of project management and requirements engineering in a goal-oriented application of the maxim ‘keep all objectives satisfied’. It thus proposes a method for appraising projects. In this method, a goal-oriented analysis establishes a framework with which expert judgements are collected so as to construct a confidence profile in regard to the feasibility and adequacy of the project's planned outputs. It is hoped that this appraisal method will contribute to the activities of project ‘shaping’ and aligning stakeholders’ expectations whilst helping project managers appreciate what parts of their project can be progressed and what parts should be held pending further analysis. This thesis offers the following original contribution: an appreciation of appraisal in the project context; a goal-oriented confidence profiling technique; and: a technique to produce goal-refinement diagrams – referred to as Goal Sketching. Collectively these amount to a method for the ‘Goal Refinement Appraisal of Soft-Projects’ (GRASP). The validity of the GRASP method is shown for two projects. In the first it is used for shaping a business investigation project. This is done in real-time in the project. The second case is a retrospective study of an enterprise IT project. This case tests the effectiveness of forecasting project difficulty from an initial confidence profile.
Resumo:
Context: During development managers, analysts and designers often need to know whether enough requirements analysis work has been done and whether or not it is safe to proceed to the design stage. Objective: This paper describes a new, simple and practical method for assessing our confidence in a set of requirements. Method: We identified 4 confidence factors and used a goal oriented framework with a simple ordinal scale to develop a method for assessing confidence. We illustrate the method and show how it has been applied to a real systems development project. Results: We show how assessing confidence in the requirements could have revealed problems in this project earlier and so saved both time and money. Conclusion: Our meta-level assessment of requirements provides a practical and pragmatic method that can prove useful to managers, analysts and designers who need to know when sufficient requirements analysis has been performed.
The Joint UK Land Environment Simulator (JULES), model description – part 1: energy and water fluxes
Resumo:
This manuscript describes the energy and water components of a new community land surface model called the Joint UK Land Environment Simulator (JULES). This is developed from the Met Office Surface Exchange Scheme (MOSES). It can be used as a stand alone land surface model driven by observed forcing data, or coupled to an atmospheric global circulation model. The JULES model has been coupled to the Met Office Unified Model (UM) and as such provides a unique opportunity for the research community to contribute their research to improve both world-leading operational weather forecasting and climate change prediction systems. In addition JULES, and its forerunner MOSES, have been the basis for a number of very high-profile papers concerning the land-surface and climate over the last decade. JULES has a modular structure aligned to physical processes, providing the basis for a flexible modelling platform.
Resumo:
Semantic Analysis is a business analysis method designed to capture system requirements. While these requirements may be represented as text, the method also advocates the use of Ontology Charts to formally denote the system's required roles, relationships and forms of communication. Following model driven engineering techniques, Ontology Charts can be transformed to temporal Database schemas, class diagrams and component diagrams, which can then be used to produce software systems. A nice property of these transformations is that resulting system design models lend themselves to complicated extensions that do not require changes to the design models. For example, resulting databases can be extended with new types of data without the need to modify the database schema of the legacy system. Semantic Analysis is not widely used in software engineering, so there is a lack of experts in the field and no design patterns are available. This make it difficult for the analysts to pass organizational knowledge to the engineers. This study describes an implementation that is readily usable by engineers, which includes an automated technique that can produce a prototype from an Ontology Chart. The use of such tools should enable developers to make use of Semantic Analysis with minimal expertise of ontologies and MDA.
Resumo:
Determining the provenance of data, i.e. the process that led to that data, is vital in many disciplines. For example, in science, the process that produced a given result must be demonstrably rigorous for the result to be deemed reliable. A provenance system supports applications in recording adequate documentation about process executions to answer queries regarding provenance, and provides functionality to perform those queries. Several provenance systems are being developed, but all focus on systems in which the components are textitreactive, for example Web Services that act on the basis of a request, job submission system, etc. This limitation means that questions regarding the motives of autonomous actors, or textitagents, in such systems remain unanswerable in the general case. Such questions include: who was ultimately responsible for a given effect, what was their reason for initiating the process and does the effect of a process match what was intended to occur by those initiating the process? In this paper, we address this limitation by integrating two solutions: a generic, re-usable framework for representing the provenance of data in service-oriented architectures and a model for describing the goal-oriented delegation and engagement of agents in multi-agent systems. Using these solutions, we present algorithms to answer common questions regarding responsibility and success of a process and evaluate the approach with a simulated healthcare example.
Resumo:
The open provenance architecture (OPA) approach to the challenge was distinct in several regards. In particular, it is based on an open, well-defined data model and architecture, allowing different components of the challenge workflow to independently record documentation, and for the workflow to be executed in any environment. Another noticeable feature is that we distinguish between the data recorded about what has occurred, emphprocess documentation, and the emphprovenance of a data item, which is all that caused the data item to be as it is and is obtained as the result of a query over process documentation. This distinction allows us to tailor the system to separately best address the requirements of recording and querying documentation. Other notable features include the explicit recording of causal relationships between both events and data items, an interaction-based world model, intensional definition of data items in queries rather than relying on explicit naming mechanisms, and emphstyling of documentation to support non-functional application requirements such as reducing storage costs or ensuring privacy of data. In this paper we describe how each of these features aid us in answering the challenge provenance queries.
Resumo:
The Short-term Water Information and Forecasting Tools (SWIFT) is a suite of tools for flood and short-term streamflow forecasting, consisting of a collection of hydrologic model components and utilities. Catchments are modeled using conceptual subareas and a node-link structure for channel routing. The tools comprise modules for calibration, model state updating, output error correction, ensemble runs and data assimilation. Given the combinatorial nature of the modelling experiments and the sub-daily time steps typically used for simulations, the volume of model configurations and time series data is substantial and its management is not trivial. SWIFT is currently used mostly for research purposes but has also been used operationally, with intersecting but significantly different requirements. Early versions of SWIFT used mostly ad-hoc text files handled via Fortran code, with limited use of netCDF for time series data. The configuration and data handling modules have since been redesigned. The model configuration now follows a design where the data model is decoupled from the on-disk persistence mechanism. For research purposes the preferred on-disk format is JSON, to leverage numerous software libraries in a variety of languages, while retaining the legacy option of custom tab-separated text formats when it is a preferred access arrangement for the researcher. By decoupling data model and data persistence, it is much easier to interchangeably use for instance relational databases to provide stricter provenance and audit trail capabilities in an operational flood forecasting context. For the time series data, given the volume and required throughput, text based formats are usually inadequate. A schema derived from CF conventions has been designed to efficiently handle time series for SWIFT.
Resumo:
Aquilo que passamos a entender como racional e lógico a partir da era moderna, provê um esquema mental para tomada de ações que carrega um arcabouço de premissas e valores consigo. Essas regras visam a maximização utilitária das consequências, esvaziadas de qualquer valor subjetivo. Weber (1994) classificou acessoriamente esse esquema como “racionalidade instrumental”, que se caracteriza por ser orientada pelos fins, meios e consequências da ação. Em contraposição, definiu ainda a “racionalidade substantiva”, postulada nos valores do sujeito, que não se orienta por quaisquer consequências da ação. Muitos autores partiram dessas racionalidades para representar a dualidade que acomete o mundo a partir da centralidade do mercado e sua lógica instrumental, mas foi Guerreiro Ramos (1989) quem deu contundente contribuição ao estudo das organizações separando diferentes enclaves sociais, nos quais as racionalidades seriam mais adequadas em um ou outro espaço. Nesse contexto, o mercado é um enclave importante e legítimo, mas apartado de outros, nos quais as relações sociais existem para servir o sujeito. Esse trabalho, fundamentado na Teoria Crítica, reconhece que as ONGs (Organizações Não Governamentais) devem pertencer a um campo distinto daquele das empresas econômicas, por se basearem em racionalidades diferentes das mesmas. Foi realizada uma pesquisa de campo junto a cinco organizações sem fins lucrativos, com fins declarados de ação social (Harmonicanto, Reviverde, ACAM, Observatório de Favelas e Bola pra Frente), buscando identificar as influências desviacionistas que a adoção da racionalidade instrumental impõe sobre a realização dos objetivos previstos para essas organizações. Observou-se que existem contingências que favorecem o uso da instrumentalidade nessas organizações, como: necessidade de autossustentação, área de atuação, tamanho da organização, influência do líder, etc. Conclui-se que tais organizações, apesar de não serem espaços dedicados à atualização do sujeito (como define a Isonomia de Guerreiro Ramos), delatam o seu fim público e orientam-se pelas consequências sempre que absorvem de forma crua a dinâmica organizacional de uma empresa econômica.
Resumo:
A pesquisa teve como objetivo explicar se o modelo de gestão do SEBRAE pode contribuir com os atores regionais para implementar as ações emanadas das diretrizes do Plano Amazônia Sustentável – PAS. Verificou-se na literatura considerações sobre o desenvolvimento e instituições; políticas públicas para o desenvolvimento na Amazônia; gestão pública para o desenvolvimento, a evolução de seus modelos, com ênfase no modelo gerencial – gestão orientada para resultados; e tecnologias de gestão. O tipo de pesquisa é descritiva exploratória. Os dados foram coletados através de pesquisa bibliográfica e documental (de acesso livre, tanto bibliográfica quanto eletrônica – internet). Apresentou-se o SEBRAE enquanto agente de desenvolvimento; identificou-se o seu modelo de gestão denominada Gestão Estratégica Orientada para Resultados – GEOR seus fundamentos basilares, e suas tecnologias de gestão Os dados foram tratados através da técnica de análise de conteúdo, numa abordagem qualitativa. Com a análise realizada depreendeu-se então, que por sua flexibilidade a GEOR, seus fundamentos basilares e de suas tecnologias de gestão – via espinha dorsal - projeto orientado para resultados e sua metodologia de elaboração, tem como contribuir de forma efetiva na realização das ações emanadas das diretrizes do PAS, mais precisamente no que diz respeito aos das cadeias produtivas e comunidades organizadas, feita que é premissa basilar do modelo e da tecnologia de gestão – projeto, ser elaborado com e por público-alvo definido, o que lhe assegura a prerrogativa de que as ações a serem realizadas sejam estruturadas e contratualizadas: gerenciadas; monitoradas e avaliadas pelos atores diretamente interessado nos resultados.
Resumo:
The development of robots has shown itself as a very complex interdisciplinary research field. The predominant procedure for these developments in the last decades is based on the assumption that each robot is a fully personalized project, with the direct embedding of hardware and software technologies in robot parts with no level of abstraction. Although this methodology has brought countless benefits to the robotics research, on the other hand, it has imposed major drawbacks: (i) the difficulty to reuse hardware and software parts in new robots or new versions; (ii) the difficulty to compare performance of different robots parts; and (iii) the difficulty to adapt development needs-in hardware and software levels-to local groups expertise. Large advances might be reached, for example, if physical parts of a robot could be reused in a different robot constructed with other technologies by other researcher or group. This paper proposes a framework for robots, TORP (The Open Robot Project), that aims to put forward a standardization in all dimensions (electrical, mechanical and computational) of a robot shared development model. This architecture is based on the dissociation between the robot and its parts, and between the robot parts and their technologies. In this paper, the first specification for a TORP family and the first humanoid robot constructed following the TORP specification set are presented, as well as the advances proposed for their improvement.