881 resultados para Interaction modeling. Model-based development. Interaction evaluation.
Resumo:
This thesis presents a one-dimensional, semi-empirical dynamic model for the simulation and analysis of a calcium looping process for post-combustion CO2 capture. Reduction of greenhouse emissions from fossil fuel power production requires rapid actions including the development of efficient carbon capture and sequestration technologies. The development of new carbon capture technologies can be expedited by using modelling tools. Techno-economical evaluation of new capture processes can be done quickly and cost-effectively with computational models before building expensive pilot plants. Post-combustion calcium looping is a developing carbon capture process which utilizes fluidized bed technology with lime as a sorbent. The main objective of this work was to analyse the technological feasibility of the calcium looping process at different scales with a computational model. A one-dimensional dynamic model was applied to the calcium looping process, simulating the behaviour of the interconnected circulating fluidized bed reactors. The model incorporates fundamental mass and energy balance solvers to semi-empirical models describing solid behaviour in a circulating fluidized bed and chemical reactions occurring in the calcium loop. In addition, fluidized bed combustion, heat transfer and core-wall layer effects were modelled. The calcium looping model framework was successfully applied to a 30 kWth laboratory scale and a pilot scale unit 1.7 MWth and used to design a conceptual 250 MWth industrial scale unit. Valuable information was gathered from the behaviour of a small scale laboratory device. In addition, the interconnected behaviour of pilot plant reactors and the effect of solid fluidization on the thermal and carbon dioxide balances of the system were analysed. The scale-up study provided practical information on the thermal design of an industrial sized unit, selection of particle size and operability in different load scenarios.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.
Resumo:
Introduction: Bien que l'importance de transférer les données de la recherche à la pratique a été largement démontrée, ce processus est toujours lent et fait face à plusieurs défis tels que la conceptualisation des évidences, la validité interne et externe de la recherche scientifique et les coûts élevés de la collecte de grandes quantités de données axées sur le patient. Les dossiers dentaires des patients contiennent des renseignements valables qui donneraient aux chercheurs cliniques une opportunité d'utiliser un large éventail d'informations quantitatives ou qualitatives. La standardisation du dossier clinique permettrait d’échanger et de réutiliser des données dans différents domaines de recherche. Objectifs: Le but de cette étude était de concevoir un dossier patient axé sur la recherche dans le domaine de la prosthodontie amovible à la clinique de premier cycle de l’Université de Montréal. Méthodes: Cette étude a utilisé des méthodes de recherche-action avec 4 étapes séquentielles : l'identification des problèmes, la collecte et l'interprétation des données, la planification et l’évaluation de l'action. Les participants de l'étude (n=14) incluaient des professeurs, des chercheurs cliniques et des instructeurs cliniques dans le domaine de la prosthodontie amovible. La collecte des données a été menée à l’aide d’une revue de littérature ciblée et complète sur les résultats en prosthodontie ainsi que par le biais de discussions de groupes et d’entrevues. Les données qualitatives ont été analysées en utilisant QDA Miner 3.2.3. Résultats: Les participants de l'étude ont soulevé plusieurs points absents au formulaire actuel de prosthodontie à la clinique de premier cycle. Ils ont partagé leurs idées pour la conception d'un nouveau dossier-patient basé sur 3 objectifs principaux: les objectifs cliniques, éducatifs et de recherche. Les principaux sujets d’intérêt en prosthodontie amovibles, les instruments appropriés ainsi que les paramètres cliniques ont été sélectionnés par le groupe de recherche. Ces résultats ont été intégrés dans un nouveau formulaire basé sur cette consultation. La pertinence du nouveau formulaire a été évaluée par le même groupe d'experts et les modifications requises ont été effectuées. Les participants de l'étude ont convenu que le cycle de recherche-action doit être poursuivi afin d'évaluer la faisabilité d’implémentation de ce dossier modifié dans un cadre universitaire. Conclusion: Cette étude est une première étape pour développer une base de données dans le domaine de la prothodontie amovible. La recherche-action est une méthode de recherche utile dans ce processus, et les éducateurs académiques sont bien placés pour mener ce type de recherche.
Resumo:
The solar and longwave environmental irradiance geometry (SOLWEIG) model simulates spatial variations of 3-D radiation fluxes and mean radiant temperature (T mrt) as well as shadow patterns in complex urban settings. In this paper, a new vegetation scheme is included in SOLWEIG and evaluated. The new shadow casting algorithm for complex vegetation structures makes it possible to obtain continuous images of shadow patterns and sky view factors taking both buildings and vegetation into account. For the calculation of 3-D radiation fluxes and T mrt, SOLWEIG only requires a limited number of inputs, such as global shortwave radiation, air temperature, relative humidity, geographical information (latitude, longitude and elevation) and urban geometry represented by high-resolution ground and building digital elevation models (DEM). Trees and bushes are represented by separate DEMs. The model is evaluated using 5 days of integral radiation measurements at two sites within a square surrounded by low-rise buildings and vegetation in Göteborg, Sweden (57°N). There is good agreement between modelled and observed values of T mrt, with an overall correspondence of R 2 = 0.91 (p < 0.01, RMSE = 3.1 K). A small overestimation of T mrt is found at locations shadowed by vegetation. Given this good performance a number of suggestions for future development are identified for applications which include for human comfort, building design, planning and evaluation of instrument exposure.
Resumo:
We describe here the development and evaluation of an Earth system model suitable for centennial-scale climate prediction. The principal new components added to the physical climate model are the terrestrial and ocean ecosystems and gas-phase tropospheric chemistry, along with their coupled interactions. The individual Earth system components are described briefly and the relevant interactions between the components are explained. Because the multiple interactions could lead to unstable feedbacks, we go through a careful process of model spin up to ensure that all components are stable and the interactions balanced. This spun-up configuration is evaluated against observed data for the Earth system components and is generally found to perform very satisfactorily. The reason for the evaluation phase is that the model is to be used for the core climate simulations carried out by the Met Office Hadley Centre for the Coupled Model Intercomparison Project (CMIP5), so it is essential that addition of the extra complexity does not detract substantially from its climate performance. Localised changes in some specific meteorological variables can be identified, but the impacts on the overall simulation of present day climate are slight. This model is proving valuable both for climate predictions, and for investigating the strengths of biogeochemical feedbacks.
Resumo:
A pH indicator film based on cassava starch plasticized with sucrose and inverted sugar and incorporated with grape and spinach extracts as pH indicator sources (anthocyanin and chlorophyll) has been developed, and its packaging properties have been assessed. A second-order central composite design (2(2)) with three central points and four star points was used to evaluate the mechanical properties (tensile strength, tensile strength at break, and elongation at break percentage), moisture barrier, and microstructure of the films, and its potential as a pH indicator packaging. The films were prepared by the casting technique and conditioned under controlled conditions (75% relative humidity and 23 degrees C), at least 4 days before the analyses. The materials were exposed to different pH solutions (0, 2, 7, 10, and 14) and their color parameters (L*, a*, b*, and haze) were measured by transmittance. Grape and spinach extracts have affected the material characterization. Film properties (mechanical properties and moisture barrier) were strongly influenced by extract concentration presenting lower results than for the control. Films containing a higher concentration of grape extract presented a greater color change at different pH`s suggesting that anthocyanins are more effective as pH indicators than chlorophyll or the mixture of both extracts. (C) 2010 Wiley Periodicals, Inc. J Appl Polym Sci 120: 1069-1079,2011
Resumo:
Neonatal anoxia is a worldwide clinical problem that has serious and lasting consequences. The diversity of models does not allow complete reproducibility, so a standardized model is needed. In this study, we developed a rat model of neonatal anoxia that utilizes a semi-hermetic system suitable for oxygen deprivation. The validity of this model was confirmed using pulse oximetry, arterial gasometry, observation of skin color and behavior and analysis of Fos immunoreactivity in brain regions that function in respiratory control. For these experiments, 87 male albino neonate rats (Rattus norvegicus, lineage Wistar) aged approximate 30 postnatal hours were divided into anoxia and control groups. The pups were kept in an euthanasia polycarbonate chamber at 36 +/- 1 degrees C, with continuous 100% nitrogen gas flow at 3 L/min and 101.7 kPa for 25 min. The peripheral arterial oxygen saturation of the anoxia group decreased 75% from its initial value. Decreased pH and partial pressure of oxygen and increased partial pressure of carbon dioxide were observed in this group, indicating metabolic acidosis, hypoxia and hypercapnia. respectively. Analysis of neuronal activation showed Fos immunoreactivity in the solitary tract nucleus, the lateral reticular nucleus and the area postrema, confirming that those conditions activated areas related to respiratory control in the nervous system. Therefore, the proposed model of neonatal anoxia allows standardization and precise control of the anoxic condition, which should be of great value in indentifying both the mechanisms underlying neonatal anoxia and novel therapeutic strategies to combat or prevent this widespread public health problem. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This paper describes the development of a new approach to the use of ICT for the teaching of courses in the interpretation and evaluation of evidence. It is based on ideas developed for the teaching of science to school children, in particular the importance of models and qualitative reasoning skills. In the first part, we make an analysis of the basis of current research into “evidence scholarship” and the demands such a system would have to meet. In the second part, we introduce the details of such a system that we developed initially to assist police in the interpretation of evidence.
Resumo:
This thesis develops and evaluates a business model for connected full electric vehicles (FEV) for the European market. Despite a promoting political environment, various barriers have thus far prevented the FEV from becoming a mass-market vehicle. Besides cost, the most noteworthy of these barriers is represented by range anxiety, a product of FEVs’ limited range, lacking availability of charging infrastructure, and long recharging times. Connected FEVs, which maintain a constant connection to the surrounding infrastructure, appear to be a promising element to overcome drivers’ range anxiety. Yet their successful application requires a well functioning FEV ecosystem which can only be created through the collaboration of various stakeholders such as original equipment manufacturers (OEM), first tier suppliers (FTS), charging infrastructure and service providers (CISP), utilities, communication enablers, and governments. This thesis explores and evaluates how a business model, jointly created by these stakeholders, could look like, i.e. how stakeholders could collaborate in the design of products, services, infrastructure, and advanced mobility management, to meet drivers with a sensible value proposition that is at least equivalent to that of internal combustion engine (ICE) cars. It suggests that this value proposition will be an end-2-end package provided by CISPs or OEMs that comprises mobility packages (incl. pay per mile plans, battery leasing, charging and battery swapping (BS) infrastructure) and FEVs equipped with an on-board unit (OBU) combined with additional services targeted at range anxiety reduction. From a theoretical point of view the thesis answers the question which business model framework is suitable for the development of a holistic, i.e. all stakeholder-comprising business model for connected FEVs and defines such a business model. In doing so the thesis provides the first comprehensive business model related research findings on connected FEVs, as prior works focused on the much less complex scenario featuring only “offline” FEVs.
Resumo:
The software development processes proposed by the most recent approaches in Software Engineering make use old models. UML was proposed as the standard language for modeling. The user interface is an important part of the software and has a fundamental importance to improve its usability. Unfortunately the standard UML does not offer appropriate resources to model user interfaces. Some proposals have already been proposed to solve this problem: some authors have been using models in the development of interfaces (Model Based Development) and some proposals to extend UML have been elaborated. But none of them considers the theoretical perspective presented by the semiotic engineering, that considers that, through the system, the designer should be able to communicate to the user what he can do, and how to use the system itself. This work presents Visual IMML, an UML Profile that emphasizes the aspects of the semiotic engineering. This Profile is based on IMML, that is a declarative textual language. The Visual IMML is a proposal that aims to improve the specification process by using a visual modeling (using diagrams) language. It proposes a new set of modeling elements (stereotypes) specifically designed to the specification and documentation of user interfaces, considering the aspects of communication, interaction and functionality in an integrated manner
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Quark-model descriptions of the nucleon-nucleon interaction contain two main ingredients, a quark-exchange mechanism for the short-range repulsion and meson exchanges for the medium- and long-range parts of the interaction. We point out the special role played by higher partial waves, and in particular the (1)F(3), as a very sensitive probe for the meson-exchange pan employed in these interaction models. In particular, we show that the presently available models fail to provide a reasonable description of higher partial waves and indicate the reasons for this shortcoming.
Resumo:
A fuzzy ruled-based system was developed in this study and resulted in an index indicating the level of uncertainty related to commercial transactions between cassava growers and their dealers. The fuzzy system was developed based on Transaction Cost Economics approach. The fuzzy system was developed from input variables regarding information sharing between grower and dealer on “Demand/purchase Forecasting”, “Production Forecasting” and “Production Innovation”. The output variable is the level of uncertainty regarding the transaction between seller and buyer agent, which may serve as a system for detecting inefficiencies. Evidences from 27 cassava growers registered in the Regional Development Offices of Tupa and Assis, São Paulo, Brazil, and 48 of their dealers supported the development of the system. The mathematical model indicated that 55% of the growers present a Very High level of uncertainty, 33% present Medium or High. The others present Low or Very Low level of uncertainty. From the model, simulations of external interferences can be implemented in order to improve the degree of uncertainty and, thus, lower transaction costs.
Resumo:
In this paper, a modeling technique for small-signal stability assessment of unbalanced power systems is presented. Since power distribution systems are inherently unbalanced, due to its lines and loads characteristics, and the penetration of distributed generation into these systems is increasing nowadays, such a tool is needed in order to ensure a secure and reliable operation of these systems. The main contribution of this paper is the development of a phasor-based model for the study of dynamic phenomena in unbalanced power systems. Using an assumption on the net torque of the generator, it is possible to precisely define an equilibrium point for the phasor model of the system, thus enabling its linearization around this point, and, consequently, its eigenvalue/eigenvector analysis for small-signal stability assessment. The modeling technique presented here was compared to the dynamic behavior observed in ATP simulations and the results show that, for the generator and controller models used, the proposed modeling approach is adequate and yields reliable and precise results.