983 resultados para modeling tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This tutorial is intended to be a "quick start" to creating simulations with GENESIS. It should give you the tools and enough information to let you quickly begin creating cells and networks with GENESIS, making use of the provided example simulations. Advanced topics are covered by appropriate links to the Advanced Tutorials on Realistic Neural Modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixed Reality (MR) aims to link virtual entities with the real world and has many applications such as military and medical domains [JBL+00, NFB07]. In many MR systems and more precisely in augmented scenes, one needs the application to render the virtual part accurately at the right time. To achieve this, such systems acquire data related to the real world from a set of sensors before rendering virtual entities. A suitable system architecture should minimize the delays to keep the overall system delay (also called end-to-end latency) within the requirements for real-time performance. In this context, we propose a compositional modeling framework for MR software architectures in order to specify, simulate and validate formally the time constraints of such systems. Our approach is first based on a functional decomposition of such systems into generic components. The obtained elements as well as their typical interactions give rise to generic representations in terms of timed automata. A whole system is then obtained as a composition of such defined components. To write specifications, a textual language named MIRELA (MIxed REality LAnguage) is proposed along with the corresponding compilation tools. The generated output contains timed automata in UPPAAL format for simulation and verification of time constraints. These automata may also be used to generate source code skeletons for an implementation on a MR platform. The approach is illustrated first on a small example. A realistic case study is also developed. It is modeled by several timed automata synchronizing through channels and including a large number of time constraints. Both systems have been simulated in UPPAAL and checked against the required behavioral properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite a broad range of collaboration tools already available, enterprises continue to look for ways to improve internal and external communication. Microblogging is such a new communication channel with some considerable potential to improve intra-firm transparency and knowledge sharing. However, the adoption of such social software presents certain challenges to enterprises. Based on the results of four focus group sessions, we identified several new constructs to play an important role in the microblogging adoption decision. Examples include privacy concerns, communication benefits, perceptions regarding signal-to-noise ratio, as well codification effort. Integrating these findings with common views on technology acceptance, we formulate a model to predict the adoption of a microblogging system in the workspace. Our findings serve as an important guideline for managers seeking to realize the potential of microblogging in their company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ore-forming and geoenviromental systems commonly involve coupled fluid flowand chemical reaction processes. The advanced numerical methods and computational modeling have become indispensable tools for simulating such processes in recent years. This enables many hitherto unsolvable geoscience problems to be addressed using numerical methods and computational modeling approaches. For example, computational modeling has been successfully used to solve ore-forming and mine site contamination/remediation problems, in which fluid flow and geochemical processes play important roles in the controlling dynamic mechanisms. The main purpose of this paper is to present a generalized overview of: (1) the various classes and models associated with fluid flow/chemically reacting systems in order to highlight possible opportunities and developments for the future; (2) some more general issues that need attention in the development of computational models and codes for simulating ore-forming and geoenviromental systems; (3) the related progresses achieved on the geochemical modeling over the past 50 years or so; (4) the general methodology for modeling of oreforming and geoenvironmental systems; and (5) the future development directions associated with modeling of ore-forming and geoenviromental systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first manuscript, entitled "Time-Series Analysis as Input for Clinical Predictive Modeling: Modeling Cardiac Arrest in a Pediatric ICU" lays out the theoretical background for the project. There are several core concepts presented in this paper. First, traditional multivariate models (where each variable is represented by only one value) provide single point-in-time snapshots of patient status: they are incapable of characterizing deterioration. Since deterioration is consistently identified as a precursor to cardiac arrests, we maintain that the traditional multivariate paradigm is insufficient for predicting arrests. We identify time series analysis as a method capable of characterizing deterioration in an objective, mathematical fashion, and describe how to build a general foundation for predictive modeling using time series analysis results as latent variables. Building a solid foundation for any given modeling task involves addressing a number of issues during the design phase. These include selecting the proper candidate features on which to base the model, and selecting the most appropriate tool to measure them. We also identified several unique design issues that are introduced when time series data elements are added to the set of candidate features. One such issue is in defining the duration and resolution of time series elements required to sufficiently characterize the time series phenomena being considered as candidate features for the predictive model. Once the duration and resolution are established, there must also be explicit mathematical or statistical operations that produce the time series analysis result to be used as a latent candidate feature. In synthesizing the comprehensive framework for building a predictive model based on time series data elements, we identified at least four classes of data that can be used in the model design. The first two classes are shared with traditional multivariate models: multivariate data and clinical latent features. Multivariate data is represented by the standard one value per variable paradigm and is widely employed in a host of clinical models and tools. These are often represented by a number present in a given cell of a table. Clinical latent features derived, rather than directly measured, data elements that more accurately represent a particular clinical phenomenon than any of the directly measured data elements in isolation. The second two classes are unique to the time series data elements. The first of these is the raw data elements. These are represented by multiple values per variable, and constitute the measured observations that are typically available to end users when they review time series data. These are often represented as dots on a graph. The final class of data results from performing time series analysis. This class of data represents the fundamental concept on which our hypothesis is based. The specific statistical or mathematical operations are up to the modeler to determine, but we generally recommend that a variety of analyses be performed in order to maximize the likelihood that a representation of the time series data elements is produced that is able to distinguish between two or more classes of outcomes. The second manuscript, entitled "Building Clinical Prediction Models Using Time Series Data: Modeling Cardiac Arrest in a Pediatric ICU" provides a detailed description, start to finish, of the methods required to prepare the data, build, and validate a predictive model that uses the time series data elements determined in the first paper. One of the fundamental tenets of the second paper is that manual implementations of time series based models are unfeasible due to the relatively large number of data elements and the complexity of preprocessing that must occur before data can be presented to the model. Each of the seventeen steps is analyzed from the perspective of how it may be automated, when necessary. We identify the general objectives and available strategies of each of the steps, and we present our rationale for choosing a specific strategy for each step in the case of predicting cardiac arrest in a pediatric intensive care unit. Another issue brought to light by the second paper is that the individual steps required to use time series data for predictive modeling are more numerous and more complex than those used for modeling with traditional multivariate data. Even after complexities attributable to the design phase (addressed in our first paper) have been accounted for, the management and manipulation of the time series elements (the preprocessing steps in particular) are issues that are not present in a traditional multivariate modeling paradigm. In our methods, we present the issues that arise from the time series data elements: defining a reference time; imputing and reducing time series data in order to conform to a predefined structure that was specified during the design phase; and normalizing variable families rather than individual variable instances. The final manuscript, entitled: "Using Time-Series Analysis to Predict Cardiac Arrest in a Pediatric Intensive Care Unit" presents the results that were obtained by applying the theoretical construct and its associated methods (detailed in the first two papers) to the case of cardiac arrest prediction in a pediatric intensive care unit. Our results showed that utilizing the trend analysis from the time series data elements reduced the number of classification errors by 73%. The area under the Receiver Operating Characteristic curve increased from a baseline of 87% to 98% by including the trend analysis. In addition to the performance measures, we were also able to demonstrate that adding raw time series data elements without their associated trend analyses improved classification accuracy as compared to the baseline multivariate model, but diminished classification accuracy as compared to when just the trend analysis features were added (ie, without adding the raw time series data elements). We believe this phenomenon was largely attributable to overfitting, which is known to increase as the ratio of candidate features to class examples rises. Furthermore, although we employed several feature reduction strategies to counteract the overfitting problem, they failed to improve the performance beyond that which was achieved by exclusion of the raw time series elements. Finally, our data demonstrated that pulse oximetry and systolic blood pressure readings tend to start diminishing about 10-20 minutes before an arrest, whereas heart rates tend to diminish rapidly less than 5 minutes before an arrest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a tool to perform guided HAZOP studies using a functional modeling framework: D-higraphs. It is a formalism that gathers in a single model structural (ontological) and functional information about the process considered. In this paper it is applied to an industrial case showing that the proposed methodology fits its purposes and fulfills some of the gaps and drawbacks existing in previous reported HAZOP assistant tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Atomic Physics Group at the Institute of Nuclear Fusion (DENIM) in Spain has accumulated experience over the years in developing a collection of computational models and tools for determining some relevant microscopic properties of, mainly, ICF and laser-produced plasmas in a variety of conditions. In this work several applications of those models in determining some relevant microscopic properties are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, Software Product Line (SPL) engineering [1] has been widely-adopted in software development due to the significant improvements that has provided, such as reducing cost and time-to-market and providing flexibility to respond to planned changes [2]. SPL takes advantage of common features among the products of a family through the systematic reuse of the core-assets and the effective management of variabilities across the products. SPL features are realized at the architectural level in product-line architecture (PLA) models. Therefore, suitable modeling and specification techniques are required to model variability. In fact, architectural variability modeling has become a challenge for SPLE due to the fact that PLA modeling requires not only modeling variability at the level of the external architecture configuration (see [3,4] literature reviews), but also at the level of internal specification of components [5]. In addition, PLA modeling requires preserving the traceability between features and PLAs. Finally, it is important to take into account that PLA modeling should guide architects in modeling the PLA core assets and variability, and in deriving the customized products. To deal with these needs, we present in this demonstration the FPLA Modeling Framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The SESAR (Single European Sky ATM Research) program is an ambitious re-search and development initiative to design the future European air traffic man-agement (ATM) system. The study of the behavior of ATM systems using agent-based modeling and simulation tools can help the development of new methods to improve their performance. This paper presents an overview of existing agent-based approaches in air transportation (paying special attention to the challenges that exist for the design of future ATM systems) and, subsequently, describes a new agent-based approach that we proposed in the CASSIOPEIA project, which was developed according to the goals of the SESAR program. In our approach, we use agent models for different ATM stakeholders, and, in contrast to previous work, our solution models new collaborative decision processes for flow traffic management, it uses an intermediate level of abstraction (useful for simulations at larger scales), and was designed to be a practical tool (open and reusable) for the development of different ATM studies. It was successfully applied in three stud-ies related to the design of future ATM systems in Europe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La modelización es un proceso por el que se obtienen modelos de los procesos del ´mundo real´ mediante la utilización de simplificaciones. Sin embargo, las estimaciones obtenidas con el modelo llevan implícitas incertidumbre que se debe evaluar. Mediante un análisis de sensibilidad se puede mejorar la confianza en los resultados, sin embargo, este paso a veces no se realiza debido básicamente al trabajo que lleva consigo este tipo de análisis. Además, al crear un modelo, hay que mantener un equilibrio entre la obtención de resultados lo más exactos posible mediante un modelo lo más sencillo posible. Por ello, una vez creado un modelo, es imprescindible comprobar si es necesario o no incluir más procesos que en un principio no se habían incluido. Los servicios ecosistémicos son los procesos mediante los cuales los ecosistemas mantienen y satisfacen el bienestar humano. La importancia que los servicios ecosistémicos y sus beneficios asociados tienen, junto con la necesidad de realizar una buena gestión de los mismos, han estimulado la aparición de modelos y herramientas para cuantificarlos. InVEST (Integrated Valuation of Ecosystem Services and Tradoffs) es una de estas herramientas específicas para calcular servicios eco-sistémicos, desarrollada por Natural Capital Project (Universidad de Stanford, EEUU). Como resultado del creciente interés en calcular los servicios eco-sistémicos, se prevé un incremento en la aplicación del InVEST. La investigación desarrollada en esta Tesis pretende ayudar en esas otras importantes fases necesarias después de la creación de un modelo, abarcando los dos siguientes trabajos. El primero es la aplicación de un análisis de sensibilidad al modelo en una cuenca concreta mediante la metodología más adecuada. El segundo es relativo a los procesos dentro de la corriente fluvial que actualmente no se incluyen en el modelo mediante la creación y aplicación de una metodología que estudiara el papel que juegan estos procesos en el modelo InVEST de retención de nutrientes en el área de estudio. Los resultados de esta Tesis contribuirán a comprender la incertidumbre involucrada en el proceso de modelado. También pondrá de manifiesto la necesidad de comprobar el comportamiento de un modelo antes de utilizarlo y en el momento de interpretar los resultados obtenidos. El trabajo en esta Tesis contribuirá a mejorar la plataforma InVEST, que es una herramienta importante en el ámbito de los servicios de los ecosistemas. Dicho trabajo beneficiará a los futuros usuarios de la herramienta, ya sean investigadores (en investigaciones futuras), o técnicos (en futuros trabajos de toma de decisiones o gestión ecosistemas). ABSTRACT Modeling is the process to idealize real-world situations through simplifications in order to obtain a model. However, model estimations lead to uncertainties that have to be evaluated formally. The role of the sensitivity analysis (SA) is to assign model output uncertainty based on the inputs and can increase confidence in model, however, it is often omitted in modelling, usually as a result of the growing effort it involves. In addition, the balance between accuracy and simplicity is not easy to assess. For this reason, when a model is developed, it is necessary to test it in order to understand its behavior and to include, if necessary, more complexity to get a better response. Ecosystem services are the conditions and processes through which natural ecosystems, and their constituent species, sustain and fulfill human life. The relevance of ecosystem services and the need to better manage them and their associated benefits have stimulated the emergence of models and tools to measure them. InVEST, Integrated Valuation of Ecosystem Services and Tradoffs, is one of these ecosystem services-specific tools developed by the Natural Capital Project (Stanford University, USA). As a result of the growing interest in measuring ecosystem services, the use of InVEST is anticipated to grow exponentially in the coming years. However, apart from model development, making a model involves other crucial stages such as its evaluation and application in order to validate estimations. The work developed in this thesis tries to help in this relevant and imperative phase of the modeling process, and does so in two different ways. The first one is to conduct a sensitivity analysis of the model, which consists in choosing and applying a methodology in an area and analyzing the results obtained. The second is related to the in-stream processes that are not modeled in the current model, and consists in creating and applying a methodology for testing the streams role in the InVEST nutrient retention model in a case study, analyzing the results obtained. The results of this Thesis will contribute to the understanding of the uncertainties involved in the modeling process. It will also illustrate the need to check the behavior of every model developed before putting them in production and illustrate the importance of understanding their behavior in terms of correctly interpreting the results obtained in light of uncertainty. The work in this thesis will contribute to improve the InVEST platform, which is an important tool in the field of ecosystem services. Such work will benefit future users, whether they are researchers (in their future research), or technicians (in their future work in ecosystem conservation or management decisions).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La modelización es un proceso por el que se obtienen modelos de los procesos del ´mundo real´ mediante la utilización de simplificaciones. Sin embargo, las estimaciones obtenidas con el modelo llevan implícitas incertidumbre que se debe evaluar. Mediante un análisis de sensibilidad se puede mejorar la confianza en los resultados, sin embargo, este paso a veces no se realiza debido básicamente al trabajo que lleva consigo este tipo de análisis. Además, al crear un modelo, hay que mantener un equilibrio entre la obtención de resultados lo más exactos posible mediante un modelo lo más sencillo posible. Por ello, una vez creado un modelo, es imprescindible comprobar si es necesario o no incluir más procesos que en un principio no se habían incluido. Los servicios ecosistémicos son los procesos mediante los cuales los ecosistemas mantienen y satisfacen el bienestar humano. La importancia que los servicios ecosistémicos y sus beneficios asociados tienen, junto con la necesidad de realizar una buena gestión de los mismos, han estimulado la aparición de modelos y herramientas para cuantificarlos. InVEST (Integrated Valuation of Ecosystem Services and Tradoffs) es una de estas herramientas específicas para calcular servicios eco-sistémicos, desarrollada por Natural Capital Project (Universidad de Stanford, EEUU). Como resultado del creciente interés en calcular los servicios eco-sistémicos, se prevé un incremento en la aplicación del InVEST. La investigación desarrollada en esta Tesis pretende ayudar en esas otras importantes fases necesarias después de la creación de un modelo, abarcando los dos siguientes trabajos. El primero es la aplicación de un análisis de sensibilidad al modelo en una cuenca concreta mediante la metodología más adecuada. El segundo es relativo a los procesos dentro de la corriente fluvial que actualmente no se incluyen en el modelo mediante la creación y aplicación de una metodología que estudiara el papel que juegan estos procesos en el modelo InVEST de retención de nutrientes en el área de estudio. Los resultados de esta Tesis contribuirán a comprender la incertidumbre involucrada en el proceso de modelado. También pondrá de manifiesto la necesidad de comprobar el comportamiento de un modelo antes de utilizarlo y en el momento de interpretar los resultados obtenidos. El trabajo en esta Tesis contribuirá a mejorar la plataforma InVEST, que es una herramienta importante en el ámbito de los servicios de los ecosistemas. Dicho trabajo beneficiará a los futuros usuarios de la herramienta, ya sean investigadores (en investigaciones futuras), o técnicos (en futuros trabajos de toma de decisiones o gestión ecosistemas). ABSTRACT Modeling is the process to idealize real-world situations through simplifications in order to obtain a model. However, model estimations lead to uncertainties that have to be evaluated formally. The role of the sensitivity analysis (SA) is to assign model output uncertainty based on the inputs and can increase confidence in model, however, it is often omitted in modelling, usually as a result of the growing effort it involves. In addition, the balance between accuracy and simplicity is not easy to assess. For this reason, when a model is developed, it is necessary to test it in order to understand its behavior and to include, if necessary, more complexity to get a better response. Ecosystem services are the conditions and processes through which natural ecosystems, and their constituent species, sustain and fulfill human life. The relevance of ecosystem services and the need to better manage them and their associated benefits have stimulated the emergence of models and tools to measure them. InVEST, Integrated Valuation of Ecosystem Services and Tradoffs, is one of these ecosystem services-specific tools developed by the Natural Capital Project (Stanford University, USA). As a result of the growing interest in measuring ecosystem services, the use of InVEST is anticipated to grow exponentially in the coming years. However, apart from model development, making a model involves other crucial stages such as its evaluation and application in order to validate estimations. The work developed in this thesis tries to help in this relevant and imperative phase of the modeling process, and does so in two different ways. The first one is to conduct a sensitivity analysis of the model, which consists in choosing and applying a methodology in an area and analyzing the results obtained. The second is related to the in-stream processes that are not modeled in the current model, and consists in creating and applying a methodology for testing the streams role in the InVEST nutrient retention model in a case study, analyzing the results obtained. The results of this Thesis will contribute to the understanding of the uncertainties involved in the modeling process. It will also illustrate the need to check the behavior of every model developed before putting them in production and illustrate the importance of understanding their behavior in terms of correctly interpreting the results obtained in light of uncertainty. The work in this thesis will contribute to improve the InVEST platform, which is an important tool in the field of ecosystem services. Such work will benefit future users, whether they are researchers (in their future research), or technicians (in their future work in ecosystem conservation or management decisions).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents the first musculoskeletal model and simulation of upper plexus brachial injury. From this model is possible to analyse forces and movement ranges in order to develop a robotic exoskeleton to improve rehabilitation. The software that currently exists for musculoskeletal modeling is varied and most have advanced features for proper analysis and study of motion simulations. Whilst more powerful computer packages are usually expensive, there are other free and open source packages available which offer different tools to perform animations and simulations and which obtain forces and moments of inertia. Among them, Musculoskeletal Modeling Software was selected to construct a model of the upper limb, which has 7 degrees of freedom and 10 muscles. These muscles are important for two of the movements simulated in this article that are part of the post-surgery rehabilitation protocol. We performed different movement animations which are made using the inertial measurement unit to capture real data from movements made by a human being. We also performed the simulation of forces produced in elbow flexion-extension and arm abduction-adduction of a healthy subject and one with upper brachial plexus injury in a postoperative state to compare the force that is capable of being produced in both cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Denver metropolitan area is facing rapid population growth that increases the stress on already limited resources. Research and advanced computer modeling show that trees, especially those in urban areas, have significant environmental benefits. These benefits include air quality improvements, energy savings, greenhouse gas reduction, and possible water conservation. This Capstone Project applies statistical methods to analyze a small data set of residential homes and their energy and water consumption, as a function of their individual landscape. Results indicate that tree shade can influence water conservation, and that irrigation methods can be an influential factor as well. The Capstone is a preliminary analysis for future study to be performed by the Institute for Environmental Solutions in 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The flanks of an oil-bearing structure were investigated to determine the most likely reservoir geometry in an area where the seismic path forks in preparation for a field equity redetermination. Two alternate hypotheses were evaluated: a “high fork model” where the reservoir top follows the higher of the two paths and a “low fork model” in which the reservoir follows the lower path. I took four approaches to evaluate the hypotheses: 1) Depth conversion by multiple velocity models to evaluate the fidelity of the picked horizon on models that did not contain a fork; 2) hand interpretation around the areas of high uncertainty to eliminate their influence; 3) path choice effects on the plausibility of the environment of deposition; and subsurface geometry modeling with synthetics to compare calculated 1D seismic responses with current data. Investigation established that both fork interpretations cannot follow a continuous seismic reflector but are otherwise equally plausible. Interval modeling revealed several structure scenarios, supporting both high and low fork, which fit the seismic data. To augment the lower fork argument, a scenario with an additional sand interval off-structure is recommended, for simplicity and reasonability.