860 resultados para Model development guidelines
Resumo:
La modelización es un proceso por el que se obtienen modelos de los procesos del ´mundo real´ mediante la utilización de simplificaciones. Sin embargo, las estimaciones obtenidas con el modelo llevan implícitas incertidumbre que se debe evaluar. Mediante un análisis de sensibilidad se puede mejorar la confianza en los resultados, sin embargo, este paso a veces no se realiza debido básicamente al trabajo que lleva consigo este tipo de análisis. Además, al crear un modelo, hay que mantener un equilibrio entre la obtención de resultados lo más exactos posible mediante un modelo lo más sencillo posible. Por ello, una vez creado un modelo, es imprescindible comprobar si es necesario o no incluir más procesos que en un principio no se habían incluido. Los servicios ecosistémicos son los procesos mediante los cuales los ecosistemas mantienen y satisfacen el bienestar humano. La importancia que los servicios ecosistémicos y sus beneficios asociados tienen, junto con la necesidad de realizar una buena gestión de los mismos, han estimulado la aparición de modelos y herramientas para cuantificarlos. InVEST (Integrated Valuation of Ecosystem Services and Tradoffs) es una de estas herramientas específicas para calcular servicios eco-sistémicos, desarrollada por Natural Capital Project (Universidad de Stanford, EEUU). Como resultado del creciente interés en calcular los servicios eco-sistémicos, se prevé un incremento en la aplicación del InVEST. La investigación desarrollada en esta Tesis pretende ayudar en esas otras importantes fases necesarias después de la creación de un modelo, abarcando los dos siguientes trabajos. El primero es la aplicación de un análisis de sensibilidad al modelo en una cuenca concreta mediante la metodología más adecuada. El segundo es relativo a los procesos dentro de la corriente fluvial que actualmente no se incluyen en el modelo mediante la creación y aplicación de una metodología que estudiara el papel que juegan estos procesos en el modelo InVEST de retención de nutrientes en el área de estudio. Los resultados de esta Tesis contribuirán a comprender la incertidumbre involucrada en el proceso de modelado. También pondrá de manifiesto la necesidad de comprobar el comportamiento de un modelo antes de utilizarlo y en el momento de interpretar los resultados obtenidos. El trabajo en esta Tesis contribuirá a mejorar la plataforma InVEST, que es una herramienta importante en el ámbito de los servicios de los ecosistemas. Dicho trabajo beneficiará a los futuros usuarios de la herramienta, ya sean investigadores (en investigaciones futuras), o técnicos (en futuros trabajos de toma de decisiones o gestión ecosistemas). ABSTRACT Modeling is the process to idealize real-world situations through simplifications in order to obtain a model. However, model estimations lead to uncertainties that have to be evaluated formally. The role of the sensitivity analysis (SA) is to assign model output uncertainty based on the inputs and can increase confidence in model, however, it is often omitted in modelling, usually as a result of the growing effort it involves. In addition, the balance between accuracy and simplicity is not easy to assess. For this reason, when a model is developed, it is necessary to test it in order to understand its behavior and to include, if necessary, more complexity to get a better response. Ecosystem services are the conditions and processes through which natural ecosystems, and their constituent species, sustain and fulfill human life. The relevance of ecosystem services and the need to better manage them and their associated benefits have stimulated the emergence of models and tools to measure them. InVEST, Integrated Valuation of Ecosystem Services and Tradoffs, is one of these ecosystem services-specific tools developed by the Natural Capital Project (Stanford University, USA). As a result of the growing interest in measuring ecosystem services, the use of InVEST is anticipated to grow exponentially in the coming years. However, apart from model development, making a model involves other crucial stages such as its evaluation and application in order to validate estimations. The work developed in this thesis tries to help in this relevant and imperative phase of the modeling process, and does so in two different ways. The first one is to conduct a sensitivity analysis of the model, which consists in choosing and applying a methodology in an area and analyzing the results obtained. The second is related to the in-stream processes that are not modeled in the current model, and consists in creating and applying a methodology for testing the streams role in the InVEST nutrient retention model in a case study, analyzing the results obtained. The results of this Thesis will contribute to the understanding of the uncertainties involved in the modeling process. It will also illustrate the need to check the behavior of every model developed before putting them in production and illustrate the importance of understanding their behavior in terms of correctly interpreting the results obtained in light of uncertainty. The work in this thesis will contribute to improve the InVEST platform, which is an important tool in the field of ecosystem services. Such work will benefit future users, whether they are researchers (in their future research), or technicians (in their future work in ecosystem conservation or management decisions).
Resumo:
En la actualidad gran parte de las industrias utilizan o desarrollan plataformas, las cuales integran un número cada vez más elevado de sistemas complejos. El mantenimiento centralizado permite optimizar el mantenimiento de estas plataformas, por medio de la integración de un sistema encargado de gestionar el mantenimiento de todos los sistemas de la plataforma. Este Trabajo Fin de Máster (TFM) desarrolla el concepto de mantenimiento centralizado para sistemas complejos, aplicable a plataformas formadas por sistemas modulares. Está basado en la creciente demanda de las diferentes industrias en las que se utilizan este tipo de plataformas, como por ejemplo la industria aeronáutica, del ferrocarril y del automóvil. Para ello este TFM analiza el Estado del Arte de los sistemas de mantenimiento centralizados en diferentes industrias, además desarrolla los diferentes tipos de arquitecturas de sistemas, las técnicas de mantenimiento aplicables, así como los sistemas y técnicas de mantenimiento basados en funciones de monitorización y auto diagnóstico denominadas Built-In-Test Equipment (BITE). Adicionalmente, este TFM incluye el desarrollo e implementación de un modelo de un Entorno de Mantenimiento Centralizado en LabVIEW. Este entorno está formado por el modelo de un Sistema Patrón, así como el modelo del Sistema de Mantenimiento Centralizado y la interfaces entre ellos. El modelo del Sistema de Mantenimiento Centralizado integra diferentes funciones para el diagnóstico y aislamiento de los fallos. Así mismo, incluye una función para el análisis estadístico de los datos de fallos almacenados por el propio sistema, con el objetivo de proporcionar capacidades de mantenimiento predictivo a los sistemas del entorno. Para la implementación del modelo del Entorno de Mantenimiento Centralizado se han utilizado recursos de comunicaciones vía TCP/IP, modelización y almacenamiento de datos en ficheros XML y generación automática de informes en HTML. ABSTRACT. Currently several industries are developing or are making use of multi system platforms. These platforms are composed by many complex systems. The centralized maintenance allows the maintenance optimization, integrating a maintenance management system. This system is in charge of managing the maintenance dialog with the different and multiple platforms. This Master Final Project (TFM) develops the centralized maintenance concept for platforms integrated by modular and complex systems. This TFM is based on the demand of the industry that uses or develops multi system platforms, as aeronautic, railway, and automotive industries. In this way, this TFM covers and analyzes several aspects of the centralized maintenance systems like the State of the Art, for several industries. Besides this work develops different systems architecture types, maintenance techniques, and techniques and systems based on Built-in-test Equipment functions. Additionally, this TFM includes a LabVIEW Centralized System Environment model. This model is composed by a Standard System, the Centralized Maintenance System and the corresponding interfaces. Several diagnostic and fault isolation functions are integrated on the Centralized Maintenance Systems, as well a statistic analysis function, that provides with predictive maintenance capacity, based on the failure data stored by the system. Among others, the following resources have been used for the Centralized System Environment model development: TCP/IP communications, XML file data modelization and storing, and also automatic HTML reports generation.
Resumo:
La modelización es un proceso por el que se obtienen modelos de los procesos del ´mundo real´ mediante la utilización de simplificaciones. Sin embargo, las estimaciones obtenidas con el modelo llevan implícitas incertidumbre que se debe evaluar. Mediante un análisis de sensibilidad se puede mejorar la confianza en los resultados, sin embargo, este paso a veces no se realiza debido básicamente al trabajo que lleva consigo este tipo de análisis. Además, al crear un modelo, hay que mantener un equilibrio entre la obtención de resultados lo más exactos posible mediante un modelo lo más sencillo posible. Por ello, una vez creado un modelo, es imprescindible comprobar si es necesario o no incluir más procesos que en un principio no se habían incluido. Los servicios ecosistémicos son los procesos mediante los cuales los ecosistemas mantienen y satisfacen el bienestar humano. La importancia que los servicios ecosistémicos y sus beneficios asociados tienen, junto con la necesidad de realizar una buena gestión de los mismos, han estimulado la aparición de modelos y herramientas para cuantificarlos. InVEST (Integrated Valuation of Ecosystem Services and Tradoffs) es una de estas herramientas específicas para calcular servicios eco-sistémicos, desarrollada por Natural Capital Project (Universidad de Stanford, EEUU). Como resultado del creciente interés en calcular los servicios eco-sistémicos, se prevé un incremento en la aplicación del InVEST. La investigación desarrollada en esta Tesis pretende ayudar en esas otras importantes fases necesarias después de la creación de un modelo, abarcando los dos siguientes trabajos. El primero es la aplicación de un análisis de sensibilidad al modelo en una cuenca concreta mediante la metodología más adecuada. El segundo es relativo a los procesos dentro de la corriente fluvial que actualmente no se incluyen en el modelo mediante la creación y aplicación de una metodología que estudiara el papel que juegan estos procesos en el modelo InVEST de retención de nutrientes en el área de estudio. Los resultados de esta Tesis contribuirán a comprender la incertidumbre involucrada en el proceso de modelado. También pondrá de manifiesto la necesidad de comprobar el comportamiento de un modelo antes de utilizarlo y en el momento de interpretar los resultados obtenidos. El trabajo en esta Tesis contribuirá a mejorar la plataforma InVEST, que es una herramienta importante en el ámbito de los servicios de los ecosistemas. Dicho trabajo beneficiará a los futuros usuarios de la herramienta, ya sean investigadores (en investigaciones futuras), o técnicos (en futuros trabajos de toma de decisiones o gestión ecosistemas). ABSTRACT Modeling is the process to idealize real-world situations through simplifications in order to obtain a model. However, model estimations lead to uncertainties that have to be evaluated formally. The role of the sensitivity analysis (SA) is to assign model output uncertainty based on the inputs and can increase confidence in model, however, it is often omitted in modelling, usually as a result of the growing effort it involves. In addition, the balance between accuracy and simplicity is not easy to assess. For this reason, when a model is developed, it is necessary to test it in order to understand its behavior and to include, if necessary, more complexity to get a better response. Ecosystem services are the conditions and processes through which natural ecosystems, and their constituent species, sustain and fulfill human life. The relevance of ecosystem services and the need to better manage them and their associated benefits have stimulated the emergence of models and tools to measure them. InVEST, Integrated Valuation of Ecosystem Services and Tradoffs, is one of these ecosystem services-specific tools developed by the Natural Capital Project (Stanford University, USA). As a result of the growing interest in measuring ecosystem services, the use of InVEST is anticipated to grow exponentially in the coming years. However, apart from model development, making a model involves other crucial stages such as its evaluation and application in order to validate estimations. The work developed in this thesis tries to help in this relevant and imperative phase of the modeling process, and does so in two different ways. The first one is to conduct a sensitivity analysis of the model, which consists in choosing and applying a methodology in an area and analyzing the results obtained. The second is related to the in-stream processes that are not modeled in the current model, and consists in creating and applying a methodology for testing the streams role in the InVEST nutrient retention model in a case study, analyzing the results obtained. The results of this Thesis will contribute to the understanding of the uncertainties involved in the modeling process. It will also illustrate the need to check the behavior of every model developed before putting them in production and illustrate the importance of understanding their behavior in terms of correctly interpreting the results obtained in light of uncertainty. The work in this thesis will contribute to improve the InVEST platform, which is an important tool in the field of ecosystem services. Such work will benefit future users, whether they are researchers (in their future research), or technicians (in their future work in ecosystem conservation or management decisions).
Resumo:
Las comunicaciones inalámbricas han transformado profundamente la forma en la que la gente se comunica en el día a día y es, sin lugar a dudas, una de las tecnologías de nuestro tiempo que más rápidamente evoluciona. Este rápido crecimiento implica retos enormes en la tecnología subyacente, debido y entre otros motivos, a la gran demanda de capacidad de los nuevos servicios inalámbricos. Los sistemas Multiple Input Multiple Output (MIMO) han despertado mucho interés como medio de mejorar el rendimiento global del sistema, satisfaciendo de este modo y en cierta medida los nuevo requisitos exigidos. De hecho, el papel relevante de esta tecnología en los actuales esfuerzos de estandarización internacionales pone de manifiesto esta utilidad. Los sistemas MIMO sacan provecho de los grados de libertad espaciales, disponibles a través del entorno multitrayecto, para mejorar el rendimiento de la comunicación con una destacable eficiencia espectral. Con el fin de alcanzar esta mejora en el rendimiento, la diversidad espacial y por diagrama han sido empleadas tradicionalmente para reducir la correlación entre los elementos radiantes, ya que una correlación baja es condición necesaria, si bien no suficiente, para dicha mejora. Tomando como referencia, o punto de partida, las técnicas empleadas para obtener diversidad por diagrama, esta tesis doctoral surge de la búsqueda de la obtención de diversidad por diagrama y/o multiplexación espacial a través del comportamiento multimodal de la antena microstrip, proponiendo para ello un modelo cuasi analítico original para el análisis y diseño de antenas microstrip multipuerto, multimodo y reconfigurables. Este novedoso enfoque en este campo, en vez de recurrir a simulaciones de onda completa por medio de herramientas comerciales tal y como se emplea en las publicaciones existentes, reduce significativamente el esfuerzo global de análisis y diseño, en este último caso por medio de guías de diseño generales. Con el fin de lograr el objetivo planteado y después de una revisión de los principales conceptos de los sistemas MIMO que se emplearán más adelante, se fija la atención en encontrar, implementar y verificar la corrección y exactitud de un modelo analítico que sirva de base sobre la cual añadir las mejoras necesarias para obtener las características buscadas del modelo cuasi analítico propuesto. Posteriormente y partiendo del modelo analítico base seleccionado, se exploran en profundidad y en diferentes entornos multitrayecto, las posibilidades en cuanto a rendimiento se refiere de diversidad por diagrama y multiplexación espacial, proporcionadas por el comportamiento multimodal de las antenas parche microstrip sin cargar. Puesto que cada modo de la cavidad tiene su propia frecuencia de resonancia, es necesario encontrar formas de desplazar la frecuencia de resonancia de cada modo empleado para ubicarlas en la misma banda de frecuencia, manteniendo cada modo al mismo tiempo tan independiente como sea posible. Este objetivo puede lograrse cargando adecuadamente la cavidad con cargas reactivas, o alterando la geometría del parche radiante. Por consiguiente, la atención en este punto se fija en el diseño, implementación y verificación de un modelo cuasi analítico para el análisis de antenas parche microstrip multipuerto, multimodo y cargadas que permita llevar a cabo la tarea indicada, el cuál es una de las contribuciones principales de esta tesis doctoral. Finalmente y basándose en el conocimiento adquirido a través del modelo cuasi analítico, se proporcionan y aplican guías generales para el diseño de antenas microstrip multipuerto, multimodo y reconfigurables para sistemas MIMO, con el fin de mejorar su diversidad por diagrama y/o su capacidad por medio del comportamiento multimodal de las antenas parche microstrip. Se debe destacar que el trabajo presentado en esta tesis doctoral ha dado lugar a una publicación en una revista técnica internacional de un alto factor de impacto. De igual manera, el trabajo también ha sido presentado en algunas de las más importantes conferencias internacionales en el ámbito de las antenas ABSTRACT Wireless communications have deeply transformed the way people communicate on daily basis and it is undoubtedly one of the most rapidly evolving technologies of our time. This fast growing behaviour involves huge challenges on the bearing technology, due to and among others reasons, the high demanding capacity of new wireless services. MIMO systems have given rise to considerable interest as a means to enhance the overall system performance, thus satisfying somehow the new demanding requirements. Indeed, the significant role of this technology on current international standardization efforts, highlights this usefulness. MIMO systems make profit from the spatial degrees of freedom available through the multipath scenario to improve the communication performance with a remarkable spectral efficiency. In order to achieve this performance improvement, spatial and pattern diversity have been traditionally used to decrease the correlation between antenna elements, as low correlation is a necessary but not sufficient condition. Taking as a reference, or starting point, the techniques used to achieve pattern diversity, this Philosophiae Doctor (Ph.D.) arises from the pursuit of obtaining pattern diversity and/or spatial multiplexing capabilities through the multimode microstrip behaviour, thus proposing a novel quasi analytical model for the analysis and design of reconfigurable multimode multiport microstrip antennas. This innovative approach on this field, instead of resorting to full-wave simulations through commercial tools as done in the available publications, significantly reduces the overall analysis and design effort, in this last case through comprehensive design guidelines. In order to achieve this goal and after a review of the main concepts of MIMO systems which will be followed used, the spotlight is fixed on finding, implementing and verifying the correctness and accuracy of a base quasi analytical model over which add the necessary enhancements to obtain the sought features of the quasi analytical model proposed. Afterwards and starting from the base quasi analytical model selected, the pattern diversity and spatial multiplexing performance capabilities provided by the multimode behaviour of unloaded microstrip patch antennas under different multipath environments are fully explored. As each cavity mode has its own resonant frequency, it is required to find ways to displace the resonant frequency of each used mode to place them at the same frequency band while keeping each mode as independent as possible. This objective can be accomplished with an appropriate loading of the cavity with reactive loads, or through the alteration of the geometry of the radiation patch. Thus, the focus is set at this point on the design, implementation and verification of a quasi analytical model for the analysis of loaded multimode multiport microstrip patch antennas to carry out the aforementioned task, which is one of the main contributions of this Ph.D. Finally and based on the knowledge acquired through the quasi analytical model, comprehensive guidelines to design reconfigurable multimode MIMO microstrip antennas to improve the spatial multiplexing and/or diversity system performance by means of the multimode microstrip patch antenna behaviour are given and applied. It shall be highlighted that the work presented in this Ph.D. has given rise to a publication in an international technical journal of high impact factor. Moreover, the work has also been presented at some of the most important international conferences in antenna area.
Resumo:
La expansión experimentada por la informática, las nuevas tecnologías e internet en los últimos años, no solo viene dada por la evolución del hardware subyacente, sino por la evolución del desarrollo de software y del crecimiento del número de desarrolladores. Este incremento ha hecho evolucionar el software de unos sistemas de gestión basados en ficheros, prácticamente sin interfaz gráfico y de unos pocos miles de líneas a grandes sistemas distribuidos multiplataforma. El desarrollo de estos grandes sistemas, requiere gran cantidad de personas involucradas en el desarrollo, y que las herramientas de desarrollo hayan crecido también para facilitar su análisis, diseño, codificación, pruebas, implantación y mantenimiento. La base de estas herramientas software las proveen las propias plataformas de desarrollo, pero la experiencia de los desarrolladores puede aportar un sinfín de utilidades y de técnicas que agilicen los desarrollos y cumplan los requisitos del software en base a la reutilización de soluciones lo suficientemente probadas y optimizadas. Dichas herramientas se agrupan ordenadamente, creando así frameworks personalizados, con herramientas de todo tipo, clases, controles, interfaces, patrones de diseño, de tal manera que se dan soluciones personalizadas a un amplio número de problemas para emplearlas cuantas veces se quiera, bien marcando directrices de desarrollo mediante el uso de patrones, bien con la encapsulación de complejidades de tal modo que los desarrolladores ya dispongan de componentes que asuman cierta lógica o cierta complejidad aliviando así la fase de construcción. En este trabajo se abordan temas sobre las tecnologías base y plataformas de desarrollo para poder acometer la creación de un framework personalizado, necesidades a evaluar antes de acometerlo, y técnicas a emplear para la consecución del mismo, orientadas a la documentación, mantenimiento y extensión del framework. La exposición teórica consiste en mostrar y evaluar los requisitos para crear un framework, requisitos de la plataforma de desarrollo, y explicar cómo funcionan las grandes plataformas de desarrollo actuales, que elementos los componen y su funcionamiento, así como marcar ciertas pautas de estructuración y nomenclatura que el desarrollo de un framework debe contemplar para su mantenimiento y extensión. En la parte metodológica se ha usado un subconjunto de Métrica V3, ya que para el desarrollo de controles no aplica dicha metodología en su totalidad, pero contempla el catálogo de requisitos, los casos de uso, diagramas de clase, diagramas de secuencia, etc… Aparte de los conceptos teóricos, se presenta un caso práctico con fines didácticos de cómo parametrizar y configurar el desarrollo bajo la plataforma .NET. Dicho caso práctico consiste en la extensión de un control de usuario genérico de la plataforma .NET, de tal modo que se aplican conceptos más allá del hecho de crear funciones como las funcionalidades que puede brindar un API. Conceptos sobre como extender y modificar controles ya existentes, que interactúan por medio de eventos con otros controles, con vistas a que ese nuevo control forme parte de una biblioteca de controles de usuario personalizados ampliamente divulgada. Los controles de usuario son algo que no solo tienen una parte funcional, sino que también tienen una parte visual, y definiciones funcionales distintas de las típicas del software de gestión, puesto que han de controlar eventos, visualizaciones mientras se dan estos eventos y requisitos no funcionales de optimización de rendimiento, etc… Para el caso práctico se toma como herramienta la plataforma de desarrollo .Net Framework, en todas sus versiones, ya que el control a extender es el control ListView y hacerlo editable. Este control está presente en todas las versiones de .NET framework y con un alto grado de reutilización. Esta extensión muestra además como se puede migrar fácilmente este tipo de extensiones sobre todos los frameworks. Los entornos de desarrollo usados son varias versiones de Visual Studio para el mostrar dicha compatibilidad, aunque el desarrollo que acompaña este documento esté realizado sobre Visual Studio 2013. ABSTRACT The expansion in computer science, new technologies and the Internet in recent years, not only is given by the evolution of the underlying hardware, but for the evolution of software development and the growing number of developers. This increase has evolved software from management systems based on files almost without graphical interface and a few thousand of code lines, to large multiplatform distributed systems. The development of these large systems, require lots of people involved in development, and development tools have also grown to facilitate analysis, design, coding, testing, deployment and maintenance. The basis of these software tools are providing by their own development platforms, but the experience of the developers can bring a lot of utilities and techniques to speed up developments and meet the requirements of software reuse based on sufficiently proven solutions and optimized. These tools are grouped neatly, creating in this way custom frameworks, with tools of all types, classes, controls, interfaces, design patterns,… in such a way that they provide customized solutions to a wide range of problems to use them many times as you want to occur, either by dialing development guidelines by using patterns or along with the encapsulation of complexities, so that developers already have components that take some logic or some complexity relieving the construction phase. This paper cover matters based on technologies and development platforms to undertake the creation of a custom framework, needs to evaluate before rush it and techniques to use in order to achieve it, a part from techniques oriented to documentation, maintenance and framework extension. The theoretical explanation consists in to demonstrate and to evaluate the requirements for creating a framework, development platform requirements, and explain how large current development platforms work, which elements compose them and their operation work, as well as mark certain patterns of structure and nomenclature that the development of a framework should include for its maintenance and extension. In the methodological part, a subset of Métrica V3 has been used, because of, for the development of custom controls this methodology does not apply in its entirety, but provides a catalogue of requirements, use cases, class diagrams, sequence diagrams, etc ... Apart from the theoretical concepts, a study case for teaching purposes about how to parameterize and configure the development under the .NET platform is presented. This study case involves the extension of a generic user control of the .NET platform, so that concepts apply beyond the fact of creating functions as the functionalities that can provide an API. Concepts on how to extend and modify existing controls that interact through events with other controls, overlooking that new control as a part of a custom user controls library widely publicized. User controls are something that not only have a functional part, but also have a visual part, and various functional definitions of typical management software, since that they have to control events, visualizations while these events are given and not functional of performance optimization requirements, etc ... For the study case the development platform .Net Framework is taken as tool, in all its versions, considering that control to extend is the ListView control and make it editable. This control is present in all versions of .NET framework and with a high degree of reuse. This extension also shows how you can easily migrate these extensions on all frameworks. The used development environments are several versions of Visual Studio to show that compatibility, although the development that accompanies this document is done on Visual Studio 2013.
Resumo:
In this manuscript we describe the experimental procedure employed at the Alfred Wegener Institute in Germany in the preparation of the simulations for the Pliocene Model Intercomparison Project (PlioMIP). We present a description of the utilized Community Earth System Models (COSMOS, version: COSMOS-landveg r2413, 2009) and document the procedures that we applied to transfer the Pliocene Research, Interpretation and Synoptic Mapping (PRISM) Project mid-Pliocene reconstruction into model forcing fields. The model setup and spin-up procedure are described for both the paleo- and preindustrial (PI) time slices of PlioMIP experiments 1 and 2, and general results that depict the performance of our model setup for mid-Pliocene conditions are presented. The mid-Pliocene, as simulated with our COSMOS setup and PRISM boundary conditions, is both warmer and wetter in the global mean than the PI. The globally averaged annual mean surface air temperature in the mid-Pliocene standalone atmosphere (fully coupled atmosphere-ocean) simulation is 17.35 °C (17.82 °C), which implies a warming of 2.23 °C (3.40 °C) relative to the respective PI control simulation.
Resumo:
Includes papers describing research sponsored by the Office of Nuclear Regulatory Research, NRC.
Resumo:
Objective: It is usual that data collected from routine clinical care is sparse and unable to support the more complex pharmacokinetic (PK) models that may have been reported in previous rich data studies. Informative priors may be a pre-requisite for model development. The aim of this study was to estimate the population PK parameters of sirolimus using a fully Bayesian approach with informative priors. Methods: Informative priors including prior mean and precision of the prior mean were elicited from previous published studies using a meta-analytic technique. Precision of between-subject variability was determined by simulations from a Wishart distribution using MATLAB (version 6.5). Concentration-time data of sirolimus retrospectively collected from kidney transplant patients were analysed using WinBUGS (version 1.3). The candidate models were either one- or two-compartment with first order absorption and first order elimination. Model discrimination was based on computation of the posterior odds supporting the model. Results: A total of 315 concentration-time points were obtained from 25 patients. Most data were clustered at trough concentrations with range of 1.6 to 77 hours post-dose. Using informative priors, either a one- or two-compartment model could be used to describe the data. When a one-compartment model was applied, information was gained from the data for the value of apparent clearance (CL/F = 18.5 L/h), and apparent volume of distribution (V/F = 1406 L) but no information was gained about the absorption rate constant (ka). When a two-compartment model was fitted to the data, the data were informative about CL/F, apparent inter-compartmental clearance, and apparent volume of distribution of the peripheral compartment (13.2 L/h, 20.8 L/h, and 579 L, respectively). The posterior distribution of the volume distribution of central compartment and ka were the same as priors. The posterior odds for the two-compartment model was 8.1, indicating the data supported the two-compartment model. Conclusion: The use of informative priors supported the choice of a more complex and informative model that would otherwise have not been supported by the sparse data.
Resumo:
The thesis presents an experimentally validated modelling study of the flow of combustion air in an industrial radiant tube burner (RTB). The RTB is used typically in industrial heat treating furnaces. The work has been initiated because of the need for improvements in burner lifetime and performance which are related to the fluid mechanics of the com busting flow, and a fundamental understanding of this is therefore necessary. To achieve this, a detailed three-dimensional Computational Fluid Dynamics (CFD) model has been used, validated with experimental air flow, temperature and flue gas measurements. Initially, the work programme is presented and the theory behind RTB design and operation in addition to the theory behind swirling flows and methane combustion. NOx reduction techniques are discussed and numerical modelling of combusting flows is detailed in this section. The importance of turbulence, radiation and combustion modelling is highlighted, as well as the numerical schemes that incorporate discretization, finite volume theory and convergence. The study first focuses on the combustion air flow and its delivery to the combustion zone. An isothermal computational model was developed to allow the examination of the flow characteristics as it enters the burner and progresses through the various sections prior to the discharge face in the combustion area. Important features identified include the air recuperator swirler coil, the step ring, the primary/secondary air splitting flame tube and the fuel nozzle. It was revealed that the effectiveness of the air recuperator swirler is significantly compromised by the need for a generous assembly tolerance. Also, there is a substantial circumferential flow maldistribution introduced by the swirier, but that this is effectively removed by the positioning of a ring constriction in the downstream passage. Computations using the k-ε turbulence model show good agreement with experimentally measured velocity profiles in the combustion zone and proved the use of the modelling strategy prior to the combustion study. Reasonable mesh independence was obtained with 200,000 nodes. Agreement was poorer with the RNG k-ε and Reynolds Stress models. The study continues to address the combustion process itself and the heat transfer process internal to the RTB. A series of combustion and radiation model configurations were developed and the optimum combination of the Eddy Dissipation (ED) combustion model and the Discrete Transfer (DT) radiation model was used successfully to validate a burner experimental test. The previously cold flow validated k-ε turbulence model was used and reasonable mesh independence was obtained with 300,000 nodes. The combination showed good agreement with temperature measurements in the inner and outer walls of the burner, as well as with flue gas composition measured at the exhaust. The inner tube wall temperature predictions validated the experimental measurements in the largest portion of the thermocouple locations, highlighting a small flame bias to one side, although the model slightly over predicts the temperatures towards the downstream end of the inner tube. NOx emissions were initially over predicted, however, the use of a combustion flame temperature limiting subroutine allowed convergence to the experimental value of 451 ppmv. With the validated model, the effectiveness of certain RTB features identified previously is analysed, and an analysis of the energy transfers throughout the burner is presented, to identify the dominant mechanisms in each region. The optimum turbulence-combustion-radiation model selection was then the baseline for further model development. One of these models, an eccentrically positioned flame tube model highlights the failure mode of the RTB during long term operation. Other models were developed to address NOx reduction and improvement of the flame profile in the burner combustion zone. These included a modified fuel nozzle design, with 12 circular section fuel ports, which demonstrates a longer and more symmetric flame, although with limited success in NOx reduction. In addition, a zero bypass swirler coil model was developed that highlights the effect of the stronger swirling combustion flow. A reduced diameter and a 20 mm forward displaced flame tube model shows limited success in NOx reduction; although the latter demonstrated improvements in the discharge face heat distribution and improvements in the flame symmetry. Finally, Flue Gas Recirculation (FGR) modelling attempts indicate the difficulty of the application of this NOx reduction technique in the Wellman RTB. Recommendations for further work are made that include design mitigations for the fuel nozzle and further burner modelling is suggested to improve computational validation. The introduction of fuel staging is proposed, as well as a modification in the inner tube to enhance the effect of FGR.
Resumo:
This thesis introduces a flexible visual data exploration framework which combines advanced projection algorithms from the machine learning domain with visual representation techniques developed in the information visualisation domain to help a user to explore and understand effectively large multi-dimensional datasets. The advantage of such a framework to other techniques currently available to the domain experts is that the user is directly involved in the data mining process and advanced machine learning algorithms are employed for better projection. A hierarchical visualisation model guided by a domain expert allows them to obtain an informed segmentation of the input space. Two other components of this thesis exploit properties of these principled probabilistic projection algorithms to develop a guided mixture of local experts algorithm which provides robust prediction and a model to estimate feature saliency simultaneously with the training of a projection algorithm.Local models are useful since a single global model cannot capture the full variability of a heterogeneous data space such as the chemical space. Probabilistic hierarchical visualisation techniques provide an effective soft segmentation of an input space by a visualisation hierarchy whose leaf nodes represent different regions of the input space. We use this soft segmentation to develop a guided mixture of local experts (GME) algorithm which is appropriate for the heterogeneous datasets found in chemoinformatics problems. Moreover, in this approach the domain experts are more involved in the model development process which is suitable for an intuition and domain knowledge driven task such as drug discovery. We also derive a generative topographic mapping (GTM) based data visualisation approach which estimates feature saliency simultaneously with the training of a visualisation model.
Resumo:
The absence of a definitive approach to the design of manufacturing systems signifies the importance of a control mechanism to ensure the timely application of relevant design techniques. To provide effective control, design development needs to be continually assessed in relation to the required system performance, which can only be achieved analytically through computer simulation. The technique providing the only method of accurately replicating the highly complex and dynamic interrelationships inherent within manufacturing facilities and realistically predicting system behaviour. Owing to the unique capabilities of computer simulation, its application should support and encourage a thorough investigation of all alternative designs. Allowing attention to focus specifically on critical design areas and enabling continuous assessment of system evolution. To achieve this system analysis needs to efficient, in terms of data requirements and both speed and accuracy of evaluation. To provide an effective control mechanism a hierarchical or multi-level modelling procedure has therefore been developed, specifying the appropriate degree of evaluation support necessary at each phase of design. An underlying assumption of the proposal being that evaluation is quick, easy and allows models to expand in line with design developments. However, current approaches to computer simulation are totally inappropriate to support the hierarchical evaluation. Implementation of computer simulation through traditional approaches is typically characterized by a requirement for very specialist expertise, a lengthy model development phase, and a correspondingly high expenditure. Resulting in very little and rather inappropriate use of the technique. Simulation, when used, is generally only applied to check or verify a final design proposal. Rarely is the full potential of computer simulation utilized to aid, support or complement the manufacturing system design procedure. To implement the proposed modelling procedure therefore the concept of a generic simulator was adopted, as such systems require no specialist expertise, instead facilitating quick and easy model creation, execution and modification, through simple data inputs. Previously generic simulators have tended to be too restricted, lacking the necessary flexibility to be generally applicable to manufacturing systems. Development of the ATOMS manufacturing simulator, however, has proven that such systems can be relevant to a wide range of applications, besides verifying the benefits of multi-level modelling.
Resumo:
Recently, we have developed the hierarchical Generative Topographic Mapping (HGTM), an interactive method for visualization of large high-dimensional real-valued data sets. In this paper, we propose a more general visualization system by extending HGTM in three ways, which allows the user to visualize a wider range of data sets and better support the model development process. 1) We integrate HGTM with noise models from the exponential family of distributions. The basic building block is the Latent Trait Model (LTM). This enables us to visualize data of inherently discrete nature, e.g., collections of documents, in a hierarchical manner. 2) We give the user a choice of initializing the child plots of the current plot in either interactive, or automatic mode. In the interactive mode, the user selects "regions of interest," whereas in the automatic mode, an unsupervised minimum message length (MML)-inspired construction of a mixture of LTMs is employed. The unsupervised construction is particularly useful when high-level plots are covered with dense clusters of highly overlapping data projections, making it difficult to use the interactive mode. Such a situation often arises when visualizing large data sets. 3) We derive general formulas for magnification factors in latent trait models. Magnification factors are a useful tool to improve our understanding of the visualization plots, since they can highlight the boundaries between data clusters. We illustrate our approach on a toy example and evaluate it on three more complex real data sets. © 2005 IEEE.
Resumo:
It is consider the new global models for society of neuronet type. The hierarchical structure of society and mentality of individual are considered. The way for incorporating in model anticipatory (prognostic) ability of individual is considered. Some implementations of approach for real task and further research problems are described. Multivaluedness of models and solutions is discussed. Sensory-motor systems analogy also is discussed. New problems for theory and applications of neural networks are described.
Resumo:
The goal of mangrove restoration projects should be to improve community structure and ecosystem function of degraded coastal landscapes. This requires the ability to forecast how mangrove structure and function will respond to prescribed changes in site conditions including hydrology, topography, and geophysical energies. There are global, regional, and local factors that can explain gradients of regulators (e.g., salinity, sulfides), resources (nutrients, light, water), and hydroperiod (frequency, duration of flooding) that collectively account for stressors that result in diverse patterns of mangrove properties across a variety of environmental settings. Simulation models of hydrology, nutrient biogeochemistry, and vegetation dynamics have been developed to forecast patterns in mangroves in the Florida Coastal Everglades. These models provide insight to mangrove response to specific restoration alternatives, testing causal mechanisms of system degradation. We propose that these models can also assist in selecting performance measures for monitoring programs that evaluate project effectiveness. This selection process in turn improves model development and calibration for forecasting mangrove response to restoration alternatives. Hydrologic performance measures include soil regulators, particularly soil salinity, surface topography of mangrove landscape, and hydroperiod, including both the frequency and duration of flooding. Estuarine performance measures should include salinity of the bay, tidal amplitude, and conditions of fresh water discharge (included in the salinity value). The most important performance measures from the mangrove biogeochemistry model should include soil resources (bulk density, total nitrogen, and phosphorus) and soil accretion. Mangrove ecology performance measures should include forest dimension analysis (transects and/or plots), sapling recruitment, leaf area index, and faunal relationships. Estuarine ecology performance measures should include the habitat function of mangroves, which can be evaluated with growth rate of key species, habitat suitability analysis, isotope abundance of indicator species, and bird census. The list of performance measures can be modified according to the model output that is used to define the scientific goals during the restoration planning process that reflect specific goals of the project.
Resumo:
The underrepresentation of women in physics has been well documented and a source of concern for both policy makers and educators. My dissertation focuses on understanding the role self-efficacy plays in retaining students, particularly women, in introductory physics. I use an explanatory mixed methods approach to first investigate quantitatively the influence of self-efficacy in predicting success and then to qualitatively explore the development of self-efficacy. In the initial quantitative studies, I explore the utility of self-efficacy in predicting the success of introductory physics students, both women and men. Results indicate that self-efficacy is a significant predictor of success for all students. I then disaggregate the data to examine how self-efficacy develops differently for women and men in the introductory physics course. Results show women rely on different sources of self-efficacy than do men, and that a particular instructional environment, Modeling Instruction, has a positive impact on these sources of self-efficacy. In the qualitative phase of the project, this dissertation focuses on the development of self-efficacy. Using the qualitative tool of microanalysis, I introduce a methodology for understanding how self-efficacy develops moment-by-moment using the lens of self-efficacy opportunities. I then use the characterizations of self-efficacy opportunities to focus on a particular course environment and to identify and describe a mechanism by which Modeling Instruction impacts student self-efficacy. Results indicate that the emphasizing the development and deployment of models affords opportunities to impact self-efficacy. The findings of this dissertation indicate that introducing key elements into the classroom, such as cooperative group work, model development and deployment, and interaction with the instructor, create a mechanism by which instructors can impact the self-efficacy of their students. Results from this study indicate that creating a model to impact the retention rates of women in physics should include attending to self-efficacy and designing activities in the classroom that create self-efficacy opportunities.