959 resultados para simulation tools


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Prediction of radiated fields from transmission lines has not previously been studied from a panoptical power system perspective. The application of BPL technologies to overhead transmission lines would benefit greatly from an ability to simulate real power system environments, not limited to the transmission lines themselves. Presently circuitbased transmission line models used by EMTP-type programs utilize Carson’s formula for a waveguide parallel to an interface. This formula is not valid for calculations at high frequencies, considering effects of earth return currents. This thesis explains the challenges of developing such improved models, explores an approach to combining circuit-based and electromagnetics modeling to predict radiated fields from transmission lines, exposes inadequacies of simulation tools, and suggests methods of extending the validity of transmission line models into very high frequency ranges. Electromagnetics programs are commonly used to study radiated fields from transmission lines. However, an approach is proposed here which is also able to incorporate the components of a power system through the combined use of EMTP-type models. Carson’s formulas address the series impedance of electrical conductors above and parallel to the earth. These equations have been analyzed to show their inherent assumptions and what the implications are. Additionally, the lack of validity into higher frequencies has been demonstrated, showing the need to replace Carson’s formulas for these types of studies. This body of work leads to several conclusions about the relatively new study of BPL. Foremost, there is a gap in modeling capabilities which has been bridged through integration of circuit-based and electromagnetics modeling, allowing more realistic prediction of BPL performance and radiated fields. The proposed approach is limited in its scope of validity due to the formulas used by EMTP-type software. To extend the range of validity, a new set of equations must be identified and implemented in the approach. Several potential methods of implementation have been explored. Though an appropriate set of equations has not yet been identified, further research in this area will benefit from a clear depiction of the next important steps and how they can be accomplished. Prediction of radiated fields from transmission lines has not previously been studied from a panoptical power system perspective. The application of BPL technologies to overhead transmission lines would benefit greatly from an ability to simulate real power system environments, not limited to the transmission lines themselves. Presently circuitbased transmission line models used by EMTP-type programs utilize Carson’s formula for a waveguide parallel to an interface. This formula is not valid for calculations at high frequencies, considering effects of earth return currents. This thesis explains the challenges of developing such improved models, explores an approach to combining circuit-based and electromagnetics modeling to predict radiated fields from transmission lines, exposes inadequacies of simulation tools, and suggests methods of extending the validity of transmission line models into very high frequency ranges. Electromagnetics programs are commonly used to study radiated fields from transmission lines. However, an approach is proposed here which is also able to incorporate the components of a power system through the combined use of EMTP-type models. Carson’s formulas address the series impedance of electrical conductors above and parallel to the earth. These equations have been analyzed to show their inherent assumptions and what the implications are. Additionally, the lack of validity into higher frequencies has been demonstrated, showing the need to replace Carson’s formulas for these types of studies. This body of work leads to several conclusions about the relatively new study of BPL. Foremost, there is a gap in modeling capabilities which has been bridged through integration of circuit-based and electromagnetics modeling, allowing more realistic prediction of BPL performance and radiated fields. The proposed approach is limited in its scope of validity due to the formulas used by EMTP-type software. To extend the range of validity, a new set of equations must be identified and implemented in the approach. Several potential methods of implementation have been explored. Though an appropriate set of equations has not yet been identified, further research in this area will benefit from a clear depiction of the next important steps and how they can be accomplished.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Determining how an exhaust system will perform acoustically before a prototype muffler is built can save the designer both a substantial amount of time and resources. In order to effectively use the simulation tools available it is important to understand what is the most effective tool for the intended purpose of analysis as well as how typical elements in an exhaust system affect muffler performance. An in-depth look at the available tools and their most beneficial uses are presented in this thesis. A full parametric study was conducted using the FEM method for typical muffler elements which was also correlated to experimental results. This thesis lays out the overall ground work on how to accurately predict sound pressure levels in the free field for an exhaust system with the engine properties included. The accuracy of the model is heavily dependent on the correct temperature profile of the model in addition to the accuracy of the source properties. These factors will be discussed in detail and methods for determining them will be presented. The secondary effects of mean flow, which affects both the acoustical wave propagation and the flow noise generation, will be discussed. Effective ways for predicting these secondary effects will be described. Experimental models will be tested on a flow rig that showcases these phenomena.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Healthcare professionals and the public have increasing concerns about the ability of emergency departments to meet current demands. Increased demand for emergency services, mainly caused by a growing number of minor and moderate injuries has reached crisis proportions, especially in the United Kingdom. Numerous efforts have been made to explore the complex causes because it is becoming more and more important to provide adequate healthcare within tight budgets. Optimisation of patient pathways in the emergency department is therefore an important factor. This paper explores the possibilities offered by dynamic simulation tools to improve patient pathways using the emergency department of a busy university teaching hospital in Switzerland as an example.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Wireless Mesh Networks (WMN) have proven to be a key technology for increased network coverage of Internet infrastructures. The development process for new protocols and architectures in the area of WMN is typically split into evaluation by network simulation and testing of a prototype in a test-bed. Testing a prototype in a real test-bed is time-consuming and expensive. Irrepressible external interferences can occur which makes debugging difficult. Moreover, the test-bed usually supports only a limited number of test topologies. Finally, mobility tests are impractical. Therefore, we propose VirtualMesh as a new testing architecture which can be used before going to a real test-bed. It provides instruments to test the real communication software including the network stack inside a controlled environment. VirtualMesh is implemented by capturing real traffic through a virtual interface at the mesh nodes. The traffic is then redirected to the network simulator OMNeT++. In our experiments, VirtualMesh has proven to be scalable and introduces moderate delays. Therefore, it is suitable for predeployment testing of communication software for WMNs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The biological function of neurons can often be understood only in the context of large, highly interconnected networks. These networks typically form two-dimensional topographic maps, such as the retinotopic maps in mammalian visual cortex. Computational simulations of these areas have led to valuable insights about how cortical topography develops and functions, but further progress has been hindered due to the lack of appropriate simulation tools. This paper introduces the freely available Topographica maplevel simulator, originally developed at the University of Texas at Austin and now maintained at the University of Edinburgh, UK. Topographica is designed to make large-scale, detailed models practical. The goal is to allow neuroscientists and computational scientists to work together to understand how topographic maps and their connections organize and operate. This understanding will be crucial for integrating experimental observations into a comprehensive theory of brain function.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Simulation tools aid in learning neuroscience by providing the student with an interactive environment to carry out simulated experiments and test hypotheses. The field of neuroscience is well suited for the use of simulation tools since nerve cell signaling can be described by mathematical equations and solved by computer. Neural signaling entails the propagation of electrical current along nerve membrane and transmission to neighboring neurons through synaptic connections. Action potentials and synaptic transmission can be simulated and results displayed for visualization and analysis. The neurosimulator SNNAP (Simulator for Neural Networks and Action Potentials) is a simulation environment that provides users with editors for model building, simulator engine and visual display editor. This paper presents several modeling examples that illustrate some of the capabilities and features of SNNAP. First, the Hodgkin-Huxley (HH) model is presented and the threshold phenomenon is illustrated. Second, small neural networks are described with HH models using various synaptic connections available with SNNAP. Synaptic connections may be modulated through facilitation or depression with SNNAP. A study of vesicle pool dynamics is presented using an AMPA receptor model. Finally, a central pattern generator model of the Aplysia feeding circuit is illustrated as an example of a complex network that may be studied with SNNAP. Simulation code is provided for each case study described and tasks are suggested for further investigation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The development and evaluation of new algorithms and protocols for Wireless Multimedia Sensor Networks (WMSNs) are usually supported by means of a discrete event network simulator, where OMNeT++ is one of the most important ones. However, experiments involving multimedia transmission, video flows with different characteristics, genres, group of pictures lengths, and coding techniques must be evaluated based also on Quality of Experience (QoE) metrics to reflect the user's perception. Such experiments require the evaluation of video-related information, i.e., frame type, received/lost, delay, jitter, decoding errors, as well as inter and intra-frame dependency of received/distorted videos. However, existing OMNeT++ frameworks for WMSNs do not support video transmissions with QoE-awareness, neither a large set of mobility traces to enable evaluations under different multimedia/mobile situations. In this paper, we propose a Mobile MultiMedia Wireless Sensor Network OMNeT++ framework (M3WSN) to support transmission, control and evaluation of real video sequences in mobile WMSNs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The OPERA neutrino experiment is designed to perform the first observation of neutrino oscillations in direct appearance mode in the νμ→ντ channel, via the detection of the τ-leptons created in charged current ντ interactions. The detector, located in the underground Gran Sasso Laboratory, consists of an emulsion/lead target with an average mass of about 1.2 kt, complemented by electronic detectors. It is exposed to the CERN Neutrinos to Gran Sasso beam, with a baseline of 730 km and a mean energy of 17 GeV. The observation of the first ντ candidate event and the analysis of the 2008-2009 neutrino sample have been reported in previous publications. This work describes substantial improvements in the analysis and in the evaluation of the detection efficiencies and backgrounds using new simulation tools. The analysis is extended to a sub-sample of 2010 and 2011 data, resulting from an electronic detector-based pre-selection, in which an additional ντ candidate has been observed. The significance of the two events in terms of a νμ→ντ oscillation signal is of 2.40 σ.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The consideration of real operating conditions for the design and optimization of a multijunction solar cell receiver-concentrator assembly is indispensable. Such a requirement involves the need for suitable modeling and simulation tools in order to complement the experimental work and circumvent its well-known burdens and restrictions. Three-dimensional distributed models have been demonstrated in the past to be a powerful choice for the analysis of distributed phenomena in single- and dual-junction solar cells, as well as for the design of strategies to minimize the solar cell losses when operating under high concentrations. In this paper, we present the application of these models for the analysis of triple-junction solar cells under real operating conditions. The impact of different chromatic aberration profiles on the short-circuit current of triple-junction solar cells is analyzed in detail using the developed distributed model. Current spreading conditions the impact of a given chromatic aberration profile on the solar cell I-V curve. The focus is put on determining the role of current spreading in the connection between photocurrent profile, subcell voltage and current, and semiconductor layers sheet resistance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The environmental impact of systems managing large (kg) tritium amount represents a public scrutiny issue for the next coming fusion facilities as ITER and DEMO. Furthermore, potentially new dose limits imposed by international regulations (ICRP) shall impact next coming devices designs and the overall costs of fusion technology deployment. Refined environmental tritium dose impact assessment schemes are then overwhelming. Detailed assessments can be procured from the knowledge of the real boundary conditions of the primary tritium discharge phase into atmosphere (low levels) and into soils. Lagrangian dispersion models using real-time meteorological and topographic data provide a strong refinement. Advance simulation tools are being developed in this sense. The tool integrates a numerical model output records from European Centre for Medium range Weather Forecast (ECMWF) with a lagrangian atmospheric dispersion model (FLEXPART). The composite model ECMWF/FLEXTRA results can be coupled with tritium dose secondary phase pathway assessment tools. Nominal tritium discharge operational reference and selected incidental ITER-like plant systems tritium form source terms have been assumed. The realtime daily data and mesh-refined records together with lagrangian dispersion model approach provide accurate results for doses to population by inhalation or ingestion in the secondary phase

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The utilisation of biofuels in gas turbines is a promising alternative to fossil fuels for power generation. It would lead to significant reduction of CO2 emissions using an existing combustion technology, although significant changes seem to be needed and further technological development is necessary. The goal of this work is to perform energy and exergy analyses of the behaviour of gas turbines fired with biogas, ethanol and synthesis gas (bio-syngas), compared with natural gas. The global energy transformation process (i.e. from biomass to electricity) has also been studied. Furthermore, the potential reduction of CO2 emissions attained by the use of biofuels has been determined, considering the restrictions regarding biomass availability. Two different simulation tools have been used to accomplish the aims of this work. The results suggest a high interest and the technical viability of the use of Biomass Integrated Gasification Combined Cycle (BIGCC) systems for large scale power generation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Compilation techniques such as those portrayed by the Warren Abstract Machine(WAM) have greatly improved the speed of execution of logic programs. The research presented herein is geared towards providing additional performance to logic programs through the use of parallelism, while preserving the conventional semantics of logic languages. Two áreas to which special attention is given are the preservation of sequential performance and storage efficiency, and the use of low overhead mechanisms for controlling parallel execution. Accordingly, the techniques used for supporting parallelism are efficient extensions of those which have brought high inferencing speeds to sequential implementations. At a lower level, special attention is also given to design and simulation detail and to the architectural implications of the execution model behavior. This paper offers an overview of the basic concepts and techniques used in the parallel design, simulation tools used, and some of the results obtained to date.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La diabetes mellitus es el conjunto de alteraciones provocadas por un defecto en la cantidad de insulina secretada o por un aprovechamiento deficiente de la misma. Es causa directa de complicaciones a corto, medio y largo plazo que disminuyen la calidad y las expectativas de vida de las personas con diabetes. La diabetes mellitus es en la actualidad uno de los problemas más importantes de salud. Ha triplicado su prevalencia en los últimos 20 anos y para el año 2025 se espera que existan casi 300 millones de personas con diabetes. Este aumento de la prevalencia junto con la morbi-mortalidad asociada a sus complicaciones micro y macro-vasculares convierten la diabetes en una carga para los sistemas sanitarios, sus recursos económicos y sus profesionales, haciendo de la enfermedad un problema individual y de salud pública de enormes proporciones. De momento no existe cura a esta enfermedad, de modo que el objetivo terapéutico del tratamiento de la diabetes se centra en la normalización de la glucemia intentando minimizar los eventos de hiper e hipoglucemia y evitando la aparición o al menos retrasando la evolución de las complicaciones vasculares, que constituyen la principal causa de morbi-mortalidad de las personas con diabetes. Un adecuado control diabetológico implica un tratamiento individualizado que considere multitud de factores para cada paciente (edad, actividad física, hábitos alimentarios, presencia de complicaciones asociadas o no a la diabetes, factores culturales, etc.). Sin embargo, a corto plazo, las dos variables más influyentes que el paciente ha de manejar para intervenir sobre su nivel glucémico son la insulina administrada y la dieta. Ambas presentan un retardo entre el momento de su aplicación y el comienzo de su acción, asociado a la absorción de los mismos. Por este motivo la capacidad de predecir la evolución del perfil glucémico en un futuro cercano, ayudara al paciente a tomar las decisiones adecuadas para mantener un buen control de su enfermedad y evitar situaciones de riesgo. Este es el objetivo de la predicción en diabetes: adelantar la evolución del perfil glucémico en un futuro cercano para ayudar al paciente a adaptar su estilo de vida y sus acciones correctoras, con el propósito de que sus niveles de glucemia se aproximen a los de una persona sana, evitando así los síntomas y complicaciones de un mal control. La aparición reciente de los sistemas de monitorización continua de glucosa ha proporcionado nuevas alternativas. La disponibilidad de un registro exhaustivo de las variaciones del perfil glucémico, con un periodo de muestreo de entre uno y cinco minutos, ha favorecido el planteamiento de nuevos modelos que tratan de predecir la glucemia utilizando tan solo las medidas anteriores de glucemia o al menos reduciendo significativamente la información de entrada a los algoritmos. El hecho de requerir menor intervención por parte del paciente, abre nuevas posibilidades de aplicación de los predictores de glucemia, haciéndose viable su uso en tiempo real, como sistemas de ayuda a la decisión, como detectores de situaciones de riesgo o integrados en algoritmos automáticos de control. En esta tesis doctoral se proponen diferentes algoritmos de predicción de glucemia para pacientes con diabetes, basados en la información registrada por un sistema de monitorización continua de glucosa así como incorporando la información de la insulina administrada y la ingesta de carbohidratos. Los algoritmos propuestos han sido evaluados en simulación y utilizando datos de pacientes registrados en diferentes estudios clínicos. Para ello se ha desarrollado una amplia metodología, que trata de caracterizar las prestaciones de los modelos de predicción desde todos los puntos de vista: precisión, retardo, ruido y capacidad de detección de situaciones de riesgo. Se han desarrollado las herramientas de simulación necesarias y se han analizado y preparado las bases de datos de pacientes. También se ha probado uno de los algoritmos propuestos para comprobar la validez de la predicción en tiempo real en un escenario clínico. Se han desarrollado las herramientas que han permitido llevar a cabo el protocolo experimental definido, en el que el paciente consulta la predicción bajo demanda y tiene el control sobre las variables metabólicas. Este experimento ha permitido valorar el impacto sobre el control glucémico del uso de la predicción de glucosa. ABSTRACT Diabetes mellitus is the set of alterations caused by a defect in the amount of secreted insulin or a suboptimal use of insulin. It causes complications in the short, medium and long term that affect the quality of life and reduce the life expectancy of people with diabetes. Diabetes mellitus is currently one of the most important health problems. Prevalence has tripled in the past 20 years and estimations point out that it will affect almost 300 million people by 2025. Due to this increased prevalence, as well as to morbidity and mortality associated with micro- and macrovascular complications, diabetes has become a burden on health systems, their financial resources and their professionals, thus making the disease a major individual and a public health problem. There is currently no cure for this disease, so that the therapeutic goal of diabetes treatment focuses on normalizing blood glucose events. The aim is to minimize hyper- and hypoglycemia and to avoid, or at least to delay, the appearance and development of vascular complications, which are the main cause of morbidity and mortality among people with diabetes. A suitable, individualized and controlled treatment for diabetes involves many factors that need to be considered for each patient: age, physical activity, eating habits, presence of complications related or unrelated to diabetes, cultural factors, etc. However, in the short term, the two most influential variables that the patient has available in order to manage his/her glycemic levels are administered insulin doses and diet. Both suffer from a delay between their time of application and the onset of the action associated with their absorption. Therefore, the ability to predict the evolution of the glycemic profile in the near future could help the patient to make appropriate decisions on how to maintain good control of his/her disease and to avoid risky situations. Hence, the main goal of glucose prediction in diabetes consists of advancing the evolution of glycemic profiles in the near future. This would assist the patient in adapting his/her lifestyle and in taking corrective actions in a way that blood glucose levels approach those of a healthy person, consequently avoiding the symptoms and complications of a poor glucose control. The recent emergence of continuous glucose monitoring systems has provided new alternatives in this field. The availability of continuous records of changes in glycemic profiles (with a sampling period of one or five minutes) has enabled the design of new models which seek to predict blood glucose by using automatically read glucose measurements only (or at least, reducing significantly the data input manually to the algorithms). By requiring less intervention by the patient, new possibilities are open for the application of glucose predictors, making its use feasible in real-time applications, such as: decision support systems, hypo- and hyperglycemia detectors, integration into automated control algorithms, etc. In this thesis, different glucose prediction algorithms are proposed for patients with diabetes. These are based on information recorded by a continuous glucose monitoring system and incorporate information of the administered insulin and carbohydrate intakes. The proposed algorithms have been evaluated in-silico and using patients’ data recorded in different clinical trials. A complete methodology has been developed to characterize the performance of predictive models from all points of view: accuracy, delay, noise and ability to detect hypo- and hyperglycemia. In addition, simulation tools and patient databases have been deployed. One of the proposed algorithms has additionally been evaluated in terms of real-time prediction performance in a clinical scenario in which the patient checked his/her glucose predictions on demand and he/she had control on his/her metabolic variables. This has allowed assessing the impact of using glucose prediction on glycemic control. The tools to carry out the defined experimental protocols were also developed in this thesis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The study of the response of mechanical systems to external excitations, even in the simplest cases, involves solving second-order ordinary differential equations or systems thereof. Finding the natural frequencies of a system and understanding the effect of variations of the excitation frequencies on the response of the system are essential when designing mechanisms [1] and structures [2]. However, faced with the mathematical complexity of the problem, students tend to focus on the mathematical resolution rather than on the interpretation of the results. To overcome this difficulty, once the general theoretical problem and its solution through the state space [3] have been presented, Matlab®[4] and Simulink®[5] are used to simulate specific situations. Without them, the discussion of the effect of slight variations in input variables on the outcome of the model becomes burdensome due to the excessive calculation time required. Conversely, with the help of those simulation tools, students can easily reach practical conclusions and their evaluation can be based on their interpretation of results and not on their mathematical skills

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The research group is currently developing a biological computing model to be implemented with Escherichia Coli bacteria and bacteriophages M13, but it has to be modelled and simulated before any experiment in order to reduce the amount of failed attempts, time and costs. The problem that gave rise to this project is that there are no software tools which are able to simulate the biological process underlying that com- putational model, so it needs to be developed before doing any experimental implementation. There are several software tools which can simulate most of the biological processes and bacterial interactions in which this model is based, so what needs to be done is to study those available simulation tools, compare them and choose the most appropriate in order to be improved adding the desired functionality for this design. Directed evolution is a method used in biotechnology to obtain proteins or nucleic acids with properties not found in nature. It consists of three steps: 1) creating a library of mutants, 2) selecting the mutants with the desired properties, 3) replicating the variants identified in the selection step. The new software tool will be verified by simulating the selection step of a process of directed evolution applied to bacteriophages. ---ABSRACT---El grupo de investigación está desarrollando un modelo de computación biolóogica para ser implementado con bacterias Escherichia Coli y bacteriofagos M13, aunque primero tiene que ser modelizado antes de realizar cualquier experimento, de forma que los intentos fallidos y por lo tanto los costes se verán reducidos. El problema que dio lugar a este proyecto es la ausencia de herramientas software capaces de simular el proceso biológico que subyace a este modelo de computación biológica, por lo que dicha herramienta tiene que ser desarrollada antes de realizar cualquier implementación real. Existen varias herramientas software capaces de simular la mayoría de los procesos biológicos y las interacciones entre bacterias en los que se basa este modelo, por lo que este trabajo consiste en realizar un estudio de dichas herramientas de simulación, compararlas y escoger aquella más apropiada para ser mejorada añadiendo la funcionalidad deseada para este diseño. La evolución dirigida es un método utilizado en biotecnología para obtener proteínas o ácidos nucleicos con propiedades que no se encuentran en la naturaleza. Este método consiste en tres pasos: 1) crear una librería de mutantes, 2) seleccionar los mutantes con las propiedades deseadas, 3) Replicar los mutantes deseados. La nueva herramienta software será verificada mediante la simulación de la selección de mutantes de un proceso de evolución dirigida aplicado a bacteriofagos.