872 resultados para Heat pumps, load modelling, power quality, power system dynamics, power system simulation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The developments of models in Earth Sciences, e.g. for earthquake prediction and for the simulation of mantel convection, are fare from being finalized. Therefore there is a need for a modelling environment that allows scientist to implement and test new models in an easy but flexible way. After been verified, the models should be easy to apply within its scope, typically by setting input parameters through a GUI or web services. It should be possible to link certain parameters to external data sources, such as databases and other simulation codes. Moreover, as typically large-scale meshes have to be used to achieve appropriate resolutions, the computational efficiency of the underlying numerical methods is important. Conceptional this leads to a software system with three major layers: the application layer, the mathematical layer, and the numerical algorithm layer. The latter is implemented as a C/C++ library to solve a basic, computational intensive linear problem, such as a linear partial differential equation. The mathematical layer allows the model developer to define his model and to implement high level solution algorithms (e.g. Newton-Raphson scheme, Crank-Nicholson scheme) or choose these algorithms form an algorithm library. The kernels of the model are generic, typically linear, solvers provided through the numerical algorithm layer. Finally, to provide an easy-to-use application environment, a web interface is (semi-automatically) built to edit the XML input file for the modelling code. In the talk, we will discuss the advantages and disadvantages of this concept in more details. We will also present the modelling environment escript which is a prototype implementation toward such a software system in Python (see www.python.org). Key components of escript are the Data class and the PDE class. Objects of the Data class allow generating, holding, accessing, and manipulating data, in such a way that the actual, in the particular context best, representation is transparent to the user. They are also the key to establish connections with external data sources. PDE class objects are describing (linear) partial differential equation objects to be solved by a numerical library. The current implementation of escript has been linked to the finite element code Finley to solve general linear partial differential equations. We will give a few simple examples which will illustrate the usage escript. Moreover, we show the usage of escript together with Finley for the modelling of interacting fault systems and for the simulation of mantel convection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modelling human interaction and decision-making within a simulation presents a particular challenge. This paper describes a methodology that is being developed known as 'knowledge based improvement'. The purpose of this methodology is to elicit decision-making strategies via a simulation model and to represent them using artificial intelligence techniques. Further to this, having identified an individual's decision-making strategy, the methodology aims to look for improvements in decision-making. The methodology is being tested on unplanned maintenance operations at a Ford engine assembly plant

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of digital communication systems is increasing very rapidly. This is due to lower system implementation cost compared to analogue transmission and at the same time, the ease with which several types of data sources (data, digitised speech and video, etc.) can be mixed. The emergence of packet broadcast techniques as an efficient type of multiplexing, especially with the use of contention random multiple access protocols, has led to a wide-spread application of these distributed access protocols in local area networks (LANs) and a further extension of them to radio and mobile radio communication applications. In this research, a proposal for a modified version of the distributed access contention protocol which uses the packet broadcast switching technique has been achieved. The carrier sense multiple access with collision avoidance (CSMA/CA) is found to be the most appropriate protocol which has the ability to satisfy equally the operational requirements for local area networks as well as for radio and mobile radio applications. The suggested version of the protocol is designed in a way in which all desirable features of its precedents is maintained. However, all the shortcomings are eliminated and additional features have been added to strengthen its ability to work with radio and mobile radio channels. Operational performance evaluation of the protocol has been carried out for the two types of non-persistent and slotted non-persistent, through mathematical and simulation modelling of the protocol. The results obtained from the two modelling procedures validate the accuracy of both methods, which compares favourably with its precedent protocol CSMA/CD (with collision detection). A further extension of the protocol operation has been suggested to operate with multichannel systems. Two multichannel systems based on the CSMA/CA protocol for medium access are therefore proposed. These are; the dynamic multichannel system, which is based on two types of channel selection, the random choice (RC) and the idle choice (IC), and the sequential multichannel system. The latter has been proposed in order to supress the effect of the hidden terminal, which always represents a major problem with the usage of the contention random multiple access protocols with radio and mobile radio channels. Verification of their operation performance evaluation has been carried out using mathematical modelling for the dynamic system. However, simulation modelling has been chosen for the sequential system. Both systems are found to improve system operation and fault tolerance when compared to single channel operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This exploratory study is concerned with the integrated appraisal of multi-storey dwelling blocks which incorporate large concrete panel systems (LPS). The first step was to look at U.K. multi-storey dwelling stock in general, and under the management of Birmingham City Council in particular. The information has been taken from the databases of three departments in the City of Birmingham, and rearranged in a new database using a suite of PC software called `PROXIMA' for clarity and analysis. One hundred of their stock were built large concrete panel system. Thirteen LPS blocks were chosen for the purpose of this study as case-studies depending mainly on the height and age factors of the block. A new integrated appraisal technique has been created for the LPS dwelling blocks, which takes into account the most physical and social factors affecting the condition and acceptability of these blocks. This appraisal technique is built up in a hierarchical form moving from the general approach to particular elements (a tree model). It comprises two main approaches; physical and social. In the physical approach, the building is viewed as a series of manageable elements and sub-elements to cover every single physical or environmental factor of the block, in which the condition of the block is analysed. A quality score system has been developed which depends mainly on the qualitative and quantitative conditions of each category in the appraisal tree model, and leads to physical ranking order of the study blocks. In the social appraisal approach, the residents' satisfaction and attitude toward their multi-storey dwelling block was analysed in relation to: a. biographical and housing related characteristics; and b. social, physical and environmental factors associated with this sort of dwelling, block and estate in general.The random sample consisted of 268 residents living in the 13 case study blocks. Data collected was analysed using frequency counts, percentages, means, standard deviations, Kendall's tue, r-correlation coefficients, t-test, analysis of variance (ANOVA) and multiple regression analysis. The analysis showed a marginally positive satisfaction and attitude towards living in the block. The five most significant factors associated with the residents' satisfaction and attitude in descending order were: the estate, in general; the service categories in the block, including heating system and lift services; vandalism; the neighbours; and the security system of the block. An important attribute of this method, is that it is relatively inexpensive to implement, especially when compared to alternatives adopted by some local authorities and the BRE. It is designed to save time, money and effort, to aid decision making, and to provide ranked priority to the multi-storey dwelling stock, in addition to many other advantages. A series of solution options to the problems of the block was sought for selection and testing before implementation. The traditional solutions have usually resulted in either demolition or costly physical maintenance and social improvement of the blocks. However, a new solution has now emerged, which is particularly suited to structurally sound units. The solution of `re-cycling' might incorporate the reuse of an entire block or part of it, by removing panels, slabs and so forth from the upper floors in order to reconstruct them as low-rise accommodations.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The sheer volume of citizen weather data collected and uploaded to online data hubs is immense. However as with any citizen data it is difficult to assess the accuracy of the measurements. Within this project we quantify just how much data is available, where it comes from, the frequency at which it is collected, and the types of automatic weather stations being used. We also list the numerous possible sources of error and uncertainty within citizen weather observations before showing evidence of such effects in real data. A thorough intercomparison field study was conducted, testing popular models of citizen weather stations. From this study we were able to parameterise key sources of bias. Most significantly the project develops a complete quality control system through which citizen air temperature observations can be passed. The structure of this system was heavily informed by the results of the field study. Using a Bayesian framework the system learns and updates its estimates of the calibration and radiation-induced biases inherent to each station. We then show the benefit of correcting for these learnt biases over using the original uncorrected data. The system also attaches an uncertainty estimate to each observation, which would provide real world applications that choose to incorporate such observations with a measure on which they may base their confidence in the data. The system relies on interpolated temperature and radiation observations from neighbouring professional weather stations for which a Bayesian regression model is used. We recognise some of the assumptions and flaws of the developed system and suggest further work that needs to be done to bring it to an operational setting. Such a system will hopefully allow applications to leverage the additional value citizen weather data brings to longstanding professional observing networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Heat sinks are widely used for cooling electronic devices and systems. Their thermal performance is usually determined by the material, shape, and size of the heat sink. With the assistance of computational fluid dynamics (CFD) and surrogate-based optimization, heat sinks can be designed and optimized to achieve a high level of performance. In this paper, the design and optimization of a plate-fin-type heat sink cooled by impingement jet is presented. The flow and thermal fields are simulated using the CFD simulation; the thermal resistance of the heat sink is then estimated. A Kriging surrogate model is developed to approximate the objective function (thermal resistance) as a function of design variables. Surrogate-based optimization is implemented by adaptively adding infill points based on an integrated strategy of the minimum value, the maximum mean square error approach, and the expected improvement approaches. The results show the influence of design variables on the thermal resistance and give the optimal heat sink with lowest thermal resistance for given jet impingement conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work aims at modeling power consumption at the nodes of a Wireless Sensor Network (WSN). For doing so, a finite state machine was implemented by means of SystemC-AMS and Stateflow modeling and simulation tools. In order to achieve this goal, communication data in a WSN were collected. Based on the collected data, a simulation environment for power consumption characterization, which aimed at describing the network operation, was developed. Other than performing power consumption simulation, this environment also takes into account a discharging model as to analyze the battery charge level at any given moment. Such analysis result in a graph illustrating the battery voltage variations as well as its state of charge (SOC). Finally, a case study of the WSN power consumption aims to analyze the acquisition mode and network data communication. With this analysis, it is possible make adjustments in node-sensors to reduce the total power consumption of the network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Model for Prediction Across Scales (MPAS) is a novel set of Earth system simulation components and consists of an atmospheric model, an ocean model and a land-ice model. Its distinct features are the use of unstructured Voronoi meshes and C-grid discretisation to address shortcomings of global models on regular grids and the use of limited area models nested in a forcing data set, with respect to parallel scalability, numerical accuracy and physical consistency. This concept allows one to include the feedback of regional land use information on weather and climate at local and global scales in a consistent way, which is impossible to achieve with traditional limited area modelling approaches. Here, we present an in-depth evaluation of MPAS with regards to technical aspects of performing model runs and scalability for three medium-size meshes on four different high-performance computing (HPC) sites with different architectures and compilers. We uncover model limitations and identify new aspects for the model optimisation that are introduced by the use of unstructured Voronoi meshes. We further demonstrate the model performance of MPAS in terms of its capability to reproduce the dynamics of the West African monsoon (WAM) and its associated precipitation in a pilot study. Constrained by available computational resources, we compare 11-month runs for two meshes with observations and a reference simulation from the Weather Research and Forecasting (WRF) model. We show that MPAS can reproduce the atmospheric dynamics on global and local scales in this experiment, but identify a precipitation excess for the West African region. Finally, we conduct extreme scaling tests on a global 3?km mesh with more than 65 million horizontal grid cells on up to half a million cores. We discuss necessary modifications of the model code to improve its parallel performance in general and specific to the HPC environment. We confirm good scaling (70?% parallel efficiency or better) of the MPAS model and provide numbers on the computational requirements for experiments with the 3?km mesh. In doing so, we show that global, convection-resolving atmospheric simulations with MPAS are within reach of current and next generations of high-end computing facilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This keynote presentation will report some of our research work and experience on the development and applications of relevant methods, models, systems and simulation techniques in support of different types and various levels of decision making for business, management and engineering. In particular, the following topics will be covered. Modelling, multi-agent-based simulation and analysis of the allocation management of carbon dioxide emission permits in China (Nanfeng Liu & Shuliang Li Agent-based simulation of the dynamic evolution of enterprise carbon assets (Yin Zeng & Shuliang Li) A framework & system for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps: a big data perspective (Jin Xu, Zheng Li, Shuliang Li & Yanyan Zhang) Open innovation: intelligent model, social media & complex adaptive system simulation (Shuliang Li & Jim Zheng Li) A framework, model and software prototype for modelling and simulation for deshopping behaviour and how companies respond (Shawkat Rahman & Shuliang Li) Integrating multiple agents, simulation, knowledge bases and fuzzy logic for international marketing decision making (Shuliang Li & Jim Zheng Li) A Web-based hybrid intelligent system for combined conventional, digital, mobile, social media and mobile marketing strategy formulation (Shuliang Li & Jim Zheng Li) A hybrid intelligent model for Web & social media dynamics, and evolutionary and adaptive branding (Shuliang Li) A hybrid paradigm for modelling, simulation and analysis of brand virality in social media (Shuliang Li & Jim Zheng Li) Network configuration management: attack paradigms and architectures for computer network survivability (Tero Karvinen & Shuliang Li)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contexto: O trabalho que a seguir se apresenta foi elaborado no âmbito de um novo conceito na gestão do desempenho organizacional, a gestão estratégica multidimensional, onde a neuroeconomia foi associada à perspetiva teórica da gestão estratégica de recursos humanos (Dessler, 2005) e à aplicação do sistema de gestão da qualidade (SGQ), da NP 9001: 2008 (Saraiva e tal, 2009), para formar um método eficaz de conceção dos percursos de operacionalização da organização Objetivos: Pretendeu-se construir uma forma anatómica de gestão de desempenho organizacional através dos três conceitos mencionados, num contexto de gestão estratégica multidimensional, possibilitando a sua quantificação pelo nível de implementação dos requisitos para um sistema de gestão de qualidade, com o intuito de verificar sua viabilidade, sendo a resiliência organizacional dependente em grande medida dos recursos humanos, Método: Pelo conjunto das três áreas mencionadas foi criado um guia de atuação abrangendo as bases fundamentais para o desenvolvimento de uma estrutura organizacional: verificação dos níveis de incidência da gestão do desempenho organizacional; caracterização de diferentes tipos de estratégias de intervenção; implementação de estratégias preventivas e estratégias corretivas; utilização das quatro dimensões PIPE – performance, inovação, processos e empenhamento – como estratégias corretivas; integração das estratégias preventivas com as estratégias corretivas; diagnóstico contínuo de situações problema na gestão do desempenho organizacional; conceção de percursos de intervenção na gestão do desempenho organizacional; conceção de um projeto de gestão estratégica. A amostra final é constituída por uma organização do 3ºsetor, nomeadamente uma IPSS, a Associação Recreativa Cultural e Social das Gândaras, na Lousã, no distrito de Coimbra. Resultados: Com efeito, acredita-se que com este trabalho seja possível quantificar o nível de implementação dos requisitos para um SGQ num contexto de gestão estratégica multidimensional. / Context: The work presented below was prepared as part of a new concept in the management of organizational performance, the multidimensional strategic management, which was associated the neuroeconomics perspetive with the human resourses strategic management (Dessler, 2005) and the application of the quality management system (QMS), NP 9001: 2008 (and such Saraiva, 2009), to form an effective method of conception paths of operation of the organization Objetives: It was intended to build an anatomical shape organizational performance management across the three concepts mentioned in the context of multidimensional strategic management, enabling to quantify the level of implementation of the requirements for a quality management system, in order to verify their viability , and the organizational resilience largely dependent on human resources. Method: For all the three areas mentioned has created an action guide covering the fundamentals for developing an organizational structure: the verification of the incidence levels of organizational performance management; characterization of different types of intervention strategies; implementation of preventive strategies and corrective strategies, using the four dimensions PIPE - performance, innovation, processes and commitment - as corrective strategies; integration of preventive strategies with remedial strategies; diagnosis of problem situations in the ongoing management of organizational performance; conception pathways of intervention in the management of organizational performance; conception of a project of strategic management. The final sample consists of an organization of the 3rd setor, an IPSS, the Associação Recreativa Cultural e Social das Gândaras, in Lousã, in the district of Coimbra. Results: In fact, it is believed that this work is possible to quantify the level of implementation of the requirements for a QMS in a multidimensional context of strategic management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Oikean tiedon siirtyminen oikeaan aikaan, sekä laadukkaan työn tekeminen yrityksen tilaus-toimitusketjun jokaisessa vaiheessa, ovat avaintekijöitä arvolupauksen ja laadun täyttämiseen asiakkaalle. Diplomityön tavoite on kehittää pk-yritykselle työkalut parempaan tiedon hallintaan ja laadukkaan työn tekemiseen toiminnanohjausjärjestelmässä. Tutkimusmenetelmänä diplomityössä käytettiin toimintatutkimusta, jossa diplomityön tekijä osallistui kohdeyrityksen päivittäiseen työn tekemiseen neljän kuukauden ajan. Tutkimuksen tiedon keräämisessä käytettiin myös puolistrukturoitua haastattelua, sekä kyselytutkimuksella. Tutkimusote työssä on kvalitatiivinen eli laadullinen tutkimusote. Työ koostuu teoriaosasta sekä soveltavasta osasta, jonka jälkeen työn tulokset esitetään tiivistetysti johtopäätöksissä ja yhteenvedossa. Toiminnanohjausjärjestelmät keräävät ja tallentavat tietoa, jota työntekijät ja yrityksen rajapinnoilla työskentelevät ihmiset siihen syöttävät. Onkin äärimmäisen tärkeää, että yrityksellä on kuvatut yhtenäiset toimintamallit prosesseille, joita he käyttävät tiedon tallentamisessa järjestelmiin. Tässä diplomityössä tutkitaan pk-yrityksen nykyiset toimintamallit tiedon tallentamisesta toiminnanohjausjärjestelmään, jonka jälkeen kehitetään yhtenäiset ohjeet toiminnanohjausjärjestelmään syötetystä myyntitilaussopimuksesta. Teoriaosuudessa esitetään laatu eri näkökulmista ja mitä laadunhallintajärjestelmät ovat ja kuinka niitä kehitetään. Teoriaosassa myös avataan tilausohjautuvan tuotannon periaatteet, sekä toiminnanohjausjärjestelmän merkitys liiketoiminnalle. Teoriaosuudella pohjustetaan soveltavaa osuutta, jossa ongelma-analyysin jälkeen kehitetään yritykseen oma laadunhallintajärjestelmä, sekä uudet työmallit tiedonvaihtoon ja sen tallentamiseen. Tuloksena on myös toiminnanohjausjärjestelmän käytön tehostuminen ohjelmistotoimittajan tekemänä. Ohjelmasta karsittiin turhat nimikkeistöt ja sen konfigurointia tehostettiin. Työn tuloksena saatiin työohjeet ydinprosessien suorittamiseen, sekä oma laadunhallintajärjestelmä tukemaan yrityksen ydin- ja tukiprosesseja, sekä tiedonhallintaa.