926 resultados para Model-Data Integration and Data Assimilation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical weather prediction and climate simulation have been among the computationally most demanding applications of high performance computing eversince they were started in the 1950's. Since the 1980's, the most powerful computers have featured an ever larger number of processors. By the early 2000's, this number is often several thousand. An operational weather model must use all these processors in a highly coordinated fashion. The critical resource in running such models is not computation, but the amount of necessary communication between the processors. The communication capacity of parallel computers often fallsfar short of their computational power. The articles in this thesis cover fourteen years of research into how to harness thousands of processors on a single weather forecast or climate simulation, so that the application can benefit as much as possible from the power of parallel high performance computers. The resultsattained in these articles have already been widely applied, so that currently most of the organizations that carry out global weather forecasting or climate simulation anywhere in the world use methods introduced in them. Some further studies extend parallelization opportunities into other parts of the weather forecasting environment, in particular to data assimilation of satellite observations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The G1-to-S transition of the cell cycle in the yeast Saccharomyces cerevisiae involves an extensive transcriptional program driven by transcription factors SBF (Swi4-Swi6) and MBF (Mbp1-Swi6). Activation of these factors ultimately depends on the G1 cyclin Cln3. Results: To determine the transcriptional targets of Cln3 and their dependence on SBF or MBF, we first have used DNA microarrays to interrogate gene expression upon Cln3 overexpression in synchronized cultures of strains lacking components of SBF and/or MBF. Secondly, we have integrated this expression dataset together with other heterogeneous data sources into a single probabilistic model based on Bayesian statistics. Our analysis has produced more than 200 transcription factor-target assignments, validated by ChIP assays and by functional enrichment. Our predictions show higher internal coherence and predictive power than previous classifications. Our results support a model whereby SBF and MBF may be differentially activated by Cln3. Conclusions: Integration of heterogeneous genome-wide datasets is key to building accurate transcriptional networks. By such integration, we provide here a reliable transcriptional network at the G1-to-S transition in the budding yeast cell cycle. Our results suggest that to improve the reliability of predictions we need to feed our models with more informative experimental data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tuotantotoimintaa harjoittavalla yrityksellä raaka-aineet ja materiaalit ovat keskeinen osa tuotantotoimintaa ja usein niihin on sitoutuneena myös paljon arvokasta pääomaa. Diplomityössä selvitettiin materiaalinohjauksen ja varastojenhallinnan kehittämismahdollisuuksia mineraalipohjaisia eristetuotteita valmistavassa teollisuusyrityksessä. Työn tavoitteena oli kehittää organisaatiolle uusi toimintamalli, jonka pohjalta yrityksen materiaalinohjaus on tehokkaampaa ja systemaattisempaa pääoman käytön kannalta mitattuna sekä paremmin tuotantoa palvelevaa. Työ perustuu organisaatiossa toteutettuun haastattelututkimukseen sekä tietojärjestelmäpohjaiseen tutkimukseen. Haastattelututkimuksella on selvitetty organisaation sisäinen materiaalinohjausprosessi ja tietojärjestelmätietojen pohjalta on analysoitu lähtökohtatilannetta ABC-analyysin avulla sekä laskettu uudet ohjausarvot materiaalinimikkeille. Työn tuloksena organisaation materiaalinohjaukseen luotiin uusi yksinkertaistettu toimintamalli sisältäen valitut ohjausmenetelmät ohjausarvoineen, tietojärjestelmäintegraation sekä suositukset tulevaisuuden toiminnan suhteen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diplomityö tehtiin Partek Oyj Abp:lle antamaan IT-järjestelmistä vastuussa oleville ja niihin tukeutuville johtohenkilöille yleisnäkemys IT-sovellusintegroinnista sekä luomaan ohjeet integrointi projekteihin. Diplomityön alkuosassa esitellään liiketoiminnan prosessien pulmakohtia ja sovellusintegrointien liiketoiminnalle tuomia etuja yleisellä tasolla perustuen kirjallisuuteen. Yleisen tason etuja saadaan mm. nopeampien prosessien, paremman tiedon saatavuuden ja ihmisille tulevien uusien toimintatapojen kautta tulevista eduista. Työn seuraavassa osiossa esitellään mitä sovellusintegraatio tarkoittaa käytännössä, mitä erilaisia vaihtoehtoja integroinneille on ja mitä etuja ja haittoja erilaisista integrointitavoista on. Integrointitavoista viesti-pohjainen integrointitapa on noussut suosituimmaksi sen yksinkertaisuuden, luotettavuuden ja helpon liitettävyyden takia. Integrointisovelluksilla on mahdollista siirtää, muokata, prosessoida ja varastoida viestejä. Näiden ominaisuuksien avulla on mahdollista luoda reaaliaikaisia yhteistyöverkostoja. Tämä osio perustuu kirjallisuuteen , artikkeleihin ja haastatteluihin. Kolmas osio keskittyy integrointi projektin ominaispiirteisiin luoden toimintakartan integrointiprojektin kululle. Osiossa esitellään huomioitavat tekniset asiat, kustannukset ja edut sekä mallipohjia integroinnin dokumentointiin. Osio perustuu omiin kokemuksiin, haastatteluihin sekä kirjallisuuteen. Neljännessä osiossa esitellään Partekissa tehty integrointiprojekti. Integrointityö tehtiin ostajille tarkoitetun toimittajarekisterin (PPM) ja ERP-järjestelmän (Baan) välillä. Integrointiin käytettiin yhtä suosituinta integrointityökalua nimeltään IBM WebSphere MQ.Osio perustuu projektin dokumentointiin, omiin kokemuksiin ja kirjallisuuteen. Diplomityön päättää yhteenveto. Kolme pääetua voidaan saavuttaa integroinneilla ja toimintakartalla; tiedon luotettavuus paranee, toimintakartalla saadaan integroinneille malli ja luodaan riippumattomuutta tiettyihin avain henkilöihin tarkalla dokumentoinnilla ja toimintatapojen standardoinnilla.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study was performed in an attempt to develop an in vitro integrated testing strategy (ITS) to evaluate drug-induced neurotoxicity. A number of endpoints were analyzed using two complementary brain cell culture models and an in vitro blood-brain barrier (BBB) model after single and repeated exposure treatments with selected drugs that covered the major biological, pharmacological and neuro-toxicological responses. Furthermore, four drugs (diazepam, cyclosporine A, chlorpromazine and amiodarone) were tested more in depth as representatives of different classes of neurotoxicants, inducing toxicity through different pathways of toxicity. The developed in vitro BBB model allowed detection of toxic effects at the level of BBB and evaluation of drug transport through the barrier for predicting free brain concentrations of the studied drugs. The measurement of neuronal electrical activity was found to be a sensitive tool to predict the neuroactivity and neurotoxicity of drugs after acute exposure. The histotypic 3D re-aggregating brain cell cultures, containing all brain cell types, were found to be well suited for OMICs analyses after both acute and long term treatment. The obtained data suggest that an in vitro ITS based on the information obtained from BBB studies and combined with metabolomics, proteomics and neuronal electrical activity measurements performed in stable in vitro neuronal cell culture systems, has high potential to improve current in vitro drug-induced neurotoxicity evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Extended Kalman Filter (EKF) and four dimensional assimilation variational method (4D-VAR) are both advanced data assimilation methods. The EKF is impractical in large scale problems and 4D-VAR needs much effort in building the adjoint model. In this work we have formulated a data assimilation method that will tackle the above difficulties. The method will be later called the Variational Ensemble Kalman Filter (VEnKF). The method has been tested with the Lorenz95 model. Data has been simulated from the solution of the Lorenz95 equation with normally distributed noise. Two experiments have been conducted, first with full observations and the other one with partial observations. In each experiment we assimilate data with three-hour and six-hour time windows. Different ensemble sizes have been tested to examine the method. There is no strong difference between the results shown by the two time windows in either experiment. Experiment I gave similar results for all ensemble sizes tested while in experiment II, higher ensembles produce better results. In experiment I, a small ensemble size was enough to produce nice results while in experiment II the size had to be larger. Computational speed is not as good as we would want. The use of the Limited memory BFGS method instead of the current BFGS method might improve this. The method has proven succesful. Even if, it is unable to match the quality of analyses of EKF, it attains significant skill in forecasts ensuing from the analysis it has produced. It has two advantages over EKF; VEnKF does not require an adjoint model and it can be easily parallelized.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study expands existing research by considering both exports and tourism as potential influencing factors for economic growth. While trade of goods has been proven as a means of growth for countries, inbound tourism as non-traditional exports, has been scarcely examined in the literature. Using data for Italy and Spain over the period 1954-2000 and 1964-2000 respectively, both exports of goods and tourism exports are included in the same model. Standard cointegration and Granger causality techniques are applied. The main results reveal the significance of both exports and tourism towards longterm growth with some peculiarities for each country.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data transmission between an electric motor and a frequency converter is required in variablespeed electric drives because of sensors installed at the motor. Sensor information can be used for various useful applications to improve the system reliability and its properties. Traditionally, the communication medium is implemented by an additional cabling. However, the costs of the traditional method may be an obstacle to the wider application of data transmission between a motor and a frequency converter. In any case, a power cable is always installed between a motor and a frequency converter for power supply, and hence it may be applied as a communication medium for sensor level data. This thesis considers power line communication (PLC) in inverter-fed motor power cables. The motor cable is studied as a communication channel in the frequency band of 100 kHz−30 MHz. The communication channel and noise characteristics are described. All the individual components included in a variable-speed electric drive are presented in detail. A channel model is developed, and it is verified by measurements. A theoretical channel information capacity analysis is carried out to estimate the opportunities of a communication medium. Suitable communication and forward error correction (FEC) methods are suggested. A general method to implement a broadband and Ethernet-based communication medium between a motor and a frequency converter is proposed. A coupling interface is also developed that allows to install the communication device safely to a three-phase inverter-fed motor power cable. Practical tests are carried out, and the results are analyzed. Possible applications for the proposed method are presented. A speed feedback motor control application is verified in detail by simulations and laboratory tests because of restrictions for the delay in the feedback loop caused by PLC. Other possible applications are discussed at a more general level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lapsen karieshoidon kustannuskertymän muutokset ja karieshoidon toimintakäytäntöjen yhteys kustannuksiin Tutkimuksen tavoitteena oli mitata terveyskeskuksessa hoidettavien lasten karieshoidon kumulatiivisia kustannuksia ja verrata niitä kahden erilaisen toimintatavan välillä. Lisäksi tarkasteltiin lasten hampaiden terveyttä. Tutkimus tehtiin julkisen palvelutuottajan näkökulmasta. Tutkimusaineisto kerättiin Kemin ja Tornion terveyskeskusten suun terveydenhuollon potilaskertomuksista. Kemin kohortit 1980, 1983 ja 1986 (n = 600) ja Tornion kohortit 1980 ja 1992 (n = 400) edustivat perinteistä ja Kemin kohortit 1989, 1992 ja 1995 (n = 600) uutta toimintatapaa työnjaon ja ehkäisyn ajoituksen suhteen. Kohortteja ja kaupunkeja verrattiin hampaiden terveyden (dmft/DMFT = 0 ja dmft ja DMFT keskiarvot 5 ja 12 vuoden iässä) ja voimavarojen käytön suhteen. Panoskäyttö johdettiin käyntimäärien avulla laskennallisen työajan kautta. Kustannuskertymät muodostettiin käyttämällä henkilöstömenoista laskettuja suorittajakohtaisia yksikkökustannuksia. Panoskäytön ja yksikkökustannusten kautta muodostettiin kustannuskertymät. Kustannusten ja terveysvaikutusten suhteita arvioitiin kustannus-vaikuttavuusanalyysissä. Suuhygienistien työpanosta hyödyntävällä varhaisen ehkäisyn toimintamallilla saavutettiin vähäisemmin kustannuksin alle kouluiässä parempi ja kouluiässä yhtä hyvä hammasterveys kuin perinteisellä, enemmän hammaslääkärien työpanokseen perustuvalla tavalla. Karieksen hoitoon liittyvien käyntien määrä oli nuorimmissa syntymävuosikohorteissa pienempi kuin vanhimmissa kohorteissa. Käynnit hammaslääkärissä vähenivät eniten. Toimintatavalla oli merkittävä vaikutus lapsen karieshoidon kokonaiskustannuksiin. Herkkyysanalyysin mukaan karieshoidon kustannukset olivat työnjakoa hyödyntämällä kolmanneksen pienemmät, kuin jos hoidon suorittajana olisi ollut ainoastaan hammaslääkäri-hoitaja työpari. Lasten karieshoidon kustannusvaikuttavuus kohentui molemmissa terveyskeskuksissa nuoremmissa kohorteissa vanhempiin verrattuna. Suun terveydenhuollon potilaskertomuksia olisi hyödynnettävä toiminnan kehittämisessä. Varhaisen ehkäisyn avulla voitaisiin kaikkien suun terveydenhuollon ammattihenkilöiden työpanos kohdentaa kustannustehokkaasti.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Controlling the quality variables (such as basis weight, moisture etc.) is a vital part of making top quality paper or board. In this thesis, an advanced data assimilation tool is applied to the quality control system (QCS) of a paper or board machine. The functionality of the QCS is based on quality observations that are measured with a traversing scanner making a zigzag path. The basic idea is the following: The measured quality variable has to be separated into its machine direction (MD) and cross direction (CD) variations due to the fact that the QCS works separately in MD and CD. Traditionally this is done simply by assuming one scan of the zigzag path to be the CD profile and its mean value to be one point of the MD trend. In this thesis, a more advanced method is introduced. The fundamental idea is to use the signals’ frequency components to represent the variation in both CD and MD. To be able to get to the frequency domain, the Fourier transform is utilized. The frequency domain, that is, the Fourier components are then used as a state vector in a Kalman filter. The Kalman filter is a widely used data assimilation tool to combine noisy observations with a model. The observations here refer to the quality measurements and the model to the Fourier frequency components. By implementing the two dimensional Fourier transform into the Kalman filter, we get an advanced tool for the separation of CD and MD components in total variation or, to be more general, for data assimilation. A piece of a paper roll is analyzed and this tool is applied to model the dataset. As a result, it is clear that the Kalman filter algorithm is able to reconstruct the main features of the dataset from a zigzag path. Although the results are made with a very short sample of paper roll, it seems that this method has great potential to be used later on as a part of the quality control system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The topic of this thesis is the simulation of a combination of several control and data assimilation methods, meant to be used for controlling the quality of paper in a paper machine. Paper making is a very complex process and the information obtained from the web is sparse. A paper web scanner can only measure a zig zag path on the web. An assimilation method is needed to process estimates for Machine Direction (MD) and Cross Direction (CD) profiles of the web. Quality control is based on these measurements. There is an increasing need for intelligent methods to assist in data assimilation. The target of this thesis is to study how such intelligent assimilation methods are affecting paper web quality. This work is based on a paper web simulator, which has been developed in the TEKES funded MASI NoTes project. The simulator is a valuable tool in comparing different assimilation methods. The thesis contains the comparison of four different assimilation methods. These data assimilation methods are a first order Bayesian model estimator, an ARMA model based on a higher order Bayesian estimator, a Fourier transform based Kalman filter estimator and a simple block estimator. The last one can be considered to be close to current operational methods. From these methods Bayesian, ARMA and Kalman all seem to have advantages over the commercial one. The Kalman and ARMA estimators seems to be best in overall performance.