919 resultados para Arrowhead, interoperability, soa, internet of things, smart spaces, api, simulation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the research was to investigate operational processes related to home care of the elderly as well as use of assistive devices in smart home environments, and operational processes that are generally related to use of assistive devices – from the point of view of productivity improvement. The themes were looked into from the points of view of both the elderly and care personnel. In addition, perspectives of near relatives of the elderly, of the larger service system as well as of companies that provided assistive devices to the smart homes were taken into consideration. In the study of home care processes, 32 customer interviews and 17 employee interviews were carried out. This report contains a summary that is based on a separate report of the home care study. The study of home care was conducted in 2006. The use of technological and mechanical assistive devices and the related operational processes were investigated with the help of the smart home pilot in 2007–2008. The study is described in this report. The smart home pilot was implemented in four different housing service units for elderly people at Lahti, Nastola and Hollola. They were in use during short-term housing periods related to, for instance, end of hospitalisation, holidays of caring relatives and assessment of living and housing conditions. More than 60 different assistive devices and technologies were brought to the smart homes. During the pilot period, experiences of customers and personnel as well as processes related to the use of assistive devices were investigated. The research material consisted of 20 survey questionnaires of personnel and customers, four interviews with customers, five interviews with personnel, feedback survey responses from 14 companies, and other data that were collected, for instance, in orientation events. The research results highlighted the need for tailored services based on an elderly person’s needs and wishes, while taking advantage of innovative and technological solutions. As in the earlier home care study, also assistive device-related operational processes were looked into with the help of concepts of ‘resource focus’, ‘lost motion’ and ‘intermediate landing’. The following were identified as central operational processes in assistive device-related services (regardless of the service provider): (1) acquisition process of technologies and assistive devices as well as of rearrangement and rebuilding works in the home, (2) introduction and orientation process (of the elderly, their relatives and care personnel), (3) information and communication process, and (4) service and monitoring process. In addition, the research focused on design and desirability of assistive devices as well as their costs, such as opportunity costs. The process-based points of view gave new knowledge that may be used in the future to develop service processes and clarify their ownership so that separately managed cross-functional processes could be built with participants from different sectors to operate alongside organisations of elderly care. Development of functionality of assistive device-related services is a societally significant issue.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Available methods to simulate nucleotide or amino acid data typically use Markov models to simulate each position independently. These approaches are not appropriate to assess the performance of combinatorial and probabilistic methods that look for coevolving positions in nucleotide or amino acid sequences. RESULTS: We have developed a web-based platform that gives a user-friendly access to two phylogenetic-based methods implementing the Coev model: the evaluation of coevolving scores and the simulation of coevolving positions. We have also extended the capabilities of the Coev model to allow for the generalization of the alphabet used in the Markov model, which can now analyse both nucleotide and amino acid data sets. The simulation of coevolving positions is novel and builds upon the developments of the Coev model. It allows user to simulate pairs of dependent nucleotide or amino acid positions. CONCLUSIONS: The main focus of our paper is the new simulation method we present for coevolving positions. The implementation of this method is embedded within the web platform Coev-web that is freely accessible at http://coev.vital-it.ch/, and was tested in most modern web browsers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents briefly the basic operation and use of centrifugal pumps and parallel pumping applications. The characteristics of parallel pumping applications are compared to circuitry, in order to search analogy between these technical fields. The purpose of studying circuitry is to find out if common software tools for solving circuit performance could be used to observe parallel pumping applications. The empirical part of the thesis introduces a simulation environment for parallel pumping systems, which is based on circuit components of Matlab Simulink —software. The created simulation environment ensures the observation of variable speed controlled parallel pumping systems in case of different controlling methods. The introduced simulation environment was evaluated by building a simulation model for actual parallel pumping system at Lappeenranta University of Technology. The simulated performance of the parallel pumps was compared to measured values of the actual system. The gathered information shows, that if the initial data of the system and pump perfonnance is adequate, the circuitry based simulation environment can be exploited to observe parallel pumping systems. The introduced simulation environment can represent the actual operation of parallel pumps in reasonably accuracy. There by the circuitry based simulation can be used as a researching tool to develop new controlling ways for parallel pumps.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le self est une notion polysémique qui fait l'objet d'un consensus relatif dans plusieurs domaines, dont la psychologie du développement. Elle rend compte de la faculté de s'éprouver le même au fil du temps et de distinguer le « je » qui regarde du « moi » regardé. C'est le garant d'un sens de soi plus ou moins cohérent au fil du temps, en dépit des changements qui surviennent au cours de la vie. Le self combine des processus de réflexivité et d'intersubjectivité. Nous en avons analysé trois composantes fonctionnelles : la mémoire de travail, la mémoire épisodique et la narration, à partir d'un protocole expérimental témoignant de son ontogenèse chez des enfants de 6 à 9 ans (n=24 répartis en deux groupes de 6‐7 et 8-9 ans). Nous avons créé le « jeu informatique du lutin » qui propose un parcours semiorienté dans un monde imaginaire. C'est une narration de soi, opérant la mise en sens des temporalités et des espaces auxquels les événements se réfèrent. Deux semaines après cette « aventure », on recueille la narration des souvenirs épisodiques de cette histoire. Nous avons également utilisé un test de mémoire de travail visuospatiale non verbale. Des différences développementales affectent les dimensions narratives de la mémoire de l'épisode du jeu, comme l'efficacité de la mémoire de travail visuospatiale. Ces développements témoignent d'une augmentation de « l'épaisseur temporelle de la conscience» entre 6 et 9 ans. L'épaisseur de la conscience renvoie fondamentalement à la faculté du self de vivre le temps dans une cyclicité incluant le passé, le présent et le futur anticipé. Le développment observé élargit les possibilités de mettre en lien des mémoires et des scénarios futurs, tout comme les mises en sens des relations aux autres et à soi-­même. Self is a polysemic concept of common use in various scientific fields, among which developmental psychology. It accounts for the capacity to maintain the conviction to be « oneself », always the same through circumstances and throughout my life. This important function contributes in maintaining coherence and some sorte of Ariadne's thread in memory. To analyse the ontogeny of the self, we have focused upon three components : working memory, episodic memory and narration in children aged between 6 and 9 years. We used a non verbal working memory task. It was completed by a video game specially designed for our purpose, in which children were engaged in moving an elf in a landscape changing through seasons, in order to deliver a princess from a mischievous wizard. Two weeks after the game, the children had to tell what happened while they moved the elf. It is a self-narrative that creates a link‐up of temporality and spaces to which the events refer. The narrated episode was assessed for its coherence and continuity dimensions. Developmental differences affect the narrative dimensions of the memory of the episode of the game, as the effectiveness of visuospatial working memory. These developments show an increase in "temporal thickness of consciousness" between 6 and 9 years. The thickness of consciousness basically refers to the ability of the self to live in a cyclical time including past, present and anticipated future. The observed development broadens the possibilities to link memories and future scenarios, like setting sense of relations with others and with oneself.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkimuksen tavoitteena on rakentaa mittaristo yrityksen yhden tehtaan tuotannon tuottavuuden seuraamiseen sekä kerätä tietoa näiden mittareiden käyttöä varten. Kerätyt tiedot ovat vuodelta 2006 sekä 2007 kesäkuuhun asti ja näitä tietoja apuna käyttäen luodaan kuva siitä, mikä tilanne on tämän tutkimuksen aikana. Yrityksessä on käytössä suorituskyvyn seuraamiseen Balanced Scorecard, jossa yhtenä tekijänä on tuottavuuden mittaaminen. Tuottavuutta halutaan kuitenkin tarkastella tarkemmin tehdaskohtaisesti ja keskittyä tuotannon tuottavuuden mittaamiseen. Mittaristo, joka tutkimuksen tuloksena syntyy, toimii johdon apuvälineenä ja päätöksenteon tukena. Mittariston halutaan tuottavan ajankohtaista tietoa tehtaan tuottavuudesta ja sen kehittymisestä. Lisäksi mittariston pitää olla selkeä ja ymmärrettävä, jotta siitä voidaan ymmärtää mitkä asiat rakentavat tuottavuuden ja miten siihen voidaan vaikuttaa. Mittariston luomiseen käytetään apuna kirjallisuudesta saatavaa tietoa tuottavuuden ja suorituskyvyn mittaamisesta sekä erilaisista mittareista. Tämän lisäksi yrityksen edustajia haastatellaan tuottavuudesta. Teemahaastatteluiden avulla saadaan kuva siitä, miten tuottavuus käsitetään yrityksessä, miten sitä halutaan mitattavan ja mitä ja millaista tietoa mittareiden halutaan tuottavan. Tutkimuksen lopputuloksena on mittaristo, jolla mitataan yrityksen yhden tehtaan tuottavuutta osatuottavuuksien kautta. Mittariston avulla saadaan kuva tämänhetkisestä tuottavuustasosta, johon voidaan verrata tilannetta jatkossa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sähkömarkkinaosapuolet ovat havainneet lukuisia puutteita ja haasteita tietojärjestelmien tiedonvaihtoon liittyvissä kysymyksissä. Tässä työssä käsitellään tiedonvaihdon ongelmaa mittausvirtojen kannalta. Tiedonvaihdon lisääntyminen ja sanomien monipuolistuminen on johtanut siihen, että läheskään kaikki tarpeelliset sanomat eivät ole standardoitu, ja jo luoduissa tai ehdotetuissa standardeissa on huomattavia eroavaisuuksia mm. eri maiden välillä. Tässä työssä kuvataan nykyinen automaattisen mittarinlukujärjestelmän ja jakeluverkkoyhtiön mittaustietovaraston välinen tiedonvaihtoratkaisu ja siihen liittyvät mittaustietovirrat. Työssä esitellään myös älykkäiden mittarien tuomia hyötyjä ja pohditaan uusien mittausten tuomia mahdollisuuksia. Lisäksi pohditaan nykyisten tietovirtojen koodituskäytäntöjen toimivuutta ja niiden puutteita ja ongelmia. Työssä laaditaan esimerkki standardi mittausvirtakonfiguraation mallintamiseksi sähkömarkkinoilla. Työn painopiste on energiamittaustietojen tietovirroissa lähtien laskutuksen tarpeista. Tavoitteena on automaattisten mittarinhallintaprojektien tuomien älykkäiden mittarien uusien mittausten aiheuttaman ja mahdollistaman tietovirran standardointi. Työssä pohditaan, kuinka tietovirta saadaan eheästi siirtymään mittauspalveluntarjoajan ja jakeluverkkoyhtiön järjestelmien välillä sekä miten uudet tiedonkäyttötarpeet tulisi koodittaa. Uudet sanomastandardiehdotukset esitetään XML-mallein, ja lopuksi pohditaan mallien toimivuutta ja niihin tarvittavia jatkokehitystarpeita.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is an increasing reliance on computers to solve complex engineering problems. This is because computers, in addition to supporting the development and implementation of adequate and clear models, can especially minimize the financial support required. The ability of computers to perform complex calculations at high speed has enabled the creation of highly complex systems to model real-world phenomena. The complexity of the fluid dynamics problem makes it difficult or impossible to solve equations of an object in a flow exactly. Approximate solutions can be obtained by construction and measurement of prototypes placed in a flow, or by use of a numerical simulation. Since usage of prototypes can be prohibitively time-consuming and expensive, many have turned to simulations to provide insight during the engineering process. In this case the simulation setup and parameters can be altered much more easily than one could with a real-world experiment. The objective of this research work is to develop numerical models for different suspensions (fiber suspensions, blood flow through microvessels and branching geometries, and magnetic fluids), and also fluid flow through porous media. The models will have merit as a scientific tool and will also have practical application in industries. Most of the numerical simulations were done by the commercial software, Fluent, and user defined functions were added to apply a multiscale method and magnetic field. The results from simulation of fiber suspension can elucidate the physics behind the break up of a fiber floc, opening the possibility for developing a meaningful numerical model of the fiber flow. The simulation of blood movement from an arteriole through a venule via a capillary showed that the model based on VOF can successfully predict the deformation and flow of RBCs in an arteriole. Furthermore, the result corresponds to the experimental observation illustrates that the RBC is deformed during the movement. The concluding remarks presented, provide a correct methodology and a mathematical and numerical framework for the simulation of blood flows in branching. Analysis of ferrofluids simulations indicate that the magnetic Soret effect can be even higher than the conventional one and its strength depends on the strength of magnetic field, confirmed experimentally by Völker and Odenbach. It was also shown that when a magnetic field is perpendicular to the temperature gradient, there will be additional increase in the heat transfer compared to the cases where the magnetic field is parallel to the temperature gradient. In addition, the statistical evaluation (Taguchi technique) on magnetic fluids showed that the temperature and initial concentration of the magnetic phase exert the maximum and minimum contribution to the thermodiffusion, respectively. In the simulation of flow through porous media, dimensionless pressure drop was studied at different Reynolds numbers, based on pore permeability and interstitial fluid velocity. The obtained results agreed well with the correlation of Macdonald et al. (1979) for the range of actual flow Reynolds studied. Furthermore, calculated results for the dispersion coefficients in the cylinder geometry were found to be in agreement with those of Seymour and Callaghan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tässä diplomityössä tutkittiin puuhakkeen esihydrolyysi- ja hakkuujätteen hydrolyysiprosessien integroimista sellutehtaaseen bioetanolin tuottamiseksi. Tällaisesta ns. biojalostamosta luotiin WinGEMS-simulointiohjelmalla simulointimalli, jonka avulla tutkittiin bioetanoliprosessin vaikutusta sellutehtaan massa- ja energiataseisiin sekä alustavaa biojalostamon kannattavuutta. Simuloinnissa tarkasteltiin kolmea eri tapausta, joissa mäntysellun tuotannon ajateltiin olevan 1000 tonnia päivässä ja hakkuujätettä käytettävän 10 % tarvittavan kuitupuun määrästä: 1) Puuhakkeen esihydrolyysi ja hakkuujätteen hydrolyysi etanolin tuottamiseksi 2) Puuhakkeen esihydrolyysi, hakkuujäte kuorikattilaan poltettavaksi 3) Ei esihydrolyysiä, hakkuujäte kuorikattilaan poltettavaksi Verrattuna tapaukseen 3, puun kulutus kasvaa 16 % esihydrolysoitaessa puuhake ennen keittoa tapauksissa 1 ja 2. Kasvaneella puun kulutuksella tuotetaan tapauksessa 1 149 tonnia etanolia ja 240 MWh enemmän ylimääräsähköä päivässä. Tapauksessa 2 tuotetaan 68 tonnia etanolia ja 460 MWh enemmän ylimääräsähköä päivässä. Tämä tuottaisi vuotuista lisäkassavirtaa 18,8 miljoonaa euroa tapauksessa 1 ja 9,4 miljoonaa euroa tapauksessa 2. Hydrolyysin tuoteliuoksen, hydrolysaatin, haihduttaminen sekä hydrolyysiprosessien orgaanisten jäännöstuotteiden haihduttaminen ja polttaminen kasvattavat haihduttamon ja soodakattilan kuormitusta. Verrattuna tapaukseen 3, tapauksissa 1 ja 2 haihduttamon vaiheiden määrä on kasvatettava viidestä seitsemään ja tarvittavat lämmönsiirtopinta-alat lähes kaksinkertaistettava. Soodakattilan kuormitus kasvaa 39 % tapauksessa 1 ja 26 % tapauksessa 2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simulation has traditionally been used for analyzing the behavior of complex real world problems. Even though only some features of the problems are considered, simulation time tends to become quite high even for common simulation problems. Parallel and distributed simulation is a viable technique for accelerating the simulations. The success of parallel simulation depends heavily on the combination of the simulation application, algorithm and message population in the simulation is sufficient, no additional delay is caused by this environment. In this thesis a conservative, parallel simulation algorithm is applied to the simulation of a cellular network application in a distributed workstation environment. This thesis presents a distributed simulation environment, Diworse, which is based on the use of networked workstations. The distributed environment is considered especially hard for conservative simulation algorithms due to the high cost of communication. In this thesis, however, the distributed environment is shown to be a viable alternative if the amount of communication is kept reasonable. Novel ideas of multiple message simulation and channel reduction enable efficient use of this environment for the simulation of a cellular network application. The distribution of the simulation is based on a modification of the well known Chandy-Misra deadlock avoidance algorithm with null messages. The basic Chandy Misra algorithm is modified by using the null message cancellation and multiple message simulation techniques. The modifications reduce the amount of null messages and the time required for their execution, thus reducing the simulation time required. The null message cancellation technique reduces the processing time of null messages as the arriving null message cancels other non processed null messages. The multiple message simulation forms groups of messages as it simulates several messages before it releases the new created messages. If the message population in the simulation is suffiecient, no additional delay is caused by this operation A new technique for considering the simulation application is also presented. The performance is improved by establishing a neighborhood for the simulation elements. The neighborhood concept is based on a channel reduction technique, where the properties of the application exclusively determine which connections are necessary when a certain accuracy for simulation results is required. Distributed simulation is also analyzed in order to find out the effect of the different elements in the implemented simulation environment. This analysis is performed by using critical path analysis. Critical path analysis allows determination of a lower bound for the simulation time. In this thesis critical times are computed for sequential and parallel traces. The analysis based on sequential traces reveals the parallel properties of the application whereas the analysis based on parallel traces reveals the properties of the environment and the distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a brief description of the archaeological research in the territory and in the city of Tarraco, the ancient capital of provincia Hispania Tarraconensis and one of the main centres for the spread of Hispanic Christianity. Althoug Tarraco was the last capital under imperial control and the firs Hispanic metropolitan see, the city had only a secondary role by comparison with other Hispanic cities during late antiquity. This evolution shaped the development of Tarraco during the 7th century, but archaeologists identify an important architectural vitality still in the 6th century at the same time as other episcopal cities were evolving. During this period, the final Christianization of the symbolic spaces of ancient paganism took place, establishing the ideological basis of medieval urbanism that is still preserved today. The paper also interprets the sites through raising key questions as well as describing rural settlements, where archaeological knowledge is not so far advanced, due in part to the difficult nature of archaeological research, and in part to the need to study new construcitve models, as well as to the systematic collection of the relevant material culture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Patenttitiedot sisältävät monenlaista tietoa joka on julkisesti saatavilla. Patenttien tietoja tutkimalla voidaan selvittää monenlaisia asioita, kuten yritysten tekemiä patenttimääriä, patenttiluokitusten jakautumista ja patenttien maantieteellistä jakaumaa. Vielä 1980-luvulla patenttitietojen analysoiminen suuremmissa määrin oli vaikeaa, sillä tietokoneiden laskentateho oli riittämätön. Nyt laskentatehon räjähdysmäisen kasvun ansiosta patentteja voidaan tutkia miljoonia kappaleita kerrallaan. Tässä työssä tutkitaan patenttitietokantoja, tietokantaserverin pystyttämistä ja tietojen hyödyntämistä jatkokäytössä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The topic of this thesis is the simulation of a combination of several control and data assimilation methods, meant to be used for controlling the quality of paper in a paper machine. Paper making is a very complex process and the information obtained from the web is sparse. A paper web scanner can only measure a zig zag path on the web. An assimilation method is needed to process estimates for Machine Direction (MD) and Cross Direction (CD) profiles of the web. Quality control is based on these measurements. There is an increasing need for intelligent methods to assist in data assimilation. The target of this thesis is to study how such intelligent assimilation methods are affecting paper web quality. This work is based on a paper web simulator, which has been developed in the TEKES funded MASI NoTes project. The simulator is a valuable tool in comparing different assimilation methods. The thesis contains the comparison of four different assimilation methods. These data assimilation methods are a first order Bayesian model estimator, an ARMA model based on a higher order Bayesian estimator, a Fourier transform based Kalman filter estimator and a simple block estimator. The last one can be considered to be close to current operational methods. From these methods Bayesian, ARMA and Kalman all seem to have advantages over the commercial one. The Kalman and ARMA estimators seems to be best in overall performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing usage of Web Services has been result of efforts to automate Web Services discovery and interoperability. The Semantic Web Service descriptions create basis for automatic Web Service information management tasks such as discovery and interoperability. The discussion of opportunities enabled by service descriptions have arisen in recent years. The end user has been considered only as a consumer of services and information sharing occurred from one service provider to public in service distribution. The social networking has changed the nature of services. The end user cannot be seen anymore only as service consumer, because by enabling semantically rich environment and right tools, the end user will be in the future the producer of services. This study investigates the ways to provide for end users the empowerment to create service descriptions on mobile device. Special focus is given to the changed role of the end user in service creation. In addition, the Web Services technologies are presented as well as different Semantic Web Service description approaches are compared. The main focus in the study is to investigate tools and techniques to enable service description creation and semantic information management on mobile device.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tässä diplomityössä esitellään älykkäiden sähköverkkojen ominaisuuksia. Tavoitteena on selvittää, millaisia hyötyjä sähköverkkojen uudistamisella voidaan saavuttaa. Lisäksi työssä selvitetään syitä, jotka ovat ajaneet sähköverkkojen uudistamiseen. Älykäs sähkönmittaus on tärkeä osa älykkäitä sähköverkkoja, ja työssä on esitelty älykkäiden sähkömittareiden tilannetta eri maissa. Diplomityö selvittää myös hajautetun pientuotannon roolia sähköntuotannossa. Työn loppupuolella on esitelty muutama smart grid -projekti Euroopasta. Projektit valittiin Euroopasta, koska smart grids on oleellinen tekijä Euroopan komission 20/20/20 -tavoitteessa. Molemmat projektit kulminoituvat täysin älykkään sähkönmittauksen liittämiseen osaksi kuluttajien sähkönmittausta. Esimerkkien avulla nähdään, millaisella aikataululla älykkäitä sähköverkkoja saadaan aikaan ja kuinka suuria verkostokokonaisuuksia tulevaisuuden sähköverkot ovat.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn päätavoitteena oli kohdeyrityksen kustannuslaskennan kehittäminen, jota varten selvitettiin toimintojen todelliset kustannukset sekä rakennettiin uusi taulukkolaskentaan perustuva hinnoittelumalli. Todellisten kustannukset selvitettiin toimintolaskennan avulla. Yrityksen aiempi kustannuslaskenta perustui perinteiseen lisäyslaskentaan. Työ jakaantui kahteen vaiheeseen: yrityksen kustannuslaskennan nykytilaselvitykseen ja toimintolaskennan toteuttamiseen. Ensimmäisen vaiheen teoriaosuudessa esiteltiin perinteisen kustannuslaskennan ja toimintolaskennan menetelmät sekä vertailtiin niitä keskenään. Empiriaosuudessa käsiteltiin yrityksen kustannusrakenne, tuotekustannuslaskenta, hinnoitteluprosessi ja eri hinnoittelukohteet. Nykytilaselvityksen perusteella laadittiin lista nykyisen kustannuslaskennan ja hinnoittelun kehitettävistä asioista. Kehittäminen päätettiin toteuttaa toimintolaskennan avulla. Toisessa vaiheessa esiteltiin toimintolaskennan toteuttamiseen ja käyttöönottoon liittyvä teoria. Tämän jälkeen suoritettiin toimintokustannusten laskeminen ja uuden hinnoittelumallin rakentaminen. Hinnoittelumallissa haettiin nopeutta uudella materiaalinlaskentatavalla. Työn tuloksina havaittiin, että toteutuneet kustannukset erosivat monen toiminnon kohdalla lisäyslaskennalla lasketuista kustannuksista ja tämä oli vääristänyt tuotteiden hinnoittelua. Toimintolaskennan käyttöönotolla yrityksen kustannuslaskenta ja tuotehinnoittelu saatettiin vastaamaan todellisia kustannuksia. Hinnoittelun nopeutumisella saavutettiin merkittäviä kustannussäästöjä.