961 resultados para Client-server distributed databases
Resumo:
Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange
Resumo:
The past few decades have seen a considerable increase in the number of parallel and distributed systems. With the development of more complex applications, the need for more powerful systems has emerged and various parallel and distributed environments have been designed and implemented. Each of the environments, including hardware and software, has unique strengths and weaknesses. There is no single parallel environment that can be identified as the best environment for all applications with respect to hardware and software properties. The main goal of this thesis is to provide a novel way of performing data-parallel computation in parallel and distributed environments by utilizing the best characteristics of difference aspects of parallel computing. For the purpose of this thesis, three aspects of parallel computing were identified and studied. First, three parallel environments (shared memory, distributed memory, and a network of workstations) are evaluated to quantify theirsuitability for different parallel applications. Due to the parallel and distributed nature of the environments, networks connecting the processors in these environments were investigated with respect to their performance characteristics. Second, scheduling algorithms are studied in order to make them more efficient and effective. A concept of application-specific information scheduling is introduced. The application- specific information is data about the workload extractedfrom an application, which is provided to a scheduling algorithm. Three scheduling algorithms are enhanced to utilize the application-specific information to further refine their scheduling properties. A more accurate description of the workload is especially important in cases where the workunits are heterogeneous and the parallel environment is heterogeneous and/or non-dedicated. The results obtained show that the additional information regarding the workload has a positive impact on the performance of applications. Third, a programming paradigm for networks of symmetric multiprocessor (SMP) workstations is introduced. The MPIT programming paradigm incorporates the Message Passing Interface (MPI) with threads to provide a methodology to write parallel applications that efficiently utilize the available resources and minimize the overhead. The MPIT allows for communication and computation to overlap by deploying a dedicated thread for communication. Furthermore, the programming paradigm implements an application-specific scheduling algorithm. The scheduling algorithm is executed by the communication thread. Thus, the scheduling does not affect the execution of the parallel application. Performance results achieved from the MPIT show that considerable improvements over conventional MPI applications are achieved.
Resumo:
Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.
Resumo:
This thesis examines the history and evolution of information system process innovation (ISPI) processes (adoption, adaptation, and unlearning) within the information system development (ISD) work in an internal information system (IS) department and in two IS software house organisations in Finland over a 43-year time-period. The study offers insights into influential actors and their dependencies in deciding over ISPIs. The research usesa qualitative research approach, and the research methodology involves the description of the ISPI processes, how the actors searched for ISPIs, and how the relationships between the actors changed over time. The existing theories were evaluated using the conceptual models of the ISPI processes based on the innovationliterature in the IS area. The main focus of the study was to observe changes in the main ISPI processes over time. The main contribution of the thesis is a new theory. The term theory should be understood as 1) a new conceptual framework of the ISPI processes, 2) new ISPI concepts and categories, and the relationships between the ISPI concepts inside the ISPI processes. The study gives a comprehensive and systematic study on the history and evolution of the ISPI processes; reveals the factors that affected ISPI adoption; studies ISPI knowledge acquisition, information transfer, and adaptation mechanisms; and reveals the mechanismsaffecting ISPI unlearning; changes in the ISPI processes; and diverse actors involved in the processes. The results show that both the internal IS department and the two IS software houses sought opportunities to improve their technical skills and career paths and this created an innovative culture. When new technology generations come to the market the platform systems need to be renewed, and therefore the organisations invest in ISPIs in cycles. The extent of internal learning and experiments was higher than the external knowledge acquisition. Until the outsourcing event (1984) the decision-making was centralised and the internalIS department was very influential over ISPIs. After outsourcing, decision-making became distributed between the two IS software houses, the IS client, and itsinternal IT department. The IS client wanted to assure that information systemswould serve the business of the company and thus wanted to co-operate closely with the software organisations.
Resumo:
Technological development brings more and more complex systems to the consumer markets. The time required for bringing a new product to market is crucial for the competitive edge of a company. Simulation is used as a tool to model these products and their operation before actual live systems are built. The complexity of these systems can easily require large amounts of memory and computing power. Distributed simulation can be used to meet these demands. Distributed simulation has its problems. Diworse, a distributed simulation environment, was used in this study to analyze the different factors that affect the time required for the simulation of a system. Examples of these factors are the simulation algorithm, communication protocols, partitioning of the problem, distributionof the problem, capabilities of the computing and communications equipment and the external load. Offices offer vast amounts of unused capabilities in the formof idle workstations. The use of this computing power for distributed simulation requires the simulation to adapt to a changing load situation. This requires all or part of the simulation work to be removed from a workstation when the owner wishes to use the workstation again. If load balancing is not performed, the simulation suffers from the workstation's reduced performance, which also hampers the owner's work. Operation of load balancing in Diworse is studied and it is shown to perform better than no load balancing, as well as which different approaches for load balancing are discussed.
Resumo:
This article introduces a new interface for T-Coffee, a consistency-based multiple sequence alignment program. This interface provides an easy and intuitive access to the most popular functionality of the package. These include the default T-Coffee mode for protein and nucleic acid sequences, the M-Coffee mode that allows combining the output of any other aligners, and template-based modes of T-Coffee that deliver high accuracy alignments while using structural or homology derived templates. These three available template modes are Expresso for the alignment of protein with a known 3D-Structure, R-Coffee to align RNA sequences with conserved secondary structures and PSI-Coffee to accurately align distantly related sequences using homology extension. The new server benefits from recent improvements of the T-Coffee algorithm and can align up to 150 sequences as long as 10 000 residues and is available from both http://www.tcoffee.org and its main mirror http://tcoffee.crg.cat.
Resumo:
Työssä tutustutaan VMwaren Vix- ja VmCOM-rajapintoihin sekä luodaan ohjelmia, joilla käyttäjä voi hallita VMware Server -ohjelmaa rajapintojen avulla. Vix-rajapinnan avulla luodaan C++-kielellä komentoliittymä-ohjelma, joka on suorassa yhteydessä VMware Server -ohjelmaan. Työssä luodaan palvelin joka käyttää yhteydenhallintaan käyttäjän ja palvelimen välillä TCP-teknologiaa. Palvelin keskustelee VMware Serverin kanssa käyttäen VmCOM-rajapintaa. Windows käyttöjärjestelmälle luodaan Client-ohjelma. Ohjelmasta luodaan suoraan VMware Serveriin yhteydessä oleva versio sekä palvelinpohjainen versio. ASP.NET:lla luodaan dynaamiset web-sivut, joka käyttää myös yhteydenhallintaa palvelinta. Dynaamisten web-sivujen avulla voidaan myös muokata SQL Serverillä olevia käyttäjien tietotauluja. Windows-ohjelmat, dynaamiset web-sivut sekä TCP-palvelin on kirjoitettu C#-ohjelmointikielellä.
Resumo:
The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.
Resumo:
A variation of task analysis was used to build an empirical model of how therapists may facilitate client assimilation process, described in the Assimilation of Problematic Experiences Scale. A rational model was specified and considered in light of an analysis of therapist in-session performances (N = 117) drawn from six inpatient therapies for depression. The therapist interventions were measured by the Comprehensive Psychotherapeutic Interventions Rating Scale. Consistent with the rational model, confronting interventions were particularly useful in helping clients elaborate insight. However, rather than there being a small number of progress-related interventions at lower levels of assimilation, therapists' use of interventions was broader than hypothesized and drew from a wide range of therapeutic approaches. Concerning the higher levels of assimilation, there was insufficient data to allow an analysis of the therapist's progress-related interventions.
Resumo:
Tutkimuksen päätavoitteena on ollut selvittää miten kansallinen ja organisaatiokulttuuri, niihin liittyvät normit ja arvot edesauttavat tai vaikeuttavat luottamuksen kehittymistä monikulttuurisissa tiimeissä maailmanlaajuisessa organisaatiossa. Tutkimuksen avulla haluttiin myös selvittää miten luottamus kehittyy hajautetuissa monikansallisissa tiimeissä WorldCom Internationalissa. Empiirinen tutkimusmenetelmä perustuu kvalitatiivisiin teemahaastatteluihin, jotka tehtiin WorldComin työntekijöille. Tutkimuksessa havaittiin, ettei yhteisten sosiaalisten normien merkitys luottamuksen syntymiselle ole niin merkittävä, koska WorldComin yhtenäiset toimintatavat sekä hallitseva amerikkalaisen emoyhtiön "kotikulttuuri" muodostavat yhtenäiset toimintalinjat tiimeissä. Tietokonevälitteisen kommunikoinnin jatkuva käyttö on edesauttanut työntekijöiden ns. sosiaalisen älyn kehittymistä, sillä henkilökohtaisen tapaamisen puuttuminen kehittää vastaavasti taitoja aistia ja tulkita sähköpostien tai puhelinneuvotteluiden aikana välittyviä vastapuolen "näkymättömiä" vihjeitä.
Resumo:
Työssä perehdytään verkkokäyttöliittymien suunnitteluun ja toteutukseen. Erityisen huomion kohteena ovat Java-ohjelmointikieli ja ohjelmistosuunnittelussa suurta huomiota herättäneet suunnittelumallit. Java on kehittynyt lyhyen elinikänsä aikana huomattavasti, ja viimeaikaiset vakioluokkakirjastot ovat mahdollistaneet monipuolisten ja siirrettävien ohjelmistojen toteuttamisen. Vaikka Java ei sovellukaan kaikkiin tarkoituksiin, on se luotettavuutensa, siirrettävyytensä ja luokkakirjastojensa ilmaisuvoiman ansiosta hyvin houkutteleva kieli käyttöliittymien tekemiseen. Suunnittelumallit ovat merkittävä osa ammattimaista oliosuunnittelua. Ne tarjoavat valmiita ratkaisuja yleisiin suunnitteluongelmiin ja säästävät siten aikaa suunnittelu- ja toteutusvaiheissa. Suunnittelumallit voidaan jakaa kolmeen ryhmään: luontimallit, rakennemallit ja toiminnalliset mallit. Käytännön osuudessa tarkastellaan Java-käyttöliittymän toteutusta olemassa olevaan Soneran TradeXpress-tietojärjestelmään. Tässä tehtävässä Java ja suunnittelumallit ovat olleet keskeisellä sijalla. Java-käyttöliittymä, JavaGUI, koostuu kahdesta osasta: asiakasappletista ja Jrls-palvelimesta. Näiden kahden väliseen tiedonsiirtoon käytetään pääasiassa Javan RMI-yhteyskäytäntöä. Jrls-palvelin toimii asiakkaana RLS-palvelimelle erityisen RLS-yhteyskäytännön avulla.
Resumo:
Tietoliikenneala elää muutosten aikaa. Vanhat piirikytketyt kapeakaistaiset verkot ovat väistymässä seuraavan sukupolven verkkojen tieltä. GPRS-verkot tekevät tuloaan ja 3G-verkkojen kehitystyö on kiivaimmillaan laitetoimittajien ratkoessa verkoille asetettuja haasteita. Jää nähtäväksi, kykenevätkö 3G-verkot vastaamaan näihin haasteisiin. Työssä käsitellään 3G-verkon arkkitehtuuria, toimintaa ja puhelunohjausta MSC Server -konseptissa, sekä käydään läpi muutamia puheluesimerkkejä. 3G-verkon arkkitehtuuri, jossa puhelunohjaus on erotettu mediakerroksesta, esitetään ja uudet verkkoelementit ja niiden toiminta käsitellään. Esitelty teoria perustuu kirjallisuuteen sekä yleisesti hyväksyttyihin 3G- verkon standardeihin. Näitä standardeja julkaisevat muun muassa ITU-T ja 3GPP yhdessä yhteistyöyritystensä kanssa. Standardointityö on monin osin vielä kesken ja se on vaikeuttanut 3G-verkkojen suunnittelutyötä merkittävästi. Työn tarkoituksena on selvittää puhelunohjausta 3G-verkossa ja sen toteuttamista MSC Server –konseptilla. MSC Serverin toteutusvaihtoehtoja arvioidaan ja niistä valitaan yksi 3G-verkon puhelunohjauksen toteutusmalliksi. Työtä jatketaan tämän diplomityön jälkeen tarkemman toteutuksen suunnittelulla, päämääränä rakentaa 3G-verkko.
Resumo:
Background: The repertoire of statistical methods dealing with the descriptive analysis of the burden of a disease has been expanded and implemented in statistical software packages during the last years. The purpose of this paper is to present a web-based tool, REGSTATTOOLS http://regstattools.net intended to provide analysis for the burden of cancer, or other group of disease registry data. Three software applications are included in REGSTATTOOLS: SART (analysis of disease"s rates and its time trends), RiskDiff (analysis of percent changes in the rates due to demographic factors and risk of developing or dying from a disease) and WAERS (relative survival analysis). Results: We show a real-data application through the assessment of the burden of tobacco-related cancer incidence in two Spanish regions in the period 1995-2004. Making use of SART we show that lung cancer is the most common cancer among those cancers, with rising trends in incidence among women. We compared 2000-2004 data with that of 1995-1999 to assess percent changes in the number of cases as well as relative survival using RiskDiff and WAERS, respectively. We show that the net change increase in lung cancer cases among women was mainly attributable to an increased risk of developing lung cancer, whereas in men it is attributable to the increase in population size. Among men, lung cancer relative survival was higher in 2000-2004 than in 1995-1999, whereas it was similar among women when these time periods were compared. Conclusions: Unlike other similar applications, REGSTATTOOLS does not require local software installation and it is simple to use, fast and easy to interpret. It is a set of web-based statistical tools intended for automated calculation of population indicators that any professional in health or social sciences may require.
Resumo:
Diplomityössä on tutkittu sulautetun järjestelmän liittämistä Ethernet-verkkoon sekä TCP/IP-protokollapinoon kuuluvien tavallisimpien protokollien toimintaa. Työn tuloksien perusteella on suunniteltu harjoitustyö, jota voidaan käyttää sähkötekniikan osaston opetuksessa. Työssä hankittiin Atmelin sulautettu Web-serverikortti (EWS) ja STK500-kortti serverin ohjelmointiin. Serverin mukana tuli Internet-yhteyden mahdollistava TCP/IP-pinon lähdekoodi. Työssä selvitettiin TCP/IP:hen kuuluvien protokollien toimintaa teoriassa ja käytännön toteutusta EWS:n avulla. Lähdekoodiin lisättiin ominaisuudet, joiden avulla laite hakee kellonaikatiedot aikapalvelimelta time-protokollaa käyttäen ja lähettää sähköpostia määriteltyyn osoitteeseen sähköpostipalvelimen kautta. Laitetta käytettiin sekä palvelimena että asiakkaana. Työssä perehdyttiin sulautettujen järjestelmien yleisiin ominaisuuksiin ja erilaisiin verkonhallinnan apuohjelmiin. Työssä tutkittiin palvelin- ja asiakaskoneen välillä Ethernetissä kulkevaa verkkoliikennettä. Työssä minimoitiin protokollatiedostojen viemä tila prosessorin muistista ja tutkittiin Internet-yhteyden kuluttamien kellojaksojen määrää. Työssä selvitettiin tietoturva-kysymysten merkitystä ja toteutusta sulautetuissa järjestelmissä.