21 resultados para distributed amorphous human intelligence genesis robust communication network

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Communications play a key role in modern smart grids. New functionalities that make the grids ‘smart’ require the communication network to function properly. Data transmission between intelligent electric devices (IEDs) in the rectifier and the customer-end inverters (CEIs) used for power conversion is also required in the smart grid concept of the low-voltage direct current (LVDC) distribution network. Smart grid applications, such as smart metering, demand side management (DSM), and grid protection applied with communications are all installed in the LVDC system. Thus, besides remote connection to the databases of the grid operators, a local communication network in the LVDC network is needed. One solution applied to implement the communication medium in power distribution grids is power line communication (PLC). There are power cables in the distribution grids, and hence, they may be applied as a communication channel for the distribution-level data. This doctoral thesis proposes an IP-based high-frequency (HF) band PLC data transmission concept for the LVDC network. A general method to implement the Ethernet-based PLC concept between the public distribution rectifier and the customerend inverters in the LVDC grid is introduced. Low-voltage cables are studied as the communication channel in the frequency band of 100 kHz–30 MHz. The communication channel characteristics and the noise in the channel are described. All individual components in the channel are presented in detail, and a channel model, comprising models for each channel component is developed and verified by measurements. The channel noise is also studied by measurements. Theoretical signalto- noise ratio (SNR) and channel capacity analyses and practical data transmission tests are carried out to evaluate the applicability of the PLC concept against the requirements set by the smart grid applications in the LVDC system. The main results concerning the applicability of the PLC concept and its limitations are presented, and suggestion for future research proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent developments in power electronics technology have made it possible to develop competitive and reliable low-voltage DC (LVDC) distribution networks. Further, islanded microgrids—isolated small-scale localized distribution networks— have been proposed to reliably supply power using distributed generations. However, islanded operations face many issues such as power quality, voltage regulation, network stability, and protection. In this thesis, an energy management system (EMS) that ensures efficient energy and power balancing and voltage regulation has been proposed for an LVDC island network utilizing solar panels for electricity production and lead-acid batteries for energy storage. The EMS uses the master/slave method with robust communication infrastructure to control the production, storage, and loads. The logical basis for the EMS operations has been established by proposing functionalities of the network components as well as by defining appropriate operation modes that encompass all situations. During loss-of-powersupply periods, load prioritizations and disconnections are employed to maintain the power supply to at least some loads. The proposed EMS ensures optimal energy balance in the network. A sizing method based on discrete-event simulations has also been proposed to obtain reliable capacities of the photovoltaic array and battery. In addition, an algorithm to determine the number of hours of electric power supply that can be guaranteed to the customers at any given location has been developed. The successful performances of all the proposed algorithms have been demonstrated by simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Superheater corrosion causes vast annual losses for the power companies. With a reliable corrosion prediction method, the plants can be designed accordingly, and knowledge of fuel selection and determination of process conditions may be utilized to minimize superheater corrosion. Growing interest to use recycled fuels creates additional demands for the prediction of corrosion potential. Models depending on corrosion theories will fail, if relations between the inputs and the output are poorly known. A prediction model based on fuzzy logic and an artificial neural network is able to improve its performance as the amount of data increases. The corrosion rate of a superheater material can most reliably be detected with a test done in a test combustor or in a commercial boiler. The steel samples can be located in a special, temperature-controlled probe, and exposed to the corrosive environment for a desired time. These tests give information about the average corrosion potential in that environment. Samples may also be cut from superheaters during shutdowns. The analysis ofsamples taken from probes or superheaters after exposure to corrosive environment is a demanding task: if the corrosive contaminants can be reliably analyzed, the corrosion chemistry can be determined, and an estimate of the material lifetime can be given. In cases where the reason for corrosion is not clear, the determination of the corrosion chemistry and the lifetime estimation is more demanding. In order to provide a laboratory tool for the analysis and prediction, a newapproach was chosen. During this study, the following tools were generated: · Amodel for the prediction of superheater fireside corrosion, based on fuzzy logic and an artificial neural network, build upon a corrosion database developed offuel and bed material analyses, and measured corrosion data. The developed model predicts superheater corrosion with high accuracy at the early stages of a project. · An adaptive corrosion analysis tool based on image analysis, constructedas an expert system. This system utilizes implementation of user-defined algorithms, which allows the development of an artificially intelligent system for thetask. According to the results of the analyses, several new rules were developed for the determination of the degree and type of corrosion. By combining these two tools, a user-friendly expert system for the prediction and analyses of superheater fireside corrosion was developed. This tool may also be used for the minimization of corrosion risks by the design of fluidized bed boilers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Matkustajat pitävät junissa annettavaa informaatiota entistä tärkeämpänä. Tähän haasteeseen pyritään vastaamaan rutiinikuulutusten, kuten asemille tulokuulutusten, automatisoinnilla. Kuulutukset soitetaan ääneen ja esitetään tekstinä näytöillä. Tällöin annettava informaatio on selkeää ja tasalaatuista myös vieraalla kielellä. Junan tiedot, kuten määränpää ja vaunun myyntinumero, esitetään junan ulkoseinällä ovien vierellä junaan nousevia matkustajia varten. Tässä diplomityössä luodaan yleiskatsaus junaväylään liittyviin standardeihin, TCN-standardiin ja UIC-556-määrelehteen. Yleiskatsauksen lisäksi käsitellään tarkemmin matkustajainformaatiojärjestelmään liittyvät osat. Tutustumme myös UIC-176-määrelehteen, joka käsittelee matkustajainformaatiojärjestelmän näyttöjä. Luomme yleiskatsauksen EKE-Elektroniikka Oy:n kehittämään modulaariseen junanhallintajärjestelmään. Työssä suunnitellaan asiakkaan ja standardien vaatimusten pohjalta matkustajainformaatiojärjestelmä, joka on integroitu junanhallintajärjestelmään. Tekstitiedon välittämiseen vaunusta toiseen käytetään WTB-junaväylää. Siinä kommunikoidaan UIC-556-määrelehden mukaisesti, joka mahdollistaa eurooppalaisen yhteensopivuuden. Työn tuloksena on määritelty järjestelmän fyysinen rakenne, ohjelmistorakenne, vaadittavat tietokannat, tiedon syöttäminen järjestelmälle ja junaväylässä käytettävät telegrammit. Erityishaasteena on ratkaistu tilanne, jossa juna jakautuu matkan varrella useampaan suuntaan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Technology scaling has proceeded into dimensions in which the reliability of manufactured devices is becoming endangered. The reliability decrease is a consequence of physical limitations, relative increase of variations, and decreasing noise margins, among others. A promising solution for bringing the reliability of circuits back to a desired level is the use of design methods which introduce tolerance against possible faults in an integrated circuit. This thesis studies and presents fault tolerance methods for network-onchip (NoC) which is a design paradigm targeted for very large systems-onchip. In a NoC resources, such as processors and memories, are connected to a communication network; comparable to the Internet. Fault tolerance in such a system can be achieved at many abstraction levels. The thesis studies the origin of faults in modern technologies and explains the classification to transient, intermittent and permanent faults. A survey of fault tolerance methods is presented to demonstrate the diversity of available methods. Networks-on-chip are approached by exploring their main design choices: the selection of a topology, routing protocol, and flow control method. Fault tolerance methods for NoCs are studied at different layers of the OSI reference model. The data link layer provides a reliable communication link over a physical channel. Error control coding is an efficient fault tolerance method especially against transient faults at this abstraction level. Error control coding methods suitable for on-chip communication are studied and their implementations presented. Error control coding loses its effectiveness in the presence of intermittent and permanent faults. Therefore, other solutions against them are presented. The introduction of spare wires and split transmissions are shown to provide good tolerance against intermittent and permanent errors and their combination to error control coding is illustrated. At the network layer positioned above the data link layer, fault tolerance can be achieved with the design of fault tolerant network topologies and routing algorithms. Both of these approaches are presented in the thesis together with realizations in the both categories. The thesis concludes that an optimal fault tolerance solution contains carefully co-designed elements from different abstraction levels

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is a preliminary study targeting South-Eastern Finland. The objective was to find out the financial and functional readiness and willingness of the small and medium-sized enterprises of the region to manufacture and sell distributed bioenergy solutions collaboratively as a business network. In this case these solutions mean small-scale (0.5 - 3 MW) woodchips-operated combined heat and power (CHP) plants. South-Eastern Finland has suffered from a decline in the recent years, mostly due to the problems of the traditionally strong industrial know-how area of the region, the paper industry. Local small and medium-sized companies will have to find new ways to survive the toughening competition. A group of 40 companies from suitable industries were selected and financial and comparative analysis was performed on them. Additionally 19 managing directors of the companies were selected for an interview to find out their views on networking, its requirements, advantages and the general interest in it. The studied companies were found to be generally in fairly good financial condition and in that sense, fit for networking activities. The interviews revealed that the companies were capable of producing all the needed elements for the plants in question, and the managers appeared to be very interested in and have a positive attitude towards such business networks. Thus it can be said that the small and medium-sized companies of the region are capable of and interested in manufacturing small bio-CHP plants as a production network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the design and implementation of a graphical application on a mobile device to teleoperate a mobile robot. The department of information technology in Lappeenranta University conducts research in robotics, and the main motivation was to extend the available teleoperation applications for mobile devices. The challenge was to port existing robot software library onto an embedded device platform, then develop a suitable user interface application that provides sufficient functionality to perform teleoperation tasks over a wireless communication network. This thesis involved investigating previous teleoperation applications and conducted similar experiments to test and evaluate the designed application for functional activity and measure performance on a mobile device which have been identified and achieved. The implemented solution offered good results for navigation purposes particularly for teleoperating a visible robot and suggests solutions for exploration when no environment map for the operator is present.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Henkilötiedustelua aiheena ei ole suomalaisessa sotatieteellisessä tutkimuksessa aiemmin tutkittu. Henkilötiedustelu tai HUMINT = Human Intelligence on nykyään yhä käytetympi termi sotilaiden keskuudessa, vaikka vain harva tietää, mitä työskentely todellisuudessa pitää sisällään. Tutkimuksessa tutkittiin henkilötiedustelijan toimintakykyä, kehyksenään sotilaspedagoginen viitekehys sotilaan toimintakyvylle. Lisäksi tutkimuksessa pyrittiin selvittämään asiakokonaisuuksia, joita HUMINT- operaattorin koulutuksessa tulisi ottaa huomioon, pyrkimyksenä henkilötiedustelukoulutuksen mahdollinen kehittyminen tulevaisuudessa. Tämän lisäksi tässä tutkimuksessa luotiin lukijalle pintapuolinen kuva henkilötiedustelutyöstä ennen toimintakykyä ja oppimista käsitteleviä kappaleita. Tutkimus toteutettiin kvalitatiivisena tutkimuksena, tutkimusmetodina oli hermeneuttinen sisällönanalyysi. Aineiston analyysitapana käytettiin teoriasidonnaista analyysiä. Tutkimuskysymykset laadittiin selvittämään henkilötiedustelijan toimintakykyä sekä oppimista 1) miten sotilaan toimintakyvyn nelikenttämalli toteutuu henkilötiedustelijan työssä ja 2) mitkä ovat keskeisimmät kehittämisen kohteet suomalaisessa henkilötiedustelukoulutuksessa? Tutkimuksessa havaittiin, että henkilötiedustelijan toimintakyvyn tärkeimmiksi osa-alueiksi nousivat psyykkinen ja sosiaalinen toimintakyky. Lisäksi tutkimustulokset osoittivat, että koulutuksen painopistettä tulisi kehittää nimenomaan lähteen käsittelyyn sekä puhuttamiseen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sähköpostin käyttö suomalaisilla työpaikoilla on kasvanut merkittävästi 2000 -luvun alusta lähtien. Tänä päivänä suuri osa työntekijöistä käyttää työpaikoillaan sähköpostia päivittäin. Tutkimuksen tavoitteena on tutkia työnantajan oikeutta lukea työntekijän sähköposteja. Edellisen lisäksi huomio kiinnittyi työntekijän yksityisyyden suojaan sekä työnantajan oikeuksiin ja velvollisuuksiin. Tutkimukseni on toteutettu kvalitatiivisena tutkimuksena. Tutkimusaineistoon valittiin kaikki tutkimusaihetta koskevat kirjoitukset. Aineisto kerättiin pääasiallisesti suomalaisista lakiteksteistä sekä tutkielmani empiirinen osio koostui neljästä suomalaisesta oikeustapauksesta. Tutkimuksen perusteella voidaan todeta, että Suomessa työnantajat osaavat kunnioittaa työntekijän yksityisyyttä ja sähköpostin käytöstä työpaikoilla ei ole usein syntynyt ongelmia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the recent years, smart grids have received great public attention. Many proposed functionalities rely on power electronics, which play a key role in the smart grid, together with the communication network. However, “smartness” is not the driver that alone motivates the research towards distribution networks based on power electronics; the network vulnerability to natural hazards has resulted in tightening requirements for the supply security, set both by electricity end-users and authorities. Because of the favorable price development and advancements in the field, direct current (DC) distribution has become an attractive alternative for distribution networks. In this doctoral dissertation, power electronic converters for a low-voltage DC (LVDC) distribution system are investigated. These include the rectifier located at the beginning of the LVDC network and the customer-end inverter (CEI) on the customer premises. Rectifier topologies are introduced, and according to the LVDC system requirements, topologies are chosen for the analysis. Similarly, suitable CEI topologies are addressed and selected for study. Application of power electronics into electricity distribution poses some new challenges. Because the electricity end-user is supplied with the CEI, it is responsible for the end-user voltage quality, but it also has to be able to supply adequate current in all operating conditions, including a short-circuit, to ensure the electrical safety. Supplying short-circuit current with power electronics requires additional measures, and therefore, the short-circuit behavior is described and methods to overcome the high-current supply to the fault are proposed. Power electronic converters also produce common-mode (CM) and radio-frequency (RF) electromagnetic interferences (EMI), which are not present in AC distribution. Hence, their magnitudes are investigated. To enable comprehensive research on the LVDC distribution field, a research site was built into a public low-voltage distribution network. The implementation was a joint task by the LVDC research team of Lappeenranta University of Technology and a power company Suur-Savon S¨ahk¨o Oy. Now, the measurements could be conducted in an actual environment. This is important especially for the EMI studies. The main results of the work concern the short-circuit operation of the CEI and the EMI issues. The applicability of the power electronic converters to electricity distribution is demonstrated, and suggestions for future research are proposed.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The past few decades have seen a considerable increase in the number of parallel and distributed systems. With the development of more complex applications, the need for more powerful systems has emerged and various parallel and distributed environments have been designed and implemented. Each of the environments, including hardware and software, has unique strengths and weaknesses. There is no single parallel environment that can be identified as the best environment for all applications with respect to hardware and software properties. The main goal of this thesis is to provide a novel way of performing data-parallel computation in parallel and distributed environments by utilizing the best characteristics of difference aspects of parallel computing. For the purpose of this thesis, three aspects of parallel computing were identified and studied. First, three parallel environments (shared memory, distributed memory, and a network of workstations) are evaluated to quantify theirsuitability for different parallel applications. Due to the parallel and distributed nature of the environments, networks connecting the processors in these environments were investigated with respect to their performance characteristics. Second, scheduling algorithms are studied in order to make them more efficient and effective. A concept of application-specific information scheduling is introduced. The application- specific information is data about the workload extractedfrom an application, which is provided to a scheduling algorithm. Three scheduling algorithms are enhanced to utilize the application-specific information to further refine their scheduling properties. A more accurate description of the workload is especially important in cases where the workunits are heterogeneous and the parallel environment is heterogeneous and/or non-dedicated. The results obtained show that the additional information regarding the workload has a positive impact on the performance of applications. Third, a programming paradigm for networks of symmetric multiprocessor (SMP) workstations is introduced. The MPIT programming paradigm incorporates the Message Passing Interface (MPI) with threads to provide a methodology to write parallel applications that efficiently utilize the available resources and minimize the overhead. The MPIT allows for communication and computation to overlap by deploying a dedicated thread for communication. Furthermore, the programming paradigm implements an application-specific scheduling algorithm. The scheduling algorithm is executed by the communication thread. Thus, the scheduling does not affect the execution of the parallel application. Performance results achieved from the MPIT show that considerable improvements over conventional MPI applications are achieved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Technological development brings more and more complex systems to the consumer markets. The time required for bringing a new product to market is crucial for the competitive edge of a company. Simulation is used as a tool to model these products and their operation before actual live systems are built. The complexity of these systems can easily require large amounts of memory and computing power. Distributed simulation can be used to meet these demands. Distributed simulation has its problems. Diworse, a distributed simulation environment, was used in this study to analyze the different factors that affect the time required for the simulation of a system. Examples of these factors are the simulation algorithm, communication protocols, partitioning of the problem, distributionof the problem, capabilities of the computing and communications equipment and the external load. Offices offer vast amounts of unused capabilities in the formof idle workstations. The use of this computing power for distributed simulation requires the simulation to adapt to a changing load situation. This requires all or part of the simulation work to be removed from a workstation when the owner wishes to use the workstation again. If load balancing is not performed, the simulation suffers from the workstation's reduced performance, which also hampers the owner's work. Operation of load balancing in Diworse is studied and it is shown to perform better than no load balancing, as well as which different approaches for load balancing are discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Tutkimuksen päätavoitteena on ollut selvittää miten kansallinen ja organisaatiokulttuuri, niihin liittyvät normit ja arvot edesauttavat tai vaikeuttavat luottamuksen kehittymistä monikulttuurisissa tiimeissä maailmanlaajuisessa organisaatiossa. Tutkimuksen avulla haluttiin myös selvittää miten luottamus kehittyy hajautetuissa monikansallisissa tiimeissä WorldCom Internationalissa. Empiirinen tutkimusmenetelmä perustuu kvalitatiivisiin teemahaastatteluihin, jotka tehtiin WorldComin työntekijöille. Tutkimuksessa havaittiin, ettei yhteisten sosiaalisten normien merkitys luottamuksen syntymiselle ole niin merkittävä, koska WorldComin yhtenäiset toimintatavat sekä hallitseva amerikkalaisen emoyhtiön "kotikulttuuri" muodostavat yhtenäiset toimintalinjat tiimeissä. Tietokonevälitteisen kommunikoinnin jatkuva käyttö on edesauttanut työntekijöiden ns. sosiaalisen älyn kehittymistä, sillä henkilökohtaisen tapaamisen puuttuminen kehittää vastaavasti taitoja aistia ja tulkita sähköpostien tai puhelinneuvotteluiden aikana välittyviä vastapuolen "näkymättömiä" vihjeitä.