833 resultados para Computational Intelligence


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Food processes must ensure safety and high-quality products for a growing demand consumer creating the need for better knowledge of its unit operations. The Computational Fluid Dynamics (CFD) has been widely used for better understanding the food thermal processes, and it is one of the safest and most frequently used methods for food preservation. However, there is no single study in the literature describing thermal process of liquid foods in a brick shaped package. The present study evaluated such process and the influence of its orientation on the process lethality. It demonstrated the potential of using CFD to evaluate thermal processes of liquid foods and the importance of rheological characterization and convection in thermal processing of liquid foods. It also showed that packaging orientation does not result in different sterilization values during thermal process of the evaluated fluids in the brick shaped package.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phenomena in cyber domain, especially threats to security and privacy, have proven an increasingly heated topic addressed by different writers and scholars at an increasing pace – both nationally and internationally. However little public research has been done on the subject of cyber intelligence. The main research question of the thesis was: To what extent is the applicability of cyber intelligence acquisition methods circumstantial? The study was conducted in sequential a manner, starting with defining the concept of intelligence in cyber domain and identifying its key attributes, followed by identifying the range of intelligence methods in cyber domain, criteria influencing their applicability, and types of operatives utilizing cyber intelligence. The methods and criteria were refined into a hierarchical model. The existing conceptions of cyber intelligence were mapped through an extensive literature study on a wide variety of sources. The established understanding was further developed through 15 semi-structured interviews with experts of different backgrounds, whose wide range of points of view proved to substantially enhance the perspective on the subject. Four of the interviewed experts participated in a relatively extensive survey based on the constructed hierarchical model on cyber intelligence that was formulated in to an AHP hierarchy and executed in the Expert Choice Comparion online application. It was concluded that Intelligence in cyber domain is an endorsing, cross-cutting intelligence discipline that adds value to all aspects of conventional intelligence and furthermore that it bears a substantial amount of characteristic traits – both advantageous and disadvantageous – and furthermore that the applicability of cyber intelligence methods is partly circumstantially limited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gravitational phase separation is a common unit operation found in most large-scale chemical processes. The need for phase separation can arise e.g. from product purification or protection of downstream equipment. In gravitational phase separation, the phases separate without the application of an external force. This is achieved in vessels where the flow velocity is lowered substantially compared to pipe flow. If the velocity is low enough, the denser phase settles towards the bottom of the vessel while the lighter phase rises. To find optimal configurations for gravitational phase separator vessels, several different geometrical and internal design features were evaluated based on simulations using OpenFOAM computational fluid dynamics (CFD) software. The studied features included inlet distributors, vessel dimensions, demister configurations and gas phase outlet configurations. Simulations were conducted as single phase steady state calculations. For comparison, additional simulations were performed as dynamic single and two-phase calculations. The steady state single phase calculations provided indications on preferred configurations for most above mentioned features. The results of the dynamic simulations supported the utilization of the computationally faster steady state model as a practical engineering tool. However, the two-phase model provides more truthful results especially with flows where a single phase does not determine the flow characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The advancement of science and technology makes it clear that no single perspective is any longer sufficient to describe the true nature of any phenomenon. That is why the interdisciplinary research is gaining more attention overtime. An excellent example of this type of research is natural computing which stands on the borderline between biology and computer science. The contribution of research done in natural computing is twofold: on one hand, it sheds light into how nature works and how it processes information and, on the other hand, it provides some guidelines on how to design bio-inspired technologies. The first direction in this thesis focuses on a nature-inspired process called gene assembly in ciliates. The second one studies reaction systems, as a modeling framework with its rationale built upon the biochemical interactions happening within a cell. The process of gene assembly in ciliates has attracted a lot of attention as a research topic in the past 15 years. Two main modelling frameworks have been initially proposed in the end of 1990s to capture ciliates’ gene assembly process, namely the intermolecular model and the intramolecular model. They were followed by other model proposals such as templatebased assembly and DNA rearrangement pathways recombination models. In this thesis we are interested in a variation of the intramolecular model called simple gene assembly model, which focuses on the simplest possible folds in the assembly process. We propose a new framework called directed overlap-inclusion (DOI) graphs to overcome the limitations that previously introduced models faced in capturing all the combinatorial details of the simple gene assembly process. We investigate a number of combinatorial properties of these graphs, including a necessary property in terms of forbidden induced subgraphs. We also introduce DOI graph-based rewriting rules that capture all the operations of the simple gene assembly model and prove that they are equivalent to the string-based formalization of the model. Reaction systems (RS) is another nature-inspired modeling framework that is studied in this thesis. Reaction systems’ rationale is based upon two main regulation mechanisms, facilitation and inhibition, which control the interactions between biochemical reactions. Reaction systems is a complementary modeling framework to traditional quantitative frameworks, focusing on explicit cause-effect relationships between reactions. The explicit formulation of facilitation and inhibition mechanisms behind reactions, as well as the focus on interactions between reactions (rather than dynamics of concentrations) makes their applicability potentially wide and useful beyond biological case studies. In this thesis, we construct a reaction system model corresponding to the heat shock response mechanism based on a novel concept of dominance graph that captures the competition on resources in the ODE model. We also introduce for RS various concepts inspired by biology, e.g., mass conservation, steady state, periodicity, etc., to do model checking of the reaction systems based models. We prove that the complexity of the decision problems related to these properties varies from P to NP- and coNP-complete to PSPACE-complete. We further focus on the mass conservation relation in an RS and introduce the conservation dependency graph to capture the relation between the species and also propose an algorithm to list the conserved sets of a given reaction system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Social insects are known for their ability to display swarm intelligence, where the cognitive capabilities of the collective surpass those of the individuals forming it by orders of magnitude. The rise of crowdsourcing in recent years has sparked speculation as to whether something similar might be taking place on crowdsourcing sites, where hundreds or thousands of people interact with each other. The phenomenon has been dubbed collective intelligence. This thesis focuses on exploring the role of collective intelligence in crowdsourcing innovations. The task is approached through three research questions: 1) what is collective intelligence; 2) how is collective intelligence manifested in websites involved in crowdsourcing innovation; and 3) how important is collective intelligence for the functioning of the crowdsourcing sites. After developing a theoretical framework for collective intelligence, a multiple case study is conducted using an ethnographic data collection approach for the most part. A variety of qualitative, quantitative and simulation modelling methods are used to analyse the complex phenomenon from several theoretical viewpoints or ‘lenses’. Two possible manifestations of collective intelligence are identified: discussion, typical of web forums; and the wisdom of crowds in evaluating crowd submissions to websites. However, neither of these appears to be specific to crowdsourcing or critical for the functioning of the sites. Collective intelligence appears to play only a minor role in the cases investigated here. In addition, this thesis shows that feedback loops, which are found in all the cases investigated, reduce the accuracy of the crowd’s evaluations when a count of votes is used for aggregation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business intelligence (BI) is an information process that includes the activities and applications used to transform business data into valuable business information. Today’s enterprises are collecting detailed data which has increased the available business data drastically. In order to meet changing customer needs and gain competitive advantage businesses try to leverage this information. However, IT departments are struggling to meet the increased amount of reporting needs. Therefore, recent shift in the BI market has been towards empowering business users with self-service BI capabilities. The purpose of this study was to understand how self-service BI could help businesses to meet increased reporting demands. The research problem was approached with an empirical single case study. Qualitative data was gathered with a semi-structured, theme-based interview. The study found out that case company’s BI system was mostly used for group performance reporting. Ad-hoc and business user-driven information needs were mostly fulfilled with self-made tools and manual work. It was felt that necessary business information was not easily available. The concept of self-service BI was perceived to be helpful to meet such reporting needs. However, it was found out that the available data is often too complex for an average user to fully understand. The respondents felt that in order to self-service BI to work, the data has to be simplified and described in a way that it can be understood by the average business user. The results of the study suggest that BI programs struggle in meeting all the information needs of today’s businesses. The concept of self-service BI tries to resolve this problem by allowing users easy self-service access to necessary business information. However, business data is often complex and hard to understand. Self-serviced BI has to overcome this challenge before it can reach its potential benefits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kilpailuetua tavoittelevan yrityksen pitää kyetä jalostamaan tietoa ja tunnistamaan sen avulla uusia tulevaisuuden mahdollisuuksia. Tulevaisuuden mielikuvien luomiseksi yrityksen on tunnettava toimintaympäristönsä ja olla herkkänä havaitsemaan muutostrendit ja muut toimintaympäristön signaalit. Ympäristön elintärkeät signaalit liittyvät kilpailijoihin, teknologian kehittymiseen, arvomaailman muutoksiin, globaaleihin väestötrendeihin tai jopa ympäristön muutoksiin. Spatiaaliset suhteet ovat peruspilareita käsitteellistää maailmaamme. Pitney (2015) on arvioinut, että 80 % kaikesta bisnesdatasta sisältää jollakin tavoin viittauksia paikkatietoon. Siitä huolimatta paikkatietoa on vielä huonosti hyödynnetty yritysten strategisten päätösten tukena. Teknologioiden kehittyminen, tiedon nopea siirto ja paikannustekniikoiden integroiminen eri laitteisiin ovat mahdollistaneet sen, että paikkatietoa hyödyntäviä palveluja ja ratkaisuja tullaan yhä enemmän näkemään yrityskentässä. Tutkimuksen tavoitteena oli selvittää voiko location intelligence toimia strategisen päätöksenteon tukena ja jos voi, niin miten. Työ toteutettiin konstruktiivista tutkimusmenetelmää käyttäen, jolla pyritään ratkaisemaan jokin relevantti ongelma. Konstruktiivinen tutkimus tehtiin tiiviissä yhteistyössä kolmen pk-yrityksen kanssa ja siihen haastateltiin kuutta eri strategiasta vastaavaa henkilöä. Tutkimuksen tuloksena löydettiin, että location intelligenceä voidaan hyödyntää strategisen päätöksenteon tukena usealla eri tasolla. Yksinkertaisimmassa karttaratkaisussa halutut tiedot tuodaan kartalle ja luodaan visuaalinen esitys, jonka avulla johtopäätöksien tekeminen helpottuu. Toisen tason karttaratkaisu pitää sisällään sekä sijainti- että ominaisuustietoa, jota on yhdistetty eri lähteistä. Tämä toisen tason karttaratkaisu on usein kuvailevaa analytiikkaa, joka mahdollistaa erilaisten ilmiöiden analysoinnin. Kolmannen eli ylimmän tason karttaratkaisu tarjoaa ennakoivaa analytiikkaa ja malleja tulevaisuudesta. Tällöin ohjelmaan koodataan älykkyyttä, jossa informaation keskinäisiä suhteita on määritelty joko tiedon louhintaa tai tilastollisia analyysejä hyödyntäen. Tutkimuksen johtopäätöksenä voidaan todeta, että location intelligence pystyy tarjoamaan lisäarvoa strategisen päätöksenteon tueksi, mikäli yritykselle on hyödyllistä ymmärtää eri ilmiöiden, asiakastarpeiden, kilpailijoiden ja markkinamuutoksien maantieteellisiä eroavaisuuksia. Parhaimmillaan location intelligence -ratkaisu tarjoaa luotettavan analyysin, jossa tieto välittyy muuttumattomana päätöksentekijältä toiselle ja johtopäätökseen johtaneita syitä on mahdollista palata tarkastelemaan tarvittaessa uudelleen.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business intelligence (BI) is an information process that includes the activities and applications used to transform business data into valuable business information. Today’s enterprises are collecting detailed data which has increased the available business data drastically. In order to meet changing customer needs and gain competitive advantage businesses try to leverage this information. However, IT departments are struggling to meet the increased amount of reporting needs. Therefore, recent shift in the BI market has been towards empowering business users with self-service BI capabilities. The purpose of this study was to understand how self-service BI could help businesses to meet increased reporting demands. The research problem was approached with an empirical single case study. Qualitative data was gathered with a semi-structured, theme-based interview. The study found out that case company’s BI system was mostly used for group performance reporting. Ad-hoc and business user-driven information needs were mostly fulfilled with self-made tools and manual work. It was felt that necessary business information was not easily available. The concept of self-service BI was perceived to be helpful to meet such reporting needs. However, it was found out that the available data is often too complex for an average user to fully understand. The respondents felt that in order to self-service BI to work, the data has to be simplified and described in a way that it can be understood by the average business user. The results of the study suggest that BI programs struggle in meeting all the information needs of today’s businesses. The concept of self-service BI tries to resolve this problem by allowing users easy self-service access to necessary business information. However, business data is often complex and hard to understand. Self-serviced BI has to overcome this challenge before it can reach its potential benefits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The last two decades have provided a vast opportunity to live and explore the compulsive imaginary world or virtual world through massively multiplayer online role-playing games (MMORPGs). MMORPG gives a wide range of opportunities to its users to participate with multi-players on the same platform, to communicate and to do real time actions. There is a virtual economy in these games which is largely player-driven. In-game currency provides its users to build up their Avatars, to buy or sell the necessary goods to play, survive in the games and so on. As a part of virtual economies generated through EVE Online, this thesis mainly focuses on how the prices of the minerals in EVE Online behave by applying the Jabłonska- Capasso-Morale (JCM) mathematical simulation model. It is to verify up to what degree the model can reproduce the virtual economy behavior. The model is applied to buy and sell prices of two minerals namely, isogen and morphite. The simulation results demonstrate that JCM model ts reasonably well to the mineral prices, which lets us conclude that virtual economies behave similarly to the real ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Finnish Defence Studies is published under the auspices of the National Defence College, and the contributions reflect the fields of research and teaching of the College. Finnish Defence Studies will occasionally feature documentation on Finnish Security Policy. Views expressed are those of the authors and do not necessarily imply endorsement by the National Defence College.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tässä diplomityössä selvitetään case-tutkimuksena parhaita käytäntöjä Business Intelligence Competency Centerin (BICC) eli liiketoimintatiedonhallinnan osaamiskeskuksen perustamiseen. Työ tehdään LähiTapiolalle, jossa on haasteita BI-alueen hallinnoinnissa kehittämisen hajaantuessa eri yksiköihin ja yhtiöihin. Myös järjestelmäympäristö on moninainen. BICC:llä tavoitellaan parempaa näkyvyyttä liiketoiminnan tarpeisiin ja toisaalta halutaan tehostaa tiedon hyödyntämistä johtamisessa sekä operatiivisen tason työskentelyssä. Tavoitteena on lisäksi saada kustannuksia pienemmäksi yhtenäistämällä järjestelmäympäristöjä ja BI-työkaluja kuten myös toimintamalleja. Työssä tehdään kirjallisuuskatsaus ja haastatellaan asiantuntijoita kolmessa yrityksessä. Tutkimuksen perusteella voidaan todeta, että liiketoiminnan BI-tarpeita kannattaa mahdollistaa eri tasoilla perusraportoinnista Ad-hoc –raportointiin ja edistyneeseen analytiikkaan huomioimalla nämä toimintamalleissa ja järjestelmäarkkitehtuurissa. BICC:n perustamisessa liiketoimintatarpeisiin vastaaminen on etusijalla.