29 resultados para Computational Intelligence System
Resumo:
Tämä taktiikan tutkimus keskittyy tietokoneavusteisen simuloinnin laskennallisiin menetelmiin, joita voidaan käyttää taktisen tason sotapeleissä. Työn tärkeimmät tuotokset ovat laskennalliset mallit todennäköisyyspohjaisen analyysin mahdollistaviin taktisen tason taistelusimulaattoreihin, joita voidaan käyttää vertailevaan analyysiin joukkue-prikaatitason tarkastelutilanteissa. Laskentamallit keskittyvät vaikuttamiseen. Mallit liittyvät vahingoittavan osuman todennäköisyyteen, jonka perusteella vaikutus joukossa on mallinnettu tilakoneina ja Markovin ketjuina. Edelleen näiden tulokset siirretään tapahtumapuuanalyysiin operaation onnistumisen todennäköisyyden osalta. Pienimmän laskentayksikön mallinnustaso on joukkue- tai ryhmätasolla, jotta laskenta-aika prikaatitason sotapelitarkasteluissa pysyisi riittävän lyhyenä samalla, kun tulokset ovat riittävän tarkkoja suomalaiseen maastoon. Joukkueiden mies- ja asejärjestelmävahvuudet ovat jakaumamuodossa, eivätkä yksittäisiä lukuja. Simuloinnin integroinnissa voidaan käyttää asejärjestelmäkohtaisia predictor corrector –parametreja, mikä mahdollistaa aika-askelta lyhytaikaisempien taistelukentän ilmiöiden mallintamisen. Asemallien pohjana ovat aiemmat tutkimukset ja kenttäkokeet, joista osa kuuluu tähän väitöstutkimukseen. Laskentamallien ohjelmoitavuus ja käytettävyys osana simulointityökalua on osoitettu tekijän johtaman tutkijaryhmän ohjelmoiman ”Sandis”- taistelusimulointiohjelmiston avulla, jota on kehitetty ja käytetty Puolustusvoimien Teknillisessä Tutkimuslaitoksessa. Sandikseen on ohjelmoitu karttakäyttöliittymä ja taistelun kulkua simuloivia laskennallisia malleja. Käyttäjä tai käyttäjäryhmä tekee taktiset päätökset ja syöttää nämä karttakäyttöliittymän avulla simulointiin, jonka tuloksena saadaan kunkin joukkuetason peliyksikön tappioiden jakauma, keskimääräisten tappioiden osalta kunkin asejärjestelmän aiheuttamat tappiot kuhunkin maaliin, ammuskulutus ja radioyhteydet ja niiden tila sekä haavoittuneiden evakuointi-tilanne joukkuetasolta evakuointisairaalaan asti. Tutkimuksen keskeisiä tuloksia (kontribuutio) ovat 1) uusi prikaatitason sotapelitilanteiden laskentamalli, jonka pienin yksikkö on joukkue tai ryhmä; 2) joukon murtumispisteen määritys tappioiden ja haavoittuneiden evakuointiin sitoutuvien taistelijoiden avulla; 3) todennäköisyyspohjaisen riskianalyysin käyttömahdollisuus vertailevassa tutkimuksessa sekä 4) kokeellisesti testatut tulen vaikutusmallit ja 5) toimivat integrointiratkaisut. Työ rajataan maavoimien taistelun joukkuetason todennäköisyysjakaumat luovaan laskentamalliin, kenttälääkinnän malliin ja epäsuoran tulen malliin integrointimenetelmineen sekä niiden antamien tulosten sovellettavuuteen. Ilmasta ja mereltä maahan -asevaikutusta voidaan tarkastella, mutta ei ilma- ja meritaistelua. Menetelmiä soveltavan Sandis -ohjelmiston malleja, käyttötapaa ja ohjelmistotekniikkaa kehitetään edelleen. Merkittäviä jatkotutkimuskohteita mallinnukseen osalta ovat muun muassa kaupunkitaistelu, vaunujen kaksintaistelu ja maaston vaikutus tykistön tuleen sekä materiaalikulutuksen arviointi.
Resumo:
This study combines several projects related to the flows in vessels with complex shapes representing different chemical apparata. Three major cases were studied. The first one is a two-phase plate reactor with a complex structure of intersecting micro channels engraved on one plate which is covered by another plain plate. The second case is a tubular microreactor, consisting of two subcases. The first subcase is a multi-channel two-component commercial micromixer (slit interdigital) used to mix two liquid reagents before they enter the reactor. The second subcase is a micro-tube, where the distribution of the heat generated by the reaction was studied. The third case is a conventionally packed column. However, flow, reactions or mass transfer were not modeled. Instead, the research focused on how to describe mathematically the realistic geometry of the column packing, which is rather random and can not be created using conventional computeraided design or engineering (CAD/CAE) methods. Several modeling approaches were used to describe the performance of the processes in the considered vessels. Computational fluid dynamics (CFD) was used to describe the details of the flow in the plate microreactor and micromixer. A space-averaged mass transfer model based on Fick’s law was used to describe the exchange of the species through the gas-liquid interface in the microreactor. This model utilized data, namely the values of the interfacial area, obtained by the corresponding CFD model. A common heat transfer model was used to find the heat distribution in the micro-tube. To generate the column packing, an additional multibody dynamic model was implemented. Auxiliary simulation was carried out to determine the position and orientation of every packing element in the column. This data was then exported into a CAD system to generate desirable geometry, which could further be used for CFD simulations. The results demonstrated that the CFD model of the microreactor could predict the flow pattern well enough and agreed with experiments. The mass transfer model allowed to estimate the mass transfer coefficient. Modeling for the second case showed that the flow in the micromixer and the heat transfer in the tube could be excluded from the larger model which describes the chemical kinetics in the reactor. Results of the third case demonstrated that the auxiliary simulation could successfully generate complex random packing not only for the column but also for other similar cases.
Resumo:
The purpose of this study is to explore the possibilities of utilizing business intelligence (BI)systems in management control (MC). The topic of this study is explored trough four researchquestions. Firstly, what kind of management control systems (MCS) use or could use the data and information enabled by the BI system? Secondly, how the BI system is or could be utilized? Thirdly, has BI system enabled new forms of control or changed old ones? The fourth and final research question is whether the BI system supports some forms of control that the literature has not thought of, or is the BI system not used for some forms of control the literature suggests it should be used? The study is conducted as an extensive case study. Three different organizations were interviewed for the study. For the theoretical basis of the study, central theories in the field of management control are introduced. The term business intelligence is discussed in detail and the mechanisms for governance of business intelligence are presented. A literature analysis of the uses of BI for management control is introduced. The theoretical part of the study ends in the construction of a framework for business intelligence in management control. In the empirical part of the study the case organizations, their BI systems, and the ways they utilize these systems for management control are presented. The main findings of the study are that BI systems can be utilized in the fields suggested in the literature, namely in planning, cybernetic, reward, boundary, and interactive control. The systems are used both as the data or information feeders and directly as the tools. Using BI systems has also enabled entirely new forms of control in the studied organizations, most significantly in the area of interactive control. They have also changed the old control systems by making the information more readily available to the whole organization. No evidence of the BI systems being used for forms of control that the literature had not suggested was found. The systems were mostly used for cybernetic control and interactive control, whereas the support for other types of control was not as prevalent. The main contribution of the study to the existing literature is the insight provided into how BI systems, both theoretically and empirically, are used for management control. The framework for business intelligence in management control presented in the study can also be utilized in further studies about the subject.
Resumo:
This study examines the practice of supply chain management problems and the perceived demand information distortion’s (the bullwhip effect) reduction with the interfirm information system, which is delivered as a cloud service to a company operating in the telecommunications industry. The purpose is to shed light in practice that do the interfirm information system have impact on the performance of the supply chain and in particularly the reduction of bullwhip effect. In addition, a holistic case study of the global telecommunications company's supply chain is presented and also the challenges it’s facing, and this study also proposes some measures to improve the situation. The theoretical part consists of the supply chain and its management, as well as increasing the efficiency and introducing the theories and related previous research. In addition, study presents performance metrics for the bullwhip effect detection and tracking. The theoretical part ends in presenting cloud -based business intelligence theoretical framework used in the background of this study. The research strategy is a qualitative case study, supported by quantitative data, which is collected from a telecommunication sector company's databases. Qualitative data were gathered mainly with two open interviews and the e-mail exchange during the development project. In addition, other materials from the company were collected during the project and the company's web site information was also used as the source. The data was collected to a specific case study database in order to increase reliability. The results show that the bullwhip effect can be reduced with the interfirm information system and with the use of CPFR and S&OP models and in particularly combining them to an integrated business planning. According to this study the interfirm information system does not, however, solve all of the supply chain and their effectiveness -related problems, because also the company’s processes and human activities have a major impact.
Resumo:
As technology has developed it has increased the number of data produced and collected from business environment. Over 80% of that data includes some sort of reference to geographical location. Individuals have used that information by utilizing Google Maps or different GPS devices, however such information has remained unexploited in business. This thesis will study the use and utilization of geographically referenced data in capital-intensive business by first providing theoretical insight into how data and data-driven management enables and enhances the business and how especially geographically referenced data adds value to the company and then examining empirical case evidence how geographical information can truly be exploited in capital-intensive business and what are the value adding elements of geographical information to the business. The study contains semi-structured interviews that are used to scan attitudes and beliefs of an organization towards the geographic information and to discover fields of applications for the use of geographic information system within the case company. Additionally geographical data is tested in order to illustrate how the data could be used in practice. Finally the outcome of the thesis provides understanding from which elements the added value of geographical information in business is consisted of and how such data can be utilized in the case company and in capital-intensive business.
Resumo:
The advancement of science and technology makes it clear that no single perspective is any longer sufficient to describe the true nature of any phenomenon. That is why the interdisciplinary research is gaining more attention overtime. An excellent example of this type of research is natural computing which stands on the borderline between biology and computer science. The contribution of research done in natural computing is twofold: on one hand, it sheds light into how nature works and how it processes information and, on the other hand, it provides some guidelines on how to design bio-inspired technologies. The first direction in this thesis focuses on a nature-inspired process called gene assembly in ciliates. The second one studies reaction systems, as a modeling framework with its rationale built upon the biochemical interactions happening within a cell. The process of gene assembly in ciliates has attracted a lot of attention as a research topic in the past 15 years. Two main modelling frameworks have been initially proposed in the end of 1990s to capture ciliates’ gene assembly process, namely the intermolecular model and the intramolecular model. They were followed by other model proposals such as templatebased assembly and DNA rearrangement pathways recombination models. In this thesis we are interested in a variation of the intramolecular model called simple gene assembly model, which focuses on the simplest possible folds in the assembly process. We propose a new framework called directed overlap-inclusion (DOI) graphs to overcome the limitations that previously introduced models faced in capturing all the combinatorial details of the simple gene assembly process. We investigate a number of combinatorial properties of these graphs, including a necessary property in terms of forbidden induced subgraphs. We also introduce DOI graph-based rewriting rules that capture all the operations of the simple gene assembly model and prove that they are equivalent to the string-based formalization of the model. Reaction systems (RS) is another nature-inspired modeling framework that is studied in this thesis. Reaction systems’ rationale is based upon two main regulation mechanisms, facilitation and inhibition, which control the interactions between biochemical reactions. Reaction systems is a complementary modeling framework to traditional quantitative frameworks, focusing on explicit cause-effect relationships between reactions. The explicit formulation of facilitation and inhibition mechanisms behind reactions, as well as the focus on interactions between reactions (rather than dynamics of concentrations) makes their applicability potentially wide and useful beyond biological case studies. In this thesis, we construct a reaction system model corresponding to the heat shock response mechanism based on a novel concept of dominance graph that captures the competition on resources in the ODE model. We also introduce for RS various concepts inspired by biology, e.g., mass conservation, steady state, periodicity, etc., to do model checking of the reaction systems based models. We prove that the complexity of the decision problems related to these properties varies from P to NP- and coNP-complete to PSPACE-complete. We further focus on the mass conservation relation in an RS and introduce the conservation dependency graph to capture the relation between the species and also propose an algorithm to list the conserved sets of a given reaction system.
Resumo:
Business intelligence (BI) is an information process that includes the activities and applications used to transform business data into valuable business information. Today’s enterprises are collecting detailed data which has increased the available business data drastically. In order to meet changing customer needs and gain competitive advantage businesses try to leverage this information. However, IT departments are struggling to meet the increased amount of reporting needs. Therefore, recent shift in the BI market has been towards empowering business users with self-service BI capabilities. The purpose of this study was to understand how self-service BI could help businesses to meet increased reporting demands. The research problem was approached with an empirical single case study. Qualitative data was gathered with a semi-structured, theme-based interview. The study found out that case company’s BI system was mostly used for group performance reporting. Ad-hoc and business user-driven information needs were mostly fulfilled with self-made tools and manual work. It was felt that necessary business information was not easily available. The concept of self-service BI was perceived to be helpful to meet such reporting needs. However, it was found out that the available data is often too complex for an average user to fully understand. The respondents felt that in order to self-service BI to work, the data has to be simplified and described in a way that it can be understood by the average business user. The results of the study suggest that BI programs struggle in meeting all the information needs of today’s businesses. The concept of self-service BI tries to resolve this problem by allowing users easy self-service access to necessary business information. However, business data is often complex and hard to understand. Self-serviced BI has to overcome this challenge before it can reach its potential benefits.
Resumo:
Business intelligence (BI) is an information process that includes the activities and applications used to transform business data into valuable business information. Today’s enterprises are collecting detailed data which has increased the available business data drastically. In order to meet changing customer needs and gain competitive advantage businesses try to leverage this information. However, IT departments are struggling to meet the increased amount of reporting needs. Therefore, recent shift in the BI market has been towards empowering business users with self-service BI capabilities. The purpose of this study was to understand how self-service BI could help businesses to meet increased reporting demands. The research problem was approached with an empirical single case study. Qualitative data was gathered with a semi-structured, theme-based interview. The study found out that case company’s BI system was mostly used for group performance reporting. Ad-hoc and business user-driven information needs were mostly fulfilled with self-made tools and manual work. It was felt that necessary business information was not easily available. The concept of self-service BI was perceived to be helpful to meet such reporting needs. However, it was found out that the available data is often too complex for an average user to fully understand. The respondents felt that in order to self-service BI to work, the data has to be simplified and described in a way that it can be understood by the average business user. The results of the study suggest that BI programs struggle in meeting all the information needs of today’s businesses. The concept of self-service BI tries to resolve this problem by allowing users easy self-service access to necessary business information. However, business data is often complex and hard to understand. Self-serviced BI has to overcome this challenge before it can reach its potential benefits.
Resumo:
In the new age of information technology, big data has grown to be the prominent phenomena. As information technology evolves, organizations have begun to adopt big data and apply it as a tool throughout their decision-making processes. Research on big data has grown in the past years however mainly from a technical stance and there is a void in business related cases. This thesis fills the gap in the research by addressing big data challenges and failure cases. The Technology-Organization-Environment framework was applied to carry out a literature review on trends in Business Intelligence and Knowledge management information system failures. A review of extant literature was carried out using a collection of leading information system journals. Academic papers and articles on big data, Business Intelligence, Decision Support Systems, and Knowledge Management systems were studied from both failure and success aspects in order to build a model for big data failure. I continue and delineate the contribution of the Information System failure literature as it is the principal dynamics behind technology-organization-environment framework. The gathered literature was then categorised and a failure model was developed from the identified critical failure points. The failure constructs were further categorized, defined, and tabulated into a contextual diagram. The developed model and table were designed to act as comprehensive starting point and as general guidance for academics, CIOs or other system stakeholders to facilitate decision-making in big data adoption process by measuring the effect of technological, organizational, and environmental variables with perceived benefits, dissatisfaction and discontinued use.
Resumo:
In the new age of information technology, big data has grown to be the prominent phenomena. As information technology evolves, organizations have begun to adopt big data and apply it as a tool throughout their decision-making processes. Research on big data has grown in the past years however mainly from a technical stance and there is a void in business related cases. This thesis fills the gap in the research by addressing big data challenges and failure cases. The Technology-Organization-Environment framework was applied to carry out a literature review on trends in Business Intelligence and Knowledge management information system failures. A review of extant literature was carried out using a collection of leading information system journals. Academic papers and articles on big data, Business Intelligence, Decision Support Systems, and Knowledge Management systems were studied from both failure and success aspects in order to build a model for big data failure. I continue and delineate the contribution of the Information System failure literature as it is the principal dynamics behind technology-organization-environment framework. The gathered literature was then categorised and a failure model was developed from the identified critical failure points. The failure constructs were further categorized, defined, and tabulated into a contextual diagram. The developed model and table were designed to act as comprehensive starting point and as general guidance for academics, CIOs or other system stakeholders to facilitate decision-making in big data adoption process by measuring the effect of technological, organizational, and environmental variables with perceived benefits, dissatisfaction and discontinued use.
Resumo:
The review of intelligent machines shows that the demand for new ways of helping people in perception of the real world is becoming higher and higher every year. This thesis provides information about design and implementation of machine vision for mobile assembly robot. The work has been done as a part of LUT project in Laboratory of Intelligent Machines. The aim of this work is to create a working vision system. The qualitative and quantitative research were done to complete this task. In the first part, the author presents the theoretical background of such things as digital camera work principles, wireless transmission basics, creation of live stream, methods used for pattern recognition. Formulas, dependencies and previous research related to the topic are shown. In the second part, the equipment used for the project is described. There is information about the brands, models, capabilities and also requirements needed for implementation. Although, the author gives a description of LabVIEW software, its add-ons and OpenCV which are used in the project. Furthermore, one can find results in further section of considered thesis. They mainly represented by screenshots from cameras, working station and photos of the system. The key result of this thesis is vision system created for the needs of mobile assembly robot. Therefore, it is possible to see graphically what was done on examples. Future research in this field includes optimization of the pattern recognition algorithm. This will give less response time for recognizing objects. Presented by author system can be used also for further activities which include artificial intelligence usage.
Resumo:
Tässä diplomityössä selvitetään case-tutkimuksena parhaita käytäntöjä Business Intelligence Competency Centerin (BICC) eli liiketoimintatiedonhallinnan osaamiskeskuksen perustamiseen. Työ tehdään LähiTapiolalle, jossa on haasteita BI-alueen hallinnoinnissa kehittämisen hajaantuessa eri yksiköihin ja yhtiöihin. Myös järjestelmäympäristö on moninainen. BICC:llä tavoitellaan parempaa näkyvyyttä liiketoiminnan tarpeisiin ja toisaalta halutaan tehostaa tiedon hyödyntämistä johtamisessa sekä operatiivisen tason työskentelyssä. Tavoitteena on lisäksi saada kustannuksia pienemmäksi yhtenäistämällä järjestelmäympäristöjä ja BI-työkaluja kuten myös toimintamalleja. Työssä tehdään kirjallisuuskatsaus ja haastatellaan asiantuntijoita kolmessa yrityksessä. Tutkimuksen perusteella voidaan todeta, että liiketoiminnan BI-tarpeita kannattaa mahdollistaa eri tasoilla perusraportoinnista Ad-hoc –raportointiin ja edistyneeseen analytiikkaan huomioimalla nämä toimintamalleissa ja järjestelmäarkkitehtuurissa. BICC:n perustamisessa liiketoimintatarpeisiin vastaaminen on etusijalla.
Resumo:
Many-core systems are emerging from the need of more computational power and power efficiency. However there are many issues which still revolve around the many-core systems. These systems need specialized software before they can be fully utilized and the hardware itself may differ from the conventional computational systems. To gain efficiency from many-core system, programs need to be parallelized. In many-core systems the cores are small and less powerful than cores used in traditional computing, so running a conventional program is not an efficient option. Also in Network-on-Chip based processors the network might get congested and the cores might work at different speeds. In this thesis is, a dynamic load balancing method is proposed and tested on Intel 48-core Single-Chip Cloud Computer by parallelizing a fault simulator. The maximum speedup is difficult to obtain due to severe bottlenecks in the system. In order to exploit all the available parallelism of the Single-Chip Cloud Computer, a runtime approach capable of dynamically balancing the load during the fault simulation process is used. The proposed dynamic fault simulation approach on the Single-Chip Cloud Computer shows up to 45X speedup compared to a serial fault simulation approach. Many-core systems can draw enormous amounts of power, and if this power is not controlled properly, the system might get damaged. One way to manage power is to set power budget for the system. But if this power is drawn by just few cores of the many, these few cores get extremely hot and might get damaged. Due to increase in power density multiple thermal sensors are deployed on the chip area to provide realtime temperature feedback for thermal management techniques. Thermal sensor accuracy is extremely prone to intra-die process variation and aging phenomena. These factors lead to a situation where thermal sensor values drift from the nominal values. This necessitates efficient calibration techniques to be applied before the sensor values are used. In addition, in modern many-core systems cores have support for dynamic voltage and frequency scaling. Thermal sensors located on cores are sensitive to the core's current voltage level, meaning that dedicated calibration is needed for each voltage level. In this thesis a general-purpose software-based auto-calibration approach is also proposed for thermal sensors to calibrate thermal sensors on different range of voltages.
Resumo:
Food safety has always been a social issue that draws great public attention. With the rapid development of wireless communication technologies and intelligent devices, more and more Internet of Things (IoT) systems are applied in the food safety tracking field. However, connection between things and information system is usually established by pre-storing information of things into RFID Tag, which is inapplicable for on-field food safety detection. Therefore, considering pesticide residue is one of the severe threaten to food safety, a new portable, high-sensitivity, low-power, on-field organophosphorus (OP) compounds detection system is proposed in this thesis to realize the on-field food safety detection. The system is designed based on optical detection method by using a customized photo-detection sensor. A Micro Controller Unit (MCU) and a Bluetooth Low Energy (BLE) module are used to quantize and transmit detection result. An Android Application (APP) is also developed for the system to processing and display detection result as well as control the detection process. Besides, a quartzose sample container and black system box are also designed and made for the system demonstration. Several optimizations are made in wireless communication, circuit layout, Android APP and industrial design to realize the mobility, low power and intelligence.