887 resultados para Scalable Intelligence
Resumo:
We describe infinitely scalable pipeline machines with perfect parallelism, in the sense that every instruction of an inline program is executed, on successive data, on every clock tick. Programs with shared data effectively execute in less than a clock tick. We show that pipeline machines are faster than single or multi-core, von Neumann machines for sufficiently many program runs of a sufficiently time consuming program. Our pipeline machines exploit the totality of transreal arithmetic and the known waiting time of statically compiled programs to deliver the interesting property that they need no hardware or software exception handling.
Resumo:
Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum's User Level Failure Mitigation proposal has introduced an operation, MPI_Comm_shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI_Comm_shrink operation requires a fault tolerant failure detection and consensus algorithm. This paper presents and compares two novel failure detection and consensus algorithms. The proposed algorithms are based on Gossip protocols and are inherently fault-tolerant and scalable. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that in both algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus.
Resumo:
Despite revived notions of a ‘cultural divide’ between East and West, Edward's Said's ‘Orientalism’ has received little attention from scholars of intelligence and diplomacy. This article brings to light for the first time a number of recently declassified documents of a different nature to usual assessments produced by Anglo-American analytic bodies: those focussed primarily on the issue of ‘national character’. Using and critiquing Said's thesis of Western ‘Orientalism’ it reveals some critical and enduring conceptualizations articulated by the diplomatic and intelligence community about Arab culture such as the role of Islam, rhetoric, political motivation and notions of ‘honour’. Such a critical approach demonstrates how diplomatic and intelligence history can also be a history of culture, ideas and institutional mentalité.
Resumo:
One of the top ten most influential data mining algorithms, k-means, is known for being simple and scalable. However, it is sensitive to initialization of prototypes and requires that the number of clusters be specified in advance. This paper shows that evolutionary techniques conceived to guide the application of k-means can be more computationally efficient than systematic (i.e., repetitive) approaches that try to get around the above-mentioned drawbacks by repeatedly running the algorithm from different configurations for the number of clusters and initial positions of prototypes. To do so, a modified version of a (k-means based) fast evolutionary algorithm for clustering is employed. Theoretical complexity analyses for the systematic and evolutionary algorithms under interest are provided. Computational experiments and statistical analyses of the results are presented for artificial and text mining data sets. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Document engineering is the computer science discipline that investigates systems for documents in any form and in all media. As with the relationship between software engineering and software, document engineering is concerned with principles, tools and processes that improve our ability to create, manage, and maintain documents (http://www.documentengineering.org). The ACM Symposium on Document Engineering is an annual meeting of researchers active in document engineering: it is sponsored by ACM by means of the ACM SIGWEB Special Interest Group. In this editorial, we first point to work carried out in the context of document engineering, which are directly related to multimedia tools and applications. We conclude with a summary of the papers presented in this special issue.
Resumo:
Security administrators face the challenge of designing, deploying and maintaining a variety of configuration files related to security systems, especially in large-scale networks. These files have heterogeneous syntaxes and follow differing semantic concepts. Nevertheless, they are interdependent due to security services having to cooperate and their configuration to be consistent with each other, so that global security policies are completely and correctly enforced. To tackle this problem, our approach supports a comfortable definition of an abstract high-level security policy and provides an automated derivation of the desired configuration files. It is an extension of policy-based management and policy hierarchies, combining model-based management (MBM) with system modularization. MBM employs an object-oriented model of the managed system to obtain the details needed for automated policy refinement. The modularization into abstract subsystems (ASs) segment the system-and the model-into units which more closely encapsulate related system components and provide focused abstract views. As a result, scalability is achieved and even comprehensive IT systems can be modelled in a unified manner. The associated tool MoBaSeC (Model-Based-Service-Configuration) supports interactive graphical modelling, automated model analysis and policy refinement with the derivation of configuration files. We describe the MBM and AS approaches, outline the tool functions and exemplify their applications and results obtained. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
Since the last decade the problem of surface inspection has been receiving great attention from the scientific community, the quality control and the maintenance of products are key points in several industrial applications.The railway associations spent much money to check the railway infrastructure. The railway infrastructure is a particular field in which the periodical surface inspection can help the operator to prevent critical situations. The maintenance and monitoring of this infrastructure is an important aspect for railway association.That is why the surface inspection of railway also makes importance to the railroad authority to investigate track components, identify problems and finding out the way that how to solve these problems. In railway industry, usually the problems find in railway sleepers, overhead, fastener, rail head, switching and crossing and in ballast section as well. In this thesis work, I have reviewed some research papers based on AI techniques together with NDT techniques which are able to collect data from the test object without making any damage. The research works which I have reviewed and demonstrated that by adopting the AI based system, it is almost possible to solve all the problems and this system is very much reliable and efficient for diagnose problems of this transportation domain. I have reviewed solutions provided by different companies based on AI techniques, their products and reviewed some white papers provided by some of those companies. AI based techniques likemachine vision, stereo vision, laser based techniques and neural network are used in most cases to solve the problems which are performed by the railway engineers.The problems in railway handled by the AI based techniques performed by NDT approach which is a very broad, interdisciplinary field that plays a critical role in assuring that structural components and systems perform their function in a reliable and cost effective fashion. The NDT approach ensures the uniformity, quality and serviceability of materials without causing any damage of that materials is being tested. This testing methods use some way to test product like, Visual and Optical testing, Radiography, Magnetic particle testing, Ultrasonic testing, Penetrate testing, electro mechanic testing and acoustic emission testing etc. The inspection procedure has done periodically because of better maintenance. This inspection procedure done by the railway engineers manually with the aid of AI based techniques.The main idea of thesis work is to demonstrate how the problems can be reduced of thistransportation area based on the works done by different researchers and companies. And I have also provided some ideas and comments according to those works and trying to provide some proposal to use better inspection method where it is needed.The scope of this thesis work is automatic interpretation of data from NDT, with the goal of detecting flaws accurately and efficiently. AI techniques such as neural networks, machine vision, knowledge-based systems and fuzzy logic were applied to a wide spectrum of problems in this area. Another scope is to provide an insight into possible research methods concerning railway sleeper, fastener, ballast and overhead inspection by automatic interpretation of data.In this thesis work, I have discussed about problems which are arise in railway sleepers,fastener, and overhead and ballasted track. For this reason I have reviewed some research papers related with these areas and demonstrated how their systems works and the results of those systems. After all the demonstrations were taking place of the advantages of using AI techniques in contrast with those manual systems exist previously.This work aims to summarize the findings of a large number of research papers deploying artificial intelligence (AI) techniques for the automatic interpretation of data from nondestructive testing (NDT). Problems in rail transport domain are mainly discussed in this work. The overall work of this paper goes to the inspection of railway sleepers, fastener, ballast and overhead.
Resumo:
Thesis is to Introduce an Intelligent cross platform architecture with Multi-agent system in order to equip the simulation Models with agents, having intelligent behavior, reactive and pro-active nature and rational in decision making.
Resumo:
Inom Business Intelligence har begreppet Self-Service Business Intelligence (Self-Service BI) vuxit fram. Self-Service BI omfattar verktyg vilka möjliggör för slutanvändare att göra analyser och skapa rapporter utan teknisk support. Ett av dessa verktyg är Microsoft PowerPivot.På Transportstyrelsens Järnvägsavdelning finns behov av ett Self-Service BI-verktyg. Vi fick i uppdrag av Sogeti att undersöka om PowerPivot var ett lämpligt verktyg för Transportstyrelsen. Målet med uppsatsen har varit att testa vilka tekniska möjligheter och begränsningar PowerPivot har samt huruvida PowerPivot är användbart för Transportstyrelsen.För att få en djupare förståelse för Self-Service BI har vi kartlagt vilka möjligheter och begränsningar med Self-Service BI-verktyg som finns beskrivna i litteraturen. Vi har sedan jämfört dessa med våra testresultat vilket har varit syftet med uppsatsen.Resultatet av testerna har visat att Transportstyrelsens Järnvägsavdelning initialt behöver teknisk support för att använda PowerPivot. Testerna har även visat att vissa av Transportstyrelsens krav inte kan uppfyllas. Detta minskar användbarheten för Transportstyrelsen.Vidare har vi kommit fram till att Self-Service BI inte alltid är enkelt att använda för slutanvändare utan teknisk support. Resultatet visar även att det krävs en BI-infrastruktur för att enkelt skapa rapporter med god kvalitet och högsta möjliga korrekthet.
Resumo:
Denna rapport behandlar vilka egenskaper som är viktiga att ta hänsyn till vid val av rapportverktyg inom området Business Intelligence. Begreppet BI är relativt omfattande och syftar till färdigheter, teknologier, applikationer och metoder av systematisk och vetenskaplig art som en organisation använder för att bättre förstå sin verksamhet, sin omgivning och omvärld. Rapportverktyg utgör således en mindre del i en större kedja av processer för att stödja beslutstagande.Landstinget Dalarna har anlitat Sogeti, som har varit vår uppdragsgivare för detta examensarbete, för att implementera BI i sin verksamhet och vår studie har sitt ursprung i att Landstinget Dalarna idag har ett stort behov av olika typer av rapporter i många olika delar av organisationen. Rapportbehovet har visat sig vara omfattande och för att lätta på arbetsbördan för de systemutvecklare som skapar rapporter har funderingar framkommit att det skulle kunna vara en bra lösning att låta användarna inom Landstinget Dalarna själva skapa en del av sina egna rapporter. Målet med arbetet är att ge de systemutvecklare som arbetar i projektet riktlinjer kring vilka egenskaper olika rapportverktyg innehar för att de enklare skall kunna avgöra vilket som är lämpligast att använda. De verktyg som i denna studie jämförs med varandra är Report Builder 3.0, PowerPivot samt Dashboard Designer 2010, samtliga från Microsoft.För att göra denna jämförelse mellan olika rapportverktyg krävs bra underlag för att kunna förstå vilka egenskaper som är relevanta att fokusera på samt om några egenskaper väger tyngre än andra.Efter att ha utfört intervjuer med systemutvecklare som arbetar med BI har vi kunnat skapa oss en tydligare bild av detta område. Egenskaperna har sammanställts för att användas i vår jämförelse mellan de olika rapportverktygen. Att dessa egenskaper är av vikt bekräftas till viss del av den teori som finns på området. De egenskaper som främst visar sig vara viktiga i valet är vilken befintlig plattform som används, verktygets möjlighet att skapa interaktiva rapporter samt vilken typ av användare verktyget riktar sig till. Även andra egenskaper visar sig vara viktiga att ta hänsyn till, men då främst beroende på vilka krav som ställs. Resultatet av den praktiska jämförelsen mellan de olika rapportverktygen visar att verktygen till viss del överlappar varandra i funktionalitet samtidigt som de är anpassade för olika typer av användare och plattformar. De utgör allihop delar i Microsofts BI-pussel som på olika sätt skall bidra till att alltid kunna täcka upp de krav som kan finnas beroende på behov och förutsättningar. Samtidigt visar det sig att jämförda rapportverktyg besitter vissa generella egenskaper som gör att verktygen i stora drag klarar, om än på olika sätt, att skapa snarlika rapporter.
Resumo:
Cada vez mais o tempo acaba sendo o diferencial de uma empresa para outra. As empresas, para serem bem sucedidas, precisam da informação certa, no momento certo e para as pessoas certas. Os dados outrora considerados importantes para a sobrevivência das empresas hoje precisam estar em formato de informações para serem utilizados. Essa é a função das ferramentas de “Business Intelligence”, cuja finalidade é modelar os dados para obter informações, de forma que diferencie as ações das empresas e essas consigam ser mais promissoras que as demais. “Business Intelligence” é um processo de coleta, análise e distribuição de dados para melhorar a decisão de negócios, que leva a informação a um número bem maior de usuários dentro da corporação. Existem vários tipos de ferramentas que se propõe a essa finalidade. Esse trabalho tem como objetivo comparar ferramentas através do estudo das técnicas de modelagem dimensional, fundamentais nos projetos de estruturas informacionais, suporte a “Data Warehouses”, “Data Marts”, “Data Mining” e outros, bem como o mercado, suas vantagens e desvantagens e a arquitetura tecnológica utilizada por estes produtos. Assim sendo, foram selecionados os conjuntos de ferramentas de “Business Intelligence” das empresas Microsoft Corporation e Oracle Corporation, visto as suas magnitudes no mundo da informática.