981 resultados para Intelligence framework
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
Digital art interfaces presents cognitiveparadigms that deals with the recognition of the symbols and representations through interaction.What is presented in this paper is anapproximation of the bodily experience in that particular scenario and a new proposal which has the aim to contribute more ideas and criteria in the analysis of the learning process of aparticipant discovering an interactive space or interface. For that I propose a first new approach where metaphorically I tried to extrapolate the stages of the psychology of development stated byJean Piaget in the interface design domain.
Resumo:
[Les métamorphoses (français moyen). 1556]
Resumo:
El Business Intelligence ha pasado en los últimos 20 años de ser un capricho de unos pocos CIO, que podían permitirse destinar partidas presupuestarias para tal efecto, a convertirse en una realidad ya presente en muchas de las grandes empresas o una necesidad urgente para las que todavía no han implantado un sistema de esas características.La primera parte del presente documento, denominada “Estudio del Business Intelligence”, presenta una introducción a dicho concepto, desde la base. Explicando los conceptos teóricos clave necesarios para entender este tipo de soluciones, más adelante se comentan los componentes tecnológicos que van desde los procesos de extracción e integración de información a cómo debemos estructurar la información para facilitar el análisis. Por último, se repasan los diferentes tipos de aplicaciones que existen en el mercado así como las tendencias más actuales en este campo.La segunda parte del documento centra su foco en la implantación de un Cuadro de Mandos para el análisis de las ventas de una empresa, se identifican las diferentes fases del proyecto así como se entra en detalle de los requerimientos identificados. En último lugar, se presenta el desarrollo realizado del Cuadro de Mandos con tecnología Xcelsius, que permite exportar a flash el resultado y visualizarlo en cualquier navegador web.
Resumo:
Iowa Natural Resources Council's Water Plan 78, the main report of the State Water Plan Framework Study. This report is the culmination of the three-year cooperative effort between this agency and the Department of Environmental Quality, Conservation Commission, Department of Soil Conservation, and the Geological Survey, funded by the Legislature in 1978.
Resumo:
The progress in prenatal medicine raises complex questions with respect to the physician-patient relationship. The physician needs to reconcile medical aspects, ethical principles as well as judicial norms. Already, during the first trimester, the physician has to put into practice the schedule combining for each individual pregnancy physical, laboratory and other appropriate exams. Physicians are under the obligation to inform in a clear and comprehensive way without creating unnecessary anxiety for their patients. Legal requirements include informed consent, the respect for the patient's right to self-determination, and compliance with the Swiss federal law on genetic testing, especially with its articles on prenatal screening and diagnosis. This article discusses the complexity of obstetrical practice when it comes to delivering adequate information within the scope of ethical and legal requirements in Switzerland. L'évolution de la médecine prénatale soulève des enjeux complexes dans la relation médecin-patient. Il s'agit de concilier à la fois les aspects médicaux, les principes éthiques et les normes juridiques. Dès le premier trimestre de la grossesse le médecin doit poser le cadre du suivi et des examens appropriés pour chaque grossesse. Son devoir est d'informer de manière claire et précise sans inquiéter inutilement, en respectant l'exigence légale d'un consentement éclairé et plus largement le droit de la patiente à l'autodétermination ainsi que le cadre de la loi fédérale suisse sur l'analyse génétique humaine dans le domaine du dépistage et du diagnostic prénatal. Cet article discute de la complexité de l'information et de l'application des principes éthiques et légaux dans la pratique obstétricale en Suisse.
Resumo:
This contribution presents the first stage of a project to assist the transition of a traditional to a blended program in higher nursing education. We shall describe the goals and context of this project, present the evaluation framework, discuss some early results and then discuss the usefulness of the first version of the evaluation framework.
Resumo:
En l'actualitat, la maduresa del camp de la investigació P2P empès a través de nous problemes, relacionats amb la seguretat. Per aquesta raó, la seguretat comença a convertir-se en una de les qüestions clau en l'avaluació d'un sistema P2P, i és important proporcionar mecanismes de seguretat per a sistemes P2P. El projecte JXTAOverlay fa un esforç per utilitzar la tecnologia JXTA per proporcionar un conjunt genèric de funcions que poden ser utilitzades pels desenvolupadors per desplegar aplicacions P2P. No obstant això, encara que el seu disseny es va centrar en qüestions com ara l'escalabilitat o el rendiment general, no va tenir en compte la seguretat. Aquest treball proposa un marc de seguretat, adaptat específicament a la idiosincràsia del JXTAOverlay.
Resumo:
In the wake of the success of Peer-to-Peer (P2P) networking, security has arisen as one of its main concerns, becoming a key issue when evaluating a P2P system. Unfortunately, some systems' design focus targeted issues such as scalabil-ity or overall performance, but not security. As a result, security mechanisms must be provided at a later stage, after the system has already been designed and partially (or even fully) implemented, which may prove a cumbersome proposition. This work exposes how a security layer was provided under such circumstances for a specic Java based P2P framework: JXTA-Overlay.
Resumo:
The software development industry is constantly evolving. The rise of the agile methodologies in the late 1990s, and new development tools and technologies require growing attention for everybody working within this industry. The organizations have, however, had a mixture of various processes and different process languages since a standard software development process language has not been available. A promising process meta-model called Software & Systems Process Engineering Meta- Model (SPEM) 2.0 has been released recently. This is applied by tools such as Eclipse Process Framework Composer, which is designed for implementing and maintaining processes and method content. Its aim is to support a broad variety of project types and development styles. This thesis presents the concepts of software processes, models, traditional and agile approaches, method engineering, and software process improvement. Some of the most well-known methodologies (RUP, OpenUP, OpenMethod, XP and Scrum) are also introduced with a comparison provided between them. The main focus is on the Eclipse Process Framework and SPEM 2.0, their capabilities, usage and modeling. As a proof of concept, I present a case study of modeling OpenMethod with EPF Composer and SPEM 2.0. The results show that the new meta-model and tool have made it possible to easily manage method content, publish versions with customized content, and connect project tools (such as MS Project) with the process content. The software process modeling also acts as a process improvement activity.
Resumo:
False identity documents represent a serious threat through their production and use in organized crime and by terrorist organizations. The present-day fight against this criminal problem and threats to national security does not appropriately address the organized nature of this criminal activity, treating each fraudulent document on its own during investigation and the judicial process, which causes linkage blindness and restrains the analysis capacity. Given the drawbacks of this case-by-case approach, this article proposes an original model in which false identity documents are used to inform a systematic forensic intelligence process. The process aims to detect links, patterns, and tendencies among false identity documents in order to support strategic and tactical decision making, thus sustaining a proactive intelligence-led approach to fighting identity document fraud and the associated organized criminality. This article formalizes both the model and the process, using practical applications to illustrate its powerful capabilities. This model has a general application and can be transposed to other fields of forensic science facing similar difficulties.