982 resultados para Data handling


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Educação - IBRC

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Elétrica - FEIS

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nowadays, data handling and data analysis in High Energy Physics requires a vast amount of computational power and storage. In particular, the world-wide LHC Com- puting Grid (LCG), an infrastructure and pool of services developed and deployed by a ample community of physicists and computer scientists, has demonstrated to be a game changer in the efficiency of data analyses during Run-I at the LHC, playing a crucial role in the Higgs boson discovery. Recently, the Cloud computing paradigm is emerging and reaching a considerable adoption level by many different scientific organizations and not only. Cloud allows to access and utilize not-owned large computing resources shared among many scientific communities. Considering the challenging requirements of LHC physics in Run-II and beyond, the LHC computing community is interested in exploring Clouds and see whether they can provide a complementary approach - or even a valid alternative - to the existing technological solutions based on Grid. In the LHC community, several experiments have been adopting Cloud approaches, and in particular the experience of the CMS experiment is of relevance to this thesis. The LHC Run-II has just started, and Cloud-based solutions are already in production for CMS. However, other approaches of Cloud usage are being thought of and are at the prototype level, as the work done in this thesis. This effort is of paramount importance to be able to equip CMS with the capability to elastically and flexibly access and utilize the computing resources needed to face the challenges of Run-III and Run-IV. The main purpose of this thesis is to present forefront Cloud approaches that allow the CMS experiment to extend to on-demand resources dynamically allocated as needed. Moreover, a direct access to Cloud resources is presented as suitable use case to face up with the CMS experiment needs. Chapter 1 presents an overview of High Energy Physics at the LHC and of the CMS experience in Run-I, as well as preparation for Run-II. Chapter 2 describes the current CMS Computing Model, and Chapter 3 provides Cloud approaches pursued and used within the CMS Collaboration. Chapter 4 and Chapter 5 discuss the original and forefront work done in this thesis to develop and test working prototypes of elastic extensions of CMS computing resources on Clouds, and HEP Computing “as a Service”. The impact of such work on a benchmark CMS physics use-cases is also demonstrated.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Several practical obstacles in data handling and evaluation complicate the use of quantitative localized magnetic resonance spectroscopy (qMRS) in clinical routine MR examinations. To overcome these obstacles, a clinically feasible MR pulse sequence protocol based on standard available MR pulse sequences for qMRS has been implemented along with newly added functionalities to the free software package jMRUI-v5.0 to make qMRS attractive for clinical routine. This enables (a) easy and fast DICOM data transfer from the MR console and the qMRS-computer, (b) visualization of combined MR spectroscopy and imaging, (c) creation and network transfer of spectroscopy reports in DICOM format, (d) integration of advanced water reference models for absolute quantification, and (e) setup of databases containing normal metabolite concentrations of healthy subjects. To demonstrate the work-flow of qMRS using these implementations, databases for normal metabolite concentration in different regions of brain tissue were created using spectroscopic data acquired in 55 normal subjects (age range 6-61 years) using 1.5T and 3T MR systems, and illustrated in one clinical case of typical brain tumor (primitive neuroectodermal tumor). The MR pulse sequence protocol and newly implemented software functionalities facilitate the incorporation of qMRS and reference to normal value metabolite concentration data in daily clinical routine. Magn Reson Med, 2013. © 2012 Wiley Periodicals, Inc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As education providers increasingly integrate digital learning media into their education processes, the need for the systematic management of learning materials and learning arrangements becomes clearer. Digital repositories, often called Learning Object Repositories (LOR), promise to provide an answer to this challenge. This article is composed of two parts. In this part, we derive technological and pedagogical requirements for LORs from a concretization of information quality criteria for e-learning technology. We review the evolution of learning object repositories and discuss their core features in the context of pedagogical requirements, information quality demands, and e-learning technology standards. We conclude with an outlook in Part 2, which presents concrete technical solutions, in particular networked repository architectures.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In Part 1 of this article we discussed the need for information quality and the systematic management of learning materials and learning arrangements. Digital repositories, often called Learning Object Repositories (LOR), were introduced as a promising answer to this challenge. We also derived technological and pedagogical requirements for LORs from a concretization of information quality criteria for e-learning technology. This second part presents technical solutions that particularly address the demands of open education movements, which aspire to a global reuse and sharing culture. From this viewpoint, we develop core requirements for scalable network architectures for educational content management. We then present edu-sharing, an advanced example of a network of homogeneous repositories for learning resources, and discuss related technology. We conclude with an outlook in terms of emerging developments towards open and networked system architectures in e-learning.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Unterstützungssysteme für die Programmierausbildung sind weit verbreitet, doch gängige Standards für den Austausch von allgemeinen (Lern-) Inhalten und Tests erfüllen nicht die speziellen Anforderungen von Programmieraufgaben wie z. B. den Umgang mit komplexen Einreichungen aus mehreren Dateien oder die Kombination verschiedener (automatischer) Bewertungsverfahren. Dadurch können Aufgaben nicht zwischen Systemen ausgetauscht werden, was aufgrund des hohen Aufwands für die Entwicklung guter Aufgaben jedoch wünschenswert wäre. In diesem Beitrag wird ein erweiterbares XML-basiertes Format zum Austausch von Programmieraufgaben vorgestellt, das bereits von mehreren Systemen prototypisch genutzt wird. Die Spezifikation des Austauschformats ist online verfügbar [PFMA].

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper describes the spatial data handling procedures used to create a vector database of the Connecticut shoreline from Coastal Survey Maps. The appendix contains detailed information on how the procedures were implemented using Geographic Transformer Software 5 and ArcGIS 8.3. The project was a joint project of the Connecticut Department of Environmental Protection and the University of Connecticut Center for Geographic Information and Analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to describe the tools and strategies that were employed by C/W MARS to successfully develop and implement the Digital Treasures digital repository. Design/methodology/approach – This paper outlines the planning and subsequent technical issues that arise when implementing a digitization project on the scale of the large, multi-type, automated library network. Workflow solutions addressed include synchronous online metadata record submissions from multiple library sources and the delivery of collection-level use statistics to participating library administrators. The importance of standards-based descriptive metadata and the role of project collaboration are also discussed. Findings – From the time of its initial planning, the Digital Treasures repository was fully implemented in six months. The discernable and statistically quantified online discovery and access of actual digital objects greatly assisted libraries unsure of their own staffing costs/benefits to join the repository. Originality/value – This case study may serve as a possible example of initial planning, workflow and final implementation strategies for new repositories in both the general and library consortium environment. Keywords – Digital repositories, Library networks, Data management. Paper type – Case study

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper examines the contribution of job matching to wage growth in the U.S. and Germany using data drawn from the Panel Study of Income Dynamics and the German Socio-Economic Panel from 1984 through 1992. Using a symmetrical set of variables and data handling procedures, real wage growth is found to be higher in the U.S. than in Germany during this period. Also, using two different estimators, job matches are found to enhance wage growth in the U.S. and retard it in Germany. The relationship of general skills to employment in each country appears responsible for this result.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El presente artículo es una revisión detallada de estudios científicos publicados que tratan el tema relacionado con la determinación de los elementos de las tierras raras (REEs) en el sistema suelo-planta. Los estudios han sido llevados a cabo principalmente en países europeos y asiáticos. Cabe señalar que la investigación en los países latinoamericanos es muy escasa; sin embargo, es creciente el interés de analizar la aportación de estos elementos al suelo y la planta, lo cual se debe a la aplicación de fertilizantes que contienen dosis elevadas de estos elementos en su composición. Diversas técnicas de muestreo, experimentación y análisis han sido empleadas para la determinación de los REEs. No obstante, se considera que el manejo de los datos ha sido incorrecto estadísticamente. El contenido del presente artículo aborda: (i) las generalidades de los REEs; (ii) el análisis de la bibliografía disponible con el fin de conocer las metodologías de muestreo y análisis más utilizadas en 37 artículos en total, señalando algunos puntos que se consideran todavía deficientes; (iii) dos ejemplos de la aplicación de técnicas estadísticas (intervalo de confianza de la media y pruebas de significancia de la relación F de Fisher y t de Student) utilizando datos reportados en dos artículos. Los resultados mostraron, con los datos del primer artículo analizado, que: a) no se aplicó una metodología estadística para evaluar la calidad de datos; b) al aplicar estadística se encontró que existen diferencias sistemáticas entre los datos determinados en el laboratorio y los certificados. En el segundo artículo analizado se demostró, mediante pruebas de significancia, que existen diferencias significativas en las medias de Ce y Eu (los dos elementos tomados como ejemplos) en las plantas de un sitio a otro.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Simulation of satellite subsystems behaviour is extramely important in the design at early stages. The subsystems are normally simulated in the both ways : isolated and as part of more complex simulation that takes into account imputs from other subsystems (concurrent design). In the present work, a simple concurrent simulation of the power subsystem of a microsatellite, UPMSat-2, is described. The aim of the work is to obtain the performance profile of the system (battery charging level, power consumption by the payloads, power supply from solar panels....). Different situations such as battery critical low or high level, effects of high current charging due to the low temperature of solar panels after eclipse,DoD margins..., were analysed, and different safety strategies studied using the developed tool (simulator) to fulfil the mission requirements. Also, failure cases were analysed in order to study the robustness of the system. The mentioned simulator has been programed taking into account the power consumption performances (average and maximum consumptions per orbit/day) of small part of the subsystem (SELEX GALILEO SPVS modular generators built with Azur Space solar cells, SAFT VES16 6P4S Li-ion battery, SSBV magnetometers, TECNOBIT and DATSI/UPM On Board Data Handling -OBDH-...). The developed tool is then intended to be a modular simulator, with the chance of use any other components implementing some standard data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Online geographic information systems provide the means to extract a subset of desired spatial information from a larger remote repository. Data retrieved representing real-world geographic phenomena are then manipulated to suit the specific needs of an end-user. Often this extraction requires the derivation of representations of objects specific to a particular resolution or scale from a single original stored version. Currently standard spatial data handling techniques cannot support the multi-resolution representation of such features in a database. In this paper a methodology to store and retrieve versions of spatial objects at, different resolutions with respect to scale using standard database primitives and SQL is presented. The technique involves heavy fragmentation of spatial features that allows dynamic simplification into scale-specific object representations customised to the display resolution of the end-user's device. Experimental results comparing the new approach to traditional R-Tree indexing and external object simplification reveal the former performs notably better for mobile and WWW applications where client-side resources are limited and retrieved data loads are kept relatively small.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Most object-based approaches to Geographical Information Systems (GIS) have concentrated on the representation of geometric properties of objects in terms of fixed geometry. In our road traffic marking application domain we have a requirement to represent the static locations of the road markings but also enforce the associated regulations, which are typically geometric in nature. For example a give way line of a pedestrian crossing in the UK must be within 1100-3000 mm of the edge of the crossing pattern. In previous studies of the application of spatial rules (often called 'business logic') in GIS emphasis has been placed on the representation of topological constraints and data integrity checks. There is very little GIS literature that describes models for geometric rules, although there are some examples in the Computer Aided Design (CAD) literature. This paper introduces some of the ideas from so called variational CAD models to the GIS application domain, and extends these using a Geography Markup Language (GML) based representation. In our application we have an additional requirement; the geometric rules are often changed and vary from country to country so should be represented in a flexible manner. In this paper we describe an elegant solution to the representation of geometric rules, such as requiring lines to be offset from other objects. The method uses a feature-property model embraced in GML 3.1 and extends the possible relationships in feature collections to permit the application of parameterized geometric constraints to sub features. We show the parametric rule model we have developed and discuss the advantage of using simple parametric expressions in the rule base. We discuss the possibilities and limitations of our approach and relate our data model to GML 3.1. © 2006 Springer-Verlag Berlin Heidelberg.