70 resultados para common platform
Resumo:
Technological limitations and power constraints are resulting in high-performance parallel computing architectures that are based on large numbers of high-core-count processors. Commercially available processors are now at 8 and 16 cores and experimental platforms, such as the many-core Intel Single-chip Cloud Computer (SCC) platform, provide much higher core counts. These trends are presenting new sets of challenges to HPC applications including programming complexity and the need for extreme energy efficiency.In this work, we first investigate the power behavior of scientific PGAS application kernels on the SCC platform, and explore opportunities and challenges for power management within the PGAS framework. Results obtained via empirical evaluation of Unified Parallel C (UPC) applications on the SCC platform under different constraints, show that, for specific operations, the potential for energy savings in PGAS is large; and power/performance trade-offs can be effectively managed using a cross-layerapproach. We investigate cross-layer power management using PGAS language extensions and runtime mechanisms that manipulate power/performance tradeoffs. Specifically, we present the design, implementation and evaluation of such a middleware for application-aware cross-layer power management of UPC applications on the SCC platform. Finally, based on our observations, we provide a set of recommendations and insights that can be used to support similar power management for PGAS applications on other many-core platforms.
Resumo:
The EVS4CSCL project starts in the context of a Computer Supported Collaborative Learning environment (CSCL). Previous UOC projects created a CSCL generic platform (CLPL) to facilitate the development of CSCL applications. A discussion forum (DF) was the first application developed over the framework. This discussion forum was different from other products on the marketplace because of its focus on the learning process. The DF carried out the specification and elaboration phases from the discussion learning process but there was a lack in the consensus phase. The consensus phase in a learning environment is not something to be achieved but tested. Common tests are done by Electronic Voting System (EVS) tools, but consensus test is not an assessment test. We are not evaluating our students by their answers but by their discussion activity. Our educational EVS would be used as a discussion catalyst proposing a discussion about the results after an initial query or it would be used after a discussion period in order to manifest how the discussion changed the students mind (consensus). It should be also used by the teacher as a quick way to know where the student needs some reinforcement. That is important in a distance-learning environment where there is no direct contact between the teacher and the student and it is difficult to detect the learning lacks. In an educational environment, assessment it is a must and the EVS will provide direct assessment by peer usefulness evaluation, teacher marks on every query created and indirect assessment from statistics regarding the user activity.
Resumo:
The present working paper aims at assessing the Common Strategy on the Mediterranean, taking into account its possible articulation as a coherent instrument of the European Foreign Policy. The study wants to answer some questions related to this instrument. The Common Strategy on the Mediterranean is an excellent case study and is a potential source of several questions about the external action of the European Union. Specifically, the present study has in mind two main questions to answer. Firstly, which are the main reasons behind the adoption of this instrument of the European Foreign Policy? In other words, which was/is the rationale for the existence of this Common Strategy? Secondly, which is the real impact of the Common Strategy? Which are its real achievements?
Resumo:
The future of elections seems to be electronic voting systems du to its advantatges over the traditional voting. Nowadays, there are some different paradigms to ensure the security and reliability of e-voting. This document is part of a wider project which presents an e-Voting platform based on elliptic curve cryptography. It uses an hybrid combination of two of the main e-Voting paradigms to guarantee privacy and security in the counting phase, these are precisely, the mixnets and the homomorphic protocols. This document is focused in the description of the system and the maths and programming needed to solve the homomorphic part of it. In later chapters, there is a comparison between a simple mixing system and our system proposal.
Resumo:
This paper describes a failure alert system and a methodology for content reuse in a new instructional design system called InterMediActor (IMA). IMA provides an environment for instructional content design, production and reuse, and for students’ evaluation based in content specification through a hierarchical structure of competences. The student assessment process and information extraction process for content reuse are explained.
Resumo:
Understanding the molecular mechanisms responsible for the regulation of the transcriptome present in eukaryotic cells isone of the most challenging tasks in the postgenomic era. In this regard, alternative splicing (AS) is a key phenomenoncontributing to the production of different mature transcripts from the same primary RNA sequence. As a plethora ofdifferent transcript forms is available in databases, a first step to uncover the biology that drives AS is to identify thedifferent types of reflected splicing variation. In this work, we present a general definition of the AS event along with anotation system that involves the relative positions of the splice sites. This nomenclature univocally and dynamically assignsa specific ‘‘AS code’’ to every possible pattern of splicing variation. On the basis of this definition and the correspondingcodes, we have developed a computational tool (AStalavista) that automatically characterizes the complete landscape of ASevents in a given transcript annotation of a genome, thus providing a platform to investigate the transcriptome diversityacross genes, chromosomes, and species. Our analysis reveals that a substantial part—in human more than a quarter—ofthe observed splicing variations are ignored in common classification pipelines. We have used AStalavista to investigate andto compare the AS landscape of different reference annotation sets in human and in other metazoan species and found thatproportions of AS events change substantially depending on the annotation protocol, species-specific attributes, andcoding constraints acting on the transcripts. The AStalavista system therefore provides a general framework to conductspecific studies investigating the occurrence, impact, and regulation of AS.
Resumo:
El objetivo de PANACEA es engranar diferentes herramientas avanzadas para construir una fábrica de Recursos Lingüísticos (RL), una línea de producción que automatice los pasos implicados en la adquisición, producción, actualización y mantenimiento de los RL que la Traducción Automática y otras tecnologías lingüísticas, necesitan.
Resumo:
The objective of PANACEA is to build a factory of LRs that automates the stages involved in the acquisition, production, updating and maintenance of LRs required by MT systems and by other applications based on language technologies, and simplifies eventual issues regarding intellectual property rights. This automation will cut down the cost, time and human effort significantly. These reductions of costs and time are the only way to guarantee the continuous supply of LRs that MT and other language technologies will be demanding in the multilingual Europe.
Resumo:
The objective of the PANACEA ICT-2007.2.2 EU project is to build a platform that automates the stages involved in the acquisition,production, updating and maintenance of the large language resources required by, among others, MT systems. The development of a Corpus Acquisition Component (CAC) for extracting monolingual and bilingual data from the web is one of the most innovative building blocks of PANACEA. The CAC, which is the first stage in the PANACEA pipeline for building Language Resources, adopts an efficient and distributed methodology to crawl for web documents with rich textual content in specific languages and predefined domains. The CAC includes modules that can acquire parallel data from sites with in-domain content available in more than one language. In order to extrinsically evaluate the CAC methodology, we have conducted several experiments that used crawled parallel corpora for the identification and extraction of parallel sentences using sentence alignment. The corpora were then successfully used for domain adaptation of Machine Translation Systems.
Resumo:
This paper presents the platform developed in the PANACEA project, a distributed factory that automates the stages involved in the acquisition, production, updating and maintenance of Language Resources required by Machine Translation and other Language Technologies. We adopt a set of tools that have been successfully used in the Bioinformatics field, they are adapted to the needs of our field and used to deploy web services, which can be combined to build more complex processing chains (workflows). This paper describes the platform and its different components (web services, registry, workflows, social network and interoperability). We demonstrate the scalability of the platform by carrying out a set of massive data experiments. Finally, a validation of the platform across a set of required criteria proves its usability for different types of users (non-technical users and providers).
Resumo:
We argue that in the development of the Western legal system, cognitive departures are themain determinant of the optimal degree of judicial rule-making. Judicial discretion, seen here as the main distinguishing feature between both legal systems, is introduced in civil law jurisdictions to protect, rather than to limit, freedom of contract against potential judicial backlash. Such protection was unnecessary in common law countries, where free-market relations enjoyed safer judicial ground mainly due to their relatively gradual evolution, their reliance on practitioners as judges, and the earlier development of institutional checks and balances that supported private property rights. In our framework, differences in costs and benefits associated with self-interest and lack of information require a cognitive failure to be active.
Resumo:
This paper studies the generation and transmission of international cycles in a multi-country model with production and consumption interdependencies. Two sources of disturbance are considered and three channels of propagation are compared. In the short run the contemporaneous correlation of disturbances determines the main features of the transmission. In the medium run production interdependencies account for the transmission of technology shocks and consumption interdependencies account for the transmission of government shocks. Technology disturbances, which are mildly correlated across countries, are more successful than government expenditure disturbances in reproducing actual data. The model also accounts for the low cross country consumption correlations observed in the data.
Resumo:
We study the price convergence of goods and services in the euro area in 2001-2002. To measure the degree of convergence, we compare the prices of around 220 items in 32 European cities. The width of the border is the price di¤erence attributed to the fact that the two cities are in different countries. We find that the 2001 European borders are negative, which suggests that the markets were very integrated before the euro changeover. Moreover, we do not identify an integration effect attributable to the introduction of the euro. We then explore the determinants of the European borders. We find that different languages, wealth and population differences tend to split the markets. Historical inflation, though, tends to lead to price convergence.
Resumo:
We propose a model and solution methods, for locating a fixed number ofmultiple-server, congestible common service centers or congestible publicfacilities. Locations are chosen so to minimize consumers congestion (orqueuing) and travel costs, considering that all the demand must be served.Customers choose the facilities to which they travel in order to receiveservice at minimum travel and congestion cost. As a proxy for thiscriterion, total travel and waiting costs are minimized. The travel costis a general function of the origin and destination of the demand, whilethe congestion cost is a general function of the number of customers inqueue at the facilities.
Resumo:
The European Space Agency Soil Moisture andOcean Salinity (SMOS) mission aims at obtaining global maps ofsoil moisture and sea surface salinity from space for large-scale andclimatic studies. It uses an L-band (1400–1427 MHz) MicrowaveInterferometric Radiometer by Aperture Synthesis to measurebrightness temperature of the earth’s surface at horizontal andvertical polarizations ( h and v). These two parameters will beused together to retrieve the geophysical parameters. The retrievalof salinity is a complex process that requires the knowledge ofother environmental information and an accurate processing ofthe radiometer measurements. Here, we present recent resultsobtained from several studies and field experiments that were partof the SMOS mission, and highlight the issues still to be solved.