949 resultados para Multi microprocessor applications
Resumo:
The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.
Resumo:
This document presents an integrated analysis of the performance of Catalonia based on an analysis of how the energy consumption (measured at the societal level for the Catalan Society) is used within both the productive sectors of the economy and the household, to generate added value, jobs, and to guarantee a given level of material standard of living to the population. The trends found in Catalonia are compared to the trends of other European Countries to contextualize the performance of Catalonia with respect to other societies that have followed different paths of economic development. The first part of the document consists of the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) approach that has been used to provide this integrated analysis of Catalan Society across different scales (starting from an analysis of the specific sectors of the Catalan economy as an Autonomous Community and scaling up to an intra-regional (European Union 14) comparison) and across different dimensions of analyses of energy consumption coupled with added value generation. Within the scope of this study, we observe the various trajectories of changes in the metabolic pattern for Catalonia and the EU14 countries in the Paid Work Sectors composed of namely, the Agricultural Sector, the Productive Sector and the Services and Government Sector also in comparison with the changes in the household sector. The flow intensities of the exosomatic energy and the added value generated for each specific sector are defined per hour of human activity, thus characterized as exosomatic energy (MJ/hour) (or Exosomatic Metabolic Rate) and added value (€/hour) (Economic Labour Productivity) across multiple levels. Within the second part of the document, the possible usage of the MuSIASEM approach to land use analyses (using a multi-level matrix of categories of land use) has been conducted.
Resumo:
Los procesadores multi-core y el multi-threading por hardware permiten aumentar el rendimiento de las aplicaciones. Por un lado, los procesadores multi-core combinan 2 o más procesadores en un mismo chip. Por otro lado, el multi-threading por hardware es una técnica que incrementa la utilización de los recursos del procesador. Este trabajo presenta un análisis de rendimiento de los resultados obtenidos en dos aplicaciones, multiplicación de matrices densas y transformada rápida de Fourier. Ambas aplicaciones se han ejecutado en arquitecturas multi-core que explotan el paralelismo a nivel de thread pero con un modelo de multi-threading diferente. Los resultados obtenidos muestran la importancia de entender y saber analizar el efecto del multi-core y multi-threading en el rendimiento.
Resumo:
La gestión de recursos en los procesadores multi-core ha ganado importancia con la evolución de las aplicaciones y arquitecturas. Pero esta gestión es muy compleja. Por ejemplo, una misma aplicación paralela ejecutada múltiples veces con los mismos datos de entrada, en un único nodo multi-core, puede tener tiempos de ejecución muy variables. Hay múltiples factores hardware y software que afectan al rendimiento. La forma en que los recursos hardware (cómputo y memoria) se asignan a los procesos o threads, posiblemente de varias aplicaciones que compiten entre sí, es fundamental para determinar este rendimiento. La diferencia entre hacer la asignación de recursos sin conocer la verdadera necesidad de la aplicación, frente a asignación con una meta específica es cada vez mayor. La mejor manera de realizar esta asignación és automáticamente, con una mínima intervención del programador. Es importante destacar, que la forma en que la aplicación se ejecuta en una arquitectura no necesariamente es la más adecuada, y esta situación puede mejorarse a través de la gestión adecuada de los recursos disponibles. Una apropiada gestión de recursos puede ofrecer ventajas tanto al desarrollador de las aplicaciones, como al entorno informático donde ésta se ejecuta, permitiendo un mayor número de aplicaciones en ejecución con la misma cantidad de recursos. Así mismo, esta gestión de recursos no requeriría introducir cambios a la aplicación, o a su estrategia operativa. A fin de proponer políticas para la gestión de los recursos, se analizó el comportamiento de aplicaciones intensivas de cómputo e intensivas de memoria. Este análisis se llevó a cabo a través del estudio de los parámetros de ubicación entre los cores, la necesidad de usar la memoria compartida, el tamaño de la carga de entrada, la distribución de los datos dentro del procesador y la granularidad de trabajo. Nuestro objetivo es identificar cómo estos parámetros influyen en la eficiencia de la ejecución, identificar cuellos de botella y proponer posibles mejoras. Otra propuesta es adaptar las estrategias ya utilizadas por el Scheduler con el fin de obtener mejores resultados.
Resumo:
This paper presents the Juste-Neige system for predicting the snow height on the ski runs of a resort using a multi-agent simulation software. Its aim is to facilitate snow cover management in order to i) reduce the production cost of artificial snow and to improve the profit margin for the companies managing the ski resorts; and ii) to reduce the water and energy consumption, and thus to reduce the environmental impact, by producing only the snow needed for a good skiing experience. The software provides maps with the predicted snow heights for up to 13 days. On these maps, the areas most exposed to snow erosion are highlighted. The software proceeds in three steps: i) interpolation of snow height measurements with a neural network; ii) local meteorological forecasts for every ski resort; iii) simulation of the impact caused by skiers using a multi-agent system. The software has been evaluated in the Swiss ski resort of Verbier and provides useful predictions.
Resumo:
Las aplicaciones de alineamiento de secuencias son una herramienta importante para la comunidad científica. Estas aplicaciones bioinformáticas son usadas en muchos campos distintos como pueden ser la medicina, la biología, la farmacología, la genética, etc. A día de hoy los algoritmos de alineamiento de secuencias tienen una complejidad elevada y cada día tienen que manejar un volumen de datos más grande. Por esta razón se deben buscar alternativas para que estas aplicaciones sean capaces de manejar el aumento de tamaño que los bancos de secuencias están sufriendo día a día. En este proyecto se estudian y se investigan mejoras en este tipo de aplicaciones como puede ser el uso de sistemas paralelos que pueden mejorar el rendimiento notablemente.
Resumo:
A straightforward route is proposed for the multi-gram scale synthesis of heterobifunctional poly(ethylene glycol) (PEG) oligomers containing combination of triethyloxysilane extremity for surface modification of metal oxides and amino or azido active end groups for further functionalization. The suitability of these PEG derivatives to be conjugated to nanomaterials was shown by pegylation of ultrasmall superparamagnetic iron oxide (USPIO) nanoparticles (NPs), followed by functionalization with small peptide ligands for biomedical applications.
Resumo:
The development of the field-scale Erosion Productivity Impact Calculator (EPIC) model was initiated in 1981 to support assessments of soil erosion impacts on soil productivity for soil, climate, and cropping conditions representative of a broad spectrum of U.S. agricultural production regions. The first major application of EPIC was a national analysis performed in support of the 1985 Resources Conservation Act (RCA) assessment. The model has continuously evolved since that time and has been applied for a wide range of field, regional, and national studies both in the U.S. and in other countries. The range of EPIC applications has also expanded greatly over that time, including studies of (1) surface runoff and leaching estimates of nitrogen and phosphorus losses from fertilizer and manure applications, (2) leaching and runoff from simulated pesticide applications, (3) soil erosion losses from wind erosion, (4) climate change impacts on crop yield and erosion, and (5) soil carbon sequestration assessments. The EPIC acronym now stands for Erosion Policy Impact Climate, to reflect the greater diversity of problems to which the model is currently applied. The Agricultural Policy EXtender (APEX) model is essentially a multi-field version of EPIC that was developed in the late 1990s to address environmental problems associated with livestock and other agricultural production systems on a whole-farm or small watershed basis. The APEX model also continues to evolve and to be utilized for a wide variety of environmental assessments. The historical development for both models will be presented, as well as example applications on several different scales.
Resumo:
This paper aims at illustrating some applications of Finite Random Set (FRS) theory to the design and analysis of wireless communication receivers, and at pointing out similarities and differences between this scenario and that pertaining to multi-target tracking, where the use of FRS has been traditionally advocated. Two case studies are considered, l.e., multiuser detection in a dynamic environment, and multicarrier (OFDM) transmission on a frequency-selective channel. Detector designand performance evaluation are discussed, along with the advantages of importing FRS-based estimation techniques to the context of wireless communications.
Resumo:
We have explored the possibility of obtaining first-order permeability estimates for saturated alluvial sediments based on the poro-elastic interpretation of the P-wave velocity dispersion inferred from sonic logs. Modern sonic logging tools designed for environmental and engineering applications allow one for P-wave velocity measurements at multiple emitter frequencies over a bandwidth covering 5 to 10 octaves. Methodological considerations indicate that, for saturated unconsolidated sediments in the silt to sand range and typical emitter frequencies ranging from approximately 1 to 30 kHz, the observable velocity dispersion should be sufficiently pronounced to allow one for reliable first-order estimations of the permeability structure. The corresponding predictions have been tested on and verified for a borehole penetrating a typical surficial alluvial aquifer. In addition to multifrequency sonic logs, a comprehensive suite of nuclear and electrical logs, an S-wave log, a litholog, and a limited number laboratory measurements of the permeability from retrieved core material were also available. This complementary information was found to be essential for parameterizing the poro-elastic inversion procedure and for assessing the uncertainty and internal consistency of corresponding permeability estimates. Our results indicate that the thus obtained permeability estimates are largely consistent with those expected based on the corresponding granulometric characteristics, as well as with the available evidence form laboratory measurements. These findings are also consistent with evidence from ocean acoustics, which indicate that, over a frequency range of several orders-of-magnitude, the classical theory of poro-elasticity is generally capable of explaining the observed P-wave velocity dispersion in medium- to fine-grained seabed sediments
Resumo:
Granular flow phenomena are frequently encountered in the design of process and industrial plants in the traditional fields of the chemical, nuclear and oil industries as well as in other activities such as food and materials handling. Multi-phase flow is one important branch of the granular flow. Granular materials have unusual kinds of behavior compared to normal materials, either solids or fluids. Although some of the characteristics are still not well-known yet, one thing is confirmed: the particle-particle interaction plays a key role in the dynamics of granular materials, especially for dense granular materials. At the beginning of this thesis, detailed illustration of developing two models for describing the interaction based on the results of finite-element simulation, dimension analysis and numerical simulation is presented. The first model is used to describing the normal collision of viscoelastic particles. Based on some existent models, more parameters are added to this model, which make the model predict the experimental results more accurately. The second model is used for oblique collision, which include the effects from tangential velocity, angular velocity and surface friction based on Coulomb's law. The theoretical predictions of this model are in agreement with those by finite-element simulation. I n the latter chapters of this thesis, the models are used to predict industrial granular flow and the agreement between the simulations and experiments also shows the validation of the new model. The first case presents the simulation of granular flow passing over a circular obstacle. The simulations successfully predict the existence of a parabolic steady layer and show how the characteristics of the particles, such as coefficients of restitution and surface friction affect the separation results. The second case is a spinning container filled with granular material. Employing the previous models, the simulation could also reproduce experimentally observed phenomena, such as a depression in the center of a high frequency rotation. The third application is about gas-solid mixed flow in a vertically vibrated device. Gas phase motion is added to coherence with the particle motion. The governing equations of the gas phase are solved by using the Large eddy simulation (LES) and particle motion is predicted by using the Lagrangian method. The simulation predicted some pattern formation reported by experiment.
Resumo:
Nykyaikaisessa liiketoimintaympäristössä yritysten kriittisiksi resursseiksi ovat muodostuneet liiketoimintaa tukevat tietojärjestelmät. Mahdollisuus hyödyntää näitä resursseja riippuu ko. liiketoiminnalle kriittisten järjestelmien luotettavuudesta ja hyödynnettävien sovellusten saatavuudesta. Eräs tilanne jossa järjestelmien kyky tukea todellisia liiketoimintaprosesseja vaarantuu on katastrofi. Vaikutukseltaan katastrofi voi olla paikallinen tai kattaa laajojakin alueita. Eri tyyppisiin katastrofeihin on varauduttava niiden edellyttämin tavoin. Eräs kriittisten tietojärjestelmien arkkitehtuuriin vaikuttanut trendi 90-luvulla on ollut client/server lähestymistapa. Client/server paradigman mukaan sovellus jaetaan tasoihin siten että esitys-, sovellus- ja tietokantakerrokset voidaan erottaa fyysisesti toisistaan näiden silti muodostaessa loogisesti yhtenäisen kokonaisuuden. Liiketoiminnan näkökulmasta 90- luvun mullistavia IT-uutuuksia olivat toiminnanohjausjärjestelmät, joiden avulla oli mahdollista hallita koko tuotantoketjua ja muita prosessikokonaisuuksia lähes reaaliajassa. Monikerroksisten toiminnanohjausjärjestelmien luotettavuus on osoittautunut haastavavaksi sillä kaikkien kerrosten suojaaminen kaikilta mahdollisilta katastrofeilta täydellisesti on nykyisellä teknologialla mahdotonta. Kompromissien tekemiseksi on oltava selvillä kunkin menetetyn prosessin aiheuttamista taloudellisista ja liiketoiminnallisista vaikutuksista. Tämän vuoksi juuri toiminnanohjausjärjestelmät ovat mielenkiintoisia, vaikuttavathan ne liiketoimintaprosesseihin läpi koko yrityksen prosessiketjun. Monikerroksisten client/server arkkitehtuuriin pohjautuvien toiminnanohjausjärjestelmien suojaamisessa katastrofeilta onkin sovellettava useita tekniikoita ja teknologioita, ja yhdistettävä kokonaisuus prosessikehykseen. Näin voidaan luoda suunnitelmallinen osa IT strategiaa, joka ottaa kantaa liiketoiminnan jatkuvuuteen katastrofitilanteessa ja mahdollistaa nopean ja täydellisen palautumisen kaikissa olosuhteissa.
Resumo:
Within Data Envelopment Analysis, several alternative models allow for an environmental adjustment. The majority of them deliver divergent results. Decision makers face the difficult task of selecting the most suitable model. This study is performed to overcome this difficulty. By doing so, it fills a research gap. First, a two-step web-based survey is conducted. It aims (1) to identify the selection criteria, (2) to prioritize and weight the selection criteria with respect to the goal of selecting the most suitable model and (3) to collect the preferences about which model is preferable to fulfil each selection criterion. Second, Analytic Hierarchy Process is used to quantify the preferences expressed in the survey. Results show that the understandability, the applicability and the acceptability of the alternative models are valid selection criteria. The selection of the most suitable model depends on the preferences of the decision makers with regards to these criteria.
Resumo:
Diplomityö tarkastelee säikeistettyä ohjelmointia rinnakkaisohjelmoinnin ylemmällä hierarkiatasolla tarkastellen erityisesti hypersäikeistysteknologiaa. Työssä tarkastellaan hypersäikeistyksen hyviä ja huonoja puolia sekä sen vaikutuksia rinnakkaisalgoritmeihin. Työn tavoitteena oli ymmärtää Intel Pentium 4 prosessorin hypersäikeistyksen toteutus ja mahdollistaa sen hyödyntäminen, missä se tuo suorituskyvyllistä etua. Työssä kerättiin ja analysoitiin suorituskykytietoa ajamalla suuri joukko suorituskykytestejä eri olosuhteissa (muistin käsittely, kääntäjän asetukset, ympäristömuuttujat...). Työssä tarkasteltiin kahdentyyppisiä algoritmeja: matriisioperaatioita ja lajittelua. Näissä sovelluksissa on säännöllinen muistinkäyttökuvio, mikä on kaksiteräinen miekka. Se on etu aritmeettis-loogisissa prosessoinnissa, mutta toisaalta huonontaa muistin suorituskykyä. Syynä siihen on nykyaikaisten prosessorien erittäin hyvä raaka suorituskyky säännöllistä dataa käsiteltäessä, mutta muistiarkkitehtuuria rajoittaa välimuistien koko ja useat puskurit. Kun ongelman koko ylittää tietyn rajan, todellinen suorituskyky voi pudota murto-osaan huippusuorituskyvystä.
Resumo:
This paper present an overview of way covered for the spectrometry of atomic absorption (AAS), tracing a line of the historical events in its development and its establishment as a multielement technique. Additionally, the efforts carried by through several researchers in the search for the instrumental evolution, the advances, advantages, limitations, and trends of this approach are related. Several works focusing its analytical applications are cited employing simultaneous multielement determination by flame (FAAS) and/or graphite furnace (GF AAS), and fast sequential multielement determination using FAAS are reported in the present review.