964 resultados para Ephemeral Computation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding the emplacement and growth of intrusive bodies in terms of mechanism, duration, ther¬mal evolution and rates are fundamental aspects of crustal evolution. Recent studies show that many plutons grow in several Ma by in situ accretion of discrete magma pulses, which constitute small-scale magmatic reservoirs. The residence time of magmas, and hence their capacities to interact and differentiate, are con¬trolled by the local thermal environment. The latter is highly dependant on 1) the emplacement depth, 2) the magmas and country rock composition, 3) the country rock thermal conductivity, 4) the rate of magma injection and 5) the geometry of the intrusion. In shallow level plutons, where magmas solidify quickly, evi¬dence for magma mixing and/or differentiation processes is considered by many authors to be inherited from deeper levels. This work shows however that in-situ differentiation and magma interactions occurred within basaltic and felsic sills at shallow depth (0.3 GPa) in the St-Jean-du-Doigt (SJDD) bimodal intrusion, France. This intrusion emplaced ca. 347 Ma ago (IDTIMS U/Pb on zircon) in the Precambrian crust of the Armori- can massif and preserves remarkable sill-like emplacement processes of bimodal mafic-felsic magmas. Field evidence coupled to high precision zircon U-Pb dating document progressive thermal maturation within the incrementally built ioppolith. Early m-thick mafic sills (eastern part) form the roof of the intrusion and are homogeneous and fine-grained with planar contacts with neighboring felsic sills; within a minimal 0.8 Ma time span, the system gets warmer (western part). Sills are emplaced by under-accretion under the old east¬ern part, interact and mingle. A striking feature of this younger, warmer part is in-situ differentiation of the mafic sills in the top 40 cm of the layer, which suggests liquids survival in the shallow crust. Rheological and thermal models were performed in order to determine the parameters required to allow this observed in- situ differentiation-accumulation processes. Strong constraints such as total emplacement durations (ca. 0.8 Ma, TIMS date) and pluton thickness (1.5 Km, gravity model) allow a quantitative estimation of the various parameters required (injection rates, incubation time,...). The results show that in-situ differentiation may be achieved in less than 10 years at such shallow depth, provided that: (1) The differentiating sills are injected beneath consolidated, yet still warm basalt sills, which act as low conductive insulating screens (eastern part formation in the SJDD intrusion). The latter are emplaced in a very short time (800 years) at high injection rate (0.5 m/y) in order to create a "hot zone" in the shallow crust (incubation time). This implies that nearly 1/3 of the pluton (400m) is emplaced by a subsequent and sustained magmatic activity occurring on a short time scale at the very beginning of the system. (2) Once incubation time is achieved, the calculations show that a small hot zone is created at the base of the sill pile, where new injections stay above their solidus T°C and may interact and differentiate. Extraction of differentiated residual liquids might eventually take place and mix with newly injected magma as documented in active syn-emplacement shear-zones within the "warm" part of the pluton. (3) Finally, the model show that in order to maintain a permanent hot zone at shallow level, injection rate must be of 0.03 m/y with injection of 5m thick basaltic sills eveiy 130yr, imply¬ing formation of a 15 km thick pluton. As this thickness is in contradiction with the one calculated for SJDD (1.5 Km) and exceed much the average thickness observed for many shallow level plutons, I infer that there is no permanent hot zone (or magma chambers) at such shallow level. I rather propose formation of small, ephemeral (10-15yr) reservoirs, which represent only small portions of the final size of the pluton. Thermal calculations show that, in the case of SJDD, 5m thick basaltic sills emplaced every 1500 y, allow formation of such ephemeral reservoirs. The latter are formed by several sills, which are in a mushy state and may interact and differentiate during a short time.The mineralogical, chemical and isotopic data presented in this study suggest a signature intermediate be¬tween E-MORB- and arc-like for the SJDD mafic sills and feeder dykes. The mantle source involved produced hydrated magmas and may be astenosphere modified by "arc-type" components, probably related to a sub¬ducting slab. Combined fluid mobile/immobile trace elements and Sr-Nd isotopes suggest that such subduc¬tion components are mainly fluids derived from altered oceanic crust with minor effect from the subducted sediments. Close match between the SJDD compositions and BABB may point to a continental back-arc setting with little crustal contamination. If so, the SjDD intrusion is a major witness of an extensional tectonic regime during the Early-Carboniferous, linked to the subduction of the Rheno-Hercynian Ocean beneath the Variscan terranes. Also of interest is the unusual association of cogenetic (same isotopic compositions) K-feldspar A- type granite and albite-granite. A-type granites may form by magma mixing between the mafic magma and crustal melts. Alternatively, they might derive from the melting of a biotite-bearing quartz-feldspathic crustal protolith triggered by early mafic injections at low crustal levels. Albite-granite may form by plagioclase cu¬mulate remelting issued from A-type magma differentiation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many eukaryote organisms are polyploid. However, despite their importance, evolutionary inference of polyploid origins and modes of inheritance has been limited by a need for analyses of allele segregation at multiple loci using crosses. The increasing availability of sequence data for nonmodel species now allows the application of established approaches for the analysis of genomic data in polyploids. Here, we ask whether approximate Bayesian computation (ABC), applied to realistic traditional and next-generation sequence data, allows correct inference of the evolutionary and demographic history of polyploids. Using simulations, we evaluate the robustness of evolutionary inference by ABC for tetraploid species as a function of the number of individuals and loci sampled, and the presence or absence of an outgroup. We find that ABC adequately retrieves the recent evolutionary history of polyploid species on the basis of both old and new sequencing technologies. The application of ABC to sequence data from diploid and polyploid species of the plant genus Capsella confirms its utility. Our analysis strongly supports an allopolyploid origin of C. bursa-pastoris about 80 000 years ago. This conclusion runs contrary to previous findings based on the same data set but using an alternative approach and is in agreement with recent findings based on whole-genome sequencing. Our results indicate that ABC is a promising and powerful method for revealing the evolution of polyploid species, without the need to attribute alleles to a homeologous chromosome pair. The approach can readily be extended to more complex scenarios involving higher ploidy levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sensor networks have many applications in monitoring and controlling of environmental properties such as sound, acceleration, vibration and temperature. Due to limitedresources in computation capability, memory and energy, they are vulnerable to many kinds of attacks. The ZigBee specification based on the 802.15.4 standard, defines a set of layers specifically suited to sensor networks. These layers support secure messaging using symmetric cryptographic. This paper presents two different ways for grabbing the cryptographic key in ZigBee: remote attack and physical attack. It also surveys and categorizes some additional attacks which can be performed on ZigBee networks: eavesdropping, spoofing, replay and DoS attacks at different layers. From this analysis, it is shown that some vulnerabilities still in the existing security schema in ZigBee technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Työn tavoitteena oli kehittää automaattinen optimointijärjestelmä energiayhtiön omistamaan pieneen sähkön- ja lämmöntuotantolaitokseen (CHP-laitos). Optimointitarve perustuu energiayhtiön sähkön hankintaan sähköpörssistä, kaasun hankintahintaan, kohteen paikallisiin sähkö- ja lämpökuormituksiin ja muihin laitoksen talouteen vaikuttaviin tekijöihin. Kehitettävällä optimointijärjestelmällä ontarkoitus tulevaisuudessa hallita useita hajautetun energiantuotannon yksiköitäkeskitetysti. Työssä kehitettiin algoritmi, joka optimoi voimalaitoksen taloutta sähkötehoa säätävillä ajomalleilla ja suoralla sähköteho-ohjeella. Työssä kehitetyn algoritmin tuottamia hyötyjä selvitettiin Harjun oppimiskeskuksen CHP-laitoksen mittaushistoriatiedoilla. CHP-laitosten käytön optimointiin luotiin keskitettyyn laskentaan ja hajautettuun ohjaukseen perustuva järjestelmä. Se ohjaa CHP-laitoksia reaaliaikaisesti ja ennustaa historiatietoihin perustuvalla aikasarjamallilla laitoksen tulevaa käyttöä. Optimointijärjestelmän toimivuus ja saatu hyöty selvitettiin Harjun oppimiskeskuksen CHP-laitoksella vertaamalla mittauksista laskettua toteutunutta hyötyä optimointijärjestelmän laskemaan ennustettuun hyötyyn.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tässä diplomityössä on tarkasteltu Oulun Energian kaukolämpötoiminnan kehitystä lähitulevaisuudessa. Työn yhteydessä selvitettiin nykyisessä tilanteessa mitoituslämpötilaa -32 oC vastaava tehotilastollisen analyysin avulla ja laadittiin kasvuennuste kaukolämmityksen tehontarpeesta seuraavalle viidelletoista vuodelle. Kasvuennusteen perusteella on tehty tarkastelu kaukolämmön varatehon riittävyydestä. Verkoston tehonsiirtokykyä nykyisissä ja tulevaisuuden kuormitustilanteissa on tarkasteltu Process VisioninGrades Heating -verkostolaskentaohjelmiston avulla. Tarkastelun perusteella kaukolämpöverkoston siirtokyky on kohtalaisen hyvä. Verkoston ongelmakohtia ovat länsi-itäsuunnassa olevat siirtolinjat. Varatehon määrä tulee laskemaan lähivuosina alle suositeltavan määrän, mikäli uutta lämmöntuotantokapasiteettia ei rakenneta. Alkuvaiheessa paras ratkaisu tilanteen korjaamiseksi olisi uusien lämpökeskusten rakentaminen sekä kaupungin etelä- että itä-osiin. 2010-luvulla tarve uuden voimalaitoksen rakentamiselle kaukolämpötehon tarpeen kattamiseksi tulee kasvamaan.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The decision-making process regarding drug dose, regularly used in everyday medical practice, is critical to patients' health and recovery. It is a challenging process, especially for a drug with narrow therapeutic ranges, in which a medical doctor decides the quantity (dose amount) and frequency (dose interval) on the basis of a set of available patient features and doctor's clinical experience (a priori adaptation). Computer support in drug dose administration makes the prescription procedure faster, more accurate, objective, and less expensive, with a tendency to reduce the number of invasive procedures. This paper presents an advanced integrated Drug Administration Decision Support System (DADSS) to help clinicians/patients with the dose computing. Based on a support vector machine (SVM) algorithm, enhanced with the random sample consensus technique, this system is able to predict the drug concentration values and computes the ideal dose amount and dose interval for a new patient. With an extension to combine the SVM method and the explicit analytical model, the advanced integrated DADSS system is able to compute drug concentration-to-time curves for a patient under different conditions. A feedback loop is enabled to update the curve with a new measured concentration value to make it more personalized (a posteriori adaptation).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerical computation of a viscid heat-conducting transonic flow over a generic commercial rocket profile with symmetric oversized nose part was carried out. It has been shown that at zero angle of attack for some free-streamvelocity value flow pattern loses its symmetry. This results in non-uniform pressure distribution on rocket surface in angle direction which may yield in additional oscillating stress on the rocket. Also it has been found that obtained non-symmetric flow patterns are stable for small velocity perturbations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Työn tavoitteena on kehittää Microsoft Excel -taulukkolaskentaohjelmaan pohjautuva arvonmääritysmalli. Mallin avulla osaketutkimusta tekevät analyytikot ja sijoittajat voivat määrittää osakkeen fundamenttiarvon. Malli kehitetään erityisesti piensijoittajien työkaluksi. Työn toisena tavoitteena on soveltaa kehitettyä arvonmääritysmallia case-yrityksenä toimivan F-Securen arvonmäärityksessä ja selvittää mallin avulla onko F-Securen osake pörssissä fundamentteihin nähden oikein hinnoiteltu. Työn teoriaosassa esitellään arvonmäärityksen käyttökohteet ja historia, arvonmääritysprosessin vaiheet (strateginen analyysi, tilinpäätösanalyysi, tulevaisuuden ennakointi, yrityksen arvon laskeminen), pääoman kustannuksen määrittäminen ja sijoittajan eri arvonmääritysmenetelmät, joita ovat diskontattuun kassavirtaan perustuvassa arvonmäärityksessä käytettävät mallit sekä suhteellisen arvonmäärityksentunnusluvut. Empiirinen osa käsittää arvonmääritysmallin kehittämisen ja rakenteen kuvauksen sekä F-Securen arvonmääritysprosessin. Vaikka F-Securen tulevaisuus näyttää varsin valoisalta, osake on hinnoiteltu markkinoilla tällä hetkellä(23.02.2006) korkeammalle kuin näihin odotuksiin nähden olisi järkevää. Eri menetelmät antavat osakkeelle arvoja 2,25 euron ja 2,97 euron väliltä. Kehitetty Excel -malli määrittää F-Securen osakkeen tavoitehinnaksi eri menetelmien mediaanina 2,29 euroa. Tutkimuksen tuloksena F-Securen osaketta voidaan pitää yliarvostettuna, sillä sen hinta pörssissä on 3,05 euroa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Résumé: Le développement rapide de nouvelles technologies comme l'imagerie médicale a permis l'expansion des études sur les fonctions cérébrales. Le rôle principal des études fonctionnelles cérébrales est de comparer l'activation neuronale entre différents individus. Dans ce contexte, la variabilité anatomique de la taille et de la forme du cerveau pose un problème majeur. Les méthodes actuelles permettent les comparaisons interindividuelles par la normalisation des cerveaux en utilisant un cerveau standard. Les cerveaux standards les plus utilisés actuellement sont le cerveau de Talairach et le cerveau de l'Institut Neurologique de Montréal (MNI) (SPM99). Les méthodes de recalage qui utilisent le cerveau de Talairach, ou celui de MNI, ne sont pas suffisamment précises pour superposer les parties plus variables d'un cortex cérébral (p.ex., le néocortex ou la zone perisylvienne), ainsi que les régions qui ont une asymétrie très importante entre les deux hémisphères. Le but de ce projet est d'évaluer une nouvelle technique de traitement d'images basée sur le recalage non-rigide et utilisant les repères anatomiques. Tout d'abord, nous devons identifier et extraire les structures anatomiques (les repères anatomiques) dans le cerveau à déformer et celui de référence. La correspondance entre ces deux jeux de repères nous permet de déterminer en 3D la déformation appropriée. Pour les repères anatomiques, nous utilisons six points de contrôle qui sont situés : un sur le gyrus de Heschl, un sur la zone motrice de la main et le dernier sur la fissure sylvienne, bilatéralement. Evaluation de notre programme de recalage est accomplie sur les images d'IRM et d'IRMf de neuf sujets parmi dix-huit qui ont participés dans une étude précédente de Maeder et al. Le résultat sur les images anatomiques, IRM, montre le déplacement des repères anatomiques du cerveau à déformer à la position des repères anatomiques de cerveau de référence. La distance du cerveau à déformer par rapport au cerveau de référence diminue après le recalage. Le recalage des images fonctionnelles, IRMf, ne montre pas de variation significative. Le petit nombre de repères, six points de contrôle, n'est pas suffisant pour produire les modifications des cartes statistiques. Cette thèse ouvre la voie à une nouvelle technique de recalage du cortex cérébral dont la direction principale est le recalage de plusieurs points représentant un sillon cérébral. Abstract : The fast development of new technologies such as digital medical imaging brought to the expansion of brain functional studies. One of the methodolgical key issue in brain functional studies is to compare neuronal activation between individuals. In this context, the great variability of brain size and shape is a major problem. Current methods allow inter-individual comparisions by means of normalisation of subjects' brains in relation to a standard brain. A largerly used standard brains are the proportional grid of Talairach and Tournoux and the Montreal Neurological Insititute standard brain (SPM99). However, there is a lack of more precise methods for the superposition of more variable portions of the cerebral cortex (e.g, neocrotex and perisyvlian zone) and in brain regions highly asymmetric between the two cerebral hemipsheres (e.g. planum termporale). The aim of this thesis is to evaluate a new image processing technique based on non-linear model-based registration. Contrary to the intensity-based, model-based registration uses spatial and not intensitiy information to fit one image to another. We extract identifiable anatomical features (point landmarks) in both deforming and target images and by their correspondence we determine the appropriate deformation in 3D. As landmarks, we use six control points that are situated: one on the Heschl'y Gyrus, one on the motor hand area, and one on the sylvian fissure, bilaterally. The evaluation of this model-based approach is performed on MRI and fMRI images of nine of eighteen subjects participating in the Maeder et al. study. Results on anatomical, i.e. MRI, images, show the mouvement of the deforming brain control points to the location of the reference brain control points. The distance of the deforming brain to the reference brain is smallest after the registration compared to the distance before the registration. Registration of functional images, i.e fMRI, doesn't show a significant variation. The small number of registration landmarks, i.e. six, is obvious not sufficient to produce significant modification on the fMRI statistical maps. This thesis opens the way to a new computation technique for cortex registration in which the main directions will be improvement of the registation algorithm, using not only one point as landmark, but many points, representing one particular sulcus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Relief Mapping is giving great results for the creation of 3D impostor models. An impostor model is a simplication of an original geometric model that is used to replace it. Then, the original volume can be reproduced in a high quality representation with very few artifacts or cracks and a high compactness. We have studied the state of the art on Relief Impostors and some current techniques related to them. In particular, we have implemented the Omni-directional Relief Impostors (ORI) technique and its hierarchical extension (HORI), througn the usage of spatial partition methods. We expose an alternative to the spatial distribution and selection of the impostors. Furthermore, we show a different computation for the rendering view distance in order to guarantee a minimal quality for the simplified representation. Finally, we discuss the obtained results and propose some new ideas or approaches to enhance the efficiency and quality of the final rendering using ORIs' and HORIs' techniques. In addition, our implementation has involved a software engineering study in the Open Source field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerical weather prediction and climate simulation have been among the computationally most demanding applications of high performance computing eversince they were started in the 1950's. Since the 1980's, the most powerful computers have featured an ever larger number of processors. By the early 2000's, this number is often several thousand. An operational weather model must use all these processors in a highly coordinated fashion. The critical resource in running such models is not computation, but the amount of necessary communication between the processors. The communication capacity of parallel computers often fallsfar short of their computational power. The articles in this thesis cover fourteen years of research into how to harness thousands of processors on a single weather forecast or climate simulation, so that the application can benefit as much as possible from the power of parallel high performance computers. The resultsattained in these articles have already been widely applied, so that currently most of the organizations that carry out global weather forecasting or climate simulation anywhere in the world use methods introduced in them. Some further studies extend parallelization opportunities into other parts of the weather forecasting environment, in particular to data assimilation of satellite observations.