964 resultados para Ephemeral Computation
Resumo:
Characterizing the geological features and structures in three dimensions over inaccessible rock cliffs is needed to assess natural hazards such as rockfalls and rockslides and also to perform investigations aimed at mapping geological contacts and building stratigraphy and fold models. Indeed, the detailed 3D data, such as LiDAR point clouds, allow to study accurately the hazard processes and the structure of geologic features, in particular in vertical and overhanging rock slopes. Thus, 3D geological models have a great potential of being applied to a wide range of geological investigations both in research and applied geology projects, such as mines, tunnels and reservoirs. Recent development of ground-based remote sensing techniques (LiDAR, photogrammetry and multispectral / hyperspectral images) are revolutionizing the acquisition of morphological and geological information. As a consequence, there is a great potential for improving the modeling of geological bodies as well as failure mechanisms and stability conditions by integrating detailed remote data. During the past ten years several large rockfall events occurred along important transportation corridors where millions of people travel every year (Switzerland: Gotthard motorway and railway; Canada: Sea to sky highway between Vancouver and Whistler). These events show that there is still a lack of knowledge concerning the detection of potential rockfalls, making mountain residential settlements and roads highly risky. It is necessary to understand the main factors that destabilize rocky outcrops even if inventories are lacking and if no clear morphological evidences of rockfall activity are observed. In order to increase the possibilities of forecasting potential future landslides, it is crucial to understand the evolution of rock slope stability. Defining the areas theoretically most prone to rockfalls can be particularly useful to simulate trajectory profiles and to generate hazard maps, which are the basis for land use planning in mountainous regions. The most important questions to address in order to assess rockfall hazard are: Where are the most probable sources for future rockfalls located? What are the frequencies of occurrence of these rockfalls? I characterized the fracturing patterns in the field and with LiDAR point clouds. Afterwards, I developed a model to compute the failure mechanisms on terrestrial point clouds in order to assess the susceptibility to rockfalls at the cliff scale. Similar procedures were already available to evaluate the susceptibility to rockfalls based on aerial digital elevation models. This new model gives the possibility to detect the most susceptible rockfall sources with unprecented detail in the vertical and overhanging areas. The results of the computation of the most probable rockfall source areas in granitic cliffs of Yosemite Valley and Mont-Blanc massif were then compared to the inventoried rockfall events to validate the calculation methods. Yosemite Valley was chosen as a test area because it has a particularly strong rockfall activity (about one rockfall every week) which leads to a high rockfall hazard. The west face of the Dru was also chosen for the relevant rockfall activity and especially because it was affected by some of the largest rockfalls that occurred in the Alps during the last 10 years. Moreover, both areas were suitable because of their huge vertical and overhanging cliffs that are difficult to study with classical methods. Limit equilibrium models have been applied to several case studies to evaluate the effects of different parameters on the stability of rockslope areas. The impact of the degradation of rockbridges on the stability of large compartments in the west face of the Dru was assessed using finite element modeling. In particular I conducted a back-analysis of the large rockfall event of 2005 (265'000 m3) by integrating field observations of joint conditions, characteristics of fracturing pattern and results of geomechanical tests on the intact rock. These analyses improved our understanding of the factors that influence the stability of rock compartments and were used to define the most probable future rockfall volumes at the Dru. Terrestrial laser scanning point clouds were also successfully employed to perform geological mapping in 3D, using the intensity of the backscattered signal. Another technique to obtain vertical geological maps is combining triangulated TLS mesh with 2D geological maps. At El Capitan (Yosemite Valley) we built a georeferenced vertical map of the main plutonio rocks that was used to investigate the reasons for preferential rockwall retreat rate. Additional efforts to characterize the erosion rate were made at Monte Generoso (Ticino, southern Switzerland) where I attempted to improve the estimation of long term erosion by taking into account also the volumes of the unstable rock compartments. Eventually, the following points summarize the main out puts of my research: The new model to compute the failure mechanisms and the rockfall susceptibility with 3D point clouds allows to define accurately the most probable rockfall source areas at the cliff scale. The analysis of the rockbridges at the Dru shows the potential of integrating detailed measurements of the fractures in geomechanical models of rockmass stability. The correction of the LiDAR intensity signal gives the possibility to classify a point cloud according to the rock type and then use this information to model complex geologic structures. The integration of these results, on rockmass fracturing and composition, with existing methods can improve rockfall hazard assessments and enhance the interpretation of the evolution of steep rockslopes. -- La caractérisation de la géologie en 3D pour des parois rocheuses inaccessibles est une étape nécessaire pour évaluer les dangers naturels tels que chutes de blocs et glissements rocheux, mais aussi pour réaliser des modèles stratigraphiques ou de structures plissées. Les modèles géologiques 3D ont un grand potentiel pour être appliqués dans une vaste gamme de travaux géologiques dans le domaine de la recherche, mais aussi dans des projets appliqués comme les mines, les tunnels ou les réservoirs. Les développements récents des outils de télédétection terrestre (LiDAR, photogrammétrie et imagerie multispectrale / hyperspectrale) sont en train de révolutionner l'acquisition d'informations géomorphologiques et géologiques. Par conséquence, il y a un grand potentiel d'amélioration pour la modélisation d'objets géologiques, ainsi que des mécanismes de rupture et des conditions de stabilité, en intégrant des données détaillées acquises à distance. Pour augmenter les possibilités de prévoir les éboulements futurs, il est fondamental de comprendre l'évolution actuelle de la stabilité des parois rocheuses. Définir les zones qui sont théoriquement plus propices aux chutes de blocs peut être très utile pour simuler les trajectoires de propagation des blocs et pour réaliser des cartes de danger, qui constituent la base de l'aménagement du territoire dans les régions de montagne. Les questions plus importantes à résoudre pour estimer le danger de chutes de blocs sont : Où se situent les sources plus probables pour les chutes de blocs et éboulement futurs ? Avec quelle fréquence vont se produire ces événements ? Donc, j'ai caractérisé les réseaux de fractures sur le terrain et avec des nuages de points LiDAR. Ensuite, j'ai développé un modèle pour calculer les mécanismes de rupture directement sur les nuages de points pour pouvoir évaluer la susceptibilité au déclenchement de chutes de blocs à l'échelle de la paroi. Les zones sources de chutes de blocs les plus probables dans les parois granitiques de la vallée de Yosemite et du massif du Mont-Blanc ont été calculées et ensuite comparés aux inventaires des événements pour vérifier les méthodes. Des modèles d'équilibre limite ont été appliqués à plusieurs cas d'études pour évaluer les effets de différents paramètres sur la stabilité des parois. L'impact de la dégradation des ponts rocheux sur la stabilité de grands compartiments de roche dans la paroi ouest du Petit Dru a été évalué en utilisant la modélisation par éléments finis. En particulier j'ai analysé le grand éboulement de 2005 (265'000 m3), qui a emporté l'entier du pilier sud-ouest. Dans le modèle j'ai intégré des observations des conditions des joints, les caractéristiques du réseau de fractures et les résultats de tests géoméchaniques sur la roche intacte. Ces analyses ont amélioré l'estimation des paramètres qui influencent la stabilité des compartiments rocheux et ont servi pour définir des volumes probables pour des éboulements futurs. Les nuages de points obtenus avec le scanner laser terrestre ont été utilisés avec succès aussi pour produire des cartes géologiques en 3D, en utilisant l'intensité du signal réfléchi. Une autre technique pour obtenir des cartes géologiques des zones verticales consiste à combiner un maillage LiDAR avec une carte géologique en 2D. A El Capitan (Yosemite Valley) nous avons pu géoréferencer une carte verticale des principales roches plutoniques que j'ai utilisé ensuite pour étudier les raisons d'une érosion préférentielle de certaines zones de la paroi. D'autres efforts pour quantifier le taux d'érosion ont été effectués au Monte Generoso (Ticino, Suisse) où j'ai essayé d'améliorer l'estimation de l'érosion au long terme en prenant en compte les volumes des compartiments rocheux instables. L'intégration de ces résultats, sur la fracturation et la composition de l'amas rocheux, avec les méthodes existantes permet d'améliorer la prise en compte de l'aléa chute de pierres et éboulements et augmente les possibilités d'interprétation de l'évolution des parois rocheuses.
Resumo:
We have investigated the behavior of bistable cells made up of four quantum dots and occupied by two electrons, in the presence of realistic confinement potentials produced by depletion gates on top of a GaAs/AlGaAs heterostructure. Such a cell represents the basic building block for logic architectures based on the concept of quantum cellular automata (QCA) and of ground state computation, which have been proposed as an alternative to traditional transistor-based logic circuits. We have focused on the robustness of the operation of such cells with respect to asymmetries derived from fabrication tolerances. We have developed a two-dimensional model for the calculation of the electron density in a driven cell in response to the polarization state of a driver cell. Our method is based on the one-shot configuration-interaction technique, adapted from molecular chemistry. From the results of our simulations, we conclude that an implementation of QCA logic based on simple ¿hole arrays¿ is not feasible, because of the extreme sensitivity to fabrication tolerances. As an alternative, we propose cells defined by multiple gates, where geometrical asymmetries can be compensated for by adjusting the bias voltages. Even though not immediately applicable to the implementation of logic gates and not suitable for large scale integration, the proposed cell layout should allow an experimental demonstration of a chain of QCA cells.
Resumo:
Lexical diversity measures are notoriously sensitive to variations of sample size and recent approaches to this issue typically involve the computation of the average variety of lexical units in random subsamples of fixed size. This methodology has been further extended to measures of inflectional diversity such as the average number of wordforms per lexeme, also known as the mean size of paradigm (MSP) index. In this contribution we argue that, while random sampling can indeed be used to increase the robustness of inflectional diversity measures, using a fixed subsample size is only justified under the hypothesis that the corpora that we compare have the same degree of lexematic diversity. In the more general case where they may have differing degrees of lexematic diversity, a more sophisticated strategy can and should be adopted. A novel approach to the measurement of inflectional diversity is proposed, aiming to cope not only with variations of sample size, but also with variations of lexematic diversity. The robustness of this new method is empirically assessed and the results show that while there is still room for improvement, the proposed methodology considerably attenuates the impact of lexematic diversity discrepancies on the measurement of inflectional diversity.
Resumo:
This paper is concerned with the contribution of forensic science to the legal process by helping reduce uncertainty. Although it is now widely accepted that uncertainty should be handled by probability because it is a safeguard against incoherent proceedings, there remain diverging and conflicting views on how probability ought to be interpreted. This is exemplified by the proposals in scientific literature that call for procedures of probability computation that are referred to as "objective," suggesting that scientists ought to use them in their reporting to recipients of expert information. I find such proposals objectionable. They need to be viewed cautiously, essentially because ensuing probabilistic statements can be perceived as making forensic science prescriptive. A motivating example from the context of forensic DNA analysis will be chosen to illustrate this. As a main point, it shall be argued that such constraining suggestions can be avoided by interpreting probability as a measure of personal belief, that is, subjective probability. Invoking references to foundational literature from mathematical statistics and philosophy of science, the discussion will explore the consequences of this interdisciplinary viewpoint for the practice of forensic expert reporting. It will be emphasized that-as an operational interpretation of probability-the subjectivist perspective enables forensic science to add value to the legal process, in particular by avoiding inferential impasses to which other interpretations of probability may lead. Moreover, understanding probability from a subjective perspective can encourage participants in the legal process to take on more responsibility in matters regarding the coherent handling of uncertainty. This would assure more balanced interactions at the interface between science and the law. This, in turn, provides support for ongoing developments that can be called the "probabilization" of forensic science.
Resumo:
Concurrent aims to be a different type of task distribution system compared to what MPI like system do. It adds a simple but powerful application abstraction layer to distribute the logic of an entire application onto a swarm of clusters holding similarities with volunteer computing systems. Traditional task distributed systems will just perform simple tasks onto the distributed system and wait for results. Concurrent goes one step further by letting the tasks and the application decide what to do. The programming paradigm is then totally async without any waits for results and based on notifications once a computation has been performed.
Resumo:
The objective of this study is to show that bone strains due to dynamic mechanical loading during physical activity can be analysed using the flexible multibody simulation approach. Strains within the bone tissue play a major role in bone (re)modeling. Based on previous studies, it has been shown that dynamic loading seems to be more important for bone (re)modeling than static loading. The finite element method has been used previously to assess bone strains. However, the finite element method may be limited to static analysis of bone strains due to the expensive computation required for dynamic analysis, especially for a biomechanical system consisting of several bodies. Further, in vivo implementation of strain gauges on the surfaces of bone has been used previously in order to quantify the mechanical loading environment of the skeleton. However, in vivo strain measurement requires invasive methodology, which is challenging and limited to certain regions of superficial bones only, such as the anterior surface of the tibia. In this study, an alternative numerical approach to analyzing in vivo strains, based on the flexible multibody simulation approach, is proposed. In order to investigate the reliability of the proposed approach, three 3-dimensional musculoskeletal models where the right tibia is assumed to be flexible, are used as demonstration examples. The models are employed in a forward dynamics simulation in order to predict the tibial strains during walking on a level exercise. The flexible tibial model is developed using the actual geometry of the subject’s tibia, which is obtained from 3 dimensional reconstruction of Magnetic Resonance Images. Inverse dynamics simulation based on motion capture data obtained from walking at a constant velocity is used to calculate the desired contraction trajectory for each muscle. In the forward dynamics simulation, a proportional derivative servo controller is used to calculate each muscle force required to reproduce the motion, based on the desired muscle contraction trajectory obtained from the inverse dynamics simulation. Experimental measurements are used to verify the models and check the accuracy of the models in replicating the realistic mechanical loading environment measured from the walking test. The predicted strain results by the models show consistency with literature-based in vivo strain measurements. In conclusion, the non-invasive flexible multibody simulation approach may be used as a surrogate for experimental bone strain measurement, and thus be of use in detailed strain estimation of bones in different applications. Consequently, the information obtained from the present approach might be useful in clinical applications, including optimizing implant design and devising exercises to prevent bone fragility, accelerate fracture healing and reduce osteoporotic bone loss.
Resumo:
In recent years, new analytical tools have allowed researchers to extract historical information contained in molecular data, which has fundamentally transformed our understanding of processes ruling biological invasions. However, the use of these new analytical tools has been largely restricted to studies of terrestrial organisms despite the growing recognition that the sea contains ecosystems that are amongst the most heavily affected by biological invasions, and that marine invasion histories are often remarkably complex. Here, we studied the routes of invasion and colonisation histories of an invasive marine invertebrate Microcosmus squamiger (Ascidiacea) using microsatellite loci, mitochondrial DNA sequence data and 11 worldwide populations. Discriminant analysis of principal components, clustering methods and approximate Bayesian computation (ABC) methods showed that the most likely source of the introduced populations was a single admixture event that involved populations from two genetically differentiated ancestral regions - the western and eastern coasts of Australia. The ABC analyses revealed that colonisation of the introduced range of M. squamiger consisted of a series of non-independent introductions along the coastlines of Africa, North America and Europe. Furthermore, we inferred that the sequence of colonisation across continents was in line with historical taxonomic records - first the Mediterranean Sea and South Africa from an unsampled ancestral population, followed by sequential introductions in California and, more recently, the NE Atlantic Ocean. We revealed the most likely invasion history for world populations of M. squamiger, which is broadly characterized by the presence of multiple ancestral sources and non-independent introductions within the introduced range. The results presented here illustrate the complexity of marine invasion routes and identify a cause-effect relationship between human-mediated transport and the success of widespread marine non-indigenous species, which benefit from stepping-stone invasions and admixture processes involving different sources for the spread and expansion of their range.
Resumo:
En aquest treball es presenten els resultats i les conclusions d’un anàlisi realitzat amb els alumnes de Primer fins a Sisè de Primària, amb l’objectiu de descobrir quines estratègies utilitzen els alumnes a l’hora de resoldre sumes mentalment i observar si aquestes evolucionen al llarg dels diferents cursos. Així doncs, mitjançant un anàlisi detallat de les estratègies, podrem observar si els alumnes de sisè utilitzen les mateix estratègies que els alumnes de primer, o no, i l’evolució d’aquestes al llarg de la Primària, concretament, a l’escola pública El Bosc de la Pabordia. Per acabar, també es volen contrastar els resultats obtinguts amb diferents autors i investigacions sobre aquest àmbit en concret, El càlcul mental.
Resumo:
The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation
Resumo:
Tässä diplomityössä tutkitaan erilaisia keskijänniteverkon kehittämismenetelmiä sekä suunnittelua haja-asutusalueelle. Suunnittelumetodiikan perustana on vertailla sähköverkon käyttövarmuuden tunnuslukujen sekä kokonaiskustannusten kehittymistä erilaisilla investointiratkaisuilla. Lähemmässä tarkastelussa ovat erilaiset kaapelointimenetelmät sekä automaatiolaitteet kuten maastoon sijoitettavat katkaisijat sekä kauko-ohjattavat erottimet. Kehittämisratkaisujen vertailemiseksi sähköverkosta muodostetaan laskentaa varten malli, jonka avulla on mahdollista tarkastella mm. käyttövarmuuden tunnuslukujen sekä verkon kustannusten kehittymistä. Verkon kustannuksissa otetaan huomioon investointikustannukset, käyttö- ja kunnossapitokustannukset, viankorjauskustannukset sekä keskeytyskustannukset. Keskeytysten laskentaa varten toteutetaan erilliset laskentalohkot, jotta keskeytyskustannukset saadaan mallinnettua tarkasti. Kaapelointistrategia-analyysissä vertaillaan kaapeloinnin erilaisia toteuttamisperiaatteita. Erilaisia tutkittavia kaapelointimenetelmiä ovat vyörytysmenetelmä, vikaherkimpien kohteiden uusinta, vanhimpien kohteiden uusinta, täydellinen kaapelointi sekä optimiverkostoratkaisu, jossa on hyödynnetty keskijännitejohtojen kaapeloinnin lisäksi mm. automaatioratkaisuja ja 1000 V tekniikkaa. Kaapelointimenetelmiä vertailtaessa on havaittu, että vikaherkimmistä kohteista aloitettava saneeraus tuottaa parhaimman tuloksen, jos optimiratkaisua ei oteta huomioon.
Resumo:
An unsupervised approach to image segmentation which fuses region and boundary information is presented. The proposed approach takes advantage of the combined use of 3 different strategies: the guidance of seed placement, the control of decision criterion, and the boundary refinement. The new algorithm uses the boundary information to initialize a set of active regions which compete for the pixels in order to segment the whole image. The method is implemented on a multiresolution representation which ensures noise robustness as well as computation efficiency. The accuracy of the segmentation results has been proven through an objective comparative evaluation of the method
Resumo:
Laser scanning is becoming an increasingly popular method for measuring 3D objects in industrial design. Laser scanners produce a cloud of 3D points. For CAD software to be able to use such data, however, this point cloud needs to be turned into a vector format. A popular way to do this is to triangulate the assumed surface of the point cloud using alpha shapes. Alpha shapes start from the convex hull of the point cloud and gradually refine it towards the true surface of the object. Often it is nontrivial to decide when to stop this refinement. One criterion for this is to do so when the homology of the object stops changing. This is known as the persistent homology of the object. The goal of this thesis is to develop a way to compute the homology of a given point cloud when processed with alpha shapes, and to infer from it when the persistent homology has been achieved. Practically, the computation of such a characteristic of the target might be applied to power line tower span analysis.
Resumo:
Per entendre l’experiència d’una pèrdua i dur a terme una intervenció amb una valoració holística del pacient amb dol, és útil conèixer que aquesta pèrdua és present a la vida humana i que tota persona és susceptible a viure-la. Cada pas que fem en el nostre camí perdem coses, des de persones a un objecte significatiu, fins a les coses més efímeres com és la joventut, els somnis o les idees que ens permeten l’enfrontament a les dures “realitats de la vida”L’objectiu d’aquest estudi és valorar el coneixement del concepte de dol, dol patològic i dóna a conèixer quin és el procés de dol a partir de la teoria de les etapes d’ Elisabet Kübler-Ross, Parkes i Worden com els pares de les tasques d’aquest procés. Per altra banda pretén, donar a conèixer la detecció d’un dol per així evitar la cronificació d’aquest i que el pacient pugui dur a terme un bonacompanyament del dolLa metodologia es realitzarà a partir d’un estudi d’anàlisis qualitatiu i quantitatiu, descriptiu i bivariant en els professionals d’infermeria de les àrees bàsiques de salut de la ciutat de Girona: Girona 1(CAP Santa Clara), Girona 2 (CAP Can Gibert del Pla), Girona 3 (CAP de Montilivi) i Girona 4(CAP Taialà)Per l’anàlisi estadística descriptiva i bivariant s’utilitzarà el programa SPSS v20 per l’anàlisi quantitativa i el programa N-Vivo 10 per l’anàlisi qualitativa. I per l’obtenció dels resultats es realitzarà una anàlisis descriptiva bivariant. Els resultats obtinguts en aquest estudi ens ajudaran a detectar necessitats relacionades amb la formació i l’aprenentatge dels professionals de l’atenció primària i la manca de recursos en relació al dol
Resumo:
In the world of transport management, the term ‘anticipation’ is gradually replacing ‘reaction’. Indeed, the ability to forecast traffic evolution in a network should ideally form the basis for many traffic management strategies and multiple ITS applications. Real-time prediction capabilities are therefore becoming a concrete need for the management of networks, both for urban and interurban environments, and today’s road operator has increasingly complex and exacting requirements. Recognising temporal patterns in traffic or the manner in which sequential traffic events evolve over time have been important considerations in short-term traffic forecasting. However, little work has been conducted in the area of identifying or associating traffic pattern occurrence with prevailing traffic conditions. This paper presents a framework for detection pattern identification based on finite mixture models using the EM algorithm for parameter estimation. The computation results have been conducted taking into account the traffic data available in an urban network.