948 resultados para Multicast Application Level
Resumo:
Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.
Resumo:
A new aggregation method for decision making is presented by using induced aggregation operators and the index of maximum and minimum level. Its main advantage is that it can assess complex reordering processes in the aggregation that represent complex attitudinal characters of the decision maker such as psychological or personal factors. A wide range of properties and particular cases of this new approach are studied. A further generalization by using hybrid averages and immediate weights is also presented. The key issue in this approach against the previous model is that we can use the weighted average and the ordered weighted average in the same formulation. Thus, we are able to consider the subjective attitude and the degree of optimism of the decision maker in the decision process. The paper ends with an application in a decision making problem based on the use of the assignment theory.
Resumo:
Objecte: L'aplicació de la NIC 32 en les cooperatives ha generat una important controvèrsia en els últims anys. Fins al moment, s'han realitzat diversos treballs que intenten preveure els possibles efectes de la seva aplicació. Aquest treball pretén analitzar l'impacte de la primera aplicació de la NIC 32 en el sector cooperatiu. Disseny/metodologia/enfocament: S'ha seleccionat una mostra de 98 cooperatives, i s'ha realitzat una anàlisi comparativa de la seva informació financera presentada abans i després de l'aplicació de la NIC 32, per a determinar les diferències existents. S’ha utilitzat la prova de la suma de rangs de Wilcoxon per comprovar si aquestes diferències són significatives. També s’ha utilitzat la prova de la U de Mann Whitney per comprovar si existeixen diferències significatives en l’impacte relatiu de l’aplicació de la NIC 32 entre diversos grups de cooperatives. Finalment, s'ha realitzat una anàlisi dels efectes de l'aplicació de la NIC 32 en la situació patrimonial i econòmica de les cooperatives, i en l'evolució dels seus actius intangibles, mitjançant l’ús de tècniques d’anàlisi econòmico-financera. Aportacions i resultats: Els resultats obtinguts confirmen que l'aplicació de la NIC 32 provoca diferències significatives en algunes partides del balanç de situació i el compte de pèrdues i guanys, així com en les ràtios analitzades. Les principals diferències es concreten en una reducció del nivell de capitalització i un augment de l'endeutament de les cooperatives, així com un empitjorament general dels ràtios de solvència i autonomia financera. Limitacions: Cal tenir en compte que el treball s'ha realitzat amb una mostra de cooperatives que estan obligades a auditar els seus comptes anuals. Per tant, els resultats obtinguts han d'interpretar-se en un context de cooperatives de tamany elevat. També cal tenir en compte que hem realitzat una anàlisi comparativa dels comptes anuals de 2011 i 2010. Això ens ha permès conèixer les diferències en la informació financera de les cooperatives abans i després d'aplicar la NIC 32. Encara que algunes d’aquestes diferències també podrien estar causades per altres factors com la situació econòmica, els canvis en l'aplicació de les normes comptables, etc. Originalitat/valor afegit: Creiem que és el moment idoni per a realitzar aquest treball d'investigació, ja que des de 2011 totes les cooperatives espanyoles han d'aplicar les normes comptables adaptades a la NIC 32. A més, fins on coneixem, no existeixen altres treballs similars realitzats amb comptes anuals de cooperatives que ja han aplicat les normes comptables adaptades a la NIC 32 . Creiem que els resultats d'aquest treball d'investigació poden ser útils per a diferents grups d'interès. En primer lloc, perquè els organismes emissors de normes comptables puguin conèixer l'abast de la NIC 32 en les cooperatives i, puguin plantejar millores en el contingut de la norma. En segon lloc, perquè les pròpies cooperatives, federacions, confederacions i altres organismes cooperatius disposin d'informació sobre l'impacte econòmic de la primera aplicació de la NIC 32, i puguin realitzar les valoracions que creguin convenients. I en tercer lloc, perquè les entitats financeres, auditors i assessors de cooperatives i altres grups d'interès disposin d'informació sobre els canvis en els comptes anuals de les cooperatives, i puguin tenir-los en compte a l'hora de prendre decisions. Paraules clau: Cooperatives, patrimoni net, capital social, NIC 32, solvència, efectes de la normativa comptable, informació financera, ràtios.
Resumo:
Purpose: Despite the fundamental role of ecosystem goods and services in sustaining human activities, there is no harmonized and internationally agreed method for including them in life cycle assessment (LCA). The main goal of this study was to develop a globally applicable and spatially resolved method for assessing land-use impacts on the erosion regulation ecosystem service.Methods: Soil erosion depends much on location. Thus, unlike conventional LCA, the endpoint method was regionalized at the grid-cell level (5 arc-minutes, approximately 10×10 km2) to reflect the spatial conditions of the site. Spatially explicit characterization factors were not further aggregated at broader spatial scales. Results and discussion: Life cycle inventory data of topsoil and topsoil organic carbon (SOC) losses were interpreted at the endpoint level in terms of the ultimate damage to soil resources and ecosystem quality. Human health damages were excluded from the assessment. The method was tested on a case study of five three-year agricultural rotations, two of them with energy crops, grown in several locations in Spain. A large variation in soil and SOC losses was recorded in the inventory step, depending on climatic and edaphic conditions. The importance of using a spatially explicit model and characterization factors is shown in the case study.Conclusions and outlook: The regionalized assessment takes into account the differences in soil erosion-related environmental impacts caused by the great variability of soils. Taking this regionalized framework as the starting point, further research should focus on testing the applicability of the method trough the complete life cycle of a product and on determining an appropriate spatial scale at which to aggregate characterization factors, in order to deal with data gaps on location of processes, especially in the background system. Additional research should also focus on improving reliability of the method by quantifying and, insofar as it is possible, reducing uncertainty.
Resumo:
In this paper we propose the inversion of nonlinear distortions in order to improve the recognition rates of a speaker recognizer system. We study the effect of saturations on the test signals, trying to take into account real situations where the training material has been recorded in a controlled situation but the testing signals present some mismatch with the input signal level (saturations). The experimental results for speaker recognition shows that a combination of several strategies can improve the recognition rates with saturated test sentences from 80% to 89.39%, while the results with clean speech (without saturation) is 87.76% for one microphone, and for speaker identification can reduce the minimum detection cost function with saturated test sentences from 6.42% to 4.15%, while the results with clean speech (without saturation) is 5.74% for one microphone and 7.02% for the other one.
Resumo:
PURPOSE: The aim of this study was to develop models based on kernel regression and probability estimation in order to predict and map IRC in Switzerland by taking into account all of the following: architectural factors, spatial relationships between the measurements, as well as geological information. METHODS: We looked at about 240,000 IRC measurements carried out in about 150,000 houses. As predictor variables we included: building type, foundation type, year of construction, detector type, geographical coordinates, altitude, temperature and lithology into the kernel estimation models. We developed predictive maps as well as a map of the local probability to exceed 300 Bq/m(3). Additionally, we developed a map of a confidence index in order to estimate the reliability of the probability map. RESULTS: Our models were able to explain 28% of the variations of IRC data. All variables added information to the model. The model estimation revealed a bandwidth for each variable, making it possible to characterize the influence of each variable on the IRC estimation. Furthermore, we assessed the mapping characteristics of kernel estimation overall as well as by municipality. Overall, our model reproduces spatial IRC patterns which were already obtained earlier. On the municipal level, we could show that our model accounts well for IRC trends within municipal boundaries. Finally, we found that different building characteristics result in different IRC maps. Maps corresponding to detached houses with concrete foundations indicate systematically smaller IRC than maps corresponding to farms with earth foundation. CONCLUSIONS: IRC mapping based on kernel estimation is a powerful tool to predict and analyze IRC on a large-scale as well as on a local level. This approach enables to develop tailor-made maps for different architectural elements and measurement conditions and to account at the same time for geological information and spatial relations between IRC measurements.
Resumo:
The present study describes some of the applications of ultrasound in bone surgery, based on the presentation of two clinical cases. The Piezosurgery® ultrasound device was used (Tecnología Mectron Medical, Carasco, Italy). In one case the instrument was used to harvest a chin bone graft for placement in a bone defect at level 1.2, while in the other case a bony window osteotomy was made in the external wall of the maxillary sinus, in the context of a sinus membrane lift procedure. The Piezosurgery® device produces specific ultrasound frequency modulation (25-29 kHz), and has been designed to secure increased precision in application to bone surgery. This instrument produces selective sectioning of the mineralized bone structures, and causes less intra- and postoperative bleeding. One of the advantages of the Piezosurgery® device is that it can be used for maxillary sinus lift procedures in dental implant placement. In this context it considerably lessens the risk of sinus mucosa laceration by preparing the bony window in the external wall of the upper maxilla, and can be used to complete the lifting maneuver. The use of ultrasound in application to hard tissues can be regarded as a slow technique compared with the conventional rotary instruments, since it requires special surgical skill and involves a certain learning curve.
Resumo:
The World Wide Web, the world¿s largest resource for information, has evolved from organizing information using controlled, top-down taxonomies to a bottom up approach that emphasizes assigning meaning to data via mechanisms such as the Social Web (Web 2.0). Tagging adds meta-data, (weak semantics) to the content available on the web. This research investigates the potential for repurposing this layer of meta-data. We propose a multi-phase approach that exploits user-defined tags to identify and extract domain-level concepts. We operationalize this approach and assess its feasibility by application to a publicly available tag repository. The paper describes insights gained from implementing and applying the heuristics contained in the approach, as well as challenges and implications of repurposing tags for extraction of domain-level concepts.
Resumo:
The objective of this work was to evaluate the protective effect of different forms of insecticide application on the transmission of yellow dwarf disease in barley cultivars, as well as to determine the production costs and the net profit of these managements. The experiments were carried out during 2011 and 2012 growing seasons, using the following managements at main plots: T1, seed treatment with insecticide (ST) + insecticide on shoots at 15-day interval; T2, just ST; T3, insecticide applied on shoots, when aphid control level (CL) was reached; T4, without insecticide; and T5, ST + insecticide on shoots when CL was reached. Different barley cultivars - BRS Cauê, BRS Brau and MN 6021 - were arranged in the subplots. Insecticides lambda cyhalothrin (pyrethroid) and thiamethoxam (neonicotinoid) were used. There were differences on yellow dwarf disease index in both seasons for the different treatments, while damage to grain yield was influenced by year and aphid population. Production costs and net profit were different among treatments. Seed treatment with insecticide is sufficient to reduce the transmission of yellow dwarf disease in years with low aphid population pressure, while in years with larger populations, the application of insecticide on shoots is also required.
Resumo:
Langattoman laajakaistaisen tietoliikennetekniikan kehittyminen on herättänyt kiinnostuksen sen ammattimaiseen hyödyntämiseen yleisen turvallisuuden ja kriisinhallinnan tarpeisiin. Hätätilanteissa usein olemassa olevat kiinteät tietoliikennejärjestelmät eivät ole ollenkaan käytettävissä tai niiden tarjoama kapasiteetti ei ole riittävä. Tästä syystä on noussut esiin tarve nopeasti toimintakuntoon saatettaville ja itsenäisille langattomille laajakaistaisille järjestelmille. Tässä diplomityössä on tarkoitus tutkia langattomia ad hoc monihyppy -verkkoja yleisen turvallisuuden tarpeiden pohjalta ja toteuttaa testialusta, jolla voidaan demonstroida sekä tutkia tällaisen järjestelmän toimintaa käytännössä. Työssä tutkitaan pisteestä pisteeseen sekä erityisesti pisteestä moneen pisteeseen suoritettavaa tietoliikennettä. Mittausten kohteena on testialustan tiedonsiirtonopeus, lähetysteho ja vastaanottimen herkkyys. Näitä tuloksia käytetään simulaattorin parametreina, jotta simulaattorin tulokset olisivat mahdollisimman aidot ja yhdenmukaiset testialustan kanssa. Sen jälkeen valitaan valikoima yleisen turvallisuuden vaatimusten mukaisia ohjelmia ja sovellusmalleja, joiden suorituskyky mitataan erilaisten reititysmenetelmien alaisena sekä testialustalla että simulaattorilla. Tuloksia arvioidaan ja vertaillaan. Multicast monihyppy -video päätettiin sovelluksista valita tutkimusten pääkohteeksi ja sitä sekä sen ominaisuuksia on tarkoitus myös oikeissa kenttäkokeissa.
Resumo:
Les approches multimodales dans l'imagerie cérébrale non invasive sont de plus en plus considérées comme un outil indispensable pour la compréhension des différents aspects de la structure et de la fonction cérébrale. Grâce aux progrès des techniques d'acquisition des images de Resonance Magnetique et aux nouveaux outils pour le traitement des données, il est désormais possible de mesurer plusieurs paramètres sensibles aux différentes caractéristiques des tissues cérébraux. Ces progrès permettent, par exemple, d'étudier les substrats anatomiques qui sont à la base des processus cognitifs ou de discerner au niveau purement structurel les phénomènes dégénératifs et développementaux. Cette thèse met en évidence l'importance de l'utilisation d'une approche multimodale pour étudier les différents aspects de la dynamique cérébrale grâce à l'application de cette approche à deux études cliniques: l'évaluation structurelle et fonctionnelle des effets aigus du cannabis fumé chez des consommateurs réguliers et occasionnels, et l'évaluation de l'intégrité de la substance grise et blanche chez des jeunes porteurs de la prémutations du gène FMR1 à risque de développer le FXTAS (Fragile-X Tremor Ataxia Syndrome). Nous avons montré que chez les fumeurs occasionnels de cannabis, même à faible concentration du principal composant psychoactif (THC) dans le sang, la performance lors d'une tâche visuo-motrice est fortement diminuée, et qu'il y a des changements dans l'activité des trois réseaux cérébraux impliqués dans les processus cognitifs: le réseau de saillance, le réseau du contrôle exécutif, et le réseau actif par défaut (Default Mode). Les sujets ne sont pas en mesure de saisir les saillances dans l'environnement et de focaliser leur attention sur la tâche. L'augmentation de la réponse hémodynamique dans le cortex cingulaire antérieur suggère une augmentation de l'activité introspective. Une investigation des ef¬fets au niveau cérébral d'une exposition prolongée au cannabis, montre des changements persistants de la substance grise dans les régions associées à la mémoire et au traitement des émotions. Le niveau d'atrophie dans ces structures corrèle avec la consommation de cannabis au cours des trois mois précédant l'étude. Dans la deuxième étude, nous démontrons des altérations structurelles des décennies avant l'apparition du syndrome FXTAS chez des sujets jeunes, asymptomatiques, et porteurs de la prémutation du gène FMR1. Les modifications trouvées peuvent être liées à deux mécanismes différents. Les altérations dans le réseau moteur du cervelet et dans la fimbria de l'hippocampe, suggèrent un effet développemental de la prémutation. Elles incluent aussi une atrophie de la substance grise du lobule VI du cervelet et l'altération des propriétés tissulaires de la substance blanche des projections afférentes correspondantes aux pédoncules cérébelleux moyens. Les lésions diffuses de la substance blanche cérébrale peu¬vent être un marquer précoce du développement de la maladie, car elles sont liées à un phénomène dégénératif qui précède l'apparition des symptômes du FXTAS. - Multimodal brain imaging is becoming a leading tool for understanding different aspects of brain structure and function. Thanks to the advances in Magnetic Resonance imaging (MRI) acquisition schemes and data processing techniques, it is now possible to measure different parameters sensitive to different tissue characteristics. This allows for example to investigate anatomical substrates underlying cognitive processing, or to disentangle, at a pure structural level degeneration and developmental processes. This thesis highlights the importance of using a multimodal approach for investigating different aspects of brain dynamics by applying this approach to two clinical studies: functional and structural assessment of the acute effects of cannabis smoking in regular and occasional users, and grey and white matter assessment in young FMR1 premutation carriers at risk of developing FXTAS. We demonstrate that in occasional smokers cannabis smoking, even at low concentration of the main psychoactive component (THC) in the blood, strongly decrease subjects' performance on a visuo-motor tracking task, and globally alters the activity of the three brain networks involved in cognitive processing: the Salience, the Control Executive, and the Default Mode networks. Subjects are unable to capture saliences in the environment and to orient attention to the task; the increase in Hemodynamic Response in the Anterior Cingulate Cortex suggests an increase in self-oriented mental activity. A further investigation on long term exposure to cannabis, shows a persistent grey matter modification in brain regions associated with memory and affective processing. The degree of atrophy in these structures also correlates with the estimation of drug use in the three months prior the participation to the study. In the second study we demonstrate structural changes in young asymptomatic premutation carriers decades before the onset of FXTAS that might be related to two different mechanisms. Alteration of the cerebellar motor network and of the hippocampal fimbria/ fornix, may reflect a potential neurodevelopmental effect of the premutation. These include grey matter atrophy in lobule VI and modification of white matter tissue property in the corresponding afferent projections through the Middle Cerebellar Peduncles. Diffuse hemispheric white matter lesions that seem to appear closer to the onset of FXTAS and be related to a neurodegenerative phenomenon may mark the imminent onset of FXTAS.
Resumo:
Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.
Resumo:
This thesis studies gray-level distance transforms, particularly the Distance Transform on Curved Space (DTOCS). The transform is produced by calculating distances on a gray-level surface. The DTOCS is improved by definingmore accurate local distances, and developing a faster transformation algorithm. The Optimal DTOCS enhances the locally Euclidean Weighted DTOCS (WDTOCS) with local distance coefficients, which minimize the maximum error from the Euclideandistance in the image plane, and produce more accurate global distance values.Convergence properties of the traditional mask operation, or sequential localtransformation, and the ordered propagation approach are analyzed, and compared to the new efficient priority pixel queue algorithm. The Route DTOCS algorithmdeveloped in this work can be used to find and visualize shortest routes between two points, or two point sets, along a varying height surface. In a digital image, there can be several paths sharing the same minimal length, and the Route DTOCS visualizes them all. A single optimal path can be extracted from the route set using a simple backtracking algorithm. A new extension of the priority pixel queue algorithm produces the nearest neighbor transform, or Voronoi or Dirichlet tessellation, simultaneously with the distance map. The transformation divides the image into regions so that each pixel belongs to the region surrounding the reference point, which is nearest according to the distance definition used. Applications and application ideas for the DTOCS and its extensions are presented, including obstacle avoidance, image compression and surface roughness evaluation.
Resumo:
The objective of this study was to evaluate the effects of foundation and leaf fertilization with micronutrients on fruit size and quality of pineapple cv. Vitória under the environmental conditions of the Baixo Acaraú irrigated perimeter in Northern Ceará State, Brazil, under two covers (bagana and black plastic) of the sandy soil of low fertility. The experimental design was a randomized split blocks one with four levels of soil dressing and four levels of foliar fertilization, with five replications. Micronutrient soil dressing was studied as FTE-12 at doses of 0, 60, 120 and 180 kg ha-1. The four levels of foliar fertilization were: LF0 (without fertilizer), LF 1 (15 leaf fertilization, using the amount of 1158.75 g Fe ha-1, 844.65 g Mn ha-1, 391.5 g ha-1 Zn, 322.65 g ha-1 Cu and 216 g ha-1 B), LF2 (15 leaf fertilization, using twice the quantities of level LF1) and LF3 (15 leaf fertilization, using three times the amount of level LF1). At 13 months after planting the micropropagated plantlets was carried out the floral induction treatment and five months later the fruit harvest determining the following variables: fruit weight and median diameter, soluble solids content (SS) and titratable acidity (TA). Both fruit weight and diameter increased with increasing doses of micronutrients applied to the soil and to the leaves, of plants grown both on bagana soil cover and plastic mulch. On the other hand fruit pulp quality was little affected by the treatments studied. There were a small increase of SS contents for plants grown on bagana soil cover and a small decrease of titratable acidity for those grown on plastic mulch, in both cases just in response to micronutrient foliar application.
Resumo:
Reusability has become more popular factor in modern software engineering. This is mainly because object-orientation has brought methods that allow reusing more easily. Today more and more application developer thinks how they can reuse already existing applications in their work. If the developer wants to use existing components outside the current project, he can use design patterns, class libraries or frameworks. These provide solution for specific or general problems that has been already encountered. Application frameworks are collection of classes that provides base for the developer. Application frameworks are mostly implementation phase tools, but can also be used in application design. The main purpose of the frameworks is separate domain specific functionalities from the application specific. Usually the frameworks are divided into two categories: black and white box. Difference between those categories is the way the reuse is done. The application frameworks provide properties that can be examined and compared between different frameworks. These properties are: extensibility, reusability, modularity and scalability. These examine how framework will handle different platforms, changes in framework, increasing demand for resources, etc. Generally application frameworks do have these properties in good level. When comparing general purpose framework and more specific purpose framework, the main difference can be located in reusability of frameworks. It is mainly because the framework designed to specific domain can have constraints from external systems and resources. With general purpose framework these are set by the application developed based on the framework.