985 resultados para Attributes Affecting Patterns
Resumo:
Summary Background The dose-response between ultraviolet (UV) exposure patterns and skin cancer occurrence is not fully understood. Sun-protection messages often focus on acute exposure, implicitly assuming that direct UV radiation is the key contributor to the overall UV exposure. However, little is known about the relative contribution of the direct, diffuse and reflected radiation components. Objective To investigate solar UV exposure patterns at different body sites with respect to the relative contribution of the direct, diffuse and reflected radiation. Methods A three-dimensional numerical model was used to assess exposure doses for various body parts and exposure scenarios of a standing individual (static and dynamic postures). The model was fed with erythemally weighted ground irradiance data for the year 2009 in Payerne, Switzerland. A year-round daily exposure (08:00-17:00 h) without protection was assumed. Results For most anatomical sites, mean daily doses were high (typically 6·2-14·6 standard erythemal doses) and exceeded the recommended exposure values. Direct exposure was important during specific periods (e.g. midday during summer), but contributed moderately to the annual dose, ranging from 15% to 24% for vertical and horizontal body parts, respectively. Diffuse irradiation explained about 80% of the cumulative annual exposure dose. Acute diffuse exposures were also observed during cloudy summer days. Conclusions The importance of diffuse UV radiation should not be underestimated when advocating preventive measures. Messages focused on avoiding acute direct exposures may be of limited efficiency to prevent skin cancers associated with chronic exposure.
Resumo:
Tämän diplomityön tarkoituksena oli selvittää tekijät, jotka vaikuttavat verkkolaskutuksen leviämiseen yrityksissä sekä selvittää verkkolaskutuksen levinneisyys Etelä-Karjalassa. Erityisesti haluttiin selvittää tekijät, jotka hidastavat diffuusiota. Työn teoreettinen osuus esittelee lyhyesti diffuusionkäsitettä; pääpaino on kirjallisuudesta löytyvillä innovaatioiden omaksumiseen vaikuttavilla tekijöillä. Teorian pohjalta muodostettiin 16 hypoteesia, jotka testattiin empiirisessä osassa. Empiirinen aineisto kerättiin eteläkarjalaisilta pk-yrityksiltä postikyselyn avulla. Vastanneet yritykset jaettiin verkkolaskutuksen omaksuneisiin ja ei-omaksuneisiin yrityksiin. Vastanneista yrityksistä 7.5 % käytti verkkolaskutusta. 17.8 % vasta testasi verkkolaskujen lähetystä ja/tai vastaanottamista. Näin ollen 25.3 % yrityksistä oli verkkolaskuominaisuudet tietojärjestelmissään. Suurin osa vastaajista ei ollut tehnyt päätöstä verkkolaskutukseen siirtymisen suhteen. Seitsemän hypoteesia jäi voimaan ja yhdeksän hylättiin.Hypoteesien ulkopuolelta löydettiin lisäksi kaksi tärkeää tekijää, joilla on mahdollisesti vaikutusta verkkolaskutuksen omaksumiseen yrityksissä. Yritykset kokivat, että suurempi painostus yrityksen ulkopuolelta nopeuttaisi penetraatiota, ja toisaalta verkkolaskutusta koskevan informaation määrä on koettu liian alhaiseksi. Työn lopussa esitellään toimenpide-ehdotuksia verkkolaskutuksen leviämisennopeuttamiseksi.
Resumo:
OBJECTIVES: Our analysis assessed the impact of information on patients' preferences in prescription versus over-the-counter (OTC) delivery systems. METHODS: A contingent valuation (CV) study was implemented, randomly assigning 534 lay people into the receipt of limited or extended information concerning new influenza drugs. In each information arm, people answered two questions: the first asked about willingness to pay (WTP) for the new prescription drug; the second asked about WTP for the same drug sold OTC. RESULTS: We show that WTP is higher for the OTC scenario and that the level of information plays a significant role in the evaluation of the OTC scenario, with more information being associated with an increase in the WTP. In contrast, the level of information provided has no impact on WTP for prescription medicine. Thus, for the kind of drug considered here (i.e. safe, not requiring medical supervision), a switch to OTC status can be expected to be all the more beneficial, as the patient is provided with more information concerning the capability of the drug. CONCLUSIONS: Our results shed light on one of the most challenging issues that health policy makers are currently faced with, namely the threat of a bird flu pandemic. Drug delivery is a critical component of pandemic influenza preparedness. Furthermore, the congruence of our results with the agency and demand theories provides an important test of the validity of using WTP based on CV methods.
Resumo:
Comparative phylogeography seeks for commonalities in the spatial demographic history of sympatric organisms to characterize the mechanisms that shaped such patterns. The unveiling of incongruent phylogeographic patterns in co-occurring species, on the other hand, may hint to overlooked differences in their life histories or microhabitat preferences. The woodlouse-hunter spiders of the genus Dysdera have undergone a major diversi cation on the Canary Islands. The species pair Dysdera alegranzaensis and Dysdera nesiotes are endemic to the island of Lanzarote and nearby islets, where they co-occur at most of their known localities. The two species stand in sharp contrast to other sympatric endemic Dysdera in showing no evidence of somatic (non-genitalic) differentiation. Phylogenetic and population genetic analyses of mitochondrial cox1 sequences from an exhaustive sample of D. alegranzaensis and D. nesiotes specimens, and additional mitochondrial (16S, L1, nad1) and nuclear genes (28S, H3) were analysed to reveal their phylogeographic patterns and clarify their phylogenetic relationships. Relaxed molecular clock models using ve calibration points were further used to estimate divergence times between species and populations. Striking differences in phylogeography and population structure between the two species were observed. Dysdera nesiotes displayed a metapopulation-like structure, while D. alegranzaensis was characterized by a weaker geographical structure but greater genetic divergences among its main haplotype lineages, suggesting more complex population dynamics. Our study con rms that co-distributed sibling species may exhibit contrasting phylogeographic patterns in the absence of somatic differentiation. Further ecological studies, however, will be necessary to clarify whether the contrasting phylogeographies may hint at an overlooked niche partitioning between the two species. In addition, further comparisons with available phylogeographic data of other eastern Canarian Dysdera endemics con rm the key role of lava ows in structuring local populations in oceanic islands and identify localities that acted as refugia during volcanic eruptions
Resumo:
Purpose: To describe (1) the clinical profiles and the patterns of use of long-acting injectable (LAI) antipsychotics in patients with schizophrenia at risk of nonadherence with oral antipsychotics, and in those who started treatment with LAI antipsychotics, (2) health care resource utilization and associated costs. Patients and methods: A total of 597 outpatients with schizophrenia at risk of nonadherence, according to the psychiatrist's clinical judgment, were recruited at 59 centers in a noninterventional prospective observational study of 1-year follow-up when their treatment was modified. In a post hoc analysis, the profiles of patients starting LAI or continuing with oral antipsychotics were described, and descriptive analyses of treatments, health resource utilization, and direct costs were performed in those who started an LAI antipsychotic. Results: Therapy modifications involved the antipsychotic medications in 84.8% of patients, mostly because of insufficient efficacy of prior regimen. Ninety-two (15.4%) patients started an LAI antipsychotic at recruitment. Of these, only 13 (14.1%) were prescribed with first-generation antipsychotics. During 1 year, 16.3% of patients who started and 14.9% of patients who did not start an LAI antipsychotic at recruitment relapsed, contrasting with the 20.9% who had been hospitalized only within the prior 6 months. After 1 year, 74.3% of patients who started an LAI antipsychotic continued concomitant treatment with oral antipsychotics. The mean (median) total direct health care cost per patient per month during the study year among the patients starting any LAI antipsychotic at baseline was 1,407 ( 897.7). Medication costs (including oral and LAI antipsychotics and concomitant medication) represented almost 44%, whereas nonmedication costs accounted for more than 55% of the mean total direct health care costs. Conclusion: LAI antipsychotics were infrequently prescribed in spite of a psychiatrist-perceived risk of nonadherence to oral antipsychotics. Mean medication costs were lower than nonmedication costs.
Resumo:
BACKGROUND: This difference in how populations living in low-, middle or upper-income countries accumulate daily PA, i.e. patterns and intensity, is an important part in addressing the global PA movement. We sought to characterize objective PA in 2,500 participants spanning the epidemiologic transition. The Modeling the Epidemiologic Transition Study (METS) is a longitudinal study, in 5 countries. METS seeks to define the association between physical activity (PA), obesity and CVD risk in populations of African origin: Ghana (GH), South Africa (SA), Seychelles (SEY), Jamaica (JA) and the US (suburban Chicago). METHODS: Baseline measurements of objective PA, SES, anthropometrics and body composition, were completed on 2,500 men and women, aged 25-45 years. Moderate and vigorous PA (MVPA, min/d) on week and weekend days was explored ecologically, by adiposity status and manual labor. RESULTS: Among the men, obesity prevalence reflected the level of economic transition and was lowest in GH (1.7%) and SA (4.8%) and highest in the US (41%). SA (55%) and US (65%) women had the highest levels of obesity, compared to only 16% in GH. More men and women in developing countries engaged in manual labor and this was reflected by an almost doubling of measured MPVA among the men in GH (45 min/d) and SA (47 min/d) compared to only 28 min/d in the US. Women in GH (25 min/d), SA (21 min/d), JA (20 min/d) and SEY (20 min/d) accumulated significantly more MPVA than women in the US (14 min/d), yet this difference was not reflected by differences in BMI between SA, JA, SEY and US. Moderate PA constituted the bulk of the PA, with no study populations except SA men accumulating > 5 min/d of vigorous PA. Among the women, no sites accumulated >2 min/d of vigorous PA. Overweight/obese men were 22% less likely to engage in manual occupations. CONCLUSION: While there is some association for PA with obesity, this relationship is inconsistent across the epidemiologic transition and suggests that PA policy recommendations should be tailored for each environment.
Resumo:
Työn tavoitteena oli kehittää kosmetiikan suoramyyntiyrityksen hinnoittelustrategiaa. Tarkoitus oli saattaa normaalihinnat oikealle tasolle määritellyn kilpailuaseman mukaan. Työn etenemisen myötä huomattiin tarvetta myös koko hinnoitteluprosessin uudelleenluomiselle, sillä aikaisempi toimintatapa ei toiminut enää nykyisessä tilanteessa konsernin tiukkojen raportointitarpeiden ja esitteiden analyysin tukena tarvittavissa määrin. Aluksi tutustuttiin hinnoitteluun vaikuttaviin tekijöihin ja eri hintastrategisiin vaihtoehtoihin sekä hinnan rooliin markkinoinnin kilpailukeinona. Työssä tutustuttiin lisäksi suoramyyntitavan ominaispiirteisiin. Teorian pohjalta selkiytettiin Oriflame Finland Oy:n hinnoittelustrategia jalaadittiin pohjat tuotekohtaiselle ja myyntijaksottaiselle hinnoittelulle. Hinnoittelupohjan tuli tukea myös yrityksen talousosastoa ja olla hyödyllinen raportointityökalu myyntiennusteita varten. Erityisesti uusi hinnoittelumalli tehtiin tukemaan yrityksen päämyyntivälineen eli jaksottaisen myyntiesitteen analysoimista. Hinnoittelu on keskeisessä osassa koko yritysstrategiassa, sillä se vaikuttaa suoraan mielikuviin. Hinnoittelun tulee jatkossa tukea enemmän yrityksen kilpailullisia ja markkinoinnillisia tavoitteita, ja sen säännöllinen kehittäminen ja kilpailutilanteen päivitys kuuluu jatkossa yhtenä määriteltynä osana hinnoitteluprosessiin.
Resumo:
PURPOSE: To describe the anatomical characteristics and patterns of neurovascular compression in patients suffering classic trigeminal neuralgia (CTN), using high-resolution magnetic resonance imaging (MRI). MATERIALS AND METHODS: The analysis of the anatomy of the trigeminal nerve, brain stem and the vascular structures related to this nerve was made in 100 consecutive patients treated with a Gamma Knife radiosurgery for CTN between December 1999 and September 2004. MRI studies (T1, T1 enhanced and T2-SPIR) with axial, coronal and sagital simultaneous visualization were dynamically assessed using the software GammaPlan?. Three-dimensional reconstructions were also developed in some representative cases. RESULTS: In 93 patients (93%), there were one or several vascular structures in contact, either, with the trigeminal nerve, or close to its origin in the pons. The superior cerebellar artery was involved in 71 cases (76%). Other vessels identified were the antero-inferior cerebellar artery, the basilar artery, the vertebral artery, and some venous structures. Vascular compression was found anywhere along the trigeminal nerve. The mean distance between the nerve compression and the origin of the nerve in the brainstem was 3.76±2.9mm (range 0-9.8mm). In 39 patients (42%), the vascular compression was located proximally and in 42 (45%) the compression was located distally. Nerve dislocation or distortion by the vessel was observed in 30 cases (32%). CONCLUSIONS: The findings of this study are similar to those reported in surgical and autopsy series. This non-invasive MRI-based approach could be useful for diagnostic and therapeutic decisions in CTN, and it could help to understand its pathogenesis.
Resumo:
BACKGROUND AND OBJECTIVE: The Lausanne Stroke Registry includes, from 1979, all patients admitted to the department of Neurology of the Lausanne University Hospital with the diagnosis of first clinical stroke. Using the Lausanne Stroke Registry, we aimed to determine trends in risk factors, causes, localization and inhospital mortality over 25 years in hospitalized stroke patients. METHODS: We assessed temporal trends in stroke patients characteristics through the following consecutive periods: 1979-1987, 1988-1995 and 1996-2003. Age-adjusted cardiovascular risk factors, etiologies, stroke localizations and mortality were compared between the three periods. RESULTS: Overall, 5,759 patients were included. Age was significantly different among the analyzed periods (p < 0.001), showing an increment in older patients throughout time. After adjustment for age, hypercholesterolemia increased (p < 0.001), as opposed to cigarette smoking (p < 0.001), hypertension (p < 0.001) and diabetes and hyperglycemia (p < 0.001). In patients with ischemic strokes, there were significant changes in the distribution of causes with an increase in cardioembolic strokes (p < 0.001), and in the localization of strokes with an increase in entire middle cerebral artery (MCA) and posterior circulation strokes together with a decrease in superficial middle cerebral artery stroke (p < 0.001). In patients with hemorrhagic strokes, the thalamic localizations increased, whereas the proportion of striatocapsular hemorrhage decreased (p = 0.022). Except in the older patient group, the mortality rate decreased. CONCLUSIONS: This study shows major trends in the characteristics of stroke patients admitted to a department of neurology over a 25-year time span, which may result from referral biases, development of acute stroke management and possibly from the evolution of cerebrovascular risk factors.
Resumo:
Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.
Resumo:
1. Species distribution models are increasingly used to address conservation questions, so their predictive capacity requires careful evaluation. Previous studies have shown how individual factors used in model construction can affect prediction. Although some factors probably have negligible effects compared to others, their relative effects are largely unknown. 2. We introduce a general "virtual ecologist" framework to study the relative importance of factors involved in the construction of species distribution models. 3. We illustrate the framework by examining the relative importance of five key factors-a missing covariate, spatial autocorrelation due to a dispersal process in presences/absences, sample size, sampling design and modeling technique-in a real study framework based on plants in a mountain landscape at regional scale, and show that, for the parameter values considered here, most of the variation in prediction accuracy is due to sample size and modeling technique. Contrary to repeatedly reported concerns, spatial autocorrelation has only comparatively small effects. 4. This study shows the importance of using a nested statistical framework to evaluate the relative effects of factors that may affect species distribution models.
Resumo:
Tutkimuksen tavoitteena oli selvittää Suomen ja Japanin välisten kulttuurierojen vaikutus valitustenkäsittelyprosessiin ja laatukäsityksiin case-yrityksen ja sen asiakkaiden välillä. Teoreettisen viitekehyksen muodostamisessa käytettiin näkemyksiä kulttuurista, kulttuurienvälisestä viestinnästä, valitustenkäsittelystä ja laatukäsityksistä. Kulttuurierojen tarkastelemiseksi esiteltiin kulttuurien ulottuvuuksia eritteleviä viitekehyksiä ja kulturaalisten tekijöiden vaikutusta viestintään. Suomen ja Japanin kulttuureja esiteltiin myös yksityiskohtaisemmin aikaisempien tutkimusten valossa. Työn empiirisessä osassa tutkittiin case-yrityksen sisäisiä sekä yrityksen ja sen asiakkaiden välisiä näkemyseroja. Tutkimus suoritettiin laadullisena case-tutkimuksena, jossa tarkasteltiin myös toimenpiteitä case-yrityksen liiketoimintaympäristön parantamiseksi. Tarvittava tieto kerättiin kirjallisuudesta, artikkeleista, taustahaastatteluilla sekä haastattelemalla yrityksen henkilöstöä Suomessa ja Japanissa samoin kuin sen japanilaisia asiakkaita. Japanilaiset asiakas/toimittaja-suhteet ovat ulkomaalaiselle yritykselle haastava liiketoimintaympäristö. Luottamuksen rakentaminen pitkällä tähtäimellä vaatii läheistä kommunikointia vastapuolen tuntemiseksi, jotta voidaan kehittää tuotteita paremmiksi ja vähentää valituskustannuksia. Laatuajattelua tulee myös yhdenmukaistaa tuotteiden ja palvelujen laadun parantamiseksi.
Resumo:
Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.
Resumo:
Quality inspection and assurance is a veryimportant step when today's products are sold to markets. As products are produced in vast quantities, the interest to automate quality inspection tasks has increased correspondingly. Quality inspection tasks usuallyrequire the detection of deficiencies, defined as irregularities in this thesis. Objects containing regular patterns appear quite frequently on certain industries and science, e.g. half-tone raster patterns in the printing industry, crystal lattice structures in solid state physics and solder joints and components in the electronics industry. In this thesis, the problem of regular patterns and irregularities is described in analytical form and three different detection methods are proposed. All the methods are based on characteristics of Fourier transform to represent regular information compactly. Fourier transform enables the separation of regular and irregular parts of an image but the three methods presented are shown to differ in generality and computational complexity. Need to detect fine and sparse details is common in quality inspection tasks, e.g., locating smallfractures in components in the electronics industry or detecting tearing from paper samples in the printing industry. In this thesis, a general definition of such details is given by defining sufficient statistical properties in the histogram domain. The analytical definition allowsa quantitative comparison of methods designed for detail detection. Based on the definition, the utilisation of existing thresholding methodsis shown to be well motivated. Comparison of thresholding methods shows that minimum error thresholding outperforms other standard methods. The results are successfully applied to a paper printability and runnability inspection setup. Missing dots from a repeating raster pattern are detected from Heliotest strips and small surface defects from IGT picking papers.