56 resultados para complex structures up to isometry


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The depositional stratigraphy of within-channel deposits in sandy braided rivers is dominated by a variety of barforms (both singular `unit' bars and complex `compound' bars), as well as the infill of individual channels (herein termed `channel fills'). The deposits of bars and channel fills define the key components of facies models for braided rivers and their within-channel heterogeneity, knowledge of which is important for reservoir characterization. However, few studies have sought to address the question of whether the deposits of bars and channel fills can be readily differentiated from each other. This paper presents the first quantitative study to achieve this aim, using aerial images of an evolving modern sandy braided river and geophysical imaging of its subsurface deposits. Aerial photographs taken between 2000 and 2004 document the abandonment and fill of a 1 3 km long, 80 m wide anabranch channel in the sandy braided South Saskatchewan River, Canada. Upstream river regulation traps the majority of very fine sediment and there is little clay (<1%) in the bed sediments. Channel abandonment was initiated by a series of unit bars that stalled and progressively blocked the anabranch entrance, together with dune deposition and stacking at the anabranch entrance and exit. Complete channel abandonment and subsequent fill of up to 3 m of sediment took approximately two years. Thirteen kilometres of ground-penetrating radar surveys, coupled with 18 cores, were obtained over the channel fill and an adjacent 750 m long, 400 m wide, compound bar, enabling a quantitative analysis of the channel and bar deposits. Results show that, in terms of grain-size trends, facies proportions and scale of deposits, there are only subtle differences between the channel fill and bar deposits which, therefore, renders them indistinguishable. Thus, it may be inappropriate to assign different geometric and sedimentological attributes to channel fill and bar facies in object-based models of sandy braided river alluvial architecture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Un système efficace de sismique tridimensionnelle (3-D) haute-résolution adapté à des cibles lacustres de petite échelle a été développé. Dans le Lac Léman, près de la ville de Lausanne, en Suisse, des investigations récentes en deux dimension (2-D) ont mis en évidence une zone de faille complexe qui a été choisie pour tester notre système. Les structures observées incluent une couche mince (<40 m) de sédiments quaternaires sub-horizontaux, discordants sur des couches tertiaires de molasse pentées vers le sud-est. On observe aussi la zone de faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau de la Molasse Subalpine. Deux campagnes 3-D complètes, d?environ d?un kilomètre carré, ont été réalisées sur ce site de test. La campagne pilote (campagne I), effectuée en 1999 pendant 8 jours, a couvert 80 profils en utilisant une seule flûte. Pendant la campagne II (9 jours en 2001), le nouveau système trois-flûtes, bien paramétrés pour notre objectif, a permis l?acquisition de données de très haute qualité sur 180 lignes CMP. Les améliorations principales incluent un système de navigation et de déclenchement de tirs grâce à un nouveau logiciel. Celui-ci comprend un contrôle qualité de la navigation du bateau en temps réel utilisant un GPS différentiel (dGPS) à bord et une station de référence près du bord du lac. De cette façon, les tirs peuvent être déclenchés tous les 5 mètres avec une erreur maximale non-cumulative de 25 centimètres. Tandis que pour la campagne I la position des récepteurs de la flûte 48-traces a dû être déduite à partir des positions du bateau, pour la campagne II elle ont pu être calculées précisément (erreur <20 cm) grâce aux trois antennes dGPS supplémentaires placées sur des flotteurs attachés à l?extrémité de chaque flûte 24-traces. Il est maintenant possible de déterminer la dérive éventuelle de l?extrémité des flûtes (75 m) causée par des courants latéraux ou de petites variations de trajet du bateau. De plus, la construction de deux bras télescopiques maintenant les trois flûtes à une distance de 7.5 m les uns des autres, qui est la même distance que celle entre les lignes naviguées de la campagne II. En combinaison avec un espacement de récepteurs de 2.5 m, la dimension de chaque «bin» de données 3-D de la campagne II est de 1.25 m en ligne et 3.75 m latéralement. L?espacement plus grand en direction « in-line » par rapport à la direction «cross-line» est justifié par l?orientation structurale de la zone de faille perpendiculaire à la direction «in-line». L?incertitude sur la navigation et le positionnement pendant la campagne I et le «binning» imprécis qui en résulte, se retrouve dans les données sous forme d?une certaine discontinuité des réflecteurs. L?utilisation d?un canon à air à doublechambre (qui permet d?atténuer l?effet bulle) a pu réduire l?aliasing observé dans les sections migrées en 3-D. Celui-ci était dû à la combinaison du contenu relativement haute fréquence (<2000 Hz) du canon à eau (utilisé à 140 bars et à 0.3 m de profondeur) et d?un pas d?échantillonnage latéral insuffisant. Le Mini G.I 15/15 a été utilisé à 80 bars et à 1 m de profondeur, est mieux adapté à la complexité de la cible, une zone faillée ayant des réflecteurs pentés jusqu?à 30°. Bien que ses fréquences ne dépassent pas les 650 Hz, cette source combine une pénétration du signal non-aliasé jusqu?à 300 m dans le sol (par rapport au 145 m pour le canon à eau) pour une résolution verticale maximale de 1.1 m. Tandis que la campagne I a été acquise par groupes de plusieurs lignes de directions alternées, l?optimisation du temps d?acquisition du nouveau système à trois flûtes permet l?acquisition en géométrie parallèle, ce qui est préférable lorsqu?on utilise une configuration asymétrique (une source et un dispositif de récepteurs). Si on ne procède pas ainsi, les stacks sont différents selon la direction. Toutefois, la configuration de flûtes, plus courtes que pour la compagne I, a réduit la couverture nominale, la ramenant de 12 à 6. Une séquence classique de traitement 3-D a été adaptée à l?échantillonnage à haute fréquence et elle a été complétée par deux programmes qui transforment le format non-conventionnel de nos données de navigation en un format standard de l?industrie. Dans l?ordre, le traitement comprend l?incorporation de la géométrie, suivi de l?édition des traces, de l?harmonisation des «bins» (pour compenser l?inhomogénéité de la couverture due à la dérive du bateau et de la flûte), de la correction de la divergence sphérique, du filtrage passe-bande, de l?analyse de vitesse, de la correction DMO en 3-D, du stack et enfin de la migration 3-D en temps. D?analyses de vitesse détaillées ont été effectuées sur les données de couverture 12, une ligne sur deux et tous les 50 CMP, soit un nombre total de 600 spectres de semblance. Selon cette analyse, les vitesses d?intervalles varient de 1450-1650 m/s dans les sédiments non-consolidés et de 1650-3000 m/s dans les sédiments consolidés. Le fait que l?on puisse interpréter plusieurs horizons et surfaces de faille dans le cube, montre le potentiel de cette technique pour une interprétation tectonique et géologique à petite échelle en trois dimensions. On distingue cinq faciès sismiques principaux et leurs géométries 3-D détaillées sur des sections verticales et horizontales: les sédiments lacustres (Holocène), les sédiments glacio-lacustres (Pléistocène), la Molasse du Plateau, la Molasse Subalpine de la zone de faille (chevauchement) et la Molasse Subalpine au sud de cette zone. Les couches de la Molasse du Plateau et de la Molasse Subalpine ont respectivement un pendage de ~8° et ~20°. La zone de faille comprend de nombreuses structures très déformées de pendage d?environ 30°. Des tests préliminaires avec un algorithme de migration 3-D en profondeur avant sommation et à amplitudes préservées démontrent que la qualité excellente des données de la campagne II permet l?application de telles techniques à des campagnes haute-résolution. La méthode de sismique marine 3-D était utilisée jusqu?à présent quasi-exclusivement par l?industrie pétrolière. Son adaptation à une échelle plus petite géographiquement mais aussi financièrement a ouvert la voie d?appliquer cette technique à des objectifs d?environnement et du génie civil.<br/><br/>An efficient high-resolution three-dimensional (3-D) seismic reflection system for small-scale targets in lacustrine settings was developed. In Lake Geneva, near the city of Lausanne, Switzerland, past high-resolution two-dimensional (2-D) investigations revealed a complex fault zone (the Paudèze thrust zone), which was subsequently chosen for testing our system. Observed structures include a thin (<40 m) layer of subhorizontal Quaternary sediments that unconformably overlie southeast-dipping Tertiary Molasse beds and the Paudèze thrust zone, which separates Plateau and Subalpine Molasse units. Two complete 3-D surveys have been conducted over this same test site, covering an area of about 1 km2. In 1999, a pilot survey (Survey I), comprising 80 profiles, was carried out in 8 days with a single-streamer configuration. In 2001, a second survey (Survey II) used a newly developed three-streamer system with optimized design parameters, which provided an exceptionally high-quality data set of 180 common midpoint (CMP) lines in 9 days. The main improvements include a navigation and shot-triggering system with in-house navigation software that automatically fires the gun in combination with real-time control on navigation quality using differential GPS (dGPS) onboard and a reference base near the lake shore. Shots were triggered at 5-m intervals with a maximum non-cumulative error of 25 cm. Whereas the single 48-channel streamer system of Survey I requires extrapolation of receiver positions from the boat position, for Survey II they could be accurately calculated (error <20 cm) with the aid of three additional dGPS antennas mounted on rafts attached to the end of each of the 24- channel streamers. Towed at a distance of 75 m behind the vessel, they allow the determination of feathering due to cross-line currents or small course variations. Furthermore, two retractable booms hold the three streamers at a distance of 7.5 m from each other, which is the same distance as the sail line interval for Survey I. With a receiver spacing of 2.5 m, the bin dimension of the 3-D data of Survey II is 1.25 m in in-line direction and 3.75 m in cross-line direction. The greater cross-line versus in-line spacing is justified by the known structural trend of the fault zone perpendicular to the in-line direction. The data from Survey I showed some reflection discontinuity as a result of insufficiently accurate navigation and positioning and subsequent binning errors. Observed aliasing in the 3-D migration was due to insufficient lateral sampling combined with the relatively high frequency (<2000 Hz) content of the water gun source (operated at 140 bars and 0.3 m depth). These results motivated the use of a double-chamber bubble-canceling air gun for Survey II. A 15 / 15 Mini G.I air gun operated at 80 bars and 1 m depth, proved to be better adapted for imaging the complexly faulted target area, which has reflectors dipping up to 30°. Although its frequencies do not exceed 650 Hz, this air gun combines a penetration of non-aliased signal to depths of 300 m below the water bottom (versus 145 m for the water gun) with a maximum vertical resolution of 1.1 m. While Survey I was shot in patches of alternating directions, the optimized surveying time of the new threestreamer system allowed acquisition in parallel geometry, which is preferable when using an asymmetric configuration (single source and receiver array). Otherwise, resulting stacks are different for the opposite directions. However, the shorter streamer configuration of Survey II reduced the nominal fold from 12 to 6. A 3-D conventional processing flow was adapted to the high sampling rates and was complemented by two computer programs that format the unconventional navigation data to industry standards. Processing included trace editing, geometry assignment, bin harmonization (to compensate for uneven fold due to boat/streamer drift), spherical divergence correction, bandpass filtering, velocity analysis, 3-D DMO correction, stack and 3-D time migration. A detailed semblance velocity analysis was performed on the 12-fold data set for every second in-line and every 50th CMP, i.e. on a total of 600 spectra. According to this velocity analysis, interval velocities range from 1450-1650 m/s for the unconsolidated sediments and from 1650-3000 m/s for the consolidated sediments. Delineation of several horizons and fault surfaces reveal the potential for small-scale geologic and tectonic interpretation in three dimensions. Five major seismic facies and their detailed 3-D geometries can be distinguished in vertical and horizontal sections: lacustrine sediments (Holocene) , glaciolacustrine sediments (Pleistocene), Plateau Molasse, Subalpine Molasse and its thrust fault zone. Dips of beds within Plateau and Subalpine Molasse are ~8° and ~20°, respectively. Within the fault zone, many highly deformed structures with dips around 30° are visible. Preliminary tests with 3-D preserved-amplitude prestack depth migration demonstrate that the excellent data quality of Survey II allows application of such sophisticated techniques even to high-resolution seismic surveys. In general, the adaptation of the 3-D marine seismic reflection method, which to date has almost exclusively been used by the oil exploration industry, to a smaller geographical as well as financial scale has helped pave the way for applying this technique to environmental and engineering purposes.<br/><br/>La sismique réflexion est une méthode d?investigation du sous-sol avec un très grand pouvoir de résolution. Elle consiste à envoyer des vibrations dans le sol et à recueillir les ondes qui se réfléchissent sur les discontinuités géologiques à différentes profondeurs et remontent ensuite à la surface où elles sont enregistrées. Les signaux ainsi recueillis donnent non seulement des informations sur la nature des couches en présence et leur géométrie, mais ils permettent aussi de faire une interprétation géologique du sous-sol. Par exemple, dans le cas de roches sédimentaires, les profils de sismique réflexion permettent de déterminer leur mode de dépôt, leurs éventuelles déformations ou cassures et donc leur histoire tectonique. La sismique réflexion est la méthode principale de l?exploration pétrolière. Pendant longtemps on a réalisé des profils de sismique réflexion le long de profils qui fournissent une image du sous-sol en deux dimensions. Les images ainsi obtenues ne sont que partiellement exactes, puisqu?elles ne tiennent pas compte de l?aspect tridimensionnel des structures géologiques. Depuis quelques dizaines d?années, la sismique en trois dimensions (3-D) a apporté un souffle nouveau à l?étude du sous-sol. Si elle est aujourd?hui parfaitement maîtrisée pour l?imagerie des grandes structures géologiques tant dans le domaine terrestre que le domaine océanique, son adaptation à l?échelle lacustre ou fluviale n?a encore fait l?objet que de rares études. Ce travail de thèse a consisté à développer un système d?acquisition sismique similaire à celui utilisé pour la prospection pétrolière en mer, mais adapté aux lacs. Il est donc de dimension moindre, de mise en oeuvre plus légère et surtout d?une résolution des images finales beaucoup plus élevée. Alors que l?industrie pétrolière se limite souvent à une résolution de l?ordre de la dizaine de mètres, l?instrument qui a été mis au point dans le cadre de ce travail permet de voir des détails de l?ordre du mètre. Le nouveau système repose sur la possibilité d?enregistrer simultanément les réflexions sismiques sur trois câbles sismiques (ou flûtes) de 24 traces chacun. Pour obtenir des données 3-D, il est essentiel de positionner les instruments sur l?eau (source et récepteurs des ondes sismiques) avec une grande précision. Un logiciel a été spécialement développé pour le contrôle de la navigation et le déclenchement des tirs de la source sismique en utilisant des récepteurs GPS différentiel (dGPS) sur le bateau et à l?extrémité de chaque flûte. Ceci permet de positionner les instruments avec une précision de l?ordre de 20 cm. Pour tester notre système, nous avons choisi une zone sur le Lac Léman, près de la ville de Lausanne, où passe la faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau et de la Molasse Subalpine. Deux campagnes de mesures de sismique 3-D y ont été réalisées sur une zone d?environ 1 km2. Les enregistrements sismiques ont ensuite été traités pour les transformer en images interprétables. Nous avons appliqué une séquence de traitement 3-D spécialement adaptée à nos données, notamment en ce qui concerne le positionnement. Après traitement, les données font apparaître différents faciès sismiques principaux correspondant notamment aux sédiments lacustres (Holocène), aux sédiments glacio-lacustres (Pléistocène), à la Molasse du Plateau, à la Molasse Subalpine de la zone de faille et la Molasse Subalpine au sud de cette zone. La géométrie 3-D détaillée des failles est visible sur les sections sismiques verticales et horizontales. L?excellente qualité des données et l?interprétation de plusieurs horizons et surfaces de faille montrent le potentiel de cette technique pour les investigations à petite échelle en trois dimensions ce qui ouvre des voies à son application dans les domaines de l?environnement et du génie civil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: Many nanomaterials (materials with structures smaller than 100 nm) have chemical, physical and bioactive characteristics of interest for novel applications. Considerable research efforts have been launched in this field. This study aimed to study exposure scenarios commonly encountered in research settings. METHODS: We studied one of the leading Swiss universities and first identified all research units dealing with nanomaterials. After a preliminary evaluation of quantities and process types used, a detailed analysis was conducted in units where more than a few micrograms were used per week. RESULTS: In the investigated laboratories, background levels were usually low and in the range of a few thousand particles per cubic centimeter. Powder applications resulted in concentrations of 10,000 to 100,000 particles/cm(3) when measured inside fume hoods, but there were no or mostly minimal increases in the breathing zone of researchers. Mostly low exposures were observed for activities involving liquid applications. However, centrifugation and lyophilization of nanoparticle-containing solutions resulted in high particle number levels (up to 300,000 particles/cm(3)) in work spaces where researchers did not always wear respiratory protection. No significant increases were found for processes involving nanoparticles bound to surfaces, nor were they found in laboratories that were visualizing properties and structure of small amounts of nanomaterials. CONCLUSIONS: Research activities in modern laboratories equipped with control techniques were associated with minimal releases of nanomaterials into the working space. However, the focus should not only be on processes involving nanopowders but should also be on processes involving nanoparticle-containing liquids, especially if the work involves physical agitation, aerosolization or drying of the liquids.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The second scientific meeting of the European systems genetics network for the study of complex genetic human disease using genetic reference populations (SYSGENET) took place at the Center for Cooperative Research in Biosciences in Bilbao, Spain, December 10-12, 2012. SYSGENET is funded by the European Cooperation in the Field of Scientific and Technological Research (COST) and represents a network of scientists in Europe that use mouse genetic reference populations (GRPs) to identify complex genetic factors influencing disease phenotypes (Schughart, Mamm Genome 21:331-336, 2010). About 50 researchers working in the field of systems genetics attended the meeting, which consisted of 27 oral presentations, a poster session, and a management committee meeting. Participants exchanged results, set up future collaborations, and shared phenotyping and data analysis methodologies. This meeting was particularly instrumental for conveying the current status of the US, Israeli, and Australian Collaborative Cross (CC) mouse GRP. The CC is an open source project initiated nearly a decade ago by members of the Complex Trait Consortium to aid the mapping of multigenetic traits (Threadgill, Mamm Genome 13:175-178, 2002). In addition, representatives of the International Mouse Phenotyping Consortium were invited to exchange ongoing activities between the knockout and complex genetics communities and to discuss and explore potential fields for future interactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the effectiveness of a complex intervention implementing best practice guidelines recommending clinicians screen and counsel young people across multiple psychosocial risk factors, on clinicians' detection of health risks and patients' risk taking behaviour, compared to a didactic seminar on young people's health. DESIGN: Pragmatic cluster randomised trial where volunteer general practices were stratified by postcode advantage or disadvantage score and billing type (private, free national health, community health centre), then randomised into either intervention or comparison arms using a computer generated random sequence. Three months post-intervention, patients were recruited from all practices post-consultation for a Computer Assisted Telephone Interview and followed up three and 12 months later. Researchers recruiting, consenting and interviewing patients and patients themselves were masked to allocation status; clinicians were not. SETTING: General practices in metropolitan and rural Victoria, Australia. PARTICIPANTS: General practices with at least one interested clinician (general practitioner or nurse) and their 14-24 year old patients. INTERVENTION: This complex intervention was designed using evidence based practice in learning and change in clinician behaviour and general practice systems, and included best practice approaches to motivating change in adolescent risk taking behaviours. The intervention involved training clinicians (nine hours) in health risk screening, use of a screening tool and motivational interviewing; training all practice staff (receptionists and clinicians) in engaging youth; provision of feedback to clinicians of patients' risk data; and two practice visits to support new screening and referral resources. Comparison clinicians received one didactic educational seminar (three hours) on engaging youth and health risk screening. OUTCOME MEASURES: Primary outcomes were patient report of (1) clinician detection of at least one of six health risk behaviours (tobacco, alcohol and illicit drug use, risks for sexually transmitted infection, STI, unplanned pregnancy, and road risks); and (2) change in one or more of the six health risk behaviours, at three months or at 12 months. Secondary outcomes were likelihood of future visits, trust in the clinician after exit interview, clinician detection of emotional distress and fear and abuse in relationships, and emotional distress at three and 12 months. Patient acceptability of the screening tool was also described for the intervention arm. Analyses were adjusted for practice location and billing type, patients' sex, age, and recruitment method, and past health risks, where appropriate. An intention to treat analysis approach was used, which included multilevel multiple imputation for missing outcome data. RESULTS: 42 practices were randomly allocated to intervention or comparison arms. Two intervention practices withdrew post allocation, prior to training, leaving 19 intervention (53 clinicians, 377 patients) and 21 comparison (79 clinicians, 524 patients) practices. 69% of patients in both intervention (260) and comparison (360) arms completed the 12 month follow-up. Intervention clinicians discussed more health risks per patient (59.7%) than comparison clinicians (52.7%) and thus were more likely to detect a higher proportion of young people with at least one of the six health risk behaviours (38.4% vs 26.7%, risk difference [RD] 11.6%, Confidence Interval [CI] 2.93% to 20.3%; adjusted odds ratio [OR] 1.7, CI 1.1 to 2.5). Patients reported less illicit drug use (RD -6.0, CI -11 to -1.2; OR 0·52, CI 0·28 to 0·96), and less risk for STI (RD -5.4, CI -11 to 0.2; OR 0·66, CI 0·46 to 0·96) at three months in the intervention relative to the comparison arm, and for unplanned pregnancy at 12 months (RD -4.4; CI -8.7 to -0.1; OR 0·40, CI 0·20 to 0·80). No differences were detected between arms on other health risks. There were no differences on secondary outcomes, apart from a greater detection of abuse (OR 13.8, CI 1.71 to 111). There were no reports of harmful events and intervention arm youth had high acceptance of the screening tool. CONCLUSIONS: A complex intervention, compared to a simple educational seminar for practices, improved detection of health risk behaviours in young people. Impact on health outcomes was inconclusive. Technology enabling more efficient, systematic health-risk screening may allow providers to target counselling toward higher risk individuals. Further trials require more power to confirm health benefits. TRIAL REGISTRATION: ISRCTN.com ISRCTN16059206.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, Species Distribution Models (SDMs) are a widely used tool. Using different statistical approaches these models reconstruct the realized niche of a species using presence data and a set of variables, often topoclimatic. There utilization range is quite large from understanding single species requirements, to the creation of nature reserve based on species hotspots, or modeling of climate change impact, etc... Most of the time these models are using variables at a resolution of 50km x 50km or 1 km x 1 km. However in some cases these models are used with resolutions below the kilometer scale and thus called high resolution models (100 m x 100 m or 25 m x 25 m). Quite recently a new kind of data has emerged enabling precision up to lm x lm and thus allowing very high resolution modeling. However these new variables are very costly and need an important amount of time to be processed. This is especially the case when these variables are used in complex calculation like models projections over large areas. Moreover the importance of very high resolution data in SDMs has not been assessed yet and is not well understood. Some basic knowledge on what drive species presence-absences is still missing. Indeed, it is not clear whether in mountain areas like the Alps coarse topoclimatic gradients are driving species distributions or if fine scale temperature or topography are more important or if their importance can be neglected when balance to competition or stochasticity. In this thesis I investigated the importance of very high resolution data (2-5m) in species distribution models using either very high resolution topographic, climatic or edaphic variables over a 2000m elevation gradient in the Western Swiss Alps. I also investigated more local responses of these variables for a subset of species living in this area at two precise elvation belts. During this thesis I showed that high resolution data necessitates very good datasets (species and variables for the models) to produce satisfactory results. Indeed, in mountain areas, temperature is the most important factor driving species distribution and needs to be modeled at very fine resolution instead of being interpolated over large surface to produce satisfactory results. Despite the instinctive idea that topographic should be very important at high resolution, results are mitigated. However looking at the importance of variables over a large gradient buffers the importance of the variables. Indeed topographic factors have been shown to be highly important at the subalpine level but their importance decrease at lower elevations. Wether at the mountane level edaphic and land use factors are more important high resolution topographic data is more imporatant at the subalpine level. Finally the biggest improvement in the models happens when edaphic variables are added. Indeed, adding soil variables is of high importance and variables like pH are overpassing the usual topographic variables in SDMs in term of importance in the models. To conclude high resolution is very important in modeling but necessitate very good datasets. Only increasing the resolution of the usual topoclimatic predictors is not sufficient and the use of edaphic predictors has been highlighted as fundamental to produce significantly better models. This is of primary importance, especially if these models are used to reconstruct communities or as basis for biodiversity assessments. -- Ces dernières années, l'utilisation des modèles de distribution d'espèces (SDMs) a continuellement augmenté. Ces modèles utilisent différents outils statistiques afin de reconstruire la niche réalisée d'une espèce à l'aide de variables, notamment climatiques ou topographiques, et de données de présence récoltées sur le terrain. Leur utilisation couvre de nombreux domaines allant de l'étude de l'écologie d'une espèce à la reconstruction de communautés ou à l'impact du réchauffement climatique. La plupart du temps, ces modèles utilisent des occur-rences issues des bases de données mondiales à une résolution plutôt large (1 km ou même 50 km). Certaines bases de données permettent cependant de travailler à haute résolution, par conséquent de descendre en dessous de l'échelle du kilomètre et de travailler avec des résolutions de 100 m x 100 m ou de 25 m x 25 m. Récemment, une nouvelle génération de données à très haute résolution est apparue et permet de travailler à l'échelle du mètre. Les variables qui peuvent être générées sur la base de ces nouvelles données sont cependant très coûteuses et nécessitent un temps conséquent quant à leur traitement. En effet, tout calcul statistique complexe, comme des projections de distribution d'espèces sur de larges surfaces, demande des calculateurs puissants et beaucoup de temps. De plus, les facteurs régissant la distribution des espèces à fine échelle sont encore mal connus et l'importance de variables à haute résolution comme la microtopographie ou la température dans les modèles n'est pas certaine. D'autres facteurs comme la compétition ou la stochasticité naturelle pourraient avoir une influence toute aussi forte. C'est dans ce contexte que se situe mon travail de thèse. J'ai cherché à comprendre l'importance de la haute résolution dans les modèles de distribution d'espèces, que ce soit pour la température, la microtopographie ou les variables édaphiques le long d'un important gradient d'altitude dans les Préalpes vaudoises. J'ai également cherché à comprendre l'impact local de certaines variables potentiellement négligées en raison d'effets confondants le long du gradient altitudinal. Durant cette thèse, j'ai pu monter que les variables à haute résolution, qu'elles soient liées à la température ou à la microtopographie, ne permettent qu'une amélioration substantielle des modèles. Afin de distinguer une amélioration conséquente, il est nécessaire de travailler avec des jeux de données plus importants, tant au niveau des espèces que des variables utilisées. Par exemple, les couches climatiques habituellement interpolées doivent être remplacées par des couches de température modélisées à haute résolution sur la base de données de terrain. Le fait de travailler le long d'un gradient de température de 2000m rend naturellement la température très importante au niveau des modèles. L'importance de la microtopographie est négligeable par rapport à la topographie à une résolution de 25m. Cependant, lorsque l'on regarde à une échelle plus locale, la haute résolution est une variable extrêmement importante dans le milieu subalpin. À l'étage montagnard par contre, les variables liées aux sols et à l'utilisation du sol sont très importantes. Finalement, les modèles de distribution d'espèces ont été particulièrement améliorés par l'addition de variables édaphiques, principalement le pH, dont l'importance supplante ou égale les variables topographique lors de leur ajout aux modèles de distribution d'espèces habituels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adoptive cell transfer using engineered T cells is emerging as a promising treatment for metastatic melanoma. Such an approach allows one to introduce T cell receptor (TCR) modifications that, while maintaining the specificity for the targeted antigen, can enhance the binding and kinetic parameters for the interaction with peptides (p) bound to major histocompatibility complexes (MHC). Using the well-characterized 2C TCR/SIYR/H-2K(b) structure as a model system, we demonstrated that a binding free energy decomposition based on the MM-GBSA approach provides a detailed and reliable description of the TCR/pMHC interactions at the structural and thermodynamic levels. Starting from this result, we developed a new structure-based approach, to rationally design new TCR sequences, and applied it to the BC1 TCR targeting the HLA-A2 restricted NY-ESO-1157-165 cancer-testis epitope. Fifty-four percent of the designed sequence replacements exhibited improved pMHC binding as compared to the native TCR, with up to 150-fold increase in affinity, while preserving specificity. Genetically engineered CD8(+) T cells expressing these modified TCRs showed an improved functional activity compared to those expressing BC1 TCR. We measured maximum levels of activities for TCRs within the upper limit of natural affinity, K D = ∼1 - 5 μM. Beyond the affinity threshold at K D < 1 μM we observed an attenuation in cellular function, in line with the "half-life" model of T cell activation. Our computer-aided protein-engineering approach requires the 3D-structure of the TCR-pMHC complex of interest, which can be obtained from X-ray crystallography. We have also developed a homology modeling-based approach, TCRep 3D, to obtain accurate structural models of any TCR-pMHC complexes when experimental data is not available. Since the accuracy of the models depends on the prediction of the TCR orientation over pMHC, we have complemented the approach with a simplified rigid method to predict this orientation and successfully assessed it using all non-redundant TCR-pMHC crystal structures available. These methods potentially extend the use of our TCR engineering method to entire TCR repertoires for which no X-ray structure is available. We have also performed a steered molecular dynamics study of the unbinding of the TCR-pMHC complex to get a better understanding of how TCRs interact with pMHCs. This entire rational TCR design pipeline is now being used to produce rationally optimized TCRs for adoptive cell therapies of stage IV melanoma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During a scientific field expedition to the Alai-Pamir range five specimens of the genus Gloydius have been collected in the larger Alai. A morphological and genetical examination of the specimens has shown that they are part of the G. halys complex, but represent a new taxon which is characterized by the following unique character combination: It is a slender and moderately stout small snake, up to 479 mm total length. The head has nine symmetrical plates on the upper head, 7 supralabial and 8-9 infralabial scales. Body scales in 20-22 rows around midbody, 143-156 ventral and 35-45 usually paired subcaudal scales. The cloacal plate not divided. The general coloration consists of various different tones of olive, tan and brown, having a distinct head, but an indistinct body pattern with, excluding the tail, 26-29 transverse crossbands, which are not extending to the sides of the body. The haplotype network shows the new species within the G. halys complex and close related to both, G. h. halys and G. h. caraganus. So far the new described species is only known from the Alai range. However, various Gloydius specimens are found in Kyrgyzstan and because of the complicated taxonomy those specimens have to re-identified to clarify their status and the status of the new species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Early readmission is the major success indicator of the transition between hospital and home. Patients admitted with heart failure reach a 20% rate. Potentially avoidable readmissions, defined as unpredictable and related to a known condition during index hospitalization, represent the improvement margin. For these latter, implementation of specific interventions can be effective. Complex interventions on transition, including several modalities and seeking to encourage patient autonomy seem more effective than others. We describe two models: a pragmatic one developed in a regional hospital, and a more complex one developed in a university hospital during the LEAR-HF study. In both cases, it is imperative to work on "medical liability": should it extend beyond discharge up to the threshold of the private practice?

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The illicit drug cutting represents a complex problem that requires the sharing of knowledge from addiction studies, toxicology, criminology and criminalistics. Therefore, cutting is not well known by the forensic community. Thus, this review aims at deciphering the different aspects of cutting, by gathering information mainly from criminology and criminalistics. It tackles essentially specificities of cocaine and heroin cutting. The article presents the detected cutting agents (adulterants and diluents), their evolution in time and space and the analytical methodology implemented by forensic laboratories. Furthermore, it discusses when, in the history of the illicit drug, cutting may take place. Moreover, researches studying how much cutting occurs in the country of destination are analysed. Lastly, the reasons for cutting are addressed. According to the literature, adulterants are added during production of the illicit drug or at a relatively high level of its distribution chain (e.g. before the product arrives in the country of destination or just after its importation in the latter). Their addition seems hardly justified by the only desire to increase profits or to harm consumers' health. Instead, adulteration would be performed to enhance or to mimic the illicit drug effects or to facilitate administration of the drug. Nowadays, caffeine, diltiazem, hydroxyzine, levamisole, lidocaïne and phenacetin are frequently detected in cocaine specimens, while paracetamol and caffeine are almost exclusively identified in heroin specimens. This may reveal differences in the respective structures of production and/or distribution of cocaine and heroin. As the relevant information about cutting is spread across different scientific fields, a close collaboration should be set up to collect essential and unified data to improve knowledge and provide information for monitoring, control and harm reduction purposes. More research, on several areas of investigation, should be carried out to gather relevant information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Abdominoperineal resection (APR) following radiotherapy is associated with a high rate of perineal wound complications. The anterolateral thigh (ALT) flap, combined with the vastus lateralis (VL) muscle, can cover complex perineal and pelvic anteroposterior defects. This is used for the first time transabdominally through the pelvis and the perineum (TAPP) in the infero-posterior directions; this technique has been described and illustrated in this study. METHODS: Among over 90 patients who underwent perineal reconstruction between May 2004 and June 2011, six patients presented high-grade tumours invading perineum, pelvis and sacrum, thereby resulting in a continuous anteroposterior defect. ALT + VL TAPP reconstructions were performed after extended APR and, subsequently, sacrectomy. Patients were examined retrospectively to determine demographics, operative time, complications (general and flap-related), time to complete healing and length of hospital stay. Long-term flap coverage, flap volume stability and functional and aesthetic outcomes were assessed. RESULTS: Mean operating time of the reconstruction was 290 min. No deaths occurred. One patient presented partial flap necrosis. Another patient presented a novel wound dehiscence after flap healing, due to secondary skin dissemination of the primary tumour. Following volumetric flap analysis on serial post-operative CT scans, no significant flap atrophy was observed. All flaps fully covered the defects. No late complications such as fistulas or perineal hernias occurred. Donor-site recovery was uneventful with no functional deficits. CONCLUSIONS: The use of the ALT + VL flap transabdominally is an innovative method to reconstruct exceptionally complex perineal and pelvic defects extending up to the lower back. This flap guarantees superior bulk, obliterating all pelvic dead space, with the fascia lata (FL) supporting the pelvic floor.