274 resultados para 3D modeling
Resumo:
RESUME Les évidences montrant que les changements globaux affectent la biodiversité s'accumulent. Les facteurs les plus influant dans ce processus sont les changements et destructions d'habitat, l'expansion des espèces envahissantes et l'impact des changements climatiques. Une évaluation pertinente de la réponse des espèces face à ces changements est essentielle pour proposer des mesures permettant de réduire le déclin actuel de la biodiversité. La modélisation de la répartition d'espèces basée sur la niche (NBM) est l'un des rares outils permettant cette évaluation. Néanmoins, leur application dans le contexte des changements globaux repose sur des hypothèses restrictives et demande une interprétation critique. Ce travail présente une série d'études de cas investiguant les possibilités et limitations de cette approche pour prédire l'impact des changements globaux. Deux études traitant des menaces sur les espèces rares et en danger d'extinction sont présentées. Les caractéristiques éco-géographiques de 118 plantes avec un haut degré de priorité de conservation sont revues. La prévalence des types de rareté sont analysées en relation avec leur risque d'extinction UICN. La revue souligne l'importance de la conservation à l'échelle régionale. Une évaluation de la rareté à échelle globale peut être trompeuse pour certaine espèces car elle ne tient pas en compte des différents degrés de rareté que présente une espèce à différentes échelles spatiales. La deuxième étude test une approche pour améliorer l'échantillonnage d'espèces rares en incluant des phases itératives de modélisation et d'échantillonnage sur le terrain. L'application de l'approche en biologie de la conservation (illustrée ici par le cas du chardon bleu, Eryngium alpinum), permettrait de réduire le temps et les coûts d'échantillonnage. Deux études sur l'impact des changements climatiques sur la faune et la flore africaine sont présentées. La première étude évalue la sensibilité de 227 mammifères africains face aux climatiques d'ici 2050. Elle montre qu'un nombre important d'espèces pourrait être bientôt en danger d'extinction et que les parcs nationaux africains (principalement ceux situé en milieux xériques) pourraient ne pas remplir leur mandat de protection de la biodiversité dans le futur. La seconde étude modélise l'aire de répartition en 2050 de 975 espèces de plantes endémiques du sud de l'Afrique. L'étude propose l'inclusion de méthodes améliorant la prédiction des risques liés aux changements climatiques. Elle propose également une méthode pour estimer a priori la sensibilité d'une espèce aux changements climatiques à partir de ses propriétés écologiques et des caractéristiques de son aire de répartition. Trois études illustrent l'utilisation des modèles dans l'étude des invasions biologiques. Une première étude relate l'expansion de la laitue sáuvage (Lactuca serriola) vers le nord de l'Europe en lien avec les changements du climat depuis 250 ans. La deuxième étude analyse le potentiel d'invasion de la centaurée tachetée (Centaures maculosa), une mauvaise herbe importée en Amérique du nord vers 1890. L'étude apporte la preuve qu'une espèce envahissante peut occuper une niche climatique différente après introduction sur un autre continent. Les modèles basés sur l'aire native prédisent de manière incorrecte l'entier de l'aire envahie mais permettent de prévoir les aires d'introductions potentielles. Une méthode alternative, incluant la calibration du modèle à partir des deux aires où l'espèce est présente, est proposée pour améliorer les prédictions de l'invasion en Amérique du nord. Je présente finalement une revue de la littérature sur la dynamique de la niche écologique dans le temps et l'espace. Elle synthétise les récents développements théoriques concernant le conservatisme de la niche et propose des solutions pour améliorer la pertinence des prédictions d'impact des changements climatiques et des invasions biologiques. SUMMARY Evidences are accumulating that biodiversity is facing the effects of global change. The most influential drivers of change in ecosystems are land-use change, alien species invasions and climate change impacts. Accurate projections of species' responses to these changes are needed to propose mitigation measures to slow down the on-going erosion of biodiversity. Niche-based models (NBM) currently represent one of the only tools for such projections. However, their application in the context of global changes relies on restrictive assumptions, calling for cautious interpretations. In this thesis I aim to assess the effectiveness and shortcomings of niche-based models for the study of global change impacts on biodiversity through the investigation of specific, unsolved limitations and suggestion of new approaches. Two studies investigating threats to rare and endangered plants are presented. I review the ecogeographic characteristic of 118 endangered plants with high conservation priority in Switzerland. The prevalence of rarity types among plant species is analyzed in relation to IUCN extinction risks. The review underlines the importance of regional vs. global conservation and shows that a global assessment of rarity might be misleading for some species because it can fail to account for different degrees of rarity at a variety of spatial scales. The second study tests a modeling framework including iterative steps of modeling and field surveys to improve the sampling of rare species. The approach is illustrated with a rare alpine plant, Eryngium alpinum and shows promise for complementing conservation practices and reducing sampling costs. Two studies illustrate the impacts of climate change on African taxa. The first one assesses the sensitivity of 277 mammals at African scale to climate change by 2050 in terms of species richness and turnover. It shows that a substantial number of species could be critically endangered in the future. National parks situated in xeric ecosystems are not expected to meet their mandate of protecting current species diversity in the future. The second study model the distribution in 2050 of 975 endemic plant species in southern Africa. The study proposes the inclusion of new methodological insights improving the accuracy and ecological realism of predictions of global changes studies. It also investigates the possibility to estimate a priori the sensitivity of a species to climate change from the geographical distribution and ecological proprieties of the species. Three studies illustrate the application of NBM in the study of biological invasions. The first one investigates the Northwards expansion of Lactuca serriola L. in Europe during the last 250 years in relation with climate changes. In the last two decades, the species could not track climate change due to non climatic influences. A second study analyses the potential invasion extent of spotted knapweed, a European weed first introduced into North America in the 1890s. The study provides one of the first empirical evidence that an invasive species can occupy climatically distinct niche spaces following its introduction into a new area. Models fail to predict the current full extent of the invasion, but correctly predict areas of introduction. An alternative approach, involving the calibration of models with pooled data from both ranges, is proposed to improve predictions of the extent of invasion on models based solely on the native range. I finally present a review on the dynamic nature of ecological niches in space and time. It synthesizes the recent theoretical developments to the niche conservatism issues and proposes solutions to improve confidence in NBM predictions of the impacts of climate change and species invasions on species distributions.
Resumo:
PURPOSE: To compare 3 different flow targeted magnetization preparation strategies for coronary MR angiography (cMRA), which allow selective visualization of the vessel lumen. MATERIAL AND METHODS: The right coronary artery of 10 healthy subjects was investigated on a 1.5 Tesla MR system (Gyroscan ACS-NT, Philips Healthcare, Best, NL). A navigator-gated and ECG-triggered 3D radial steady-state free-precession (SSFP) cMRA sequence with 3 different magnetization preparation schemes was performed referred to as projection SSFP (selective labeling of the aorta, subtraction of 2 data sets), LoReIn SSFP (double-inversion preparation, selective labeling of the aorta, 1 data set), and inflow SSFP (inversion preparation, selective labeling of the coronary artery, 1 data set). Signal-to-noise ratio (SNR) of the coronary artery and aorta, contrast-to-noise ratio (CNR) between the coronary artery and epicardial fat, vessel length and vessel sharpness were analyzed. RESULTS: All cMRA sequences were successfully obtained in all subjects. Both projection SSFP and LoReIn SSFP allowed for selective visualization of the coronary arteries with excellent background suppression. Scan time was doubled in projection SSFP because of the need for subtraction of 2 data sets. In inflow SSFP, background suppression was limited to the tissue included in the inversion volume. Projection SSFP (SNR(coro): 25.6 +/- 12.1; SNR(ao): 26.1 +/- 16.8; CNR(coro-fat): 22.0 +/- 11.7) and inflow SSFP (SNR(coro): 27.9 +/- 5.4; SNR(ao): 37.4 +/- 9.2; CNR(coro-fat): 24.9 +/- 4.8) yielded significantly increased SNR and CNR compared with LoReIn SSFP (SNR(coro): 12.3 +/- 5.4; SNR(ao): 11.8 +/- 5.8; CNR(coro-fat): 9.8 +/- 5.5; P < 0.05 for both). Longest visible vessel length was found with projection SSFP (79.5 mm +/- 18.9; P < 0.05 vs. LoReIn) whereas vessel sharpness was best in inflow SSFP (68.2% +/- 4.5%; P < 0.05 vs. LoReIn). Consistently good image quality was achieved using inflow SSFP likely because of the simple planning procedure and short scanning time. CONCLUSION: Three flow targeted cMRA approaches are presented, which provide selective visualization of the coronary vessel lumen and in addition blood flow information without the need of contrast agent administration. Inflow SSFP yielded highest SNR, CNR and vessel sharpness and may prove useful as a fast and efficient approach for assessing proximal and mid vessel coronary blood flow, whereas requiring less planning skills than projection SSFP or LoReIn SSFP.
Resumo:
The authors compared radial steady-state free precession (SSFP) coronary magnetic resonance (MR) angiography, cartesian k-space sampling SSFP coronary MR angiography, and gradient-echo coronary MR angiography in 16 healthy adults and four pilot study patients. Standard gradient-echo MR imaging with a T2 preparatory pulse and cartesian k-space sampling was the reference technique. Image quality was compared by using subjective motion artifact level and objective contrast-to-noise ratio and vessel sharpness. Radial SSFP, compared with cartesian SSFP and gradient-echo MR angiography, resulted in reduced motion artifacts and superior vessel sharpness. Cartesian SSFP resulted in increased motion artifacts (P <.05). Contrast-to-noise ratio with radial SSFP was lower than that with cartesian SSFP and similar to that with the reference technique. Radial SSFP coronary MR angiography appears preferable because of improved definition of vessel borders.
Resumo:
Exposure to various pesticides has been characterized in workers and the general population, but interpretation and assessment of biomonitoring data from a health risk perspective remains an issue. For workers, a Biological Exposure Index (BEI®) has been proposed for some substances, but most BEIs are based on urinary biomarker concentrations at Threshold Limit Value - Time Weighted Average (TLV-TWA) airborne exposure while occupational exposure can potentially occurs through multiple routes, particularly by skin contact (i.e.captan, chlorpyrifos, malathion). Similarly, several biomonitoring studies have been conducted to assess environmental exposure to pesticides in different populations, but dose estimates or health risks related to these environmental exposures (mainly through the diet), were rarely characterized. Recently, biological reference values (BRVs) in the form of urinary pesticide metabolites have been proposed for both occupationally exposed workers and children. These BRVs were established using toxicokinetic models developed for each substance, and correspond to safe levels of absorption in humans, regardless of the exposure scenario. The purpose of this chapter is to present a review of a toxicokinetic modeling approach used to determine biological reference values. These are then used to facilitate health risk assessments and decision-making on occupational and environmental pesticide exposures. Such models have the ability to link absorbed dose of the parent compound to exposure biomarkers and critical biological effects. To obtain the safest BRVs for the studied population, simulations of exposure scenarios were performed using a conservative reference dose such as a no-observed-effect level (NOEL). The various examples discussed in this chapter show the importance of knowledge on urine collections (i.e. spot samples and complete 8-h, 12-h or 24-h collections), sampling strategies, metabolism, relative proportions of the different metabolites in urine, absorption fraction, route of exposure and background contribution of prior exposures. They also show that relying on urinary measurements of specific metabolites appears more accurate when applying this approach to the case of occupational exposures. Conversely, relying on semi-specific metabolites (metabolites common to a category of pesticides) appears more accurate for the health risk assessment of environmental exposures given that the precise pesticides to which subjects are exposed are often unknown. In conclusion, the modeling approach to define BRVs for the relevant pesticides may be useful for public health authorities for managing issues related to health risks resulting from environmental and occupational exposures to pesticides.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
Resumo:
A critical issue in brain energy metabolism is whether lactate produced within the brain by astrocytes is taken up and metabolized by neurons upon activation. Although there is ample evidence that neurons can efficiently use lactate as an energy substrate, at least in vitro, few experimental data exist to indicate that it is indeed the case in vivo. To address this question, we used a modeling approach to determine which mechanisms are necessary to explain typical brain lactate kinetics observed upon activation. On the basis of a previously validated model that takes into account the compartmentalization of energy metabolism, we developed a mathematical model of brain lactate kinetics, which was applied to published data describing the changes in extracellular lactate levels upon activation. Results show that the initial dip in the extracellular lactate concentration observed at the onset of stimulation can only be satisfactorily explained by a rapid uptake within an intraparenchymal cellular compartment. In contrast, neither blood flow increase, nor extracellular pH variation can be major causes of the lactate initial dip, whereas tissue lactate diffusion only tends to reduce its amplitude. The kinetic properties of monocarboxylate transporter isoforms strongly suggest that neurons represent the most likely compartment for activation-induced lactate uptake and that neuronal lactate utilization occurring early after activation onset is responsible for the initial dip in brain lactate levels observed in both animals and humans.
Resumo:
During conventional x-ray coronary angiography, multiple projections of the coronary arteries are acquired to define coronary anatomy precisely. Due to time constraints, coronary magnetic resonance angiography (MRA) usually provides only one or two views of the major coronary vessels. A coronary MRA approach that allowed for reconstruction of arbitrary isotropic orientations might therefore be desirable. The purpose of the study was to develop a three-dimensional (3D) coronary MRA technique with isotropic image resolution in a relatively short scanning time that allows for reconstruction of arbitrary views of the coronary arteries without constraints given by anisotropic voxel size. Eight healthy adult subjects were examined using a real-time navigator-gated and corrected free-breathing interleaved echoplanar (TFE-EPI) 3D-MRA sequence. Two 3D datasets were acquired for the left and right coronary systems in each subject, one with anisotropic (1.0 x 1.5 x 3.0 mm, 10 slices) and one with "near" isotropic (1.0 x 1.5 x 1.0 mm, 30 slices) image resolution. All other imaging parameters were maintained. In all cases, the entire left main (LM) and extensive portions of the left anterior descending (LAD) and the right coronary artery (RCA) were visualized. Objective assessment of coronary vessel sharpness was similar (41% +/- 5% vs. 42% +/- 5%; P = NS) between in-plane and through-plane views with "isotropic" voxel size but differed (32% +/- 7% vs. 23% +/- 4%; P < 0.001) with nonisotropic voxel size. In reconstructed views oriented in the through-plane direction, the vessel border was 86% more defined (P < 0.01) for isotropic compared with anisotropic images. A smaller (30%; P < 0.001) improvement was seen for in-plane reconstructions. Vessel diameter measurements were view independent (2.81 +/- 0.45 mm vs. 2.66 +/- 0.52 mm; P = NS) for isotropic, but differed (2.71 +/- 0.51 mm vs. 3.30 +/- 0.38 mm; P < 0.001) between anisotropic views. Average scanning time was 2:31 +/- 0:57 minutes for anisotropic and 7:11 +/- 3:02 minutes for isotropic image resolution (P < 0.001). We present a new approach for "near" isotropic 3D coronary artery imaging, which allows for reconstruction of arbitrary views of the coronary arteries. The good delineation of the coronary arteries in all views suggests that isotropic 3D coronary MRA might be a preferred technique for the assessment of coronary disease, although at the expense of prolonged scan times. Comparative studies with conventional x-ray angiography are needed to investigate the clinical utility of the isotropic strategy.
Resumo:
With the dramatic increase in the volume of experimental results in every domain of life sciences, assembling pertinent data and combining information from different fields has become a challenge. Information is dispersed over numerous specialized databases and is presented in many different formats. Rapid access to experiment-based information about well-characterized proteins helps predict the function of uncharacterized proteins identified by large-scale sequencing. In this context, universal knowledgebases play essential roles in providing access to data from complementary types of experiments and serving as hubs with cross-references to many specialized databases. This review outlines how the value of experimental data is optimized by combining high-quality protein sequences with complementary experimental results, including information derived from protein 3D-structures, using as an example the UniProt knowledgebase (UniProtKB) and the tools and links provided on its website ( http://www.uniprot.org/ ). It also evokes precautions that are necessary for successful predictions and extrapolations.
Resumo:
For radiotherapy treatment planning of retinoblastoma inchildhood, Computed Tomography (CT) represents thestandard method for tumor volume delineation, despitesome inherent limitations. CT scan is very useful inproviding information on physical density for dosecalculation and morphological volumetric information butpresents a low sensitivity in assessing the tumorviability. On the other hand, 3D ultrasound (US) allows ahigh accurate definition of the tumor volume thanks toits high spatial resolution but it is not currentlyintegrated in the treatment planning but used only fordiagnosis and follow-up. Our ultimate goal is anautomatic segmentation of gross tumor volume (GTV) in the3D US, the segmentation of the organs at risk (OAR) inthe CT and the registration of both. In this paper, wepresent some preliminary results in this direction. Wepresent 3D active contour-based segmentation of the eyeball and the lens in CT images; the presented approachincorporates the prior knowledge of the anatomy by usinga 3D geometrical eye model. The automated segmentationresults are validated by comparing with manualsegmentations. Then, for the fusion of 3D CT and USimages, we present two approaches: (i) landmark-basedtransformation, and (ii) object-based transformation thatmakes use of eye ball contour information on CT and USimages.
Resumo:
Pharmacokinetic variability in drug levels represent for some drugs a major determinant of treatment success, since sub-therapeutic concentrations might lead to toxic reactions, treatment discontinuation or inefficacy. This is true for most antiretroviral drugs, which exhibit high inter-patient variability in their pharmacokinetics that has been partially explained by some genetic and non-genetic factors. The population pharmacokinetic approach represents a very useful tool for the description of the dose-concentration relationship, the quantification of variability in the target population of patients and the identification of influencing factors. It can thus be used to make predictions and dosage adjustment optimization based on Bayesian therapeutic drug monitoring (TDM). This approach has been used to characterize the pharmacokinetics of nevirapine (NVP) in 137 HIV-positive patients followed within the frame of a TDM program. Among tested covariates, body weight, co-administration of a cytochrome (CYP) 3A4 inducer or boosted atazanavir as well as elevated aspartate transaminases showed an effect on NVP elimination. In addition, genetic polymorphism in the CYP2B6 was associated with reduced NVP clearance. Altogether, these factors could explain 26% in NVP variability. Model-based simulations were used to compare the adequacy of different dosage regimens in relation to the therapeutic target associated with treatment efficacy. In conclusion, the population approach is very useful to characterize the pharmacokinetic profile of drugs in a population of interest. The quantification and the identification of the sources of variability is a rational approach to making optimal dosage decision for certain drugs administered chronically.