869 resultados para Constraint based modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

RESUME Les évidences montrant que les changements globaux affectent la biodiversité s'accumulent. Les facteurs les plus influant dans ce processus sont les changements et destructions d'habitat, l'expansion des espèces envahissantes et l'impact des changements climatiques. Une évaluation pertinente de la réponse des espèces face à ces changements est essentielle pour proposer des mesures permettant de réduire le déclin actuel de la biodiversité. La modélisation de la répartition d'espèces basée sur la niche (NBM) est l'un des rares outils permettant cette évaluation. Néanmoins, leur application dans le contexte des changements globaux repose sur des hypothèses restrictives et demande une interprétation critique. Ce travail présente une série d'études de cas investiguant les possibilités et limitations de cette approche pour prédire l'impact des changements globaux. Deux études traitant des menaces sur les espèces rares et en danger d'extinction sont présentées. Les caractéristiques éco-géographiques de 118 plantes avec un haut degré de priorité de conservation sont revues. La prévalence des types de rareté sont analysées en relation avec leur risque d'extinction UICN. La revue souligne l'importance de la conservation à l'échelle régionale. Une évaluation de la rareté à échelle globale peut être trompeuse pour certaine espèces car elle ne tient pas en compte des différents degrés de rareté que présente une espèce à différentes échelles spatiales. La deuxième étude test une approche pour améliorer l'échantillonnage d'espèces rares en incluant des phases itératives de modélisation et d'échantillonnage sur le terrain. L'application de l'approche en biologie de la conservation (illustrée ici par le cas du chardon bleu, Eryngium alpinum), permettrait de réduire le temps et les coûts d'échantillonnage. Deux études sur l'impact des changements climatiques sur la faune et la flore africaine sont présentées. La première étude évalue la sensibilité de 227 mammifères africains face aux climatiques d'ici 2050. Elle montre qu'un nombre important d'espèces pourrait être bientôt en danger d'extinction et que les parcs nationaux africains (principalement ceux situé en milieux xériques) pourraient ne pas remplir leur mandat de protection de la biodiversité dans le futur. La seconde étude modélise l'aire de répartition en 2050 de 975 espèces de plantes endémiques du sud de l'Afrique. L'étude propose l'inclusion de méthodes améliorant la prédiction des risques liés aux changements climatiques. Elle propose également une méthode pour estimer a priori la sensibilité d'une espèce aux changements climatiques à partir de ses propriétés écologiques et des caractéristiques de son aire de répartition. Trois études illustrent l'utilisation des modèles dans l'étude des invasions biologiques. Une première étude relate l'expansion de la laitue sáuvage (Lactuca serriola) vers le nord de l'Europe en lien avec les changements du climat depuis 250 ans. La deuxième étude analyse le potentiel d'invasion de la centaurée tachetée (Centaures maculosa), une mauvaise herbe importée en Amérique du nord vers 1890. L'étude apporte la preuve qu'une espèce envahissante peut occuper une niche climatique différente après introduction sur un autre continent. Les modèles basés sur l'aire native prédisent de manière incorrecte l'entier de l'aire envahie mais permettent de prévoir les aires d'introductions potentielles. Une méthode alternative, incluant la calibration du modèle à partir des deux aires où l'espèce est présente, est proposée pour améliorer les prédictions de l'invasion en Amérique du nord. Je présente finalement une revue de la littérature sur la dynamique de la niche écologique dans le temps et l'espace. Elle synthétise les récents développements théoriques concernant le conservatisme de la niche et propose des solutions pour améliorer la pertinence des prédictions d'impact des changements climatiques et des invasions biologiques. SUMMARY Evidences are accumulating that biodiversity is facing the effects of global change. The most influential drivers of change in ecosystems are land-use change, alien species invasions and climate change impacts. Accurate projections of species' responses to these changes are needed to propose mitigation measures to slow down the on-going erosion of biodiversity. Niche-based models (NBM) currently represent one of the only tools for such projections. However, their application in the context of global changes relies on restrictive assumptions, calling for cautious interpretations. In this thesis I aim to assess the effectiveness and shortcomings of niche-based models for the study of global change impacts on biodiversity through the investigation of specific, unsolved limitations and suggestion of new approaches. Two studies investigating threats to rare and endangered plants are presented. I review the ecogeographic characteristic of 118 endangered plants with high conservation priority in Switzerland. The prevalence of rarity types among plant species is analyzed in relation to IUCN extinction risks. The review underlines the importance of regional vs. global conservation and shows that a global assessment of rarity might be misleading for some species because it can fail to account for different degrees of rarity at a variety of spatial scales. The second study tests a modeling framework including iterative steps of modeling and field surveys to improve the sampling of rare species. The approach is illustrated with a rare alpine plant, Eryngium alpinum and shows promise for complementing conservation practices and reducing sampling costs. Two studies illustrate the impacts of climate change on African taxa. The first one assesses the sensitivity of 277 mammals at African scale to climate change by 2050 in terms of species richness and turnover. It shows that a substantial number of species could be critically endangered in the future. National parks situated in xeric ecosystems are not expected to meet their mandate of protecting current species diversity in the future. The second study model the distribution in 2050 of 975 endemic plant species in southern Africa. The study proposes the inclusion of new methodological insights improving the accuracy and ecological realism of predictions of global changes studies. It also investigates the possibility to estimate a priori the sensitivity of a species to climate change from the geographical distribution and ecological proprieties of the species. Three studies illustrate the application of NBM in the study of biological invasions. The first one investigates the Northwards expansion of Lactuca serriola L. in Europe during the last 250 years in relation with climate changes. In the last two decades, the species could not track climate change due to non climatic influences. A second study analyses the potential invasion extent of spotted knapweed, a European weed first introduced into North America in the 1890s. The study provides one of the first empirical evidence that an invasive species can occupy climatically distinct niche spaces following its introduction into a new area. Models fail to predict the current full extent of the invasion, but correctly predict areas of introduction. An alternative approach, involving the calibration of models with pooled data from both ranges, is proposed to improve predictions of the extent of invasion on models based solely on the native range. I finally present a review on the dynamic nature of ecological niches in space and time. It synthesizes the recent theoretical developments to the niche conservatism issues and proposes solutions to improve confidence in NBM predictions of the impacts of climate change and species invasions on species distributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predictive groundwater modeling requires accurate information about aquifer characteristics. Geophysical imaging is a powerful tool for delineating aquifer properties at an appropriate scale and resolution, but it suffers from problems of ambiguity. One way to overcome such limitations is to adopt a simultaneous multitechnique inversion strategy. We have developed a methodology for aquifer characterization based on structural joint inversion of multiple geophysical data sets followed by clustering to form zones and subsequent inversion for zonal parameters. Joint inversions based on cross-gradient structural constraints require less restrictive assumptions than, say, applying predefined petro-physical relationships and generally yield superior results. This approach has, for the first time, been applied to three geophysical data types in three dimensions. A classification scheme using maximum likelihood estimation is used to determine the parameters of a Gaussian mixture model that defines zonal geometries from joint-inversion tomograms. The resulting zones are used to estimate representative geophysical parameters of each zone, which are then used for field-scale petrophysical analysis. A synthetic study demonstrated how joint inversion of seismic and radar traveltimes and electrical resistance tomography (ERT) data greatly reduces misclassification of zones (down from 21.3% to 3.7%) and improves the accuracy of retrieved zonal parameters (from 1.8% to 0.3%) compared to individual inversions. We applied our scheme to a data set collected in northeastern Switzerland to delineate lithologic subunits within a gravel aquifer. The inversion models resolve three principal subhorizontal units along with some important 3D heterogeneity. Petro-physical analysis of the zonal parameters indicated approximately 30% variation in porosity within the gravel aquifer and an increasing fraction of finer sediments with depth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exposure to various pesticides has been characterized in workers and the general population, but interpretation and assessment of biomonitoring data from a health risk perspective remains an issue. For workers, a Biological Exposure Index (BEI®) has been proposed for some substances, but most BEIs are based on urinary biomarker concentrations at Threshold Limit Value - Time Weighted Average (TLV-TWA) airborne exposure while occupational exposure can potentially occurs through multiple routes, particularly by skin contact (i.e.captan, chlorpyrifos, malathion). Similarly, several biomonitoring studies have been conducted to assess environmental exposure to pesticides in different populations, but dose estimates or health risks related to these environmental exposures (mainly through the diet), were rarely characterized. Recently, biological reference values (BRVs) in the form of urinary pesticide metabolites have been proposed for both occupationally exposed workers and children. These BRVs were established using toxicokinetic models developed for each substance, and correspond to safe levels of absorption in humans, regardless of the exposure scenario. The purpose of this chapter is to present a review of a toxicokinetic modeling approach used to determine biological reference values. These are then used to facilitate health risk assessments and decision-making on occupational and environmental pesticide exposures. Such models have the ability to link absorbed dose of the parent compound to exposure biomarkers and critical biological effects. To obtain the safest BRVs for the studied population, simulations of exposure scenarios were performed using a conservative reference dose such as a no-observed-effect level (NOEL). The various examples discussed in this chapter show the importance of knowledge on urine collections (i.e. spot samples and complete 8-h, 12-h or 24-h collections), sampling strategies, metabolism, relative proportions of the different metabolites in urine, absorption fraction, route of exposure and background contribution of prior exposures. They also show that relying on urinary measurements of specific metabolites appears more accurate when applying this approach to the case of occupational exposures. Conversely, relying on semi-specific metabolites (metabolites common to a category of pesticides) appears more accurate for the health risk assessment of environmental exposures given that the precise pesticides to which subjects are exposed are often unknown. In conclusion, the modeling approach to define BRVs for the relevant pesticides may be useful for public health authorities for managing issues related to health risks resulting from environmental and occupational exposures to pesticides.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pharmacokinetic variability in drug levels represent for some drugs a major determinant of treatment success, since sub-therapeutic concentrations might lead to toxic reactions, treatment discontinuation or inefficacy. This is true for most antiretroviral drugs, which exhibit high inter-patient variability in their pharmacokinetics that has been partially explained by some genetic and non-genetic factors. The population pharmacokinetic approach represents a very useful tool for the description of the dose-concentration relationship, the quantification of variability in the target population of patients and the identification of influencing factors. It can thus be used to make predictions and dosage adjustment optimization based on Bayesian therapeutic drug monitoring (TDM). This approach has been used to characterize the pharmacokinetics of nevirapine (NVP) in 137 HIV-positive patients followed within the frame of a TDM program. Among tested covariates, body weight, co-administration of a cytochrome (CYP) 3A4 inducer or boosted atazanavir as well as elevated aspartate transaminases showed an effect on NVP elimination. In addition, genetic polymorphism in the CYP2B6 was associated with reduced NVP clearance. Altogether, these factors could explain 26% in NVP variability. Model-based simulations were used to compare the adequacy of different dosage regimens in relation to the therapeutic target associated with treatment efficacy. In conclusion, the population approach is very useful to characterize the pharmacokinetic profile of drugs in a population of interest. The quantification and the identification of the sources of variability is a rational approach to making optimal dosage decision for certain drugs administered chronically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identifying the geographic distribution of populations is a basic, yet crucial step in many fundamental and applied ecological projects, as it provides key information on which many subsequent analyses depend. However, this task is often costly and time consuming, especially where rare species are concerned and where most sampling designs generally prove inefficient. At the same time, rare species are those for which distribution data are most needed for their conservation to be effective. To enhance fieldwork sampling, model-based sampling (MBS) uses predictions from species distribution models: when looking for the species in areas of high habitat suitability, chances should be higher to find them. We thoroughly tested the efficiency of MBS by conducting an important survey in the Swiss Alps, assessing the detection rate of three rare and five common plant species. For each species, habitat suitability maps were produced following an ensemble modeling framework combining two spatial resolutions and two modeling techniques. We tested the efficiency of MBS and the accuracy of our models by sampling 240 sites in the field (30 sitesx8 species). Across all species, the MBS approach proved to be effective. In particular, the MBS design strictly led to the discovery of six sites of presence of one rare plant, increasing chances to find this species from 0 to 50%. For common species, MBS doubled the new population discovery rates as compared to random sampling. Habitat suitability maps coming from the combination of four individual modeling methods predicted well the species' distribution and more accurately than the individual models. As a conclusion, using MBS for fieldwork could efficiently help in increasing our knowledge of rare species distribution. More generally, we recommend using habitat suitability models to support conservation plans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Changes of functional connectivity in prodromal and early Alzheimer's disease can arise from compensatory and/or pathological processes. We hypothesized that i) there is impairment of effective inhibition associated with early Alzheimer's disease that may lead to ii) a paradoxical increase of functional connectivity. To this end we analyzed effective connectivity in 14 patients and 16 matched controls using dynamic causal modeling of functional MRI time series recorded during a visual inter-hemispheric integration task. By contrasting co-linear with non co-linear bilateral gratings, we estimated inhibitory top-down effects within the visual areas. The anatomical areas constituting the functional network of interest were identified with categorical functional MRI contrasts (Stimuli>Baseline and Co-linear gratings>Non co-linear gratings), which implicated V1 and V3v in both hemispheres. A model with reciprocal excitatory intrinsic connections linking these four regions and modulatory inhibitory effects exerted by V3v on V1 optimally explained the functional MRI time series in both subject groups. However, Alzheimer's disease was associated with significantly weakened intrinsic and modulatory connections. Top-down inhibitory effects, previously detected as relative deactivations of V1 in young adults, were observed neither in our aged controls nor in patients. We conclude that effective inhibition weakens with age and more so in early Alzheimer's disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compartmental and physiologically based toxicokinetic modeling coupled with Monte Carlo simulation were used to quantify the impact of biological variability (physiological, biochemical, and anatomic parameters) on the values of a series of bio-indicators of metal and organic industrial chemical exposures. A variability extent index and the main parameters affecting biological indicators were identified. Results show a large diversity in interindividual variability for the different categories of biological indicators examined. Measurement of the unchanged substance in blood, alveolar air, or urine is much less variable than the measurement of metabolites, both in blood and urine. In most cases, the alveolar flow and cardiac output were identified as the prime parameters determining biological variability, thus suggesting the importance of workload intensity on absorbed dose for inhaled chemicals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Excitation-continuous music instrument control patterns are often not explicitly represented in current sound synthesis techniques when applied to automatic performance. Both physical model-based and sample-based synthesis paradigmswould benefit from a flexible and accurate instrument control model, enabling the improvement of naturalness and realism. Wepresent a framework for modeling bowing control parameters inviolin performance. Nearly non-intrusive sensing techniques allow for accurate acquisition of relevant timbre-related bowing control parameter signals.We model the temporal contour of bow velocity, bow pressing force, and bow-bridge distance as sequences of short Bézier cubic curve segments. Considering different articulations, dynamics, and performance contexts, a number of note classes are defined. Contours of bowing parameters in a performance database are analyzed at note-level by following a predefined grammar that dictates characteristics of curve segment sequences for each of the classes in consideration. As a result, contour analysis of bowing parameters of each note yields an optimal representation vector that is sufficient for reconstructing original contours with significant fidelity. From the resulting representation vectors, we construct a statistical model based on Gaussian mixtures suitable for both the analysis and synthesis of bowing parameter contours. By using the estimated models, synthetic contours can be generated through a bow planning algorithm able to reproduce possible constraints caused by the finite length of the bow. Rendered contours are successfully used in two preliminary synthesis frameworks: digital waveguide-based bowed stringphysical modeling and sample-based spectral-domain synthesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work described in this report documents the activities performed for the evaluation, development, and enhancement of the Iowa Department of Transportation (DOT) pavement condition information as part of their pavement management system operation. The study covers all of the Iowa DOT’s interstate and primary National Highway System (NHS) and non-NHS system. A new pavement condition rating system that provides a consistent, unified approach in rating pavements in Iowa is being proposed. The proposed 100-scale system is based on five individual indices derived from specific distress data and pavement properties, and an overall pavement condition index, PCI-2, that combines individual indices using weighting factors. The different indices cover cracking, ride, rutting, faulting, and friction. The Cracking Index is formed by combining cracking data (transverse, longitudinal, wheel-path, and alligator cracking indices). Ride, rutting, and faulting indices utilize the International Roughness Index (IRI), rut depth, and fault height, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a framework for modeling right-hand gestures in bowed-string instrument playing, applied to violin. Nearly non-intrusive sensing techniques allow for accurate acquisition of relevant timbre-related bowing gesture parameter cues. We model the temporal contour of bow transversal velocity, bow pressing force, and bow-bridge distance as sequences of short segments, in particular B´ezier cubic curve segments. Considering different articulations, dynamics, andcontexts, a number of note classes is defined. Gesture parameter contours of a performance database are analyzed at note-level by following a predefined grammar that dictatescharacteristics of curve segment sequences for each of the classes into consideration. Based on dynamic programming, gesture parameter contour analysis provides an optimal curve parameter vector for each note. The informationpresent in such parameter vector is enough for reconstructing original gesture parameter contours with significant fidelity. From the resulting representation vectors, weconstruct a statistical model based on Gaussian mixtures, suitable for both analysis and synthesis of bowing gesture parameter contours. We show the potential of the modelby synthesizing bowing gesture parameter contours from an annotated input score. Finally, we point out promising applicationsand developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pluripotency in human embryonic stem cells (hESCs) and induced pluripotent stem cells (iPSCs) is regulated by three transcription factors-OCT3/4, SOX2, and NANOG. To fully exploit the therapeutic potential of these cells it is essential to have a good mechanistic understanding of the maintenance of self-renewal and pluripotency. In this study, we demonstrate a powerful systems biology approach in which we first expand literature-based network encompassing the core regulators of pluripotency by assessing the behavior of genes targeted by perturbation experiments. We focused our attention on highly regulated genes encoding cell surface and secreted proteins as these can be more easily manipulated by the use of inhibitors or recombinant proteins. Qualitative modeling based on combining boolean networks and in silico perturbation experiments were employed to identify novel pluripotency-regulating genes. We validated Interleukin-11 (IL-11) and demonstrate that this cytokine is a novel pluripotency-associated factor capable of supporting self-renewal in the absence of exogenously added bFGF in culture. To date, the various protocols for hESCs maintenance require supplementation with bFGF to activate the Activin/Nodal branch of the TGFβ signaling pathway. Additional evidence supporting our findings is that IL-11 belongs to the same protein family as LIF, which is known to be necessary for maintaining pluripotency in mouse but not in human ESCs. These cytokines operate through the same gp130 receptor which interacts with Janus kinases. Our finding might explain why mESCs are in a more naïve cell state compared to hESCs and how to convert primed hESCs back to the naïve state. Taken together, our integrative modeling approach has identified novel genes as putative candidates to be incorporated into the expansion of the current gene regulatory network responsible for inducing and maintaining pluripotency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Hierarchical modeling has been proposed as a solution to the multiple exposure problem. We estimate associations between metabolic syndrome and different components of antiretroviral therapy using both conventional and hierarchical models. STUDY DESIGN AND SETTING: We use discrete time survival analysis to estimate the association between metabolic syndrome and cumulative exposure to 16 antiretrovirals from four drug classes. We fit a hierarchical model where the drug class provides a prior model of the association between metabolic syndrome and exposure to each antiretroviral. RESULTS: One thousand two hundred and eighteen patients were followed for a median of 27 months, with 242 cases of metabolic syndrome (20%) at a rate of 7.5 cases per 100 patient years. Metabolic syndrome was more likely to develop in patients exposed to stavudine, but was less likely to develop in those exposed to atazanavir. The estimate for exposure to atazanavir increased from hazard ratio of 0.06 per 6 months' use in the conventional model to 0.37 in the hierarchical model (or from 0.57 to 0.81 when using spline-based covariate adjustment). CONCLUSION: These results are consistent with trials that show the disadvantage of stavudine and advantage of atazanavir relative to other drugs in their respective classes. The hierarchical model gave more plausible results than the equivalent conventional model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surface geological mapping, laboratory measurements of rock properties, and seismic reflection data are integrated through three-dimensional seismic modeling to determine the likely cause of upper crustal reflections and to elucidate the deep structure of the Penninic Alps in eastern Switzerland. Results indicate that the principal upper crustal reflections recorded on the south end of Swiss seismic line NFP20-EAST can be explained by the subsurface geometry of stacked basement nappes. In addition, modeling results provide improvements to structural maps based solely on surface trends and suggest the presence of previously unrecognized rock units in the subsurface. Construction of the initial model is based upon extrapolation of plunging surface. structures; velocities and densities are established by laboratory measurements of corresponding rock units. Iterative modification produces a best fit model that refines the definition of the subsurface geometry of major structures. We conclude that most reflections from the upper 20 km can be ascribed to the presence of sedimentary cover rocks (especially carbonates) and ophiolites juxtaposed against crystalline basement nappes. Thus, in this area, reflections appear to be principally due to first-order lithologic contrasts. This study also demonstrates not only the importance of three-dimensional effects (sideswipe) in interpreting seismic data, but also that these effects can be considered quantitatively through three-dimensional modeling.