999 resultados para Biodiversity modeling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A critical issue in brain energy metabolism is whether lactate produced within the brain by astrocytes is taken up and metabolized by neurons upon activation. Although there is ample evidence that neurons can efficiently use lactate as an energy substrate, at least in vitro, few experimental data exist to indicate that it is indeed the case in vivo. To address this question, we used a modeling approach to determine which mechanisms are necessary to explain typical brain lactate kinetics observed upon activation. On the basis of a previously validated model that takes into account the compartmentalization of energy metabolism, we developed a mathematical model of brain lactate kinetics, which was applied to published data describing the changes in extracellular lactate levels upon activation. Results show that the initial dip in the extracellular lactate concentration observed at the onset of stimulation can only be satisfactorily explained by a rapid uptake within an intraparenchymal cellular compartment. In contrast, neither blood flow increase, nor extracellular pH variation can be major causes of the lactate initial dip, whereas tissue lactate diffusion only tends to reduce its amplitude. The kinetic properties of monocarboxylate transporter isoforms strongly suggest that neurons represent the most likely compartment for activation-induced lactate uptake and that neuronal lactate utilization occurring early after activation onset is responsible for the initial dip in brain lactate levels observed in both animals and humans.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pharmacokinetic variability in drug levels represent for some drugs a major determinant of treatment success, since sub-therapeutic concentrations might lead to toxic reactions, treatment discontinuation or inefficacy. This is true for most antiretroviral drugs, which exhibit high inter-patient variability in their pharmacokinetics that has been partially explained by some genetic and non-genetic factors. The population pharmacokinetic approach represents a very useful tool for the description of the dose-concentration relationship, the quantification of variability in the target population of patients and the identification of influencing factors. It can thus be used to make predictions and dosage adjustment optimization based on Bayesian therapeutic drug monitoring (TDM). This approach has been used to characterize the pharmacokinetics of nevirapine (NVP) in 137 HIV-positive patients followed within the frame of a TDM program. Among tested covariates, body weight, co-administration of a cytochrome (CYP) 3A4 inducer or boosted atazanavir as well as elevated aspartate transaminases showed an effect on NVP elimination. In addition, genetic polymorphism in the CYP2B6 was associated with reduced NVP clearance. Altogether, these factors could explain 26% in NVP variability. Model-based simulations were used to compare the adequacy of different dosage regimens in relation to the therapeutic target associated with treatment efficacy. In conclusion, the population approach is very useful to characterize the pharmacokinetic profile of drugs in a population of interest. The quantification and the identification of the sources of variability is a rational approach to making optimal dosage decision for certain drugs administered chronically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing anthropogenic pressures urge enhanced knowledge and understanding of the current state of marine biodiversity. This baseline information is pivotal to explore present trends, detect future modifications and propose adequate management actions for marine ecosystems. Coralligenous outcrops are a highly diverse and structurally complex deep-water habitat faced with major threats in the Mediterranean Sea. Despite its ecological, aesthetic and economic value, coralligenous biodiversity patterns are still poorly understood. There is currently no single sampling method that has been demonstrated to be sufficiently representative to ensure adequate community assessment and monitoring in this habitat. Therefore, we propose a rapid non-destructive protocol for biodiversity assessment and monitoring of coralligenous outcrops providing good estimates of its structure and species composition, based on photographic sampling and the determination of presence/absence of macrobenthic species. We used an extensive photographic survey, covering several spatial scales (100s of m to 100s of km) within the NW Mediterranean and including 2 different coralligenous assemblages: Paramuricea clavata (PCA) and Corallium rubrum assemblage (CRA). This approach allowed us to determine the minimal sampling area for each assemblage (5000 cm² for PCA and 2500 cm²for CRA). In addition, we conclude that 3 replicates provide an optimal sampling effort in order to maximize the species number and to assess the main biodiversity patterns of studied assemblages in variability studies requiring replicates. We contend that the proposed sampling approach provides a valuable tool for management and conservation planning, monitoring and research programs focused on coralligenous outcrops, potentially also applicable in other benthic ecosystems

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research work deals with the problem of modeling and design of low level speed controller for the mobile robot PRIM. The main objective is to develop an effective educational, and research tool. On one hand, the interests in using the open mobile platform PRIM consist in integrating several highly related subjects to the automatic control theory in an educational context, by embracing the subjects of communications, signal processing, sensor fusion and hardware design, amongst others. On the other hand, the idea is to implement useful navigation strategies such that the robot can be served as a mobile multimedia information point. It is in this context, when navigation strategies are oriented to goal achievement, that a local model predictive control is attained. Hence, such studies are presented as a very interesting control strategy in order to develop the future capabilities of the system. In this context the research developed includes the visual information as a meaningful source that allows detecting the obstacle position coordinates as well as planning the free obstacle trajectory that should be reached by the robot

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of the present article is to take stock of a recent exchange in Organizational Research Methods between critics (Rönkkö & Evermann, 2013) and proponents (Henseler et al., 2014) of partial least squares path modeling (PLS-PM). The two target articles were centered around six principal issues, namely whether PLS-PM: (1) can be truly characterized as a technique for structural equation modeling (SEM); (2) is able to correct for measurement error; (3) can be used to validate measurement models; (4) accommodates small sample sizes; (5) is able to provide null hypothesis tests for path coefficients; and (6) can be employed in an exploratory, model-building fashion. We summarize and elaborate further on the key arguments underlying the exchange, drawing from the broader methodological and statistical literature in order to offer additional thoughts concerning the utility of PLS-PM and ways in which the technique might be improved. We conclude with recommendations as to whether and how PLS-PM serves as a viable contender to SEM approaches for estimating and evaluating theoretical models.