114 resultados para Data Streams Distribution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, multi-atlas fusion methods have gainedsignificant attention in medical image segmentation. Inthis paper, we propose a general Markov Random Field(MRF) based framework that can perform edge-preservingsmoothing of the labels at the time of fusing the labelsitself. More specifically, we formulate the label fusionproblem with MRF-based neighborhood priors, as an energyminimization problem containing a unary data term and apairwise smoothness term. We present how the existingfusion methods like majority voting, global weightedvoting and local weighted voting methods can be reframedto profit from the proposed framework, for generatingmore accurate segmentations as well as more contiguoussegmentations by getting rid of holes and islands. Theproposed framework is evaluated for segmenting lymphnodes in 3D head and neck CT images. A comparison ofvarious fusion algorithms is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND AIMS: In critically ill patients, fractional hepatic de novo lipogenesis increases in proportion to carbohydrate administration during isoenergetic nutrition. In this study, we sought to determine whether this increase may be the consequence of continuous enteral nutrition and bed rest. We, therefore, measured fractional hepatic de novo lipogenesis in a group of 12 healthy subjects during near-continuous oral feeding (hourly isoenergetic meals with a liquid formula containing 55% carbohydrate). In eight subjects, near-continuous enteral nutrition and bed rest were applied over a 10 h period. In the other four subjects, it was extended to 34 h. Fractional hepatic de novo lipogenesis was measured by infusing(13) C-labeled acetate and monitoring VLDL-(13)C palmitate enrichment with mass isotopomer distribution analysis. Fractional hepatic de novo lipogenesis was 3.2% (range 1.5-7.5%) in the eight subjects after 10 h of near continuous nutrition and 1.6% (range 1.3-2.0%) in the four subjects after 34 h of near-continuous nutrition and bed rest. This indicates that continuous nutrition and physical inactivity do not increase hepatic de novo lipogenesis. Fractional hepatic de novo lipogenesis previously reported in critically ill patients under similar nutritional conditions (9.3%) (range 5.3-15.8%) was markedly higher than in healthy subjects (P<0.001). These data from healthy subjects indicate that fractional hepatic de novo lipogenesis is increased in critically ill patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biological invasions and land-use changes are two major causes of the global modifications of biodiversity. Habitat suitability models are the tools of choice to predict potential distributions of invasive species. Although land-use is a key driver of alien species invasions, it is often assumed that land-use is constant in time. Here we combine historical and present day information, to evaluate whether land-use changes could explain the dynamic of invasion of the American bullfrog Rana catesbeiana (=Lithobathes catesbeianus) in Northern Italy, from the 1950s to present-day. We used maxent to build habitat suitability models, on the basis of past (1960s, 1980s) and present-day data on land-uses and species distribution. For example, we used models built using the 1960s data to predict distribution in the 1980s, and so on. Furthermore, we used land-use scenarios to project suitability in the future. Habitat suitability models predicted well the spread of bullfrogs in the subsequent temporal step. Models considering land-use changes predicted invasion dynamics better than models assuming constant land-use over the last 50 years. Scenarios of future land-use suggest that suitability will remain similar in the next years. Habitat suitability models can help to understand and predict the dynamics of invasions; however, land-use is not constant in time: land-use modifications can strongly affect invasions; furthermore, both land management and the suitability of a given land-use class may vary in time. An integration of land-use changes in studies of biological invasions can help to improve management strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The International Molecular Exchange (IMEx) consortium is an international collaboration between major public interaction data providers to share literature-curation efforts and make a nonredundant set of protein interactions available in a single search interface on a common website (http://www.imexconsortium.org/). Common curation rules have been developed, and a central registry is used to manage the selection of articles to enter into the dataset. We discuss the advantages of such a service to the user, our quality-control measures and our data-distribution practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Peroxisome proliferator-activated receptors (PPARs) are members of the nuclear hormone receptor superfamily that can be activated by various xenobiotics and natural fatty acids. These transcription factors primarily regulate genes involved in lipid metabolism and also play a role in adipocyte differentiation. We present the expression patterns of the PPAR subtypes in the adult rat, determined by in situ hybridization using specific probes for PPAR-alpha, -beta and -gamma, and by immunohistochemistry using a polyclonal antibody that recognizes the three rat PPAR subtypes. In numerous cell types from either ectodermal, mesodermal, or endodermal origin, PPARs are coexpressed, with relative levels varying between them from one cell type to the other. PPAR-alpha is highly expressed in hepatocytes, cardiomyocytes, enterocytes, and the proximal tubule cells of kidney. PPAR-beta is expressed ubiquitously and often at higher levels than PPAR-alpha and -gamma. PPAR-gamma is expressed predominantly in adipose tissue and the immune system. Our results suggest new potential directions to investigate the functions of the different PPAR subtypes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are far-reaching conceptual similarities between bi-static surface georadar and post-stack, "zero-offset" seismic reflection data, which is expressed in largely identical processing flows. One important difference is, however, that standard deconvolution algorithms routinely used to enhance the vertical resolution of seismic data are notoriously problematic or even detrimental to the overall signal quality when applied to surface georadar data. We have explored various options for alleviating this problem and have tested them on a geologically well-constrained surface georadar dataset. Standard stochastic and direct deterministic deconvolution approaches proved to be largely unsatisfactory. While least-squares-type deterministic deconvolution showed some promise, the inherent uncertainties involved in estimating the source wavelet introduced some artificial "ringiness". In contrast, we found spectral balancing approaches to be effective, practical and robust means for enhancing the vertical resolution of surface georadar data, particularly, but not exclusively, in the uppermost part of the georadar section, which is notoriously plagued by the interference of the direct air- and groundwaves. For the data considered in this study, it can be argued that band-limited spectral blueing may provide somewhat better results than standard band-limited spectral whitening, particularly in the uppermost part of the section affected by the interference of the air- and groundwaves. Interestingly, this finding is consistent with the fact that the amplitude spectrum resulting from least-squares-type deterministic deconvolution is characterized by a systematic enhancement of higher frequencies at the expense of lower frequencies and hence is blue rather than white. It is also consistent with increasing evidence that spectral "blueness" is a seemingly universal, albeit enigmatic, property of the distribution of reflection coefficients in the Earth. Our results therefore indicate that spectral balancing techniques in general and spectral blueing in particular represent simple, yet effective means of enhancing the vertical resolution of surface georadar data and, in many cases, could turn out to be a preferable alternative to standard deconvolution approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Landscape modification is often considered the principal cause of population decline in many bat species. Thus, schemes for bat conservation rely heavily on knowledge about species-landscape relationships. So far, however, few studies have quantified the possible influence of landscape structure on large-scale spatial patterns in bat communities. 2. This study presents quantitative models that use landscape structure to predict (i) spatial patterns in overall community composition and (ii) individual species' distributions through canonical correspondence analysis and generalized linear models, respectively. A geographical information system (GIS) was then used to draw up maps of (i) overall community patterns and (ii) distribution of potential species' habitats. These models relied on field data from the Swiss Jura mountains. 3. Fight descriptors of landscape structure accounted for 30% of the variation in bat community composition. For some species, more than 60% of the variance in distribution could be explained by landscape structure. Elevation, forest or woodland cover, lakes and suburbs, were the most frequent predictors. 4. This study shows that community composition in bats is related to landscape structure through species-specific relationships to resources. Due to their nocturnal activities and the difficulties of remote identification, a comprehensive bat census is rarely possible, and we suggest that predictive modelling of the type described here provides an indispensable conservation tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sensitivity of altitudinal and latitudinal tree-line ecotones to climate change, particularly that of temperature, has received much attention. To improve our understanding of the factors affecting tree-line position, we used the spatially explicit dynamic forest model TreeMig. Although well-suited because of its landscape dynamics functions, TreeMig features a parabolic temperature growth response curve, which has recently been questioned. and the species parameters are not specifically calibrated for cold temperatures. Our main goals were to improve the theoretical basis of the temperature growth response curve in the model and develop a method for deriving that curve's parameters from tree-ring data. We replaced the parabola with an asymptotic curve, calibrated for the main species at the subalpine (Swiss Alps: Pinus cembra, Larix decidua, Picea abies) and boreal (Fennoscandia: Pinus sylvestris, Betula pubescens, P. abies) tree-lines. After fitting new parameters, the growth curve matched observed tree-ring widths better. For the subalpine species, the minimum degree-day sum allowing, growth (kDDMin) was lowered by around 100 degree-days; in the case of Larix, the maximum potential ring-width was increased to 5.19 mm. At the boreal tree-line, the kDDMin for P. sylvestris was lowered by 210 degree-days and its maximum ring-width increased to 2.943 mm; for Betula (new in the model) kDDMin was set to 325 degree-days and the maximum ring-width to 2.51 mm; the values from the only boreal sample site for Picea were similar to the subalpine ones, so the same parameters were used. However, adjusting the growth response alone did not improve the model's output concerning species' distributions and their relative importance at tree-line. Minimum winter temperature (MinWiT, mean of the coldest winter month), which controls seedling establishment in TreeMig, proved more important for determining distribution. Picea, P. sylvestris and Betula did not previously have minimum winter temperature limits, so these values were set to the 95th percentile of each species' coldest MinWiT site (respectively -7, -11, -13). In a case study for the Alps, the original and newly calibrated versions of TreeMig were compared with biomass data from the National Forest Inventor), (NFI). Both models gave similar, reasonably realistic results. In conclusion, this method of deriving temperature responses from tree-rings works well. However, regeneration and its underlying factors seem more important for controlling species' distributions than previously thought. More research on regeneration ecology, especially at the upper limit of forests. is needed to improve predictions of tree-line responses to climate change further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence-environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence-environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building 'under fit' models, having insufficient flexibility to describe observed occurrence-environment relationships, we risk misunderstanding the factors shaping species distributions. By building 'over fit' models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species ranges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Herpes simplex ocular infection is a major cause of corneal blindness. Local antiviral treatments exist but are associated with corneal toxicity, and resistance has become an issue. We evaluated the biodistribution and efficacy of a humanized anti-herpes simplex virus (anti-HSV) IgG FAb fragment (AC-8; 53 kDa) following repeated topical administration. AC-8 was found in the corneal epithelium, anterior stroma, subepithelial stromal cells, and retinal glial cells, with preferential entry through the ocular limbus. AC-8 was active against 13 different strains of HSV-1, with 50% and 90% mean effective concentrations (MEC(50) and MEC(90), respectively) ranging from 0.03 to 0.13 μg/ml, indicating broad-spectrum activity. The in vivo efficacy of AC-8 was evaluated in a mouse model of herpes-induced ocular disease. Treatment with low-dose AC-8 (1 mg/ml) slightly reduced the ocular disease scores. A greater reduction of the disease scores was observed in the 10-mg/ml AC-8-treated group, but not as much as with trifluridine (TFT). AC-8 treatment reduced viral titers but less than trifluridine. AC-8 did not display any toxicity to the cornea or other structures in the eye. In summary, topical instillation of an anti-HSV FAb can be used on both intact and ulcerated corneas. It is well tolerated and does not alter reepithelialization. Further studies to improve the antiviral effect are needed for AC-8 to be considered for therapeutic use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prepro-RFRP-containing neurons have recently been described in the mammalian brain. These neurons are only found in the tuberal hypothalamus. In this work, we have provided a detailed analysis of the distribution of cells expressing the RFRP mRNA, and found them in seven anatomical structures of the tuberal hypothalamus. No co-expression with melanin-concentrating hormone (MCH) or hypocretin (Hcrt), that are also described in neurons of the tuberal hypothalamus, was observed. Using the BrdU method, we found that all RFRP cell bodies are generated between E13 and E14. Thus, RFRP neurons form a specific cell population with a complex distribution pattern in the tuberal hypothalamus. However, they are generated in one peak. These observations are discussed with data concerning the distribution and genesis of the MCH and Hcrt cell populations that are also distributed in the tuberal hypothalamus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A factor limiting preliminary rockfall hazard mapping at regional scale is often the lack of knowledge of potential source areas. Nowadays, high resolution topographic data (LiDAR) can account for realistic landscape details even at large scale. With such fine-scale morphological variability, quantitative geomorphometric analyses become a relevant approach for delineating potential rockfall instabilities. Using digital elevation model (DEM)-based ?slope families? concept over areas of similar lithology and cliffs and screes zones available from the 1:25,000 topographic map, a susceptibility rockfall hazard map was drawn up in the canton of Vaud, Switzerland, in order to provide a relevant hazard overview. Slope surfaces over morphometrically-defined thresholds angles were considered as rockfall source zones. 3D modelling (CONEFALL) was then applied on each of the estimated source zones in order to assess the maximum runout length. Comparison with known events and other rockfall hazard assessments are in good agreement, showing that it is possible to assess rockfall activities over large areas from DEM-based parameters and topographical elements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The monocarboxylate transporter MCT2 belongs to a large family of membrane proteins involved in the transport of lactate, pyruvate and ketone bodies. Although its expression in rodent brain has been well documented, the presence of MCT2 in the human brain has been questioned on the basis of low mRNA abundance. In this study, the distribution of the monocarboxylate transporter MCT2 has been investigated in the cortex of normal adult human brain using an immunohistochemical approach. Widespread neuropil staining in all cortical layers was observed by light microscopy. Such a distribution was very similar in three different cortical areas investigated. At the cellular level, the expression of MCT2 could be observed in a large number of neurons, in fibers both in grey and white matter, as well as in some astrocytes, mostly localized in layer I and in the white matter. Double staining experiments combined with confocal microscopy confirmed the neuronal expression but also suggested a preferential postsynaptic localization of synaptic MCT2 expression. A few astrocytes in the grey matter appeared to exhibit MCT2 labelling but at low levels. Electron microscopy revealed strong MCT2 expression at asymmetric synapses in the postsynaptic density and also within the spine head but not in the presynaptic terminal. These data not only demonstrate neuronal MCT2 expression in human, but since a portion of it exhibits a distinct synaptic localization, it further supports a putative role for MCT2 in adjustment of energy supply to levels of activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predictive species distribution modelling (SDM) has become an essential tool in biodiversity conservation and management. The choice of grain size (resolution) of environmental layers used in modelling is one important factor that may affect predictions. We applied 10 distinct modelling techniques to presence-only data for 50 species in five different regions, to test whether: (1) a 10-fold coarsening of resolution affects predictive performance of SDMs, and (2) any observed effects are dependent on the type of region, modelling technique, or species considered. Results show that a 10 times change in grain size does not severely affect predictions from species distribution models. The overall trend is towards degradation of model performance, but improvement can also be observed. Changing grain size does not equally affect models across regions, techniques, and species types. The strongest effect is on regions and species types, with tree species in the data sets (regions) with highest locational accuracy being most affected. Changing grain size had little influence on the ranking of techniques: boosted regression trees remain best at both resolutions. The number of occurrences used for model training had an important effect, with larger sample sizes resulting in better models, which tended to be more sensitive to grain. Effect of grain change was only noticeable for models reaching sufficient performance and/or with initial data that have an intrinsic error smaller than the coarser grain size.