961 resultados para density-dependent space use


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past, culvert pipes were made only of corrugated metal or reinforced concrete. In recent years, several manufacturers have made pipe of lightweight plastic - for example, high density polyethylene (HDPE) - which is considered to be viscoelastic in its structural behavior. It appears that there are several highway applications in which HDPE pipe would be an economically favorable alternative. However, the newness of plastic pipe requires the evaluation of its performance, integrity, and durability; A review of the Iowa Department of Transportation Standard Specifications for Highway and Bridge Construction reveals limited information on the use of plastic pipe for state projects. The objective of this study was to review and evaluate the use of HDPE pipe in roadway applications. Structural performance, soil-structure interaction, and the sensitivity of the pipe to installation was investigated. Comprehensive computerized literature searches were undertaken to define the state-of-the-art in the design and use of HDPE pipe in highway applications. A questionnaire was developed and sent to all Iowa county engineers to learn of their use of HDPE pipe. Responses indicated that the majority of county engineers were aware of the product but were not confident in its ability to perform as well as conventional materials. Counties currently using HDPE pipe in general only use it in driveway crossings. Originally, we intended to survey states as to their usage of HDPE pipe. However, a few weeks after initiation of the project, it was learned that the Tennessee DOT was in the process of making a similar survey of state DOT's. Results of the Tennessee survey of states have been obtained and included in this report. In an effort to develop more confidence in the pipe's performance parameters, this research included laboratory tests to determine the ring and flexural stiffness of HDPE pipe provided by various manufacturers. Parallel plate tests verified all specimens were in compliance with ASTM specifications. Flexural testing revealed that pipe profile had a significant effect on the longitudinal stiffness and that strength could not be accurately predicted on the basis of diameter alone. Realizing that the soil around a buried HDPE pipe contributes to the pipe stiffness, the research team completed a limited series of tests on buried 3 ft-diameter HDPE pipe. The tests simulated the effects of truck wheel loads above the pipe and were conducted with two feet of cover. These tests indicated that the type and quality of backfill significantly influences the performance of HDPE pipe. The tests revealed that the soil envelope does significantly affect the performance of HDPE pipe in situ, and after a certain point, no additional strength is realized by increasing the quality of the backfill.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this work was to evaluate the use of basic density and pulp yield correlations with some chemical parameters, in order to differentiate an homogeneous eucalyptus tree population, in terms of its potential for pulp production or some other technological applications. Basic density and kraft pulp yield were determined for 120 Eucalyptus globulus trees, and the values were plotted as frequency distributions. Homogenized samples from the first and fourth density quartiles and first and fourth yield quartiles were submitted to total phenols, total sugars and methoxyl group analysis. Syringyl/guaiacyl (S/G) and syringaldehyde/vanillin (S/V) ratios were determined on the kraft lignins from wood of the same quartiles. The results show the similarity between samples from high density and low yield quartiles, both with lower S/G (3.88-4.12) and S/V (3.99-4.09) ratios and higher total phenols (13.3-14.3 g gallic acid kg-1 ). Woods from the high yield quartile are statistically distinguished from all the others because of their higher S/G (5.15) and S/V (4.98) ratios and lower total phenols (8.7 g gallic acid kg-1 ). Methoxyl group and total sugars parameters are more adequate to distinguish wood samples with lower density.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aging is associated with an increased risk of depression in humans. To elucidate the underlying mechanisms of depression and its dependence on aging, here we study signs of depression in male SAMP8 mice. For this purpose, we used the forced swimming test (FST). The total floating time in the FST was greater in SAMP8 than in SAMR1 mice at 9 months of age; however, this difference was not observed in 12-month-old mice, when both strains are considered elderly. Of the two strains, only the SAMP8 animals responded to imipramine treatment. We also applied the dexamethasone suppression test (DST) and studied changes in the dopamine and serotonin (5-HT) uptake systems, the 5-HT2a/2c receptor density in the cortex, and levels of TPH2. The DST showed a significant difference between SAMR1 and SAMP8 mice at old age. SAMP8 exhibits an increase in 5-HT transporter density, with slight changes in 5-HT2a/2c receptor density. In conclusion, SAMP8 mice presented depression-like behavior that is dependent on senescence process, because it differs from SAMR1, senescence resistant strain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study evaluated the use of electromagnetic gauges to determine the adjusted densities of HMA pavements. Field measurements were taken with two electromagnetic gauges, the Pavement Quality Indicator (PQI) 301 and the Pavetracker Plus 2701B. Seven projects were included in the study with 3 to 5 consecutive paving days. For each day/lot 20 randomly selected locations were tested along with seven core locations. The analysis of PaveTracker and PQI density consisted of determining which factors are statistically significant, and core density residuals and a regression analysis of core as a function of PaveTracker and PQI readings. The following key conclusions can be stated: 1. Core density, traffic and binder content were all found to be significant for both electromagnetic gauges studied, 2. Core density residuals are normally distributed and centered at zero for both electromagnetic gauges, 3. For PaveTracker readings, statistically one third of the lots do not have an intercept that is zero and two thirds of the lots do not rule out a scaler correction factor of zero, 4. For PQI readings, statistically the 95% confidence interval rules out the intercept being zero for all seven projects and six of the seven projects do not rule out the scaler correction factor being zero, 5. The PQI 301 gauge should not be used for quality control or quality assurance, and 6. The Pavetracker 2701B gauge can be used for quality control but not quality assurance. This study has found that with the limited sample size, the adjusted density equations for both electromagnetic gauges were determined to be inadequate. The PaveTracker Plus 2701B was determined to be better than the PQI 301. The PaveTracker 2701B could still be applicable for quality assurance if the number of core locations per day is reduced and supplemented with additional PaveTracker 2701B readings. Further research should be done to determine the minimum number of core locations to calibrate the gauges each day/lot and the number of additional PaveTracker 2701B readings required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: This study aimed at measuring the lipophilicity and ionization constants of diastereoisomeric dipeptides, interpreting them in terms of conformational behavior, and developing statistical models to predict them. METHODS: A series of 20 dipeptides of general structure NH(2) -L-X-(L or D)-His-OMe was designed and synthetized. Their experimental ionization constants (pK(1) , pK(2) and pK(3) ) and lipophilicity parameters (log P(N) and log D(7.4) ) were measured by potentiometry. Molecular modeling in three media (vacuum, water, and chloroform) was used to explore and sample their conformational space, and for each stored conformer to calculate their radius of gyration, virtual log P (preferably written as log P(MLP) , meaning obtained by the molecular lipophilicity potential (MLP) method) and polar surface area (PSA). Means and ranges were calculated for these properties, as was their sensitivity (i.e., the ratio between property range and number of rotatable bonds). RESULTS: Marked differences between diastereoisomers were seen in their experimental ionization constants and lipophilicity parameters. These differences are explained by molecular flexibility, configuration-dependent differences in intramolecular interactions, and accessibility of functional groups. Multiple linear equations correlated experimental lipophilicity parameters and ionization constants with PSA range and other calculated parameters. CONCLUSION: This study documents the differences in lipophilicity and ionization constants between diastereoisomeric dipeptides. Such configuration-dependent differences are shown to depend markedly on differences in conformational behavior and to be amenable to multiple linear regression. Chirality 24:566-576, 2012. © 2012 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The risk of osteoporosis and fracture influences the selection of adjuvant endocrine therapy. We analyzed bone mineral density (BMD) in Swiss patients of the Breast International Group (BIG) 1-98 trial [treatment arms: A, tamoxifen (T) for 5 years; B, letrozole (L) for 5 years; C, 2 years of T followed by 3 years of L; D, 2 years of L followed by 3 years of T]. PATIENTS AND METHODS: Dual-energy X-ray absorptiometry (DXA) results were retrospectively collected. Patients without DXA served as control group. Repeated measures models using covariance structures allowing for different times between DXA were used to estimate changes in BMD. Prospectively defined covariates were considered as fixed effects in the multivariable models. RESULTS: Two hundred and sixty-one of 546 patients had one or more DXA with 577 lumbar and 550 hip measurements. Weight, height, prior hormone replacement therapy, and hysterectomy were positively correlated with BMD; the correlation was negative for letrozole arms (B/C/D versus A), known osteoporosis, time on trial, age, chemotherapy, and smoking. Treatment did not influence the occurrence of osteoporosis (T score < -2.5 standard deviation). CONCLUSIONS: All aromatase inhibitor regimens reduced BMD. The sequential schedules were as detrimental for bone density as L monotherapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONTEXT: In the Health Outcomes and Reduced Incidence with Zoledronic Acid Once Yearly - Pivotal Fracture Trial (HORIZON-PFT), zoledronic acid (ZOL) 5 mg significantly reduced fracture risk. OBJECTIVE: The aim of the study was to identify factors associated with greater efficacy during ZOL 5 mg treatment. DESIGN, SETTING, AND PATIENTS: We conducted a subgroup analysis (preplanned and post hoc) of a multicenter, double-blind, placebo-controlled, 36-month trial in 7765 women with postmenopausal osteoporosis. Intervention: A single infusion of ZOL 5 mg or placebo was administered at baseline, 12, and 24 months. MAIN OUTCOME MEASURES: Primary endpoints were new vertebral fracture and hip fracture. Secondary endpoints were nonvertebral fracture and change in femoral neck bone mineral density (BMD). Baseline risk factor subgroups were age, BMD T-score and vertebral fracture status, total hip BMD, race, weight, geographical region, smoking, height loss, history of falls, physical activity, prior bisphosphonates, creatinine clearance, body mass index, and concomitant osteoporosis medications. RESULTS: Greater ZOL induced effects on vertebral fracture risk were seen with younger age (treatment-by-subgroup interaction, P = 0.05), normal creatinine clearance (P = 0.04), and body mass index >or= 25 kg/m(2) (P = 0.02). There were no significant treatment-factor interactions for hip or nonvertebral fracture or for change in BMD. CONCLUSIONS: ZOL appeared more effective in preventing vertebral fracture in younger women, overweight/obese women, and women with normal renal function. ZOL had similar effects irrespective of fracture risk factors or femoral neck BMD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aging is associated with an increased risk of depression in humans. To elucidate the underlying mechanisms of depression and its dependence on aging, here we study signs of depression in male SAMP8 mice. For this purpose, we used the forced swimming test (FST). The total floating time in the FST was greater in SAMP8 than in SAMR1 mice at 9 months of age; however, this difference was not observed in 12-month-old mice, when both strains are considered elderly. Of the two strains, only the SAMP8 animals responded to imipramine treatment. We also applied the dexamethasone suppression test (DST) and studied changes in the dopamine and serotonin (5-HT) uptake systems, the 5-HT2a/2c receptor density in the cortex, and levels of TPH2. The DST showed a significant difference between SAMR1 and SAMP8 mice at old age. SAMP8 exhibits an increase in 5-HT transporter density, with slight changes in 5-HT2a/2c receptor density. In conclusion, SAMP8 mice presented depression-like behavior that is dependent on senescence process, because it differs from SAMR1, senescence resistant strain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research into the anatomical substrates and "principles" for integrating inputs from separate sensory surfaces has yielded divergent findings. This suggests that multisensory integration is flexible and context dependent and underlines the need for dynamically adaptive neuronal integration mechanisms. We propose that flexible multisensory integration can be explained by a combination of canonical, population-level integrative operations, such as oscillatory phase resetting and divisive normalization. These canonical operations subsume multisensory integration into a fundamental set of principles as to how the brain integrates all sorts of information, and they are being used proactively and adaptively. We illustrate this proposition by unifying recent findings from different research themes such as timing, behavioral goal, and experience-related differences in integration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: At high magnetic field strengths (B0 ≥ 3 T), the shorter radiofrequency wavelength produces an inhomogeneous distribution of the transmit magnetic field. This can lead to variable contrast across the brain which is particularly pronounced in T2 -weighted imaging that requires multiple radiofrequency pulses. To obtain T2 -weighted images with uniform contrast throughout the whole brain at 7 T, short (2-3 ms) 3D tailored radiofrequency pulses (kT -points) were integrated into a 3D variable flip angle turbo spin echo sequence. METHODS: The excitation and refocusing "hard" pulses of a variable flip angle turbo spin echo sequence were replaced with kT -point pulses. Spatially resolved extended phase graph simulations and in vivo acquisitions at 7 T, utilizing both single channel and parallel-transmit systems, were used to test different kT -point configurations. RESULTS: Simulations indicated that an extended optimized k-space trajectory ensured a more homogeneous signal throughout images. In vivo experiments showed that high quality T2 -weighted brain images with uniform signal and contrast were obtained at 7 T by using the proposed methodology. CONCLUSION: This work demonstrates that T2 -weighted images devoid of artifacts resulting from B1 (+) inhomogeneity can be obtained at high field through the optimization of extended kT -point pulses. Magn Reson Med 71:1478-1488, 2014. © 2013 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aging is associated with an increased risk of depression in humans. To elucidate the underlying mechanisms of depression and its dependence on aging, here we study signs of depression in male SAMP8 mice. For this purpose, we used the forced swimming test (FST). The total floating time in the FST was greater in SAMP8 than in SAMR1 mice at 9 months of age; however, this difference was not observed in 12-month-old mice, when both strains are considered elderly. Of the two strains, only the SAMP8 animals responded to imipramine treatment. We also applied the dexamethasone suppression test (DST) and studied changes in the dopamine and serotonin (5-HT) uptake systems, the 5-HT2a/2c receptor density in the cortex, and levels of TPH2. The DST showed a significant difference between SAMR1 and SAMP8 mice at old age. SAMP8 exhibits an increase in 5-HT transporter density, with slight changes in 5-HT2a/2c receptor density. In conclusion, SAMP8 mice presented depression-like behavior that is dependent on senescence process, because it differs from SAMR1, senescence resistant strain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: The expansion of a recovering population - whether re-introduced or spontaneously returning - is shaped by (i) biological (intrinsic) factors such as the land tenure system or dispersal, (ii) the distribution and availability of resources (e.g. prey), (iii) habitat and landscape features, and (iv) human attitudes and activities. In order to develop efficient conservation and recovery strategies, we need to understand all these factors and to predict the potential distribution and explore ways to reach it. An increased number of lynx in the north-western Swiss Alps in the nineties lead to a new controversy about the return of this cat. When the large carnivores were given legal protection in many European countries, most organizations and individuals promoting their protection did not foresee the consequences. Management plans describing how to handle conflicts with large predators are needed to find a balance between "overabundance" and extinction. Wildlife and conservation biologists need to evaluate the various threats confronting populations so that adequate management decisions can be taken. I developed a GIS probability model for the lynx, based on habitat information and radio-telemetry data from the Swiss Jura Mountains, in order to predict the potential distribution of the lynx in this mountain range, which is presently only partly occupied by lynx. Three of the 18 variables tested for each square kilometre describing land use, vegetation, and topography, qualified to predict the probability of lynx presence. The resulting map was evaluated with data from dispersing subadult lynx. Young lynx that were not able to establish home ranges in what was identified as good lynx habitat did not survive their first year of independence, whereas the only one that died in good lynx habitat was illegally killed. Radio-telemetry fixes are often used as input data to calibrate habitat models. Radio-telemetry is the only way to gather accurate and unbiased data on habitat use of elusive larger terrestrial mammals. However, it is time consuming and expensive, and can therefore only be applied in limited areas. Habitat models extrapolated over large areas can in turn be problematic, as habitat characteristics and availability may change from one area to the other. I analysed the predictive power of Ecological Niche Factor Analysis (ENFA) in Switzerland with the lynx as focal species. According to my results, the optimal sampling strategy to predict species distribution in an Alpine area lacking available data would be to pool presence cells from contrasted regions (Jura Mountains, Alps), whereas in regions with a low ecological variance (Jura Mountains), only local presence cells should be used for the calibration of the model. Dispersal influences the dynamics and persistence of populations, the distribution and abundance of species, and gives the communities and ecosystems their characteristic texture in space and time. Between 1988 and 2001, the spatio-temporal behaviour of subadult Eurasian lynx in two re-introduced populations in Switzerland was studied, based on 39 juvenile lynx of which 24 were radio-tagged to understand the factors influencing dispersal. Subadults become independent from their mothers at the age of 8-11 months. No sex bias neither in the dispersal rate nor in the distance moved was detected. Lynx are conservative dispersers, compared to bear and wolf, and settled within or close to known lynx occurrences. Dispersal distances reached in the high lynx density population - shorter than those reported in other Eurasian lynx studies - are limited by habitat restriction hindering connections with neighbouring metapopulations. I postulated that high lynx density would lead to an expansion of the population and validated my predictions with data from the north-western Swiss Alps where about 1995 a strong increase in lynx abundance took place. The general hypothesis that high population density will foster the expansion of the population was not confirmed. This has consequences for the re-introduction and recovery of carnivores in a fragmented landscape. To establish a strong source population in one place might not be an optimal strategy. Rather, population nuclei should be founded in several neighbouring patches. Exchange between established neighbouring subpopulations will later on take place, as adult lynx show a higher propensity to cross barriers than subadults. To estimate the potential population size of the lynx in the Jura Mountains and to assess possible corridors between this population and adjacent areas, I adapted a habitat probability model for lynx distribution in the Jura Mountains with new environmental data and extrapolated it over the entire mountain range. The model predicts a breeding population ranging from 74-101 individuals and from 51-79 individuals when continuous habitat patches < 50 km2 are disregarded. The Jura Mountains could once be part of a metapopulation, as potential corridors exist to the adjoining areas (Alps, Vosges Mountains, and Black Forest). Monitoring of the population size, spatial expansion, and the genetic surveillance in the Jura Mountains must be continued, as the status of the population is still critical. ENFA was used to predict the potential distribution of lynx in the Alps. The resulting model divided the Alps into 37 suitable habitat patches ranging from 50 to 18,711 km2, covering a total area of about 93,600 km2. When using the range of lynx densities found in field studies in Switzerland, the Alps could host a population of 961 to 1,827 residents. The results of the cost-distance analysis revealed that all patches were within the reach of dispersing lynx, as the connection costs were in the range of dispersal cost of radio-tagged subadult lynx moving through unfavorable habitat. Thus, the whole Alps could once be considered as a metapopulation. But experience suggests that only few disperser will cross unsuitable areas and barriers. This low migration rate may seldom allow the spontaneous foundation of new populations in unsettled areas. As an alternative to natural dispersal, artificial transfer of individuals across the barriers should be considered. Wildlife biologists can play a crucial role in developing adaptive management experiments to help managers learning by trial. The case of the lynx in Switzerland is a good example of a fruitful cooperation between wildlife biologists, managers, decision makers and politician in an adaptive management process. This cooperation resulted in a Lynx Management Plan which was implemented in 2000 and updated in 2004 to give the cantons directives on how to handle lynx-related problems. This plan was put into practice e.g. in regard to translocation of lynx into unsettled areas. Résumé: L'expansion d'une population en phase de recolonisation, qu'elle soit issue de réintroductions ou d'un retour naturel dépend 1) de facteurs biologiques tels que le système social et le mode de dispersion, 2) de la distribution et la disponibilité des ressources (proies), 3) de l'habitat et des éléments du paysage, 4) de l'acceptation de l'espèce par la population locale et des activités humaines. Afin de pouvoir développer des stratégies efficaces de conservation et de favoriser la recolonisation, chacun de ces facteurs doit être pris en compte. En plus, la distribution potentielle de l'espèce doit pouvoir être déterminée et enfin, toutes les possibilités pour atteindre les objectifs, examinées. La phase de haute densité que la population de lynx a connue dans les années nonante dans le nord-ouest des Alpes suisses a donné lieu à une controverse assez vive. La protection du lynx dans de nombreux pays européens, promue par différentes organisations, a entraîné des conséquences inattendues; ces dernières montrent que tout plan de gestion doit impérativement indiquer des pistes quant à la manière de gérer les conflits, tout en trouvant un équilibre entre l'extinction et la surabondance de l'espèce. Les biologistes de la conservation et de la faune sauvage doivent pour cela évaluer les différents risques encourus par les populations de lynx, afin de pouvoir rapidement prendre les meilleuresmdécisions de gestion. Un modèle d'habitat pour le lynx, basé sur des caractéristiques de l'habitat et des données radio télémétriques collectées dans la chaîne du Jura, a été élaboré afin de prédire la distribution potentielle dans cette région, qui n'est que partiellement occupée par l'espèce. Trois des 18 variables testées, décrivant pour chaque kilomètre carré l'utilisation du sol, la végétation ainsi que la topographie, ont été retenues pour déterminer la probabilité de présence du lynx. La carte qui en résulte a été comparée aux données télémétriques de lynx subadultes en phase de dispersion. Les jeunes qui n'ont pas pu établir leur domaine vital dans l'habitat favorable prédit par le modèle n'ont pas survécu leur première année d'indépendance alors que le seul individu qui est mort dans l'habitat favorable a été braconné. Les données radio-télémétriques sont souvent utilisées pour l'étalonnage de modèles d'habitat. C'est un des seuls moyens à disposition qui permette de récolter des données non biaisées et précises sur l'occupation de l'habitat par des mammifères terrestres aux moeurs discrètes. Mais ces méthodes de- mandent un important investissement en moyens financiers et en temps et peuvent, de ce fait, n'être appliquées qu'à des zones limitées. Les modèles d'habitat sont ainsi souvent extrapolés à de grandes surfaces malgré le risque d'imprécision, qui résulte des variations des caractéristiques et de la disponibilité de l'habitat d'une zone à l'autre. Le pouvoir de prédiction de l'Analyse Ecologique de la Niche (AEN) dans les zones où les données de présence n'ont pas été prises en compte dans le calibrage du modèle a été analysée dans le cas du lynx en Suisse. D'après les résultats obtenus, la meilleure mé- thode pour prédire la distribution du lynx dans une zone alpine dépourvue d'indices de présence est de combiner des données provenant de régions contrastées (Alpes, Jura). Par contre, seules les données sur la présence locale de l'espèce doivent être utilisées pour les zones présentant une faible variance écologique tel que le Jura. La dispersion influence la dynamique et la stabilité des populations, la distribution et l'abondance des espèces et détermine les caractéristiques spatiales et temporelles des communautés vivantes et des écosystèmes. Entre 1988 et 2001, le comportement spatio-temporel de lynx eurasiens subadultes de deux populations réintroduites en Suisse a été étudié, basé sur le suivi de 39 individus juvéniles dont 24 étaient munis d'un collier émetteur, afin de déterminer les facteurs qui influencent la dispersion. Les subadultes se sont séparés de leur mère à l'âge de 8 à 11 mois. Le sexe n'a pas eu d'influence sur le nombre d'individus ayant dispersés et la distance parcourue au cours de la dispersion. Comparé à l'ours et au loup, le lynx reste très modéré dans ses mouvements de dispersion. Tous les individus ayant dispersés se sont établis à proximité ou dans des zones déjà occupées par des lynx. Les distances parcourues lors de la dispersion ont été plus courtes pour la population en phase de haute densité que celles relevées par les autres études de dispersion du lynx eurasien. Les zones d'habitat peu favorables et les barrières qui interrompent la connectivité entre les populations sont les principales entraves aux déplacements, lors de la dispersion. Dans un premier temps, nous avons fait l'hypothèse que les phases de haute densité favorisaient l'expansion des populations. Mais cette hypothèse a été infirmée par les résultats issus du suivi des lynx réalisé dans le nord-ouest des Alpes, où la population connaissait une phase de haute densité depuis 1995. Ce constat est important pour la conservation d'une population de carnivores dans un habitat fragmenté. Ainsi, instaurer une forte population source à un seul endroit n'est pas forcément la stratégie la plus judicieuse. Il est préférable d'établir des noyaux de populations dans des régions voisines où l'habitat est favorable. Des échanges entre des populations avoisinantes pourront avoir lieu par la suite car les lynx adultes sont plus enclins à franchir les barrières qui entravent leurs déplacements que les individus subadultes. Afin d'estimer la taille de la population de lynx dans le Jura et de déterminer les corridors potentiels entre cette région et les zones avoisinantes, un modèle d'habitat a été utilisé, basé sur un nouveau jeu de variables environnementales et extrapolé à l'ensemble du Jura. Le modèle prédit une population reproductrice de 74 à 101 individus et de 51 à 79 individus lorsque les surfaces d'habitat d'un seul tenant de moins de 50 km2 sont soustraites. Comme des corridors potentiels existent effectivement entre le Jura et les régions avoisinantes (Alpes, Vosges, et Forêt Noire), le Jura pourrait faire partie à l'avenir d'une métapopulation, lorsque les zones avoisinantes seront colonisées par l'espèce. La surveillance de la taille de la population, de son expansion spatiale et de sa structure génétique doit être maintenue car le statut de cette population est encore critique. L'AEN a également été utilisée pour prédire l'habitat favorable du lynx dans les Alpes. Le modèle qui en résulte divise les Alpes en 37 sous-unités d'habitat favorable dont la surface varie de 50 à 18'711 km2, pour une superficie totale de 93'600 km2. En utilisant le spectre des densités observées dans les études radio-télémétriques effectuées en Suisse, les Alpes pourraient accueillir une population de lynx résidents variant de 961 à 1'827 individus. Les résultats des analyses de connectivité montrent que les sous-unités d'habitat favorable se situent à des distances telles que le coût de la dispersion pour l'espèce est admissible. L'ensemble des Alpes pourrait donc un jour former une métapopulation. Mais l'expérience montre que très peu d'individus traverseront des habitats peu favorables et des barrières au cours de leur dispersion. Ce faible taux de migration rendra difficile toute nouvelle implantation de populations dans des zones inoccupées. Une solution alternative existe cependant : transférer artificiellement des individus d'une zone à l'autre. Les biologistes spécialistes de la faune sauvage peuvent jouer un rôle important et complémentaire pour les gestionnaires de la faune, en les aidant à mener des expériences de gestion par essai. Le cas du lynx en Suisse est un bel exemple d'une collaboration fructueuse entre biologistes de la faune sauvage, gestionnaires, organes décisionnaires et politiciens. Cette coopération a permis l'élaboration du Concept Lynx Suisse qui est entré en vigueur en 2000 et remis à jour en 2004. Ce plan donne des directives aux cantons pour appréhender la problématique du lynx. Il y a déjà eu des applications concrètes sur le terrain, notamment par des translocations d'individus dans des zones encore inoccupées.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tissue protein hypercatabolism (TPH) is a most important feature in cancer cachexia, particularly with regard to the skeletal muscle. The rat ascites hepatoma Yoshida AH-130 is a very suitable model system for studying the mechanisms involved in the processes that lead to tissue depletion, since it induces in the host a rapid and progressive muscle waste mainly due to TPH (Tessitore, L., G. Bonelli, and F. M. Baccino. 1987. Biochem. J. 241:153-159). Detectable plasma levels of tumor necrosis factor-alpha associated with marked perturbations in the hormonal homeostasis have been shown to concur in forcing metabolism into a catabolic setting (Tessitore, L., P. Costelli, and F. M. Baccino. 1993. Br. J. Cancer. 67:15-23). The present study was directed to investigate if beta 2-adrenergic agonists, which are known to favor skeletal muscle hypertrophy, could effectively antagonize the enhanced muscle protein breakdown in this cancer cachexia model. One such agent, i.e., clenbuterol, indeed largely prevented skeletal muscle waste in AH-130-bearing rats by restoring protein degradative rates close to control values. This normalization of protein breakdown rates was achieved through a decrease of the hyperactivation of the ATP-ubiquitin-dependent proteolytic pathway, as previously demonstrated in our laboratory (Llovera, M., C. García-Martínez, N. Agell, M. Marzábal, F. J. López-Soriano, and J. M. Argilés. 1994. FEBS (Fed. Eur. Biochem. Soc.) Lett. 338:311-318). By contrast, the drug did not exert any measurable effect on various parenchymal organs, nor did it modify the plasma level of corticosterone and insulin, which were increased and decreased, respectively, in the tumor hosts. The present data give new insights into the mechanisms by which clenbuterol exerts its preventive effect on muscle protein waste and seem to warrant the implementation of experimental protocols involving the use of clenbuterol or alike drugs in the treatment of pathological states involving TPH, particularly in skeletal muscle and heart, such as in the present model of cancer cachexia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tissue protein hypercatabolism (TPH) is a most important feature in cancer cachexia, particularly with regard to the skeletal muscle. The rat ascites hepatoma Yoshida AH-130 is a very suitable model system for studying the mechanisms involved in the processes that lead to tissue depletion, since it induces in the host a rapid and progressive muscle waste mainly due to TPH (Tessitore, L., G. Bonelli, and F. M. Baccino. 1987. Biochem. J. 241:153-159). Detectable plasma levels of tumor necrosis factor-alpha associated with marked perturbations in the hormonal homeostasis have been shown to concur in forcing metabolism into a catabolic setting (Tessitore, L., P. Costelli, and F. M. Baccino. 1993. Br. J. Cancer. 67:15-23). The present study was directed to investigate if beta 2-adrenergic agonists, which are known to favor skeletal muscle hypertrophy, could effectively antagonize the enhanced muscle protein breakdown in this cancer cachexia model. One such agent, i.e., clenbuterol, indeed largely prevented skeletal muscle waste in AH-130-bearing rats by restoring protein degradative rates close to control values. This normalization of protein breakdown rates was achieved through a decrease of the hyperactivation of the ATP-ubiquitin-dependent proteolytic pathway, as previously demonstrated in our laboratory (Llovera, M., C. García-Martínez, N. Agell, M. Marzábal, F. J. López-Soriano, and J. M. Argilés. 1994. FEBS (Fed. Eur. Biochem. Soc.) Lett. 338:311-318). By contrast, the drug did not exert any measurable effect on various parenchymal organs, nor did it modify the plasma level of corticosterone and insulin, which were increased and decreased, respectively, in the tumor hosts. The present data give new insights into the mechanisms by which clenbuterol exerts its preventive effect on muscle protein waste and seem to warrant the implementation of experimental protocols involving the use of clenbuterol or alike drugs in the treatment of pathological states involving TPH, particularly in skeletal muscle and heart, such as in the present model of cancer cachexia.