932 resultados para Ensemble of classifiers


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The newly developed atmosphere–ocean-chemistry-climate model SOCOL-MPIOM is presented by demonstrating the influence of the interactive chemistry module on the climate state and the variability. Therefore, we compare pre-industrial control simulations with (CHEM) and without (NOCHEM) interactive chemistry. In general, the influence of the chemistry on the mean state and the variability is small and mainly restricted to the stratosphere and mesosphere. The largest differences are found for the atmospheric dynamics in the polar regions, with slightly stronger northern and southern winter polar vortices in CHEM. The strengthening of the vortex is related to larger stratospheric temperature gradients, which are attributed to a parametrization of the absorption of ozone and oxygen in the Lyman-alpha, Schumann–Runge, Hartley, and Higgins bands. This effect is parametrized in the version with interactive chemistry only. A second reason for the temperature differences between CHEM and NOCHEM is related to diurnal variations in the ozone concentrations in the higher atmosphere, which are missing in NOCHEM. Furthermore, stratospheric water vapour concentrations differ substantially between the two experiments, but their effect on the temperatures is small. In both setups, the simulated intensity and variability of the northern polar vortex is inside the range of present day observations. Sudden stratospheric warming events are well reproduced in terms of their frequency, but the distribution amongst the winter months is too uniform. Additionally, the performance of SOCOL-MPIOM under changing external forcings is assessed for the period 1600–2000 using an ensemble of simulations driven by a spectral solar forcing reconstruction. The amplitude of the reconstruction is large in comparison to other state-of-the-art reconstructions, providing an upper limit for the importance of the solar signal. In the pre-industrial period (1600–1850) the simulated surface temperature trends are in reasonable agreement with temperature reconstructions, although the multi-decadal variability is more pronounced. This enhanced variability can be attributed to the variability in the solar forcing. The simulated temperature reductions during the Maunder Minimum are in the lowest probability range of the proxy records. During the Dalton Minimum, when also volcanic forcing is an important driver of temperature variations, the agreement is better. In the industrial period from 1850 onward SOCOL-MPIOM overestimates the temperature increase in comparison to observational data sets. Sensitivity simulations show that this overestimation can be attributed to the increasing trend in the solar forcing reconstruction that is used in this study and an additional warming induced by the simulated ozone changes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Earth’s carbon and hydrologic cycles are intimately coupled by gas exchange through plant stomata1, 2, 3. However, uncertainties in the magnitude4, 5, 6 and consequences7, 8 of the physiological responses9, 10 of plants to elevated CO2 in natural environments hinders modelling of terrestrial water cycling and carbon storage11. Here we use annually resolved long-term δ13C tree-ring measurements across a European forest network to reconstruct the physiologically driven response of intercellular CO2 (Ci) caused by atmospheric CO2 (Ca) trends. When removing meteorological signals from the δ13C measurements, we find that trees across Europe regulated gas exchange so that for one ppmv atmospheric CO2 increase, Ci increased by ~0.76 ppmv, most consistent with moderate control towards a constant Ci/Ca ratio. This response corresponds to twentieth-century intrinsic water-use efficiency (iWUE) increases of 14 ± 10 and 22 ± 6% at broadleaf and coniferous sites, respectively. An ensemble of process-based global vegetation models shows similar CO2 effects on iWUE trends. Yet, when operating these models with climate drivers reintroduced, despite decreased stomatal opening, 5% increases in European forest transpiration are calculated over the twentieth century. This counterintuitive result arises from lengthened growing seasons, enhanced evaporative demand in a warming climate, and increased leaf area, which together oppose effects of CO2-induced stomatal closure. Our study questions changes to the hydrological cycle, such as reductions in transpiration and air humidity, hypothesized to result from plant responses to anthropogenic emissions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Air was sampled from the porous firn layer at the NEEM site in Northern Greenland. We use an ensemble of ten reference tracers of known atmospheric history to characterise the transport properties of the site. By analysing uncertainties in both data and the reference gas atmospheric histories, we can objectively assign weights to each of the gases used for the depth-diffusivity reconstruction. We define an objective root mean square criterion that is minimised in the model tuning procedure. Each tracer constrains the firn profile differently through its unique atmospheric history and free air diffusivity, making our multiple-tracer characterisation method a clear improvement over the commonly used single-tracer tuning. Six firn air transport models are tuned to the NEEM site; all models successfully reproduce the data within a 1σ Gaussian distribution. A comparison between two replicate boreholes drilled 64 m apart shows differences in measured mixing ratio profiles that exceed the experimental error. We find evidence that diffusivity does not vanish completely in the lock-in zone, as is commonly assumed. The ice age- gas age difference (1 age) at the firn-ice transition is calculated to be 182+3−9 yr. We further present the first intercomparison study of firn air models, where we introduce diagnostic scenarios designed to probe specific aspects of the model physics. Our results show that there are major differences in the way the models handle advective transport. Furthermore, diffusive fractionation of isotopes in the firn is poorly constrained by the models, which has consequences for attempts to reconstruct the isotopic composition of trace gases back in time using firn air and ice core records.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The spatial context is critical when assessing present-day climate anomalies, attributing them to potential forcings and making statements regarding their frequency and severity in a long-term perspective. Recent international initiatives have expanded the number of high-quality proxy-records and developed new statistical reconstruction methods. These advances allow more rigorous regional past temperature reconstructions and, in turn, the possibility of evaluating climate models on policy-relevant, spatio-temporal scales. Here we provide a new proxy-based, annually-resolved, spatial reconstruction of the European summer (June–August) temperature fields back to 755 CE based on Bayesian hierarchical modelling (BHM), together with estimates of the European mean temperature variation since 138 BCE based on BHM and composite-plus-scaling (CPS). Our reconstructions compare well with independent instrumental and proxy-based temperature estimates, but suggest a larger amplitude in summer temperature variability than previously reported. Both CPS and BHM reconstructions indicate that the mean 20th century European summer temperature was not significantly different from some earlier centuries, including the 1st, 2nd, 8th and 10th centuries CE. The 1st century (in BHM also the 10th century) may even have been slightly warmer than the 20th century, but the difference is not statistically significant. Comparing each 50 yr period with the 1951–2000 period reveals a similar pattern. Recent summers, however, have been unusually warm in the context of the last two millennia and there are no 30 yr periods in either reconstruction that exceed the mean average European summer temperature of the last 3 decades (1986–2015 CE). A comparison with an ensemble of climate model simulations suggests that the reconstructed European summer temperature variability over the period 850–2000 CE reflects changes in both internal variability and external forcing on multi-decadal time-scales. For pan-European temperatures we find slightly better agreement between the reconstruction and the model simulations with high-end estimates for total solar irradiance. Temperature differences between the medieval period, the recent period and the Little Ice Age are larger in the reconstructions than the simulations. This may indicate inflated variability of the reconstructions, a lack of sensitivity and processes to changes in external forcing on the simulated European climate and/or an underestimation of internal variability on centennial and longer time scales.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sentinel-5 (S5) and its precursor (S5P) are future European satellite missions aiming at global monitoring of methane (CH4) column-average dry air mole fractions (XCH4). The spectrometers to be deployed onboard the satellites record spectra of sunlight backscattered from the Earth's surface and atmosphere. In particular, they exploit CH4 absorption in the shortwave infrared spectral range around 1.65 mu m (S5 only) and 2.35 mu m (both S5 and S5P) wavelength. Given an accuracy goal of better than 2% for XCH4 to be delivered on regional scales, assessment and reduction of potential sources of systematic error such as spectroscopic uncertainties is crucial. Here, we investigate how spectroscopic errors propagate into retrieval errors on the global scale. To this end, absorption spectra of a ground-based Fourier transform spectrometer (FTS) operating at very high spectral resolution serve as estimate for the quality of the spectroscopic parameters. Feeding the FTS fitting residuals as a perturbation into a global ensemble of simulated S5- and S5P-like spectra at relatively low spectral resolution, XCH4 retrieval errors exceed 0.6% in large parts of the world and show systematic correlations on regional scales, calling for improved spectroscopic parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We compare the ocean temperature evolution of the Holocene as simulated by climate models and reconstructed from marine temperature proxies. This site provides informations about the Holocene temperature trends as simulated by the models. We use transient simulations from a coupled atmosphere-ocean general circulation model, as well as an ensemble of time slice simulations from the Paleoclimate Modelling Intercomparison Project. The general pattern of sea surface temperature (SST) in the models shows a high latitude cooling and a low latitude warming. The proxy dataset comprises a global compilation of marine alkenone- and Mg/Ca-derived SST estimates. Independently of the choice of the climate model, we observe significant mismatches between modelled and estimated SST amplitudes in the trends for the last 6000 years. Alkenone-based SST records show a similar pattern as the simulated annual mean SSTs, but the simulated SST trends underestimate the alkenone-based SST trends by a factor of two to five. For Mg/Ca, no significant relationship between model simulations and proxy reconstructions can be detected. We tested if such discrepancies can be caused by too simplistic interpretations of the proxy data. We tested different seasons and depths in the model to compare the proxy data trends, and can reconcile only part of the mismatches on a regional scale. We therefore considered the additional environmental factor changes in the planktonic organisms' habitat depth and a time-shift in the recording season to diagnose whether invoking those environmental factors can help reconciling the proxy records and the model simulations. We find that invoking shifts in the living season and habitat depth can remove some of the model-data discrepancies in SST trends. Regardless whether such adjustments in the environmental parameters during the Holocene are realistic, they indicate that when modeled temperature trends are set up to allow drastic shifts in the ecological behavior of planktonic organisms, they do not capture the full range of reconstructed SST trends. Our findings indicate that climate model and reconstructed temperature trends are to a large degree only qualitatively comparable, thus providing a challenge for the interpretation of proxy data as well as the models' sensitivity to orbital forcing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Antarctica is a continent with a strong character. High wind speeds, very low temperatures and heavy snow storms. All these parameters are well known due to observations and measurements, but precipitation measurements are still rare because the number of manned stations is very limited in Antarctica. In such a polar snow region many wind driven phenomena associated with snow fall exist like snow drift, blowing snow or sastrugi. Snow drift is defined as a layer of snow formed by the wind during a snowstorm. The horizontal visibility is below eye level. Blowing snow is specified as an ensemble of snow particles raised by the wind to moderate or great heights above the ground; the horizontal visibility at eye level is generally very poor (National Snow And Ice Data Center (NSIDC), 2013). Sastrugi are complex, fragile and sharp ridges or grooves formed on land or over sea ice. They arise from wind erosion, saltation of snow particles and deposition. To get more details about these procedures better instruments than the conventional stake array are required. This small report introduces a new measuring technique and therefore offers a never used dataset of snow heights. It is very common to measure the snow height with a stake array in Antarctica (f.e. Neumayer Station, Kohnen Station) but not with a laser beam. Thus the idea was born to install a new instrument in December 2012 at Neumayer Station.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Opportunities offered by high performance computing provide a significant degree of promise in the enhancement of the performance of real-time flood forecasting systems. In this paper, a real-time framework for probabilistic flood forecasting through data assimilation is presented. The distributed rainfall-runoff real-time interactive basin simulator (RIBS) model is selected to simulate the hydrological process in the basin. Although the RIBS model is deterministic, it is run in a probabilistic way through the results of calibration developed in a previous work performed by the authors that identifies the probability distribution functions that best characterise the most relevant model parameters. Adaptive techniques improve the result of flood forecasts because the model can be adapted to observations in real time as new information is available. The new adaptive forecast model based on genetic programming as a data assimilation technique is compared with the previously developed flood forecast model based on the calibration results. Both models are probabilistic as they generate an ensemble of hydrographs, taking the different uncertainties inherent in any forecast process into account. The Manzanares River basin was selected as a case study, with the process being computationally intensive as it requires simulation of many replicas of the ensemble in real time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The characteristics of the power-line communication (PLC) channel are difficult to model due to the heterogeneity of the networks and the lack of common wiring practices. To obtain the full variability of the PLC channel, random channel generators are of great importance for the design and testing of communication algorithms. In this respect, we propose a random channel generator that is based on the top-down approach. Basically, we describe the multipath propagation and the coupling effects with an analytical model. We introduce the variability into a restricted set of parameters and, finally, we fit the model to a set of measured channels. The proposed model enables a closed-form description of both the mean path-loss profile and the statistical correlation function of the channel frequency response. As an example of application, we apply the procedure to a set of in-home measured channels in the band 2-100 MHz whose statistics are available in the literature. The measured channels are divided into nine classes according to their channel capacity. We provide the parameters for the random generation of channels for all nine classes, and we show that the results are consistent with the experimental ones. Finally, we merge the classes to capture the entire heterogeneity of in-home PLC channels. In detail, we introduce the class occurrence probability, and we present a random channel generator that targets the ensemble of all nine classes. The statistics of the composite set of channels are also studied, and they are compared to the results of experimental measurement campaigns in the literature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Performing activity recognition using the information provided by the different sensors embedded in a smartphone face limitations due to the capabilities of those devices when the computations are carried out in the terminal. In this work a fuzzy inference module is implemented in order to decide which classifier is the most appropriate to be used at a specific moment regarding the application requirements and the device context characterized by its battery level, available memory and CPU load. The set of classifiers that is considered is composed of Decision Tables and Trees that have been trained using different number of sensors and features. In addition, some classifiers perform activity recognition regardless of the on-body device position and others rely on the previous recognition of that position to use a classifier that is trained with measurements gathered with the mobile placed on that specific position. The modules implemented show that an evaluation of the classifiers allows sorting them so the fuzzy inference module can choose periodically the one that best suits the device context and application requirements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Comments This article is a U.S. government work, and is not subject to copyright in the United States. Abstract Potential consequences of climate change on crop production can be studied using mechanistic crop simulation models. While a broad variety of maize simulation models exist, it is not known whether different models diverge on grain yield responses to changes in climatic factors, or whether they agree in their general trends related to phenology, growth, and yield. With the goal of analyzing the sensitivity of simulated yields to changes in temperature and atmospheric carbon dioxide concentrations [CO2], we present the largest maize crop model intercomparison to date, including 23 different models. These models were evaluated for four locations representing a wide range of maize production conditions in the world: Lusignan (France), Ames (USA), Rio Verde (Brazil) and Morogoro (Tanzania). While individual models differed considerably in absolute yield simulation at the four sites, an ensemble of a minimum number of models was able to simulate absolute yields accurately at the four sites even with low data for calibration, thus suggesting that using an ensemble of models has merit. Temperature increase had strong negative influence on modeled yield response of roughly 0.5 Mg ha 1 per °C. Doubling [CO2] from 360 to 720 lmol mol 1 increased grain yield by 7.5% on average across models and the sites. That would therefore make temperature the main factor altering maize yields at the end of this century. Furthermore, there was a large uncertainty in the yield response to [CO2] among models. Model responses to temperature and [CO2] did not differ whether models were simulated with low calibration information or, simulated with high level of calibration information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El presente trabajo realiza un análisis de la vulnerabilidad de la viticultura en España ante el Cambio Climático que contribuya a la mejora de la capacidad de respuesta del sector vitivinícola a la hora de afrontar los retos de la globalización. Para ello se analiza el impacto que puede tener el Cambio Climático en primer lugar sobre determinados riesgos ocasionados por eventos climáticos adversos relacionados con extremos climáticos y en segundo lugar, sobre los principales índices agro-climáticos definidos en el Sistema de Clasificación Climática Multicriterio Geoviticultura (MCGG), que permiten clasificar las zonas desde un punto de vista de su potencial climático. Para el estudio de las condiciones climáticas se han utilizado los escenarios de Cambio Climático regionalizados del proyecto ESCENA, desarrollados dentro del Plan Nacional de Adaptación al Cambio Climático (PNACC) con el fin de promover iniciativas de anticipación y respuesta al Cambio Climático hasta el año 2050. Como parte clave del estudio de la vulnerabilidad, en segundo lugar se miden las necesidades de adaptación para 56 Denominaciones de Origen Protegidas, definidas por los impactos y de acuerdo con un análisis de sensibilidad desarrollado en este trabajo. De este análisis se desprende que los esfuerzos de adaptación se deberían centrar en el mantenimiento de la calidad sobre todo para mejorar las condiciones en la época de maduración en los viñedos de la mitad norte, mientras que en las zonas de la mitad sur y del arco mediterráneo, además deberían buscar mantener la productividad en la viticultura. Los esfuerzos deberían ser más intensos en esta zona sur y también estarían sujetos a más limitaciones, ya que por ejemplo el riego, que podría llegar a ser casi obligatorio para mantener el cultivo, se enfrentaría a un contexto de mayor competencia y escasez de recursos hídricos. La capacidad de afrontar estas necesidades de adaptación determinará la vulnerabilidad del viñedo en cada zona en el futuro. Esta capacidad está definida por las propias necesidades y una serie de condicionantes sociales y de limitaciones legales, como las impuestas por las propias Denominaciones de Origen, o medioambientales, como la limitación del uso de agua. El desarrollo de estrategias que aseguren una utilización sostenible de los recursos hídricos, así como el apoyo de las Administraciones dentro de la nueva Política Agraria Común (PAC) pueden mejorar esta capacidad de adaptación y con ello disminuir la vulnerabilidad. ABSTRACT This paper analyzes the vulnerability of viticulture in Spain on Climate Change in order to improve the adaptive capacity of the wine sector to meet the diverse challenges of globalization. The risks to quality and quantity are explored by considering bioclimatic indices with specific emphasis on the Protected Designation of Origin areas that produce the premium winegrapes. The Indices selected represents risks caused by adverse climatic events related to climate extremes, and requirements of varieties and vintage quality in the case of those used in the Multicriteria Climatic Classification System. (MCCS). To study the climatic conditions, an ensemble of Regional Climate Models (RCMs) of ESCENA project, developed in the framework of the Spanish Plan for Regional Climate Change Scenarios (PNACC-2012) have been used As a key part of the study of vulnerability risks and opportunities are linked to adaptation needs across the Spanish territory. Adaptation efforts are calculated as proportional to the magnitude of change and according to a sensitivity analysis for 56 protected designations of origin. This analysis shows that adaptation efforts should focus on improving conditions in the ripening period to maintain quality in the vineyards of the northern half of Iberian Peninsula, while in areas of the southern half and in the Mediterranean basin, also should seek to maintain productivity of viticulture. Therefore, efforts should be more intense in the Southern and Eastern part, and may also be subject to other limitations, such as irrigation, which could become almost mandatory to keep growing, would face a context of increased competition and lack of resources water. The ability to meet these needs will determine the vulnerability of the vineyard in each region in the future. This capability is defined also by a number of social factors and legal limitations such as environmental regulations, limited water resources or those imposed by their own Designation of Origin. The development of strategies to ensure sustainable use of water resources and the support schemes in the new Common Agricultural Policy (CAP) can improve the resilience and thus reduce vulnerability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sin duda, el rostro humano ofrece mucha más información de la que pensamos. La cara transmite sin nuestro consentimiento señales no verbales, a partir de las interacciones faciales, que dejan al descubierto nuestro estado afectivo, actividad cognitiva, personalidad y enfermedades. Estudios recientes [OFT14, TODMS15] demuestran que muchas de nuestras decisiones sociales e interpersonales derivan de un previo análisis facial de la cara que nos permite establecer si esa persona es confiable, trabajadora, inteligente, etc. Esta interpretación, propensa a errores, deriva de la capacidad innata de los seres humanas de encontrar estas señales e interpretarlas. Esta capacidad es motivo de estudio, con un especial interés en desarrollar métodos que tengan la habilidad de calcular de manera automática estas señales o atributos asociados a la cara. Así, el interés por la estimación de atributos faciales ha crecido rápidamente en los últimos años por las diversas aplicaciones en que estos métodos pueden ser utilizados: marketing dirigido, sistemas de seguridad, interacción hombre-máquina, etc. Sin embargo, éstos están lejos de ser perfectos y robustos en cualquier dominio de problemas. La principal dificultad encontrada es causada por la alta variabilidad intra-clase debida a los cambios en la condición de la imagen: cambios de iluminación, oclusiones, expresiones faciales, edad, género, etnia, etc.; encontradas frecuentemente en imágenes adquiridas en entornos no controlados. Este de trabajo de investigación estudia técnicas de análisis de imágenes para estimar atributos faciales como el género, la edad y la postura, empleando métodos lineales y explotando las dependencias estadísticas entre estos atributos. Adicionalmente, nuestra propuesta se centrará en la construcción de estimadores que tengan una fuerte relación entre rendimiento y coste computacional. Con respecto a éste último punto, estudiamos un conjunto de estrategias para la clasificación de género y las comparamos con una propuesta basada en un clasificador Bayesiano y una adecuada extracción de características. Analizamos en profundidad el motivo de porqué las técnicas lineales no han logrado resultados competitivos hasta la fecha y mostramos cómo obtener rendimientos similares a las mejores técnicas no-lineales. Se propone un segundo algoritmo para la estimación de edad, basado en un regresor K-NN y una adecuada selección de características tal como se propuso para la clasificación de género. A partir de los experimentos desarrollados, observamos que el rendimiento de los clasificadores se reduce significativamente si los ´estos han sido entrenados y probados sobre diferentes bases de datos. Hemos encontrado que una de las causas es la existencia de dependencias entre atributos faciales que no han sido consideradas en la construcción de los clasificadores. Nuestro resultados demuestran que la variabilidad intra-clase puede ser reducida cuando se consideran las dependencias estadísticas entre los atributos faciales de el género, la edad y la pose; mejorando el rendimiento de nuestros clasificadores de atributos faciales con un coste computacional pequeño. Abstract Surely the human face provides much more information than we think. The face provides without our consent nonverbal cues from facial interactions that reveal our emotional state, cognitive activity, personality and disease. Recent studies [OFT14, TODMS15] show that many of our social and interpersonal decisions derive from a previous facial analysis that allows us to establish whether that person is trustworthy, hardworking, intelligent, etc. This error-prone interpretation derives from the innate ability of human beings to find and interpret these signals. This capability is being studied, with a special interest in developing methods that have the ability to automatically calculate these signs or attributes associated with the face. Thus, the interest in the estimation of facial attributes has grown rapidly in recent years by the various applications in which these methods can be used: targeted marketing, security systems, human-computer interaction, etc. However, these are far from being perfect and robust in any domain of problems. The main difficulty encountered is caused by the high intra-class variability due to changes in the condition of the image: lighting changes, occlusions, facial expressions, age, gender, ethnicity, etc.; often found in images acquired in uncontrolled environments. This research work studies image analysis techniques to estimate facial attributes such as gender, age and pose, using linear methods, and exploiting the statistical dependencies between these attributes. In addition, our proposal will focus on the construction of classifiers that have a good balance between performance and computational cost. We studied a set of strategies for gender classification and we compare them with a proposal based on a Bayesian classifier and a suitable feature extraction based on Linear Discriminant Analysis. We study in depth why linear techniques have failed to provide competitive results to date and show how to obtain similar performances to the best non-linear techniques. A second algorithm is proposed for estimating age, which is based on a K-NN regressor and proper selection of features such as those proposed for the classification of gender. From our experiments we note that performance estimates are significantly reduced if they have been trained and tested on different databases. We have found that one of the causes is the existence of dependencies between facial features that have not been considered in the construction of classifiers. Our results demonstrate that intra-class variability can be reduced when considering the statistical dependencies between facial attributes gender, age and pose, thus improving the performance of our classifiers with a reduced computational cost.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prion protein consists of an ensemble of glycosylated variants or glycoforms. The enzymes that direct oligosaccharide processing, and hence control the glycan profile for any given glycoprotein, are often exquisitely sensitive to other events taking place within the cell in which the glycoprotein is expressed. Alterations in the populations of sugars attached to proteins can reflect changes caused, for example, by developmental processes or by disease. Here we report that normal (PrPC) and pathogenic (PrPSc) prion proteins (PrP) from Syrian hamsters contain the same set of at least 52 bi-, tri-, and tetraantennary N-linked oligosaccharides, although the relative proportions of individual glycans differ. This conservation of structure suggests that the conversion of PrPC into PrPSc is not confined to a subset of PrPs that contain specific sugars. Compared with PrPC, PrPSc contains decreased levels of glycans with bisecting GlcNAc residues and increased levels of tri- and tetraantennary sugars. This change is consistent with a decrease in the activity of N-acetylglucosaminyltransferase III (GnTIII) toward PrPC in cells where PrPSc is formed and argues that, in at least some cells forming PrPSc, the glycosylation machinery has been perturbed. The reduction in GnTIII activity is intriguing both with respect to the pathogenesis of the prion disease and the replication pathway for prions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Deciphering the information that eyes, ears, and other sensory organs transmit to the brain is important for understanding the neural basis of behavior. Recordings from single sensory nerve cells have yielded useful insights, but single neurons generally do not mediate behavior; networks of neurons do. Monitoring the activity of all cells in a neural network of a behaving animal, however, is not yet possible. Taking an alternative approach, we used a realistic cell-based model to compute the ensemble of neural activity generated by one sensory organ, the lateral eye of the horseshoe crab, Limulus polyphemus. We studied how the neural network of this eye encodes natural scenes by presenting to the model movies recorded with a video camera mounted above the eye of an animal that was exploring its underwater habitat. Model predictions were confirmed by simultaneously recording responses from single optic nerve fibers of the same animal. We report here that the eye transmits to the brain robust “neural images” of objects having the size, contrast, and motion of potential mates. The neural code for such objects is not found in ambiguous messages of individual optic nerve fibers but rather in patterns of coherent activity that extend over small ensembles of nerve fibers and are bound together by stimulus motion. Integrative properties of neurons in the first synaptic layer of the brain appear well suited to detecting the patterns of coherent activity. Neural coding by this relatively simple eye helps explain how horseshoe crabs find mates and may lead to a better understanding of how more complex sensory organs process information.