1000 resultados para Lemhi Range


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many studies have reported long-range synchronization of neuronal activity between brain areas, in particular in the beta and gamma bands with frequencies in the range of 14–30 and 40–80 Hz, respectively. Several studies have reported synchrony with zero phase lag, which is remarkable considering the synaptic and conduction delays inherent in the connections between distant brain areas. This result has led to many speculations about the possible functional role of zero-lag synchrony, such as for neuronal communication, attention, memory, and feature binding. However, recent studies using recordings of single-unit activity and local field potentials report that neuronal synchronization may occur with non-zero phase lags. This raises the questions whether zero-lag synchrony can occur in the brain and, if so, under which conditions. We used analytical methods and computer simulations to investigate which connectivity between neuronal populations allows or prohibits zero-lag synchrony. We did so for a model where two oscillators interact via a relay oscillator. Analytical results and computer simulations were obtained for both type I Mirollo–Strogatz neurons and type II Hodgkin–Huxley neurons. We have investigated the dynamics of the model for various types of synaptic coupling and importantly considered the potential impact of Spike-Timing Dependent Plasticity (STDP) and its learning window. We confirm previous results that zero-lag synchrony can be achieved in this configuration. This is much easier to achieve with Hodgkin–Huxley neurons, which have a biphasic phase response curve, than for type I neurons. STDP facilitates zero-lag synchrony as it adjusts the synaptic strengths such that zero-lag synchrony is feasible for a much larger range of parameters than without STDP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During long-range transport, many distinct processes – including photochemistry, deposition, emissions and mixing – contribute to the transformation of air mass composition. Partitioning the effects of different processes can be useful when considering the sensitivity of chemical transformation to, for example, a changing environment or anthropogenic influence. However, transformation is not observed directly, since mixing ratios are measured, and models must be used to relate changes to processes. Here, four cases from the ITCT-Lagrangian 2004 experiment are studied. In each case, aircraft intercepted a distinct air mass several times during transport over the North Atlantic, providing a unique dataset and quantifying the net changes in composition from all processes. A new framework is presented to deconstruct the change in O3 mixing ratio (Δ O3) into its component processes, which were not measured directly, taking into account the uncertainty in measurements, initial air mass variability and its time evolution. The results show that the net chemical processing (Δ O3chem) over the whole simulation is greater than net physical processing (Δ O3phys) in all cases. This is in part explained by cancellation effects associated with mixing. In contrast, each case is in a regime of either net photochemical destruction (lower tropospheric transport) or production (an upper tropospheric biomass burning case). However, physical processes influence O3 indirectly through addition or removal of precursor gases, so that changes to physical parameters in a model can have a larger effect on Δ O3chem than Δ O3phys. Despite its smaller magnitude, the physical processing distinguishes the lower tropospheric export cases, since the net photochemical O3 change is −5 ppbv per day in all three cases. Processing is quantified using a Lagrangian photochemical model with a novel method for simulating mixing through an ensemble of trajectories and a background profile that evolves with them. The model is able to simulate the magnitude and variability of the observations (of O3, CO, NOy and some hydrocarbons) and is consistent with the time-average OH following air-masses inferred from hydrocarbon measurements alone (by Arnold et al., 2007). Therefore, it is a useful new method to simulate air mass evolution and variability, and its sensitivity to process parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

By combining electrostatic measurements of lightning-induced electrostatic field changes with radio frequency lightning location, some field changes from exceptionally distant lightning events are apparent which are inconsistent with the usual inverse cube of distance. Furthermore, by using two measurement sites, a transition zone can be identified beyond which the electric field response reverses polarity. For these severe lightning events, we infer a horizontally extensive charge sheet above a thunderstorm, consistent with a mesospheric halo of several hundred kilometers’ extent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of pulse compression techniques to improve the sensitivity of meteorological radars has become increasingly common in recent years. An unavoidable side-effect of such techniques is the formation of ‘range sidelobes’ which lead to spreading of information across several range gates. These artefacts are particularly troublesome in regions where there is a sharp gradient in the power backscattered to the antenna as a function of range. In this article we present a simple method for identifying and correcting range sidelobe artefacts. We make use of the fact that meteorological targets produce an echo which fluctuates at random, and that this echo, like a fingerprint, is unique to each range gate. By cross-correlating the echo time series from pairs of gates therefore we can identify whether information from one gate has spread into another, and hence flag regions of contamination. In addition we show that the correlation coefficients contain quantitative information about the fraction of power leaked from one range gate to another, and we propose a simple algorithm to correct the corrupted reflectivity profile.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Providing probabilistic forecasts using Ensemble Prediction Systems has become increasingly popular in both the meteorological and hydrological communities. Compared to conventional deterministic forecasts, probabilistic forecasts may provide more reliable forecasts of a few hours to a number of days ahead, and hence are regarded as better tools for taking uncertainties into consideration and hedging against weather risks. It is essential to evaluate performance of raw ensemble forecasts and their potential values in forecasting extreme hydro-meteorological events. This study evaluates ECMWF’s medium-range ensemble forecasts of precipitation over the period 2008/01/01-2012/09/30 on a selected mid-latitude large scale river basin, the Huai river basin (ca. 270,000 km2) in central-east China. The evaluation unit is sub-basin in order to consider forecast performance in a hydrologically relevant way. The study finds that forecast performance varies with sub-basin properties, between flooding and non-flooding seasons, and with the forecast properties of aggregated time steps and lead times. Although the study does not evaluate any hydrological applications of the ensemble precipitation forecasts, its results have direct implications in hydrological forecasts should these ensemble precipitation forecasts be employed in hydrology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With advances in technology, terahertz imaging and spectroscopy are beginning to move out of the laboratory and find applications in areas as diverse as security screening, medicine, art conservation and field archaeology. Nevertheless, there is still a need to improve upon the performance of existing terahertz systems to achieve greater compactness and robustness, enhanced spatial resolution, more rapid data acquisition times and operation at greater standoff distances. This chapter will review recent technological developments in this direction that make use of nanostructures in the generation, detection and manipulation of terahertz radiation. The chapter will also explain how terahertz spectroscopy can be used as a tool to characterize the ultrafast carrier dynamics of nanomaterials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The pollen beetle, Meligethes aeneus, is a significant pest of oilseed rape crops and there is considerable research effort focused on developing novel, sustainable methods of integrated control. These insects rely on flight for all dispersal movements and we have investigated their flight patterns using a novel combination of data from suction traps, vertical-looking radar and field counts. Analysis of these preliminary data will help determine the best timing for different control measures within an integrated pest management strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although no GM crops currently are licensed for commercial production in the UK, as opposition to GM crops by consumers softens, this could change quickly. Although past studies have examined attitudes of UK farmers toward GM technologies in general, there has been little work on the impact of possible coexistence measures on their attitudes toward GM crop production. This could be because the UK Government has not engaged in any public dialogue on the coexistence measures that might be applied on farms. Based on a farm survey, this article examines farmers’ attitudes toward GM technologies and planting intentions for three crops (maize, oilseed rape, and sugar beet) based on a GM availability scenario. The article then nuances this analysis with a review of farmer perceptions of the level of constraint associated with a suite of notional farm-level coexistence measures and issues, based on current European Commission guidelines and practice in other EU Member States.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Incomplete understanding of three aspects of the climate system—equilibrium climate sensitivity, rate of ocean heat uptake and historical aerosol forcing—and the physical processes underlying them lead to uncertainties in our assessment of the global-mean temperature evolution in the twenty-first century1,2. Explorations of these uncertainties have so far relied on scaling approaches3,4, large ensembles of simplified climate models1,2, or small ensembles of complex coupled atmosphere–ocean general circulation models5,6 which under-represent uncertainties in key climate system properties derived from independent sources7–9. Here we present results from a multi-thousand-member perturbed-physics ensemble of transient coupled atmosphere–ocean general circulation model simulations. We find that model versions that reproduce observed surface temperature changes over the past 50 years show global-mean temperature increases of 1.4–3 K by 2050, relative to 1961–1990, under a mid-range forcing scenario. This range of warming is broadly consistent with the expert assessment provided by the Intergovernmental Panel on Climate Change Fourth Assessment Report10, but extends towards larger warming than observed in ensemblesof-opportunity5 typically used for climate impact assessments. From our simulations, we conclude that warming by the middle of the twenty-first century that is stronger than earlier estimates is consistent with recent observed temperature changes and a mid-range ‘no mitigation’ scenario for greenhouse-gas emissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wall plaster sequences from the Neolithic town of Çatalhöyük have been analysed and compared to three types of natural sediment found in the vicinity of the site, using a range of analytical techniques. Block samples containing the plaster sequences were removed from the walls of several different buildings on the East Mound. Sub-samples were examined by IR spectroscopy, X-ray diffraction and X-ray fluorescence to determine the overall mineralogical and elemental composition, whilst thin sections were studied using optical polarising microscopy, IR Microscopy and Environmental Scanning Electron Microscopy with Energy Dispersive X-ray analysis. The results of this study have shown that there are two types of wall plaster found in the sequences and that the sediments used to produce these were obtained from at least two distinct sources. In particular, the presence of clay, calcite and magnesian calcite in the foundation plasters suggested that these were prepared predominantly from a marl source. On the other hand, the finishing plasters were found to contain dolomite with a small amount of clay and no calcite, revealing that softlime was used in their preparation. Whilst marl is located directly below and around Çatalhöyük, the nearest source of softlime is 6.5 km away, an indication that the latter was important to the Neolithic people, possibly due to the whiter colour (5Y 8/1) of this sediment. Furthermore, the same two plaster types were found on each wall of Building 49, the main building studied in this research, and in all five buildings investigated, suggesting that the use of these sources was an established practice for the inhabitants of several different households across the site.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many of the next generation of global climate models will include aerosol schemes which explicitly simulate the microphysical processes that determine the particle size distribution. These models enable aerosol optical properties and cloud condensation nuclei (CCN) concentrations to be determined by fundamental aerosol processes, which should lead to a more physically based simulation of aerosol direct and indirect radiative forcings. This study examines the global variation in particle size distribution simulated by 12 global aerosol microphysics models to quantify model diversity and to identify any common biases against observations. Evaluation against size distribution measurements from a new European network of aerosol supersites shows that the mean model agrees quite well with the observations at many sites on the annual mean, but there are some seasonal biases common to many sites. In particular, at many of these European sites, the accumulation mode number concentration is biased low during winter and Aitken mode concentrations tend to be overestimated in winter and underestimated in summer. At high northern latitudes, the models strongly underpredict Aitken and accumulation particle concentrations compared to the measurements, consistent with previous studies that have highlighted the poor performance of global aerosol models in the Arctic. In the marine boundary layer, the models capture the observed meridional variation in the size distribution, which is dominated by the Aitken mode at high latitudes, with an increasing concentration of accumulation particles with decreasing latitude. Considering vertical profiles, the models reproduce the observed peak in total particle concentrations in the upper troposphere due to new particle formation, although modelled peak concentrations tend to be biased high over Europe. Overall, the multi-model-mean data set simulates the global variation of the particle size distribution with a good degree of skill, suggesting that most of the individual global aerosol microphysics models are performing well, although the large model diversity indicates that some models are in poor agreement with the observations. Further work is required to better constrain size-resolved primary and secondary particle number sources, and an improved understanding of nucleation and growth (e.g. the role of nitrate and secondary organics) will improve the fidelity of simulated particle size distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current UK intake of non-milk extrinsic sugars (NMES) is above recommendations. Reducing the sugar content of processed high sugar foods through reformulation is one option for reducing consumption of NMES at a population level. However, reformulation can alter the sensory attributes of food products and influence consumer liking. This study evaluated consumer acceptance of a selection of products that are commercially-available in the UK; these included regular and sugar-reduced baked beans, strawberry jam, milk chocolate, cola and cranberry & raspberry juice. Sweeteners were present in the reformulated chocolate (maltitol), cola (aspartame and acesulfame-K) and juice (sucralose) samples. Healthy, non-smoking consumers (n = 116; 55 men, 61 women, age: 33 ± 9 years; BMI: 25.7 ± 4.6 kg/m2) rated the products for overall liking and on liking of appearance, flavor and texture using a nine-point hedonic scale. There were significant differences between standard and reduced sugar products in consumers’ overall liking and on liking of each modality (appearance, flavor and texture; all P < 0.0001). For overall liking, only the regular beans and cola were significantly more liked than their reformulated counterparts (P < 0.0001). Cluster analysis identified three consumer clusters that were representative of different patterns of consumer liking. For the largest cluster (cluster 3: 45%), there was a significant difference in mean liking scores across all products, except jam. Differences in liking were predominantly driven by sweet taste in 2 out of 3 clusters. The current research has demonstrated that a high proportion of consumers prefer conventional products over sugar-reduced products across a wide range of product types (45%) or across selected products (27%), when tasted unbranded, and so there is room for further optimization of commercial reduced sugar products that were evaluated in the current study. Future work should evaluate strategies to facilitate compliance to dietary recommendations on NMES and free sugars, such as the impact of sugar-reduced food exposure on their acceptance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An analysis of diabatic heating and moistening processes from 12-36 hour lead time forecasts from 12 Global Circulation Models are presented as part of the "Vertical structure and physical processes of the Madden-Julian Oscillation (MJO)" project. A lead time of 12-36 hours is chosen to constrain the large scale dynamics and thermodynamics to be close to observations while avoiding being too close to the initial spin-up for the models as they adjust to being driven from the YOTC analysis. A comparison of the vertical velocity and rainfall with the observations and YOTC analysis suggests that the phases of convection associated with the MJO are constrained in most models at this lead time although the rainfall in the suppressed phase is typically overestimated. Although the large scale dynamics is reasonably constrained, moistening and heating profiles have large inter-model spread. In particular, there are large spreads in convective heating and moistening at mid-levels during the transition to active convection. Radiative heating and cloud parameters have the largest relative spread across models at upper levels during the active phase. A detailed analysis of time step behaviour shows that some models show strong intermittency in rainfall and differences in the precipitation and dynamics relationship between models. The wealth of model outputs archived during this project is a very valuable resource for model developers beyond the study of the MJO. In addition, the findings of this study can inform the design of process model experiments, and inform the priorities for field experiments and future observing systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Free range egg producers face continuing problems from injurious pecking (IP) which has financial consequences for farmers and poor welfare implications for birds. Beak trimming has been practised for many years to limit the damage caused by IP, but with the UK Government giving notification that they intend to ban beak trimming in 2016, considerable efforts have been made to devise feasible housing, range and management strategies to reduce IP. A recent research project investigated the efficacy of a range of IP reducing management strategies, the mean costs of which came to around 5 pence per bird. Here, the results of the above project’s consumer survey are presented: consumers’ attitudes to free range egg production are detailed showing that, whilst consumers had a very positive attitude towards free range eggs, they were especially uninformed about some aspects of free range egg production. The contingent valuation technique was used to estimate the price premium consumers would be prepared to pay to ensure that hens do not suffer from IP: this was calculated as just over 3% on top of the prevailing retail price of free range eggs. These findings reinforce other studies that have found that whilst consumers are not generally well-informed about certain specific welfare problems faced by animals under free range conditions, they are prepared to pay to improve animal welfare. Indeed, the study findings suggest that producers could obtain an additional price premium if they demonstrate the welfare provenance of their eggs, perhaps through marketing the eggs as coming from birds with intact beaks. This welfare provenance issue could usefully be assured to consumers by the introduction of a mandatory, single, accredited EU-wide welfare-standards labelling scheme.