132 resultados para Range-finding


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many studies have reported long-range synchronization of neuronal activity between brain areas, in particular in the beta and gamma bands with frequencies in the range of 14–30 and 40–80 Hz, respectively. Several studies have reported synchrony with zero phase lag, which is remarkable considering the synaptic and conduction delays inherent in the connections between distant brain areas. This result has led to many speculations about the possible functional role of zero-lag synchrony, such as for neuronal communication, attention, memory, and feature binding. However, recent studies using recordings of single-unit activity and local field potentials report that neuronal synchronization may occur with non-zero phase lags. This raises the questions whether zero-lag synchrony can occur in the brain and, if so, under which conditions. We used analytical methods and computer simulations to investigate which connectivity between neuronal populations allows or prohibits zero-lag synchrony. We did so for a model where two oscillators interact via a relay oscillator. Analytical results and computer simulations were obtained for both type I Mirollo–Strogatz neurons and type II Hodgkin–Huxley neurons. We have investigated the dynamics of the model for various types of synaptic coupling and importantly considered the potential impact of Spike-Timing Dependent Plasticity (STDP) and its learning window. We confirm previous results that zero-lag synchrony can be achieved in this configuration. This is much easier to achieve with Hodgkin–Huxley neurons, which have a biphasic phase response curve, than for type I neurons. STDP facilitates zero-lag synchrony as it adjusts the synaptic strengths such that zero-lag synchrony is feasible for a much larger range of parameters than without STDP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During long-range transport, many distinct processes – including photochemistry, deposition, emissions and mixing – contribute to the transformation of air mass composition. Partitioning the effects of different processes can be useful when considering the sensitivity of chemical transformation to, for example, a changing environment or anthropogenic influence. However, transformation is not observed directly, since mixing ratios are measured, and models must be used to relate changes to processes. Here, four cases from the ITCT-Lagrangian 2004 experiment are studied. In each case, aircraft intercepted a distinct air mass several times during transport over the North Atlantic, providing a unique dataset and quantifying the net changes in composition from all processes. A new framework is presented to deconstruct the change in O3 mixing ratio (Δ O3) into its component processes, which were not measured directly, taking into account the uncertainty in measurements, initial air mass variability and its time evolution. The results show that the net chemical processing (Δ O3chem) over the whole simulation is greater than net physical processing (Δ O3phys) in all cases. This is in part explained by cancellation effects associated with mixing. In contrast, each case is in a regime of either net photochemical destruction (lower tropospheric transport) or production (an upper tropospheric biomass burning case). However, physical processes influence O3 indirectly through addition or removal of precursor gases, so that changes to physical parameters in a model can have a larger effect on Δ O3chem than Δ O3phys. Despite its smaller magnitude, the physical processing distinguishes the lower tropospheric export cases, since the net photochemical O3 change is −5 ppbv per day in all three cases. Processing is quantified using a Lagrangian photochemical model with a novel method for simulating mixing through an ensemble of trajectories and a background profile that evolves with them. The model is able to simulate the magnitude and variability of the observations (of O3, CO, NOy and some hydrocarbons) and is consistent with the time-average OH following air-masses inferred from hydrocarbon measurements alone (by Arnold et al., 2007). Therefore, it is a useful new method to simulate air mass evolution and variability, and its sensitivity to process parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In addition to the Hamiltonian functional itself, non-canonical Hamiltonian dynamical systems generally possess integral invariants known as ‘Casimir functionals’. In the case of the Euler equations for a perfect fluid, the Casimir functionals correspond to the vortex topology, whose invariance derives from the particle-relabelling symmetry of the underlying Lagrangian equations of motion. In a recent paper, Vallis, Carnevale & Young (1989) have presented algorithms for finding steady states of the Euler equations that represent extrema of energy subject to given vortex topology, and are therefore stable. The purpose of this note is to point out a very general method for modifying any Hamiltonian dynamical system into an algorithm that is analogous to those of Vallis etal. in that it will systematically increase or decrease the energy of the system while preserving all of the Casimir invariants. By incorporating momentum into the extremization procedure, the algorithm is able to find steadily-translating as well as steady stable states. The method is applied to a variety of perfect-fluid systems, including Euler flow as well as compressible and incompressible stratified flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The role of atmospheric general circulation model (AGCM) horizontal resolution in representing the global energy budget and hydrological cycle is assessed, with the aim of improving the understanding of model uncertainties in simulating the hydrological cycle. We use two AGCMs from the UK Met Office Hadley Centre: HadGEM1-A at resolutions ranging from 270 to 60 km, and HadGEM3-A ranging from 135 to 25 km. The models exhibit a stable hydrological cycle, although too intense compared to reanalyses and observations. This over-intensity is explained by excess surface shortwave radiation, a common error in general circulation models (GCMs). This result is insensitive to resolution. However, as resolution is increased, precipitation decreases over the ocean and increases over the land. This is associated with an increase in atmospheric moisture transport from ocean to land, which changes the partitioning of moisture fluxes that contribute to precipitation over land from less local to more non-local moisture sources. The results start to converge at 60-km resolution, which underlines the excessive reliance of the mean hydrological cycle on physical parametrization (local unresolved processes) versus model dynamics (large-scale resolved processes) in coarser HadGEM1 and HadGEM3 GCMs. This finding may be valid for other GCMs, showing the necessity to analyze other chains of GCMs that may become available in the future with such a range of horizontal resolutions. Our finding supports the hypothesis that heterogeneity in model parametrization is one of the underlying causes of model disagreement in the Coupled Model Intercomparison Project (CMIP) exercises.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

By combining electrostatic measurements of lightning-induced electrostatic field changes with radio frequency lightning location, some field changes from exceptionally distant lightning events are apparent which are inconsistent with the usual inverse cube of distance. Furthermore, by using two measurement sites, a transition zone can be identified beyond which the electric field response reverses polarity. For these severe lightning events, we infer a horizontally extensive charge sheet above a thunderstorm, consistent with a mesospheric halo of several hundred kilometers’ extent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent research into sea ice friction has focussed on ways to provide a model which maintains much of the clarity and simplicity of Amonton's law, yet also accounts for memory effects. One promising avenue of research has been to adapt the rate- and state- dependent models which are prevalent in rock friction. In such models it is assumed that there is some fixed critical slip displacement, which is effectively a measure of the displacement over which memory effects might be considered important. Here we show experimentally that a fixed critical slip displacement is not a valid assumption in ice friction, whereas a constant critical slip time appears to hold across a range of parameters and scales. As a simple rule of thumb, memory effects persist to a significant level for 10 s. We then discuss the implications of this finding for modelling sea ice friction and for our understanding of friction in general.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of pulse compression techniques to improve the sensitivity of meteorological radars has become increasingly common in recent years. An unavoidable side-effect of such techniques is the formation of ‘range sidelobes’ which lead to spreading of information across several range gates. These artefacts are particularly troublesome in regions where there is a sharp gradient in the power backscattered to the antenna as a function of range. In this article we present a simple method for identifying and correcting range sidelobe artefacts. We make use of the fact that meteorological targets produce an echo which fluctuates at random, and that this echo, like a fingerprint, is unique to each range gate. By cross-correlating the echo time series from pairs of gates therefore we can identify whether information from one gate has spread into another, and hence flag regions of contamination. In addition we show that the correlation coefficients contain quantitative information about the fraction of power leaked from one range gate to another, and we propose a simple algorithm to correct the corrupted reflectivity profile.