68 resultados para low frequency motion
em CentAUR: Central Archive University of Reading - UK
Resumo:
Experiments have been performed using a simplified, Newtonian forced, global circulation model to investigate how variability of the tropospheric jet can be characterized by examining the combined fluctuations of the two leading modes of annular variability. Eddy forcing of this variability is analyzed in the phase space of the leading modes using the vertically integrated momentum budget. The nature of the annular variability and eddy forcing depends on the time scale. At low frequencies the zonal flow and baroclinic eddies are in quasi equilibrium and anomalies propagate poleward. The eddies are shown primarily to reinforce the anomalous state and are closely balanced by the linear damping, leaving slow evolution as a residual. At high frequencies the flow is strongly evolving and anomalies are initiated on the poleward side of the tropospheric jet and propagate equatorward. The eddies are shown to drive this evolution strongly: eddy location and amplitude reflect the past baroclinicity, while eddy feedback on the zonal flow may be interpreted in terms of wave breaking associated with baroclinic life cycles in lateral shear.
Resumo:
An isentropic potential vorticity (PV) budget analysis is employed to examine the role of synoptic transients, advection, and nonconservative processes as forcings for the evolution of the low-frequency PV anomalies locally and those associated with the North Atlantic Oscillation (NAO) and the Pacific–North American (PNA) pattern. Specifically, the rate of change of the low-frequency PV is expressed as a sum of tendencies due to divergence of eddy transport, advection by the low-frequency flow (hereafter referred to as advection), and the residual nonconservative processes. The balance between the variances and covariances of these terms is illustrated using a novel vector representation. It is shown that for most locations, as well as for the PNA pattern, the PV variability is dominantly driven by advection. The eddy forcing explains a small amount of the tendency variance. For the NAO, the role of synoptic eddy fluxes is found to be stronger, explaining on average 15% of the NAO tendency variance. Previous studies have not assessed quantitively how the various forcings balance the tendency. Thus, such studies may have overestimated the role of eddy fluxes for the evolution of teleconnections by examining, for example, composites and regressions that indicate maintenance, rather than evolution driven by the eddies. The authors confirm this contrasting view by showing that during persistent blocking (negative NAO) episodes the eddy driving is relatively stronger.
Resumo:
A connection is shown to exist between the mesoscale eddy activity around Madagascar and the large-scale interannual variability in the Indian Ocean. We use the combined TOPEX/Poseidon-ERS sea surface height (SSH) data for the period 1993–2003. The SSH-fields in the Mozambique Channel and east of Madagascar exhibit a significant interannual oscillation. This is related to the arrival of large-scale anomalies that propagate westward along 10°–15°S in response to the Indian Ocean dipole (IOD) events. Positive (negative) SSH anomalies associated to a positive (negative) IOD phase induce a shift in the intensity and position of the tropical and subtropical gyres. A weakening (strengthening) results in the intensity of the South Equatorial Current and its branches along east Madagascar. In addition, the flow through the narrows of the Mozambique Channel around 17°S increases (decreases) during periods of a stronger and northward (southward) extension of the subtropical (tropical) gyre. Interaction between the currents in the narrows and southward propagating eddies from the northern Channel leads to interannual variability in the eddy kinetic energy of the central Channel in phase with the one in the SSH-field.
Resumo:
Changes in the effective potential function of a low-frequency large-amplitude molecular vibration, resulting from excitation of a high-frequency vibration, are discussed. It is shown that in some situations a significant contribution to such changes may arise from failure of the Born-Oppenheimer separation of the low-frequency mode. In the particular example of the HF dimer, recent evidence that the tunneling barrier increases on exciting either of the H-stretching vibrations is probably due to this effect.
Resumo:
In this paper we are mainly concerned with the development of efficient computer models capable of accurately predicting the propagation of low-to-middle frequency sound in the sea, in axially symmetric (2D) and in fully 3D environments. The major physical features of the problem, i.e. a variable bottom topography, elastic properties of the subbottom structure, volume attenuation and other range inhomogeneities are efficiently treated. The computer models presented are based on normal mode solutions of the Helmholtz equation on the one hand, and on various types of numerical schemes for parabolic approximations of the Helmholtz equation on the other. A new coupled mode code is introduced to model sound propagation in range-dependent ocean environments with variable bottom topography, where the effects of an elastic bottom, of volume attenuation, surface and bottom roughness are taken into account. New computer models based on finite difference and finite element techniques for the numerical solution of parabolic approximations are also presented. They include an efficient modeling of the bottom influence via impedance boundary conditions, they cover wide angle propagation, elastic bottom effects, variable bottom topography and reverberation effects. All the models are validated on several benchmark problems and versus experimental data. Results thus obtained were compared with analogous results from standard codes in the literature.
Resumo:
Starting from the classical Saltzman two-dimensional convection equations, we derive via a severe spectral truncation a minimal 10 ODE system which includes the thermal effect of viscous dissipation. Neglecting this process leads to a dynamical system which includes a decoupled generalized Lorenz system. The consideration of this process breaks an important symmetry and couples the dynamics of fast and slow variables, with the ensuing modifications to the structural properties of the attractor and of the spectral features. When the relevant nondimensional number (Eckert number Ec) is different from zero, an additional time scale of O(Ec−1) is introduced in the system, as shown with standard multiscale analysis and made clear by several numerical evidences. Moreover, the system is ergodic and hyperbolic, the slow variables feature long-term memory with 1/f3/2 power spectra, and the fast variables feature amplitude modulation. Increasing the strength of the thermal-viscous feedback has a stabilizing effect, as both the metric entropy and the Kaplan-Yorke attractor dimension decrease monotonically with Ec. The analyzed system features very rich dynamics: it overcomes some of the limitations of the Lorenz system and might have prototypical value in relevant processes in complex systems dynamics, such as the interaction between slow and fast variables, the presence of long-term memory, and the associated extreme value statistics. This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties (ergodicity, hyperbolicity) and that cause the model losing ability in describing intrinsically multiscale processes.
Resumo:
Real estate securities have a number of distinct characteristics that differentiate them from stocks generally. Key amongst them is that under-pinning the firms are both real as well as investment assets. The connections between the underlying macro-economy and listed real estate firms is therefore clearly demonstrated and of heightened importance. To consider the linkages with the underlying macro-economic fundamentals we extract the ‘low-frequency’ volatility component from aggregate volatility shocks in 11 international markets over the 1990-2014 period. This is achieved using Engle and Rangel’s (2008) Spline-Generalized Autoregressive Conditional Heteroskedasticity (Spline-GARCH) model. The estimated low-frequency volatility is then examined together with low-frequency macro data in a fixed-effect pooled regression framework. The analysis reveals that the low-frequency volatility of real estate securities has strong and positive association with most of the macroeconomic risk proxies examined. These include interest rates, inflation, GDP and foreign exchange rates.
Resumo:
In this article we assess the abilities of a new electromagnetic (EM) system, the CMD Mini-Explorer, for prospecting of archaeological features in Ireland and the UK. The Mini-Explorer is an EM probe which is primarily aimed at the environmental/geological prospecting market for the detection of pipes and geology. It has long been evident from the use of other EM devices that such an instrument might be suitable for shallow soil studies and applicable for archaeological prospecting. Of particular interest for the archaeological surveyor is the fact that the Mini-Explorer simultaneously obtains both quadrature (‘conductivity’) and in-phase (relative to ‘magnetic susceptibility’) data from three depth levels. As the maximum depth range is probably about 1.5 m, a comprehensive analysis of the subsoil within that range is possible. As with all EM devices the measurements require no contact with the ground, thereby negating the problem of high contact resistance that often besets earth resistance data during dry spells. The use of the CMD Mini-Explorer at a number of sites has demonstrated that it has the potential to detect a range of archaeological features and produces high-quality data that are comparable in quality to those obtained from standard earth resistance and magnetometer techniques. In theory the ability to measure two phenomena at three depths suggests that this type of instrument could reduce the number of poor outcomes that are the result of single measurement surveys. The high success rate reported here in the identification of buried archaeology using a multi-depth device that responds to the two most commonly mapped geophysical phenomena has implications for evaluation style surveys. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
Every winter, the high-latitude oceans are struck by severe storms that are considerably smaller than the weather-dominating synoptic depressions1. Accompanied by strong winds and heavy precipitation, these often explosively developing mesoscale cyclones—termed polar lows1—constitute a threat to offshore activities such as shipping or oil and gas exploitation. Yet owing to their small scale, polar lows are poorly represented in the observational and global reanalysis data2 often used for climatological investigations of atmospheric features and cannot be assessed in coarse-resolution global simulations of possible future climates. Here we show that in a future anthropogenically warmed climate, the frequency of polar lows is projected to decline. We used a series of regional climate model simulations to downscale a set of global climate change scenarios3 from the Intergovernmental Panel of Climate Change. In this process, we first simulated the formation of polar low systems in the North Atlantic and then counted the individual cases. A previous study4 using NCEP/NCAR re-analysis data5 revealed that polar low frequency from 1948 to 2005 did not systematically change. Now, in projections for the end of the twenty-first century, we found a significantly lower number of polar lows and a northward shift of their mean genesis region in response to elevated atmospheric greenhouse gas concentration. This change can be related to changes in the North Atlantic sea surface temperature and mid-troposphere temperature; the latter is found to rise faster than the former so that the resulting stability is increased, hindering the formation or intensification of polar lows. Our results provide a rare example of a climate change effect in which a type of extreme weather is likely to decrease, rather than increase.
Resumo:
Embodied theories of cognition propose that neural substrates used in experiencing the referent of a word, for example perceiving upward motion, should be engaged in weaker form when that word, for example ‘rise’, is comprehended. Motivated by the finding that the perception of irrelevant background motion at near-threshold, but not supra-threshold, levels interferes with task execution, we assessed whether interference from near-threshold background motion was modulated by its congruence with the meaning of words (semantic content) when participants completed a lexical decision task (deciding if a string of letters is a real word or not). Reaction times for motion words, such as ‘rise’ or ‘fall’, were slower when the direction of visual motion and the ‘motion’ of the word were incongruent — but only when the visual motion was at nearthreshold levels. When motion was supra-threshold, the distribution of error rates, not reaction times, implicated low-level motion processing in the semantic processing of motion words. As the perception of near-threshold signals is not likely to be influenced by strategies, our results support a close contact between semantic information and perceptual systems.
Resumo:
Current force feedback, haptic interface devices are generally limited to the display of low frequency, high amplitude spatial data. A typical device consists of a low impedance framework of one or more degrees-of-freedom (dof), allowing a user to explore a pre-defined workspace via an end effector such as a handle, thimble, probe or stylus. The movement of the device is then constrained using high gain positional feedback, thus reducing the apparent dof of the device and conveying the illusion of hard contact to the user. Such devices are, however, limited to a narrow bandwidth of frequencies, typically below 30Hz, and are not well suited to the display of surface properties, such as object texture. This paper details a device to augment an existing force feedback haptic display with a vibrotactile display, thus providing a means of conveying low amplitude, high frequency spatial information of object surface properties. 1. Haptics and Haptic Interfaces Haptics is the study of human touch and interaction with the external environment via touch. Information from the human sense of touch can be classified in to two categories, cutaneous and kinesthetic. Cutaneous information is provided via the mechanoreceptive nerve endings in the glabrous skin of the human hand. It is primarily a means of relaying information regarding small-scale details in the form of skin stretch, compression and vibration.
Resumo:
Measurements from ground-based magnetometers and riometers at auroral latitudes have demonstrated that energetic (~30-300keV) electron precipitation can be modulated in the presence of magnetic field oscillations at ultra-low frequencies. It has previously been proposed that an ultra-low frequency (ULF) wave would modulate field and plasma properties near the equatorial plane, thus modifying the growth rates of whistler-mode waves. In turn, the resulting whistler-mode waves would mediate the pitch-angle scattering of electrons resulting in ionospheric precipitation. In this paper, we investigate this hypothesis by quantifying the changes to the linear growth rate expected due to a slow change in the local magnetic field strength for parameters typical of the equatorial region around 6.6RE radial distance. To constrain our study, we determine the largest possible ULF wave amplitudes from measurements of the magnetic field at geosynchronous orbit. Using nearly ten years of observations from two satellites, we demonstrate that the variation in magnetic field strength due to oscillations at 2mHz does not exceed ±10% of the background field. Modifications to the plasma density and temperature anisotropy are estimated using idealised models. For low temperature anisotropy, there is little change in the whistler-mode growth rates even for the largest ULF wave amplitude. Only for large temperature anisotropies can whistler-mode growth rates be modulated sufficiently to account for the changes in electron precipitation measured by riometers at auroral latitudes.
Resumo:
The EU Water Framework Directive (WFD) requires that the ecological and chemical status of water bodies in Europe should be assessed, and action taken where possible to ensure that at least "good" quality is attained in each case by 2015. This paper is concerned with the accuracy and precision with which chemical status in rivers can be measured given certain sampling strategies, and how this can be improved. High-frequency (hourly) chemical data from four rivers in southern England were subsampled to simulate different sampling strategies for four parameters used for WFD classification: dissolved phosphorus, dissolved oxygen, pH and water temperature. These data sub-sets were then used to calculate the WFD classification for each site. Monthly sampling was less precise than weekly sampling, but the effect on WFD classification depended on the closeness of the range of concentrations to the class boundaries. In some cases, monthly sampling for a year could result in the same water body being assigned to three or four of the WFD classes with 95% confidence, due to random sampling effects, whereas with weekly sampling this was one or two classes for the same cases. In the most extreme case, the same water body could have been assigned to any of the five WFD quality classes. Weekly sampling considerably reduces the uncertainties compared to monthly sampling. The width of the weekly sampled confidence intervals was about 33% that of the monthly for P species and pH, about 50% for dissolved oxygen, and about 67% for water temperature. For water temperature, which is assessed as the 98th percentile in the UK, monthly sampling biases the mean downwards by about 1 °C compared to the true value, due to problems of assessing high percentiles with limited data. Low-frequency measurements will generally be unsuitable for assessing standards expressed as high percentiles. Confining sampling to the working week compared to all 7 days made little difference, but a modest improvement in precision could be obtained by sampling at the same time of day within a 3 h time window, and this is recommended. For parameters with a strong diel variation, such as dissolved oxygen, the value obtained, and thus possibly the WFD classification, can depend markedly on when in the cycle the sample was taken. Specifying this in the sampling regime would be a straightforward way to improve precision, but there needs to be agreement about how best to characterise risk in different types of river. These results suggest that in some cases it will be difficult to assign accurate WFD chemical classes or to detect likely trends using current sampling regimes, even for these largely groundwater-fed rivers. A more critical approach to sampling is needed to ensure that management actions are appropriate and supported by data.
Resumo:
The behavior of the Asian summer monsoon is documented and compared using the European Centre for Medium-Range Weather Forecasts (ECMWF) Reanalysis (ERA) and the National Centers for Environmental Prediction-National Center for Atmospheric Research (NCEP-NCAR) Reanalysis. In terms of seasonal mean climatologies the results suggest that, in several respects, the ERA is superior to the NCEP-NCAR Reanalysis. The overall better simulation of the precipitation and hence the diabatic heating field over the monsoon domain in ERA means that the analyzed circulation is probably nearer reality. In terms of interannual variability, inconsistencies in the definition of weak and strong monsoon years based on typical monsoon indices such as All-India Rainfall (AIR) anomalies and the large-scale wind shear based dynamical monsoon index (DMI) still exist. Two dominant modes of interannual variability have been identified that together explain nearly 50% of the variance. Individually, they have many features in common with the composite flow patterns associated with weak and strong monsoons, when defined in terms of regional AIR anomalies and the large-scale DMI. The reanalyses also show a common dominant mode of intraseasonal variability that describes the latitudinal displacement of the tropical convergence zone from its oceanic-to-continental regime and essentially captures the low-frequency active/break cycles of the monsoon. The relationship between interannual and intraseasonal variability has been investigated by considering the probability density function (PDF) of the principal component of the dominant intraseasonal mode. Based on the DMI, there is an indication that in years with a weaker monsoon circulation, the PDF is skewed toward negative values (i,e., break conditions). Similarly, the PDFs for El Nino and La Nina years suggest that El Nino predisposes the system to more break spells, although the sample size may limit the statistical significance of the results.