992 resultados para Coherent Vortices


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Elucidating the controls on the location and vigor of ice streams is crucial to understanding the processes that lead to fast disintegration of ice flows and ice sheets. In the former North American Laurentide ice sheet, ice stream occurrence appears to have been governed by topographic troughs or areas of soft-sediment geology. This paper reports robust evidence of a major paleo-ice stream over the northwestern Canadian Shield, an area previously assumed to be incompatible with fast ice flow because of the low relief and relatively hard bedrock. A coherent pattern of subglacial bedforms (drumlins and megascalle glacial lineations) demarcates the ice stream flow set, which exhibits a convergent onset zone, a narrow main trunk with abrupt lateral margins, and a lobate terminus. Variations in bedform elongation ratio within the flow set match theoretical expectations of ice velocity. In the center of the ice stream, extremely parallel megascalle glacial lineations tens of kilometers long with elongation ratios in excess of 40:1 attest to a single episode of rapid ice flow. We conclude that while bed properties are likely to be influential in determining the occurrence and vigor of ice streams, contrary to established views, widespread soft-bed geology is not an essential requirement for those ice streams without topographic control. We speculate that the ice stream acted as a release valve on ice-sheet mass balance and was initiated by the presence of a proglacial lake that destabilized the ice-sheet margin and propagated fast ice flow through a series of thermomechanical feedbacks involving ice flow and temperature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a novel numerical algorithm for simulating the evolution of fine-scale conservative fields in layer-wise two-dimensional flows, the most important examples of which are the earth's atmosphere and oceans. the algorithm combines two radically different algorithms, one Lagrangian and the other Eulerian, to achieve an unexpected gain in computational efficiency. The algorithm is demonstrated for multi-layer quasi-geostrophic flow, and results are presented for a simulation of a tilted stratospheric polar vortex and of nearly-inviscid quasi-geostrophic turbulence. the turbulence results contradict previous arguments and simulation results that have suggested an ultimate two-dimensional, vertically-coherent character of the flow. Ongoing extensions of the algorithm to the generally ageostrophic flows characteristic of planetary fluid dynamics are outlined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using a novel numerical method at unprecedented resolution, we demonstrate that structures of small to intermediate scale in rotating, stratified flows are intrinsically three-dimensional. Such flows are characterized by vortices (spinning volumes of fluid), regions of large vorticity gradients, and filamentary structures at all scales. It is found that such structures have predominantly three-dimensional dynamics below a horizontal scale LLR, where LR is the so-called Rossby radius of deformation, equal to the characteristic vertical scale of the fluid H divided by the ratio of the rotational and buoyancy frequencies f/N. The breakdown of two-dimensional dynamics at these scales is attributed to the so-called "tall-column instability" [D. G. Dritschel and M. de la Torre Juárez, J. Fluid. Mech. 328, 129 (1996)], which is active on columnar vortices that are tall after scaling by f/N, or, equivalently, that are narrow compared with LR. Moreover, this instability eventually leads to a simple relationship between typical vertical and horizontal scales: for each vertical wave number (apart from the vertically averaged, barotropic component of the flow) the average horizontal wave number is equal to f/N times the vertical wave number. The practical implication is that three-dimensional modeling is essential to capture the behavior of rotating, stratified fluids. Two-dimensional models are not valid for scales below LR. ©1999 American Institute of Physics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mostly because of a lack of observations, fundamental aspects of the St. Lawrence Estuary's wintertime response to forcing remain poorly understood. The results of a field campaign over the winter of 2002/03 in the estuary are presented. The response of the system to tidal forcing is assessed through the use of harmonic analyses of temperature, salinity, sea level, and current observations. The analyses confirm previous evidence for the presence of semidiurnal internal tides, albeit at greater depths than previously observed for ice-free months. The low-frequency tidal streams were found to be mostly baroclinic in character and to produce an important neap tide intensification of the estuarine circulation. Despite stronger atmospheric momentum forcing in winter, the response is found to be less coherent with the winds than seen in previous studies of ice-free months. The tidal residuals show the cold intermediate layer in the estuary is renewed rapidly ( 14 days) in late March by the advection of a wedge of near-freezing waters from the Gulf of St. Lawrence. In situ processes appeared to play a lesser role in the renewal of this layer. In particular, significant wintertime deepening of the estuarine surface mixed layer was prevented by surface stability, which remained high throughout the winter. The observations also suggest that the bottom circulation was intensified during winter, with the intrusion in the deep layer of relatively warm Atlantic waters, such that the 3 C isotherm rose from below 150 m to near 60 m.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Three interrelated climate phenomena are at the center of the Climate Variability and Predictability (CLIVAR) Atlantic research: tropical Atlantic variability (TAV), the North Atlantic Oscillation (NAO), and the Atlantic meridional overturning circulation (MOC). These phenomena produce a myriad of impacts on society and the environment on seasonal, interannual, and longer time scales through variability manifest as coherent fluctuations in ocean and land temperature, rainfall, and extreme events. Improved understanding of this variability is essential for assessing the likely range of future climate fluctuations and the extent to which they may be predictable, as well as understanding the potential impact of human-induced climate change. CLIVAR is addressing these issues through prioritized and integrated plans for short-term and sustained observations, basin-scale reanalysis, and modeling and theoretical investigations of the coupled Atlantic climate system and its links to remote regions. In this paper, a brief review of the state of understanding of Atlantic climate variability and achievements to date is provided. Considerable discussion is given to future challenges related to building and sustaining observing systems, developing synthesis strategies to support understanding and attribution of observed change, understanding sources of predictability, and developing prediction systems in order to meet the scientific objectives of the CLIVAR Atlantic program.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A suite of climate change indices derived from daily temperature and precipitation data, with a primary focus on extreme events, were computed and analyzed. By setting an exact formula for each index and using specially designed software, analyses done in different countries have been combined seamlessly. This has enabled the presentation of the most up-to-date and comprehensive global picture of trends in extreme temperature and precipitation indices using results from a number of workshops held in data-sparse regions and high-quality station data supplied by numerous scientists world wide. Seasonal and annual indices for the period 1951-2003 were gridded. Trends in the gridded fields were computed and tested for statistical significance. Results showed widespread significant changes in temperature extremes associated with warming, especially for those indices derived from daily minimum temperature. Over 70% of the global land area sampled showed a significant decrease in the annual occurrence of cold nights and a significant increase in the annual occurrence of warm nights. Some regions experienced a more than doubling of these indices. This implies a positive shift in the distribution of daily minimum temperature throughout the globe. Daily maximum temperature indices showed similar changes but with smaller magnitudes. Precipitation changes showed a widespread and significant increase, but the changes are much less spatially coherent compared with temperature change. Probability distributions of indices derived from approximately 200 temperature and 600 precipitation stations, with near-complete data for 1901-2003 and covering a very large region of the Northern Hemisphere midlatitudes (and parts of Australia for precipitation) were analyzed for the periods 1901-1950, 1951-1978 and 1979-2003. Results indicate a significant warming throughout the 20th century. Differences in temperature indices distributions are particularly pronounced between the most recent two periods and for those indices related to minimum temperature. An analysis of those indices for which seasonal time series are available shows that these changes occur for all seasons although they are generally least pronounced for September to November. Precipitation indices show a tendency toward wetter conditions throughout the 20th century.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present results from fast-response wind measurements within and above a busy intersection between two street canyons (Marylebone Road and Gloucester Place) in Westminster, London taken as part of the DAPPLE (Dispersion of Air Pollution and Penetration into the Local Environment; www.dapple.org.uk) 2007 field campaign. The data reported here were collected using ultrasonic anemometers on the roof-top of a building adjacent to the intersection and at two heights on a pair of lamp-posts on opposite sides of the intersection. Site characteristics, data analysis and the variation of intersection flow with the above-roof wind direction (θref) are discussed. Evidence of both flow channelling and recirculation was identified within the canyon, only a few metres from the intersection for along-street and across-street roof-top winds respectively. Results also indicate that for oblique rooftop flows, the intersection flow is a complex combination of bifurcated channelled flows, recirculation and corner vortices. Asymmetries in local building geometry around the intersection and small changes in the background wind direction (changes in 15-min mean θref of 5–10 degrees) were also observed to have profound influences on the behaviour of intersection flow patterns. Consequently, short time-scale variability in the background flow direction can lead to highly scattered in-street mean flow angles masking the true multi-modal features of the flow and thus further complicating modelling challenges.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Military Intelligence (Research) Department of the British War Office was tasked in 1940 with encouraging and supporting armed resistance in occupied Europe and the Axis-controlled Middle East. The major contention of this paper is that, in doing so, MI(R) performed a key role in British strategy in 1940-42 and in the development of what are now known as covert operations. MI(R) developed an organic, but coherent doctrine for such activity which was influential upon the Special Operations Executive (SOE) and its own sub-branch, G(R), which applied this doctrine in practice in East Africa and the Middle East in 1940-41. It was also here that a number of key figures in the development of covert operations and special forces first cut their teeth, the most notable being Major Generals Colin Gubbins and Orde Wingate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Bureau International des Poids et Mesures, the BIPM, was established by Article 1 of the Convention du Mètre, on 20 May 1875, and is charged with providing the basis for a single, coherent system of measurements to be used throughout the world. The decimal metric system, dating from the time of the French Revolution, was based on the metre and the kilogram. Under the terms of the 1875 Convention, new international prototypes of the metre and kilogram were made and formally adopted by the first Conférence Générale des Poids et Mesures (CGPM) in 1889. Over time this system developed, so that it now includes seven base units. In 1960 it was decided at the 11th CGPM that it should be called the Système International d’Unités, the SI (in English: the International System of Units). The SI is not static but evolves to match the world’s increasingly demanding requirements for measurements at all levels of precision and in all areas of science, technology, and human endeavour. This document is a summary of the SI Brochure, a publication of the BIPM which is a statement of the current status of the SI. The seven base units of the SI, listed in Table 1, provide the reference used to define all the measurement units of the International System. As science advances, and methods of measurement are refined, their definitions have to be revised. The more accurate the measurements, the greater the care required in the realization of the units of measurement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Indian Ocean water that ends up in the Atlantic Ocean detaches from the Agulhas Current retroflection predominantly in the form of Agulhas rings and cyclones. Using numerical Lagrangian float trajectories in a high-resolution numerical ocean model, the fate of coherent structures near the Agulhas Current retroflection is investigated. It is shown that within the Agulhas Current, upstream of the retroflection, the spatial distributions of floats ending in the Atlantic Ocean and floats ending in the Indian Ocean are to a large extent similar. This indicates that Agulhas leakage occurs mostly through the detachment of Agulhas rings. After the floats detach from the Agulhas Current, the ambient water quickly looses its relative vorticity. The Agulhas rings thus seem to decay and loose much of their water in the Cape Basin. A cluster analysis reveals that most water in the Agulhas Current is within clusters of 180 km in diameter. Halfway in the Cape Basin there is an increase in the number of larger clusters with low relative vorticity, which carry the bulk of the Agulhas leakage transport through the Cape Basin. This upward cascade with respect to the length scales of the leakage, in combination with a power law decay of the magnitude of relative vorticity, might be an indication that the decay of Agulhas rings is somewhat comparable to the decay of two-dimensional turbulence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The propagation velocity and propagation mechanism for vortices on a β plane are determined for a reduced-gravity model by integrating the momentum equations over the β plane. Isolated vortices, vortices in a background current, and initial vortex propagation from rest are studied. The propagation mechanism for isolated anticyclones as well as cyclones, which has been lacking up to now, is presented. It is shown that, to first order, the vortex moves to generate a Coriolis force on the mass anomaly of the vortex to compensate for the force on the vortex due to the variation of the Coriolis parameter. Only the mass anomaly of the vortex is of importance, because the Coriolis force due to the motion of the bulk of the layer moving with the vortex is almost fully compensated by the Coriolis force on the motion of the exterior flow. Because the mass anomaly of a cyclone is negative the force and acceleration have opposite sign. The role of dipolar structures in steadily moving vortices is discussed, and it is shown that their overall structure is fixed by the steady westward motion of the mass anomaly. Furthermore, it is shown that reduced-gravity vortices are not advected with a background flow. The reason for this behavior is that the background flow changes the ambient vorticity gradient such that the vortex obtains an extra self-propagation term that exactly cancels the advection by the background flow. Last, it is shown that a vortex initially at rest will accelerate equatorward first, after which a westward motion is generated. This result is independent of the sign of the vortex.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The definition of coherent derived units in the International System of Units (SI) is reviewed, and the important role of the equations defining physical quantities is emphasized in obtaining coherent derived units. In the case of the dimensionless quantity plane angle, the choice between alternative definitions is considered, leading to a corresponding choice between alternative definitions of the coherent derived unit - the radian, degree or revolution. In this case the General Conference on Weights and Measures (CGPM) has chosen to adopt the definition that leads to the radian as the coherent derived unit in the SI. In the case of the quantity logarithmic decay (or gain), also sometimes called decrement, and sometimes called level, a similar choice of defining equation exists, leading to a corresponding choice for the coherent derived unit - the neper or the bel. In this case the CGPM has not yet made a choice. We argue that for the quantity logarithmic decay the most logical choice of defining equation is linked to that of the radian, and is that which leads to the neper as the corresponding coherent derived unit. This should not prevent us from using the bel and decibel as units of logarithmic decay. However, it is an important part of the SI to establish in a formal sense the equations defining physical quantities, and the corresponding coherent derived units.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effects of uniform straining and shearing on the stability of a surface quasi-geostrophic temperature filament are investigated. Straining is shown to stabilize perturbations for wide filaments but only for a finite time until the filament thins to a critical width, after which some perturbations can grow. No filament can be stabilized in practice, since there are perturbations that can grow large for any strain rate. The optimally growing perturbations, defined as solutions that reach a certain threshold amplitude first, are found numerically for a wide range of parameter values. The radii of the vortices formed through nonlinear roll-up are found to be proportional to θ/s, where θ is the temperature anomaly of the filament and s the strain rate, and are not dependent on the initial size of the filament. Shearing is shown to reduce the normal-mode growth rates, but it cannot stabilize them completely when there are temperature discontinuities in the basic state; smooth filaments can be stabilized completely by shearing and a simple scaling argument provides the shear rate required. Copyright © 2010 Royal Meteorological Society

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A methodology is presented for the development of a combined seasonal weather and crop productivity forecasting system. The first stage of the methodology is the determination of the spatial scale(s) on which the system could operate; this determination has been made for the case of groundnut production in India. Rainfall is a dominant climatic determinant of groundnut yield in India. The relationship between yield and rainfall has been explored using data from 1966 to 1995. On the all-India scale, seasonal rainfall explains 52% of the variance in yield. On the subdivisional scale, correlations vary between variance r(2) = 0.62 (significance level p < 10(-4)) and a negative correlation with r(2) = 0.1 (p = 0.13). The spatial structure of the relationship between rainfall and groundnut yield has been explored using empirical orthogonal function (EOF) analysis. A coherent, large-scale pattern emerges for both rainfall and yield. On the subdivisional scale (similar to 300 km), the first principal component (PC) of rainfall is correlated well with the first PC of yield (r(2) = 0.53, p < 10(-4)), demonstrating that the large-scale patterns picked out by the EOFs are related. The physical significance of this result is demonstrated. Use of larger averaging areas for the EOF analysis resulted in lower and (over time) less robust correlations. Because of this loss of detail when using larger spatial scales, the subdivisional scale is suggested as an upper limit on the spatial scale for the proposed forecasting system. Further, district-level EOFs of the yield data demonstrate the validity of upscaling these data to the subdivisional scale. Similar patterns have been produced using data on both of these scales, and the first PCs are very highly correlated (r(2) = 0.96). Hence, a working spatial scale has been identified, typical of that used in seasonal weather forecasting, that can form the basis of crop modeling work for the case of groundnut production in India. Last, the change in correlation between yield and seasonal rainfall during the study period has been examined using seasonal totals and monthly EOFs. A further link between yield and subseasonal variability is demonstrated via analysis of dynamical data.