28 resultados para clear air turbulence
Resumo:
A new method of clear-air turbulence (CAT) forecasting based on the Lighthill–Ford theory of spontaneous imbalance and emission of inertia–gravity waves has been derived and applied on episodic and seasonal time scales. A scale analysis of this shallow-water theory for midlatitude synoptic-scale flows identifies advection of relative vorticity as the leading-order source term. Examination of leading- and second-order terms elucidates previous, more empirically inspired CAT forecast diagnostics. Application of the Lighthill–Ford theory to the Upper Mississippi and Ohio Valleys CAT outbreak of 9 March 2006 results in good agreement with pilot reports of turbulence. Application of Lighthill–Ford theory to CAT forecasting for the 3 November 2005–26 March 2006 period using 1-h forecasts of the Rapid Update Cycle (RUC) 2 1500 UTC model run leads to superior forecasts compared to the current operational version of the Graphical Turbulence Guidance (GTG1) algorithm, the most skillful operational CAT forecasting method in existence. The results suggest that major improvements in CAT forecasting could result if the methods presented herein become operational.
Resumo:
Recent research has shown that Lighthill–Ford spontaneous gravity wave generation theory, when applied to numerical model data, can help predict areas of clear-air turbulence. It is hypothesized that this is the case because spontaneously generated atmospheric gravity waves may initiate turbulence by locally modifying the stability and wind shear. As an improvement on the original research, this paper describes the creation of an ‘operational’ algorithm (ULTURB) with three modifications to the original method: (1) extending the altitude range for which the method is effective downward to the top of the boundary layer, (2) adding turbulent kinetic energy production from the environment to the locally produced turbulent kinetic energy production, and, (3) transforming turbulent kinetic energy dissipation to eddy dissipation rate, the turbulence metric becoming the worldwide ‘standard’. In a comparison of ULTURB with the original method and with the Graphical Turbulence Guidance second version (GTG2) automated procedure for forecasting mid- and upper-level aircraft turbulence ULTURB performed better for all turbulence intensities. Since ULTURB, unlike GTG2, is founded on a self-consistent dynamical theory, it may offer forecasters better insight into the causes of the clear-air turbulence and may ultimately enhance its predictability.
Resumo:
The variation of wind-optimal transatlantic flight routes and their turbulence potential is investigated to understand how upper-level winds and large-scale flow patterns can affect the efficiency and safety of long-haul flights. In this study, the wind-optimal routes (WORs) that minimize the total flight time by considering wind variations are modeled for flights between John F. Kennedy International Airport (JFK) in New York, New York, and Heathrow Airport (LHR) in London, United Kingdom, during two distinct winter periods of abnormally high and low phases of North Atlantic Oscillation (NAO) teleconnection patterns. Eastbound WORs approximate the JFK–LHR great circle (GC) route following northerly shifted jets in the +NAO period. Those WORs deviate southward following southerly shifted jets during the −NAO period, because eastbound WORs fly closely to the prevailing westerly jets to maximize tailwinds. Westbound WORs, however, spread meridionally to avoid the jets near the GC in the +NAO period to minimize headwinds. In the −NAO period, westbound WORs are north of the GC because of the southerly shifted jets. Consequently, eastbound WORs are faster but have higher probabilities of encountering clear-air turbulence than westbound ones, because eastbound WORs are close to the jet streams, especially near the cyclonic shear side of the jets in the northern (southern) part of the GC in the +NAO (−NAO) period. This study suggests how predicted teleconnection weather patterns can be used for long-haul strategic flight planning, ultimately contributing to minimizing aviation’s impact on the environment
Resumo:
Atmospheric turbulence causes most weather-related aircraft incidents1. Commercial aircraft encounter moderate-or-greater turbulence tens of thousands of times each year worldwide, injuring probably hundreds of passengers (occasionally fatally), costing airlines tens of millions of dollars and causing structural damage to planes1, 2, 3. Clear-air turbulence is especially difficult to avoid, because it cannot be seen by pilots or detected by satellites or on-board radar4, 5. Clear-air turbulence is linked to atmospheric jet streams6, 7, which are projected to be strengthened by anthropogenic climate change8. However, the response of clear-air turbulence to projected climate change has not previously been studied. Here we show using climate model simulations that clear-air turbulence changes significantly within the transatlantic flight corridor when the concentration of carbon dioxide in the atmosphere is doubled. At cruise altitudes within 50–75° N and 10–60° W in winter, most clear-air turbulence measures show a 10–40% increase in the median strength of turbulence and a 40–170% increase in the frequency of occurrence of moderate-or-greater turbulence. Our results suggest that climate change will lead to bumpier transatlantic flights by the middle of this century. Journey times may lengthen and fuel consumption and emissions may increase. Aviation is partly responsible for changing the climate9, but our findings show for the first time how climate change could affect aviation.
Resumo:
Several previous studies have attempted to assess the sublimation depth-scales of ice particles from clouds into clear air. Upon examining the sublimation depth-scales in the Met Office Unified Model (MetUM), it was found that the MetUM has evaporation depth-scales 2–3 times larger than radar observations. Similar results can be seen in the European Centre for Medium-Range Weather Forecasts (ECMWF), Regional Atmospheric Climate Model (RACMO) and Météo-France models. In this study, we use radar simulation (converting model variables into radar observations) and one-dimensional explicit microphysics numerical modelling to test and diagnose the cause of the deep sublimation depth-scales in the forecast model. The MetUM data and parametrization scheme are used to predict terminal velocity, which can be compared with the observed Doppler velocity. This can then be used to test the hypothesis as to why the sublimation depth-scale is too large within the MetUM. Turbulence could lead to dry air entrainment and higher evaporation rates; particle density may be wrong, particle capacitance may be too high and lead to incorrect evaporation rates or the humidity within the sublimating layer may be incorrectly represented. We show that the most likely cause of deep sublimation zones is an incorrect representation of model humidity in the layer. This is tested further by using a one-dimensional explicit microphysics model, which tests the sensitivity of ice sublimation to key atmospheric variables and is capable of including sonde and radar measurements to simulate real cases. Results suggest that the MetUM grid resolution at ice cloud altitudes is not sufficient enough to maintain the sharp drop in humidity that is observed in the sublimation zone.
Resumo:
Despite the importance of microphysical cloud processes on the climate system, some topics are under-explored. For example, few measurements of droplet charges in nonthunderstorm clouds exist. Balloon carried charge sensors can be used to provide new measurements. A charge sensor is described for use with meteorological balloons, which has been tested over a range of atmospheric temperatures from -60 to 20 degrees C, in cloudy and clear air. The rapid time response of the sensor (to >10 V s(-1)) permits charge densities from 100 fC m(-3) to 1 nC m(-3) to be determined, which is sufficient for it to act as a cloud edge charge detector at weakly charged horizontal cloud boundaries.
Resumo:
Abstract Foggy air and clear air have appreciably different electrical conductivities. The conductivity gradient at horizontal droplet boundaries causes droplet charging, as a result of vertical current flow in the global atmospheric electrical circuit. The charging is poorly known, as both the current flow through atmospheric water droplet layers and the air conductivity are poorly characterised experimentally. Surface measurements during three days of continuous fog using new instrument techniques show that a shallow (of order 100 m deep) fog layer still permits the vertical conduction current to pass. Further, the conductivity in the fog is estimated to be approximately 20% lower than in clear air. Assuming a fog transition thickness of one metre, this implies a vertical conductivity gradient of order 10 fS m−2 at the boundary. The actual vertical conductivity gradient at a cloud boundary would probably be greater, due to the presence of larger droplets in clouds compared to fog, and cleaner, more conductive clear air aloft.
Resumo:
The cloud-air transition zone at stratiform cloud edges is an electrically active region where droplet charging has been predicted. Cloud edge droplet charging is expected from vertical flow of cosmic ray generated atmospheric ions in the global electric circuit. Experimental confirmation of stratiform cloud edge electrification is presented here, through charge and droplet measurements made within an extensive layer of supercooled stratiform cloud, using a specially designed electrostatic sensor. Negative space charge up to 35 pC m−3 was found in a thin (<100 m) layer at the lower cloud boundary associated with the clear air-cloud conductivity gradient, agreeing closely with space charge predicted from the measured droplet concentration using ion-aerosol theory. Such charge levels carried by droplets are sufficient to influence collision processes between cloud droplets.
Resumo:
A low cost, disposable instrument for measuring solar radiation during meteorological balloon flights through cloud layers is described. Using a photodiode detector and low thermal drift signal conditioning circuitry, the device showed less than 1% drift for temperatures varied from +20 °C to −35 °C. The angular response to radiation, which declined less rapidly than the cosine of the angle between the incident radiation and normal incidence, is used for cloud detection exploiting the motion of the platform. Oriented upwards, the natural motion imposed by the balloon allows cloud and clear air to be distinguished by the absence of radiation variability within cloud, where the diffuse radiation present is isotropic. The optical method employed by the solar radiation instrument has also been demonstrated to provide higher resolution measurements of cloud boundaries than relative humidity measurements alone.
Resumo:
Measurements of the electrical characteristics of the atmosphere above the surface have been made for over 200 years, from a variety of different platforms, including kites, balloons, rockets and aircraft. From these measurements, a great deal of information about the electrical characteristics of the atmosphere has been gained, assisting our understanding of the global atmospheric electric circuit, thunderstorm electrification and lightning generation mechanisms, discovery of transient luminous events above thunderstorms, and many other electrical phenomena. This paper surveys the history of atmospheric electrical measurements aloft, from the earliest manned balloon ascents to current day observations with free balloons and aircraft. Measurements of atmospheric electrical parameters in a range of meteorological conditions are described, including clear air conditions, polluted conditions, non-thunderstorm clouds, and thunderstorm clouds, spanning a range of atmospheric conditions, from fair weather, to the most electrically active.
Resumo:
The DIAMET (DIAbatic influences on Mesoscale structures in ExTratropical storms) project aims to improve forecasts of high-impact weather in extratropical cyclones through field measurements, high-resolution numerical modeling, and improved design of ensemble forecasting and data assimilation systems. This article introduces DIAMET and presents some of the first results. Four field campaigns were conducted by the project, one of which, in late 2011, coincided with an exceptionally stormy period marked by an unusually strong, zonal North Atlantic jet stream and a succession of severe windstorms in northwest Europe. As a result, December 2011 had the highest monthly North Atlantic Oscillation index (2.52) of any December in the last 60 years. Detailed observations of several of these storms were gathered using the UK’s BAe146 research aircraft and extensive ground-based measurements. As an example of the results obtained during the campaign, observations are presented of cyclone Friedhelm on 8 December 2011, when surface winds with gusts exceeding 30 m s-1 crossed central Scotland, leading to widespread disruption to transportation and electricity supply. Friedhelm deepened 44 hPa in 24 hours and developed a pronounced bent-back front wrapping around the storm center. The strongest winds at 850 hPa and the surface occurred in the southern quadrant of the storm, and detailed measurements showed these to be most intense in clear air between bands of showers. High-resolution ensemble forecasts from the Met Office showed similar features, with the strongest winds aligned in linear swaths between the bands, suggesting that there is potential for improved skill in forecasts of damaging winds.
Resumo:
The ITCT-Lagrangian-2K4 (Intercontinental Transport and Chemical Transformation) experiment was conceived with an aim to quantify the effects of photochemistry and mixing on the transformation of air masses in the free troposphere away from emissions. To this end, attempts were made to intercept and sample air masses several times during their journey across the North Atlantic using four aircraft based in New Hampshire (USA), Faial (Azores) and Creil (France). This article begins by describing forecasts from two Lagrangian models that were used to direct the aircraft into target air masses. A novel technique then identifies Lagrangian matches between flight segments. Two independent searches are conducted: for Lagrangian model matches and for pairs of whole air samples with matching hydrocarbon fingerprints. The information is filtered further by searching for matching hydrocarbon samples that are linked by matching trajectories. The quality of these "coincident matches'' is assessed using temperature, humidity and tracer observations. The technique pulls out five clear Lagrangian cases covering a variety of situations and these are examined in detail. The matching trajectories and hydrocarbon fingerprints are shown, and the downwind minus upwind differences in tracers are discussed.
Resumo:
Magnetic sensors have been added to a standard weather balloon radiosonde package to detect motion in turbulent air. These measure the terrestrial magnetic field and return data over the standard uhf radio telemetry. Variability in the magnetic sensor data is caused by motion of the instrument package. A series of radiosonde ascents carrying these sensors has been made near a Doppler lidar measuring atmospheric properties. Lidar-retrieved quantities include vertical velocity (w) profile and its standard deviation (w). w determined over 1 h is compared with the radiosonde motion variability at the same heights. Vertical motion in the radiosonde is found to be robustly increased when w>0.75 m s−1 and is linearly proportional to w. ©2009 American Institute of Physics