40 resultados para mixing model
em CentAUR: Central Archive University of Reading - UK
Resumo:
A Lagrangian model of photochemistry and mixing is described (CiTTyCAT, stemming from the Cambridge Tropospheric Trajectory model of Chemistry And Transport), which is suitable for transport and chemistry studies throughout the troposphere. Over the last five years, the model has been developed in parallel at several different institutions and here those developments have been incorporated into one "community" model and documented for the first time. The key photochemical developments include a new scheme for biogenic volatile organic compounds and updated emissions schemes. The key physical development is to evolve composition following an ensemble of trajectories within neighbouring air-masses, including a simple scheme for mixing between them via an evolving "background profile", both within the boundary layer and free troposphere. The model runs along trajectories pre-calculated using winds and temperature from meteorological analyses. In addition, boundary layer height and precipitation rates, output from the analysis model, are interpolated to trajectory points and used as inputs to the mixing and wet deposition schemes. The model is most suitable in regimes when the effects of small-scale turbulent mixing are slow relative to advection by the resolved winds so that coherent air-masses form with distinct composition and strong gradients between them. Such air-masses can persist for many days while stretching, folding and thinning. Lagrangian models offer a useful framework for picking apart the processes of air-mass evolution over inter-continental distances, without being hindered by the numerical diffusion inherent to global Eulerian models. The model, including different box and trajectory modes, is described and some output for each of the modes is presented for evaluation. The model is available for download from a Subversion-controlled repository by contacting the corresponding authors.
Resumo:
The INtegrated CAtchment (INCA) model has been developed to simulate the impact of mine discharges on river systems. The model accounts for the key kinetic chemical processes operating as well as the dilution, mixing and redistribution of pollutants in rivers downstream of mine discharges or acid rock drainage sites. The model is dynamic and simulates the day-to-day behaviour of hydrology and eight metals (cadmium, mercury, copper, zinc, lead, arsenic, manganese and chromium) as well as cyanide and ammonia. The model is semi-distributed and can simulate catchments, sub-catchment and in-stream river behaviour. The model has been applied to the Roia Montan Mine in Transylvania, Romania, and used to assess the impacts of old mine adits on the local catchments as well as on the downstream Aries and Mures river system. The question of mine restoration is investigated and a set of clean-up scenarios investigated. It is shown that the planned restoration will generate a much improved water quality from the mine and also alleviate the metal pollution of the river system.
Resumo:
The entropy budget is calculated of the coupled atmosphere–ocean general circulation model HadCM3. Estimates of the different entropy sources and sinks of the climate system are obtained directly from the diabatic heating terms, and an approximate estimate of the planetary entropy production is also provided. The rate of material entropy production of the climate system is found to be ∼50 mW m−2 K−1, a value intermediate in the range 30–70 mW m−2 K−1 previously reported from different models. The largest part of this is due to sensible and latent heat transport (∼38 mW m−2 K−1). Another 13 mW m−2 K−1 is due to dissipation of kinetic energy in the atmosphere by friction and Reynolds stresses. Numerical entropy production in the atmosphere dynamical core is found to be about 0.7 mW m−2 K−1. The material entropy production within the ocean due to turbulent mixing is ∼1 mW m−2 K−1, a very small contribution to the material entropy production of the climate system. The rate of change of entropy of the model climate system is about 1 mW m−2 K−1 or less, which is comparable with the typical size of the fluctuations of the entropy sources due to interannual variability, and a more accurate closure of the budget than achieved by previous analyses. Results are similar for FAMOUS, which has a lower spatial resolution but similar formulation to HadCM3, while more substantial differences are found with respect to other models, suggesting that the formulation of the model has an important influence on the climate entropy budget. Since this is the first diagnosis of the entropy budget in a climate model of the type and complexity used for projection of twenty-first century climate change, it would be valuable if similar analyses were carried out for other such models.
Resumo:
(From author). Comments: First 3D stochastic/fractal model of cirrus; first detailed analysis & explanation of power spectra of ice water content, including first observations of 50-km scale break and mixing-induced steepening of spectrum; first demonstration of the potential effect of wind shear on radiative fluxes by changing fall-streak orientation. Has spawned work on the effect of 3D photon transport on the radiative effects of cirrus clouds.
Resumo:
In this study, the processes affecting sea surface temperature variability over the 1992–98 period, encompassing the very strong 1997–98 El Niño event, are analyzed. A tropical Pacific Ocean general circulation model, forced by a combination of weekly ERS1–2 and TAO wind stresses, and climatological heat and freshwater fluxes, is first validated against observations. The model reproduces the main features of the tropical Pacific mean state, despite a weaker than observed thermal stratification, a 0.1 m s−1 too strong (weak) South Equatorial Current (North Equatorial Countercurrent), and a slight underestimate of the Equatorial Undercurrent. Good agreement is found between the model dynamic height and TOPEX/Poseidon sea level variability, with correlation/rms differences of 0.80/4.7 cm on average in the 10°N–10°S band. The model sea surface temperature variability is a bit weak, but reproduces the main features of interannual variability during the 1992–98 period. The model compares well with the TAO current variability at the equator, with correlation/rms differences of 0.81/0.23 m s−1 for surface currents. The model therefore reproduces well the observed interannual variability, with wind stress as the only interannually varying forcing. This good agreement with observations provides confidence in the comprehensive three-dimensional circulation and thermal structure of the model. A close examination of mixed layer heat balance is thus undertaken, contrasting the mean seasonal cycle of the 1993–96 period and the 1997–98 El Niño. In the eastern Pacific, cooling by exchanges with the subsurface (vertical advection, mixing, and entrainment), the atmospheric forcing, and the eddies (mainly the tropical instability waves) are the three main contributors to the heat budget. In the central–western Pacific, the zonal advection by low-frequency currents becomes the main contributor. Westerly wind bursts (in December 1996 and March and June 1997) were found to play a decisive role in the onset of the 1997–98 El Niño. They contributed to the early warming in the eastern Pacific because the downwelling Kelvin waves that they excited diminished subsurface cooling there. But it is mainly through eastward advection of the warm pool that they generated temperature anomalies in the central Pacific. The end of El Niño can be linked to the large-scale easterly anomalies that developed in the western Pacific and spread eastward, from the end of 1997 onward. In the far-western Pacific, because of the shallower than normal thermocline, these easterlies cooled the SST by vertical processes. In the central Pacific, easterlies pushed the warm pool back to the west. In the east, they led to a shallower thermocline, which ultimately allowed subsurface cooling to resume and to quickly cool the surface layer.
Resumo:
Intercontinental Transport of Ozone and Precursors (ITOP) (part of International Consortium for Atmospheric Research on Transport and Transformation (ICARTT)) was an intense research effort to measure long-range transport of pollution across the North Atlantic and its impact on O3 production. During the aircraft campaign plumes were encountered containing large concentrations of CO plus other tracers and aerosols from forest fires in Alaska and Canada. A chemical transport model, p-TOMCAT, and new biomass burning emissions inventories are used to study the emissions long-range transport and their impact on the troposphere O3 budget. The fire plume structure is modeled well over long distances until it encounters convection over Europe. The CO values within the simulated plumes closely match aircraft measurements near North America and over the Atlantic and have good agreement with MOPITT CO data. O3 and NOx values were initially too great in the model plumes. However, by including additional vertical mixing of O3 above the fires, and using a lower NO2/CO emission ratio (0.008) for boreal fires, O3 concentrations are reduced closer to aircraft measurements, with NO2 closer to SCIAMACHY data. Too little PAN is produced within the simulated plumes, and our VOC scheme's simplicity may be another reason for O3 and NOx model-data discrepancies. In the p-TOMCAT simulations the fire emissions lead to increased tropospheric O3 over North America, the north Atlantic and western Europe from photochemical production and transport. The increased O3 over the Northern Hemisphere in the simulations reaches a peak in July 2004 in the range 2.0 to 6.2 Tg over a baseline of about 150 Tg.
Resumo:
This paper presents a first attempt to estimate mixing parameters from sea level observations using a particle method based on importance sampling. The method is applied to an ensemble of 128 members of model simulations with a global ocean general circulation model of high complexity. Idealized twin experiments demonstrate that the method is able to accurately reconstruct mixing parameters from an observed mean sea level field when mixing is assumed to be spatially homogeneous. An experiment with inhomogeneous eddy coefficients fails because of the limited ensemble size. This is overcome by the introduction of local weighting, which is able to capture spatial variations in mixing qualitatively. As the sensitivity of sea level for variations in mixing is higher for low values of mixing coefficients, the method works relatively well in regions of low eddy activity.
Resumo:
Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.
Resumo:
In designing modern office buildings, building spaces are frequently zoned by introducing internal partitioning, which may have a significant influence on the room air environment. This internal partitioning was studied by means of model test, numerical simulation, and statistical analysis as the final stage. In this paper, the results produced from the statistical analysis are summarized and presented.
Effect of internal partitioning on indoor air quality of rooms with mixing ventilation - basic study
Resumo:
The internal partitioning, which is frequently introduced in open-space planning due to its flexibility, was tested to study its effects on the room air quality as well as ventilation performance. For the study, physical tests using a small model room and numerical modeling using CFD computation were utilized to evaluate different test conditions employing mixing ventilation from the ceiling. The partition parameters, such as its location, height, and the gap underneath, as well as contaminant source location were tested under isothermal conditions. This paper summarizes the results from the study.
Resumo:
The development and performance of a three-stage tubular model of the large human intestine is outlined. Each stage comprises a membrane fermenter where flow of an aqueous polyethylene glycol solution on the outside of the tubular membrane is used to control the removal of water and metabolites (principally short chain fatty acids) from, and thus the pH of, the flowing contents on the fermenter side. The three stage system gave a fair representation of conditions in the human gut. Numbers of the main bacterial groups were consistently higher than in an existing three-chemostat gut model system, suggesting the advantages of the new design in providing an environment for bacterial growth to represent the actual colonic microflora. Concentrations of short chain fatty acids and Ph levels throughout the system were similar to those associated with corresponding sections of the human colon. The model was able to achieve considerable water transfer across the membrane, although the values were not as high as those in the colon. The model thus goes some way towards a realistic simulation of the colon, although it makes no pretence to simulate the pulsating nature of the real flow. The flow conditions in each section are characterized by low Reynolds numbers: mixing due to Taylor dispersion is significant, and the implications of Taylor mixing and biofilm development for the stability, that is the ability to operate without washout, of the system are briefly analysed and discussed. It is concluded that both phenomena are important for stabilizing the model and the human colon.
Resumo:
A cross-platform field campaign, OP3, was conducted in the state of Sabah in Malaysian Borneo between April and July of 2008. Among the suite of observations recorded, the campaign included measurements of NOx and O3 – crucial outputs of any model chemistry mechanism. We describe the measurements of these species made from both the ground site and aircraft. We then use the output from two resolutions of the chemistry transport model p-TOMCAT to illustrate the ability of a global model chemical mechanism to capture the chemistry at the rainforest site. The basic model performance is good for NOx and poor for ozone. A box model containing the same chemical mechanism is used to explore the results of the global model in more depth and make comparisons between the two. Without some parameterization of the nighttime boundary layer – free troposphere mixing (i.e. the use of a dilution parameter), the box model does not reproduce the observations, pointing to the importance of adequately representing physical processes for comparisons with surface measurements. We conclude with a discussion of box model budget calculations of chemical reaction fluxes, deposition and mixing, and compare these results to output from p-TOMCAT. These show the same chemical mechanism behaves similarly in both models, but that emissions and advection play particularly strong roles in influencing the comparison to surface measurements.
Resumo:
Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.
Resumo:
In this paper the meteorological processes responsible for transporting tracer during the second ETEX (European Tracer EXperiment) release are determined using the UK Met Office Unified Model (UM). The UM predicted distribution of tracer is also compared with observations from the ETEX campaign. The dominant meteorological process is a warm conveyor belt which transports large amounts of tracer away from the surface up to a height of 4 km over a 36 h period. Convection is also an important process, transporting tracer to heights of up to 8 km. Potential sources of error when using an operational numerical weather prediction model to forecast air quality are also investigated. These potential sources of error include model dynamics, model resolution and model physics. In the UM a semi-Lagrangian monotonic advection scheme is used with cubic polynomial interpolation. This can predict unrealistic negative values of tracer which are subsequently set to zero, and hence results in an overprediction of tracer concentrations. In order to conserve mass in the UM tracer simulations it was necessary to include a flux corrected transport method. Model resolution can also affect the accuracy of predicted tracer distributions. Low resolution simulations (50 km grid length) were unable to resolve a change in wind direction observed during ETEX 2, this led to an error in the transport direction and hence an error in tracer distribution. High resolution simulations (12 km grid length) captured the change in wind direction and hence produced a tracer distribution that compared better with the observations. The representation of convective mixing was found to have a large effect on the vertical transport of tracer. Turning off the convective mixing parameterisation in the UM significantly reduced the vertical transport of tracer. Finally, air quality forecasts were found to be sensitive to the timing of synoptic scale features. Errors in the position of the cold front relative to the tracer release location of only 1 h resulted in changes in the predicted tracer concentrations that were of the same order of magnitude as the absolute tracer concentrations.
Resumo:
The Cambridge Tropospheric Trajectory model of Chemistry and Transport (CiTTyCAT), a Lagrangian chemistry model, has been evaluated using atmospheric chemical measurements collected during the East Atlantic Summer Experiment 1996 (EASE '96). This field campaign was part of the UK Natural Environment Research Council's (NERC) Atmospheric Chemistry Studies in the Oceanic Environment (ACSOE) programme, conducted at Mace Head, Republic of Ireland, during July and August 1996. The model includes a description of gas-phase tropospheric chemistry, and simple parameterisations for surface deposition, mixing from the free troposphere and emissions. The model generally compares well with the measurements and is used to study the production and loss of O3 under a variety of conditions. The mean difference between the hourly O3 concentrations calculated by the model and those measured is 0.6 ppbv with a standard deviation of 8.7 ppbv. Three specific air-flow regimes were identified during the campaign – westerly, anticyclonic (easterly) and south westerly. The westerly flow is typical of background conditions for Mace Head. However, on some occasions there was evidence of long-range transport of pollutants from North America. In periods of anticyclonic flow, air parcels had collected emissions of NOx and VOCs immediately before arriving at Mace Head, leading to O3 production. The level of calculated O3 depends critically on the precise details of the trajectory, and hence on the emissions into the air parcel. In several periods of south westerly flow, low concentrations of O3 were measured which were consistent with deposition and photochemical destruction inside the tropical marine boundary layer.