925 resultados para Travel Time Prediction
Resumo:
The DAPPLE (Dispersion of Air Pollutants and their Penetration into the Local Environment) project seeks to characterise near-field urban atmospheric dispersion using a multidisciplinary approach. In this paper we report on the first tracer dispersion experiment carried out in May 2003. Results of concurrent meteorological measurements are presented. Variations of receptor tracer concentration with time are presented. Meteorological observations suggest that in-street channelling and flow-switching at intersections take place. A comparison between roof top and surface measurements suggest that rapid vertical mixing occurs, and a comparison between a simple dispersion model and maximum concentrations observed are presented
Resumo:
The initial condition effect on climate prediction skill over a 2-year hindcast time-scale has been assessed from ensemble HadCM3 climate model runs using anomaly initialization over the period 1990–2001, and making comparisons with runs without initialization (equivalent to climatological conditions), and to anomaly persistence. It is shown that the assimilation improves the prediction skill in the first year globally, and in a number of limited areas out into the second year. Skill in hindcasting surface air temperature anomalies is most marked over ocean areas, and is coincident with areas of high sea surface temperature and ocean heat content skill. Skill improvement over land areas is much more limited but is still detectable in some cases. We found little difference in the skill of hindcasts using three different sets of ocean initial conditions, and we obtained the best results by combining these to form a grand ensemble hindcast set. Results are also compared with the idealized predictability studies of Collins (Clim. Dynam. 2002; 19: 671–692), which used the same model. The maximum lead time for which initialization gives enhanced skill over runs without initialization varies in different regions but is very similar to lead times found in the idealized studies, therefore strongly supporting the process representation in the model as well as its use for operational predictions. The limited 12-year period of the study, however, means that the regional details of model skill should probably be further assessed under a wider range of observational conditions.
Resumo:
This study investigated the potential application of mid-infrared spectroscopy (MIR 4,000–900 cm−1) for the determination of milk coagulation properties (MCP), titratable acidity (TA), and pH in Brown Swiss milk samples (n = 1,064). Because MCP directly influence the efficiency of the cheese-making process, there is strong industrial interest in developing a rapid method for their assessment. Currently, the determination of MCP involves time-consuming laboratory-based measurements, and it is not feasible to carry out these measurements on the large numbers of milk samples associated with milk recording programs. Mid-infrared spectroscopy is an objective and nondestructive technique providing rapid real-time analysis of food compositional and quality parameters. Analysis of milk rennet coagulation time (RCT, min), curd firmness (a30, mm), TA (SH°/50 mL; SH° = Soxhlet-Henkel degree), and pH was carried out, and MIR data were recorded over the spectral range of 4,000 to 900 cm−1. Models were developed by partial least squares regression using untreated and pretreated spectra. The MCP, TA, and pH prediction models were improved by using the combined spectral ranges of 1,600 to 900 cm−1, 3,040 to 1,700 cm−1, and 4,000 to 3,470 cm−1. The root mean square errors of cross-validation for the developed models were 2.36 min (RCT, range 24.9 min), 6.86 mm (a30, range 58 mm), 0.25 SH°/50 mL (TA, range 3.58 SH°/50 mL), and 0.07 (pH, range 1.15). The most successfully predicted attributes were TA, RCT, and pH. The model for the prediction of TA provided approximate prediction (R2 = 0.66), whereas the predictive models developed for RCT and pH could discriminate between high and low values (R2 = 0.59 to 0.62). It was concluded that, although the models require further development to improve their accuracy before their application in industry, MIR spectroscopy has potential application for the assessment of RCT, TA, and pH during routine milk analysis in the dairy industry. The implementation of such models could be a means of improving MCP through phenotypic-based selection programs and to amend milk payment systems to incorporate MCP into their payment criteria.
Resumo:
The potential of a fibre optic sensor, detecting light backscatter in a cheese vat during coagulation and syneresis, to predict curd moisture, fat loses and curd yield was examined. Temperature, cutting time and calcium levels were varied to assess the strength of the predictions over a range of processing conditions. Equations were developed using a combination of independent variables, milk compositional and light backscatter parameters. Fat losses, curd yield and curd moisture content were predicted with a standard error of prediction (SEP) of +/- 2.65 g 100 g(-1) (R-2 = 0.93), +/- 0.95% (R-2 = 0.90) and +/- 1.43% (R-2 = 0.94), respectively. These results were used to develop a model for predicting curd moisture as a function of time during syneresis (SEP = +/- 1.72%; R-2 = 0.95). By monitoring coagulation and syneresis, this sensor technology could be employed to control curd moisture content, thereby improving process control during cheese manufacture. (c) 2007 Elsevier Ltd. All rights reserved..
Resumo:
Many numerical models for weather prediction and climate studies are run at resolutions that are too coarse to resolve convection explicitly, but too fine to justify the local equilibrium assumed by conventional convective parameterizations. The Plant-Craig (PC) stochastic convective parameterization scheme, developed in this paper, solves this problem by removing the assumption that a given grid-scale situation must always produce the same sub-grid-scale convective response. Instead, for each timestep and gridpoint, one of the many possible convective responses consistent with the large-scale situation is randomly selected. The scheme requires as input the large-scale state as opposed to the instantaneous grid-scale state, but must nonetheless be able to account for genuine variations in the largescale situation. Here we investigate the behaviour of the PC scheme in three-dimensional simulations of radiative-convective equilibrium, demonstrating in particular that the necessary space-time averaging required to produce a good representation of the input large-scale state is not in conflict with the requirement to capture large-scale variations. The resulting equilibrium profiles agree well with those obtained from established deterministic schemes, and with corresponding cloud-resolving model simulations. Unlike the conventional schemes the statistics for mass flux and rainfall variability from the PC scheme also agree well with relevant theory and vary appropriately with spatial scale. The scheme is further shown to adapt automatically to changes in grid length and in forcing strength.
Resumo:
This paper provides a solution for predicting moving/moving and moving/static collisions of objects within a virtual environment. Feasible prediction in real-time virtual worlds can be obtained by encompassing moving objects within a sphere and static objects within a convex polygon. Fast solutions are then attainable by describing the movement of objects parametrically in time as a polynomial.
Resumo:
We study the feasibility of using the singular vector technique to create initial condition perturbations for short-range ensemble prediction systems (SREPS) focussing on predictability of severe local storms and in particular deep convection. For this a new final time semi-norm based on the convective available potential energy (CAPE) is introduced. We compare singular vectors using the CAPE-norm with SVs using the more common total energy (TE) norm for a 2-week summer period in 2007, which includes a case of mesoscale extreme rainfall in the south west of Finland. The CAPE singular vectors perturb the CAPE field by increasing the specific humidity and temperature of the parcel and increase the lapse rate above the parcel in the lower troposphere consistent with physical considerations. The CAPE-SVs are situated in the lower troposphere. This in contrast to TE-SVs with short optimization times which predominantly remain in the high troposphere. By examining the time evolution of the CAPE singular values we observe that the convective event in the south west of Finland is clearly associated with high CAPE singular values.
Resumo:
The World Weather Research Programme (WWRP) and the World Climate Research Programme (WCRP) have identified collaborations and scientific priorities to accelerate advances in analysis and prediction at subseasonal-to-seasonal time scales, which include i) advancing knowledge of mesoscale–planetary-scale interactions and their prediction; ii) developing high-resolution global–regional climate simulations, with advanced representation of physical processes, to improve the predictive skill of subseasonal and seasonal variability of high-impact events, such as seasonal droughts and floods, blocking, and tropical and extratropical cyclones; iii) contributing to the improvement of data assimilation methods for monitoring and predicting used in coupled ocean–atmosphere–land and Earth system models; and iv) developing and transferring diagnostic and prognostic information tailored to socioeconomic decision making. The document puts forward specific underpinning research, linkage, and requirements necessary to achieve the goals of the proposed collaboration.
Resumo:
Predictability is considered in the context of the seamless weather-climate prediction problem, and the notion is developed that there can be predictive power on all time-scales. On all scales there are phenomena that occur as well as longer time-scales and external conditions that should combine to give some predictability. To what extent this theoretical predictability may actually be realised and, further, to what extent it may be useful is not clear. However the potential should provide a stimulus to, and high profile for, our science and its application for many years.
Resumo:
We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.
Resumo:
We examine to what degree we can expect to obtain accurate temperature trends for the last two decades near the surface and in the lower troposphere. We compare temperatures obtained from surface observations and radiosondes as well as satellite-based measurements from the Microwave Soundings Units (MSU), which have been adjusted for orbital decay and non-linear instrument-body effects, and reanalyses from the European Centre for Medium-Range Weather Forecasts (ERA) and the National Centre for Environmental Prediction (NCEP). In regions with abundant conventional data coverage, where the MSU has no major influence on the reanalysis, temperature anomalies obtained from microwave sounders, radiosondes and from both reanalyses agree reasonably. Where coverage is insufficient, in particular over the tropical oceans, large differences are found between the MSU and either reanalysis. These differences apparently relate to changes in the satellite data availability and to differing satellite retrieval methodologies, to which both reanalyses are quite sensitive over the oceans. For NCEP, this results from the use of raw radiances directly incorporated into the analysis, which make the reanalysis sensitive to changes in the underlying algorithms, e.g. those introduced in August 1992. For ERA, the bias-correction of the one-dimensional variational analysis may introduce an error when the satellite relative to which the correction is calculated is biased itself or when radiances change on a time scale longer than a couple of months, e.g. due to orbit decay. ERA inhomogeneities are apparent in April 1985, October/November 1986 and April 1989. These dates can be identified with the replacements of satellites. It is possible that a negative bias in the sea surface temperatures (SSTs) used in the reanalyses may have been introduced over the period of the satellite record. This could have resulted from a decrease in the number of ship measurements, a concomitant increase in the importance of satellite-derived SSTs, and a likely cold bias in the latter. Alternately, a warm bias in SSTs could have been caused by an increase in the percentage of buoy measurements (relative to deeper ship intake measurements) in the tropical Pacific. No indications for uncorrected inhomogeneities of land surface temperatures could be found. Near-surface temperatures have biases in the boundary layer in both reanalyses, presumably due to the incorrect treatment of snow cover. The increase of near-surface compared to lower tropospheric temperatures in the last two decades may be due to a combination of several factors, including high-latitude near-surface winter warming due to an enhanced NAO and upper-tropospheric cooling due to stratospheric ozone decrease.
Resumo:
This work presents a description of the 1979–2002 tropical Atlantic (TA) SST variability modes coupled to the anomalous West African (WA) rainfall during the monsoon season. The time-evolving SST patterns, with an impact on WA rainfall variability, are analyzed using a new methodology based on maximum covariance analysis. The enhanced Climate Prediction Center (CPC) Merged Analysis of Precipitation (CMAP) dataset, which includes measures over the ocean, gives a complete picture of the interannual WA rainfall patterns for the Sahel dry period. The leading TA SST pattern, related to the Atlantic El Niño, is coupled to anomalous precipitation over the coast of the Gulf of Guinea, which corresponds to the second WA rainfall principal component. The thermodynamics and dynamics involved in the generation, development, and damping of this mode are studied and compared with previous works. The SST mode starts at the Angola/Benguela region and is caused by alongshore wind anomalies. It then propagates westward via Rossby waves and damps because of latent heat flux anomalies and Kelvin wave eastward propagation from an off-equatorial forcing. The second SST mode includes the Mediterranean and the Atlantic Ocean, showing how the Mediterranean SST anomalies are those that are directly associated with the Sahelian rainfall. The global signature of the TA SST patterns is analyzed, adding new insights about the Pacific– Atlantic link in relation to WA rainfall during this period. Also, this global picture suggests that the Mediterranean SST anomalies are a fingerprint of large-scale forcing. This work updates the results given by other authors, whose studies are based on different datasets dating back to the 1950s, including both the wet and the dry Sahel periods.
Resumo:
Numerical Weather Prediction (NWP) fields are used to assist the detection of cloud in satellite imagery. Simulated observations based on NWP are used within a framework based on Bayes' theorem to calculate a physically-based probability of each pixel with an imaged scene being clear or cloudy. Different thresholds can be set on the probabilities to create application-specific cloud-masks. Here, this is done over both land and ocean using night-time (infrared) imagery. We use a validation dataset of difficult cloud detection targets for the Spinning Enhanced Visible and Infrared Imager (SEVIRI) achieving true skill scores of 87% and 48% for ocean and land, respectively using the Bayesian technique, compared to 74% and 39%, respectively for the threshold-based techniques associated with the validation dataset.
Resumo:
Numerical Weather Prediction (NWP) fields are used to assist the detection of cloud in satellite imagery. Simulated observations based on NWP are used within a framework based on Bayes' theorem to calculate a physically-based probability of each pixel with an imaged scene being clear or cloudy. Different thresholds can be set on the probabilities to create application-specific cloud masks. Here, the technique is shown to be suitable for daytime applications over land and sea, using visible and near-infrared imagery, in addition to thermal infrared. We use a validation dataset of difficult cloud detection targets for the Spinning Enhanced Visible and Infrared Imager (SEVIRI) achieving true skill scores of 89% and 73% for ocean and land, respectively using the Bayesian technique, compared to 90% and 70%, respectively for the threshold-based techniques associated with the validation dataset.