48 resultados para Dedicated Short-Range Communication
em CentAUR: Central Archive University of Reading - UK
Resumo:
Four perfluorocarbon tracer dispersion experiments were carried out in central London, United Kingdom in 2004. These experiments were supplementary to the dispersion of air pollution and penetration into the local environment (DAPPLE) campaign and consisted of ground level releases, roof level releases and mobile releases; the latter are believed to be the first such experiments to be undertaken. A detailed description of the experiments including release, sampling, analysis and wind observations is given. The characteristics of dispersion from the fixed and mobile sources are discussed and contrasted, in particular, the decay in concentration levels away from the source location and the additional variability that results from the non-uniformity of vehicle speed. Copyright © 2009 Royal Meteorological Society
Resumo:
At the end of the 20th century, we can look back on a spectacular development of numerical weather prediction, which has, practically uninterrupted, been going on since the middle of the century. High-resolution predictions for more than a week ahead for any part of the globe are now routinely produced and anyone with an Internet connection can access many of these forecasts for anywhere in the world. Extended predictions for several seasons ahead are also being done — the latest El Niño event in 1997/1998 is an example of such a successful prediction. The great achievement is due to a number of factors including the progress in computational technology and the establishment of global observing systems, combined with a systematic research program with an overall strategy towards building comprehensive prediction systems for climate and weather. In this article, I will discuss the different evolutionary steps in this development and the way new scientific ideas have contributed to efficiently explore the computing power and in using observations from new types of observing systems. Weather prediction is not an exact science due to unavoidable errors in initial data and in the models. To quantify the reliability of a forecast is therefore essential and probably more so the longer the forecasts are. Ensemble prediction is thus a new and important concept in weather and climate prediction, which I believe will become a routine aspect of weather prediction in the future. The limit between weather and climate prediction is becoming more and more diffuse and in the final part of this article I will outline the way I think development may proceed in the future.
Resumo:
An analysis of diabatic heating and moistening processes from 12-36 hour lead time forecasts from 12 Global Circulation Models are presented as part of the "Vertical structure and physical processes of the Madden-Julian Oscillation (MJO)" project. A lead time of 12-36 hours is chosen to constrain the large scale dynamics and thermodynamics to be close to observations while avoiding being too close to the initial spin-up for the models as they adjust to being driven from the YOTC analysis. A comparison of the vertical velocity and rainfall with the observations and YOTC analysis suggests that the phases of convection associated with the MJO are constrained in most models at this lead time although the rainfall in the suppressed phase is typically overestimated. Although the large scale dynamics is reasonably constrained, moistening and heating profiles have large inter-model spread. In particular, there are large spreads in convective heating and moistening at mid-levels during the transition to active convection. Radiative heating and cloud parameters have the largest relative spread across models at upper levels during the active phase. A detailed analysis of time step behaviour shows that some models show strong intermittency in rainfall and differences in the precipitation and dynamics relationship between models. The wealth of model outputs archived during this project is a very valuable resource for model developers beyond the study of the MJO. In addition, the findings of this study can inform the design of process model experiments, and inform the priorities for field experiments and future observing systems.
Resumo:
In the last two decades substantial advances have been made in the understanding of the scientific basis of urban climates. These are reviewed here with attention to sustainability of cities, applications that use climate information, and scientific understanding in relation to measurements and modelling. Consideration is given from street (micro) scale to neighbourhood (local) to city and region (meso) scale. Those areas where improvements are needed in the next decade to ensure more sustainable cities are identified. High-priority recommendations are made in the following six strategic areas: observations, data, understanding, modelling, tools and education. These include the need for more operational urban measurement stations and networks; for an international data archive to aid translation of research findings into design tools, along with guidelines for different climate zones and land uses; to develop methods to analyse atmospheric data measured above complex urban surfaces; to improve short-range, high-resolution numerical prediction of weather, air quality and chemical dispersion through improved modelling of the biogeophysical features of the urban land surface; to improve education about urban meteorology; and to encourage communication across scientific disciplines at a range of spatial and temporal scales.
Resumo:
Almost all modern cars can be controlled remotely using a personal communicator (keyfob). However, the degree of interaction between currently available personal communicators and cars is very limited. The communication link is unidirectional and the communication range is limited to a few dozen meters. However, there are many interesting applications that could be supported if a keyfob would be able to support energy efficient bidirectional longer range communication. In this paper we investigate off-the-shelf transceivers in terms of their usability for bidirectional longer range communication. Our evaluation results show that existing transceivers can generally support the required communication ranges but that links tend to be very unreliable. This high unreliability must be handled in an energy efficient way by the keyfob to car communication protocol in order to make off-the-shelf transceivers a viable solution.
Resumo:
ERA-40 is a re-analysis of meteorological observations from September 1957 to August 2002 produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) in collaboration with many institutions. The observing system changed considerably over this re-analysis period, with assimilable data provided by a succession of satellite-borne instruments from the 1970s onwards, supplemented by increasing numbers of observations from aircraft, ocean-buoys and other surface platforms, but with a declining number of radiosonde ascents since the late 1980s. The observations used in ERA-40 were accumulated from many sources. The first part of this paper describes the data acquisition and the principal changes in data type and coverage over the period. It also describes the data assimilation system used for ERA-40. This benefited from many of the changes introduced into operational forecasting since the mid-1990s, when the systems used for the 15-year ECMWF re-analysis (ERA-15) and the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) re-analysis were implemented. Several of the improvements are discussed. General aspects of the production of the analyses are also summarized. A number of results indicative of the overall performance of the data assimilation system, and implicitly of the observing system, are presented and discussed. The comparison of background (short-range) forecasts and analyses with observations, the consistency of the global mass budget, the magnitude of differences between analysis and background fields and the accuracy of medium-range forecasts run from the ERA-40 analyses are illustrated. Several results demonstrate the marked improvement that was made to the observing system for the southern hemisphere in the 1970s, particularly towards the end of the decade. In contrast, the synoptic quality of the analysis for the northern hemisphere is sufficient to provide forecasts that remain skilful well into the medium range for all years. Two particular problems are also examined: excessive precipitation over tropical oceans and a too strong Brewer-Dobson circulation, both of which are pronounced in later years. Several other aspects of the quality of the re-analyses revealed by monitoring and validation studies are summarized. Expectations that the second-generation ERA-40 re-analysis would provide products that are better than those from the firstgeneration ERA-15 and NCEP/NCAR re-analyses are found to have been met in most cases. © Royal Meteorological Society, 2005. The contributions of N. A. Rayner and R. W. Saunders are Crown copyright.
Resumo:
For many networks in nature, science and technology, it is possible to order the nodes so that most links are short-range, connecting near-neighbours, and relatively few long-range links, or shortcuts, are present. Given a network as a set of observed links (interactions), the task of finding an ordering of the nodes that reveals such a range-dependent structure is closely related to some sparse matrix reordering problems arising in scientific computation. The spectral, or Fiedler vector, approach for sparse matrix reordering has successfully been applied to biological data sets, revealing useful structures and subpatterns. In this work we argue that a periodic analogue of the standard reordering task is also highly relevant. Here, rather than encouraging nonzeros only to lie close to the diagonal of a suitably ordered adjacency matrix, we also allow them to inhabit the off-diagonal corners. Indeed, for the classic small-world model of Watts & Strogatz (1998, Collective dynamics of ‘small-world’ networks. Nature, 393, 440–442) this type of periodic structure is inherent. We therefore devise and test a new spectral algorithm for periodic reordering. By generalizing the range-dependent random graph class of Grindrod (2002, Range-dependent random graphs and their application to modeling large small-world proteome datasets. Phys. Rev. E, 66, 066702-1–066702-7) to the periodic case, we can also construct a computable likelihood ratio that suggests whether a given network is inherently linear or periodic. Tests on synthetic data show that the new algorithm can detect periodic structure, even in the presence of noise. Further experiments on real biological data sets then show that some networks are better regarded as periodic than linear. Hence, we find both qualitative (reordered networks plots) and quantitative (likelihood ratios) evidence of periodicity in biological networks.
Resumo:
The elucidation of spatial variation in the landscape can indicate potential wildlife habitats or breeding sites for vectors, such as ticks or mosquitoes, which cause a range of diseases. Information from remotely sensed data could aid the delineation of vegetation distribution on the ground in areas where local knowledge is limited. The data from digital images are often difficult to interpret because of pixel-to-pixel variation, that is, noise, and complex variation at more than one spatial scale. Landsat Thematic Mapper Plus (ETM+) and Satellite Pour l'Observation de La Terre (SPOT) image data were analyzed for an area close to Douna in Mali, West Africa. The variograms of the normalized difference vegetation index (NDVI) from both types of image data were nested. The parameters of the nested variogram function from the Landsat ETM+ data were used to design the sampling for a ground survey of soil and vegetation data. Variograms of the soil and vegetation data showed that their variation was anisotropic and their scales of variation were similar to those of NDVI from the SPOT data. The short- and long-range components of variation in the SPOT data were filtered out separately by factorial kriging. The map of the short-range component appears to represent the patterns of vegetation and associated shallow slopes and drainage channels of the tiger bush system. The map of the long-range component also appeared to relate to broader patterns in the tiger bush and to gentle undulations in the topography. The results suggest that the types of image data analyzed in this study could be used to identify areas with more moisture in semiarid regions that could support wildlife and also be potential vector breeding sites.
Resumo:
A partial phase diagram is constructed for diblock copolymer melts using lattice-based Monte Carlo simulations. This is done by locating the order-disorder transition (ODT) with the aid of a recently proposed order parameter and identifying the ordered phase over a wide range of copolymer compositions (0.2 <= f <= 0.8). Consistent with experiments, the disordered phase is found to exhibit direct first-order transitions to each of the ordered morphologies. This includes the spontaneous formation of a perforated-lamellar phase, which presumably forms in place of the gyroid morphology due to finite-size and/or nonequilibrium effects. Also included in our study is a detailed examination of disordered cylinder-forming (f=0.3) diblock copolymers, revealing a substantial degree of pretransitional chain stretching and short-range order that set in well before the ODT, as observed previously in analogous studies on lamellar-forming (f=0.5) molecules. (c) 2006 American Institute of Physics.
Resumo:
The ECMWF full-physics and dry singular vector (SV) packages, using a dry energy norm and a 1-day optimization time, are applied to four high impact European cyclones of recent years that were almost universally badly forecast in the short range. It is shown that these full-physics SVs are much more relevant to severe cyclonic development than those based on dry dynamics plus boundary layer alone. The crucial extra ingredient is the representation of large-scale latent heat release. The severe winter storms all have a long, nearly straight region of high baroclinicity stretching across the Atlantic towards Europe, with a tongue of very high moisture content on its equatorward flank. In each case some of the final-time top SV structures pick out the region of the actual storm. The initial structures were generally located in the mid- to low troposphere. Forecasts based on initial conditions perturbed by moist SVs with opposite signs and various amplitudes show the range of possible 1-day outcomes for reasonable magnitudes of forecast error. In each case one of the perturbation structures gave a forecast very much closer to the actual storm than the control forecast. Deductions are made about the predictability of high-impact extratropical cyclone events. Implications are drawn for the short-range forecast problem and suggestions made for one practicable way to approach short-range ensemble forecasting. Copyright © 2005 Royal Meteorological Society.
Resumo:
The influence matrix is used in ordinary least-squares applications for monitoring statistical multiple-regression analyses. Concepts related to the influence matrix provide diagnostics on the influence of individual data on the analysis - the analysis change that would occur by leaving one observation out, and the effective information content (degrees of freedom for signal) in any sub-set of the analysed data. In this paper, the corresponding concepts have been derived in the context of linear statistical data assimilation in numerical weather prediction. An approximate method to compute the diagonal elements of the influence matrix (the self-sensitivities) has been developed for a large-dimension variational data assimilation system (the four-dimensional variational system of the European Centre for Medium-Range Weather Forecasts). Results show that, in the boreal spring 2003 operational system, 15% of the global influence is due to the assimilated observations in any one analysis, and the complementary 85% is the influence of the prior (background) information, a short-range forecast containing information from earlier assimilated observations. About 25% of the observational information is currently provided by surface-based observing systems, and 75% by satellite systems. Low-influence data points usually occur in data-rich areas, while high-influence data points are in data-sparse areas or in dynamically active regions. Background-error correlations also play an important role: high correlation diminishes the observation influence and amplifies the importance of the surrounding real and pseudo observations (prior information in observation space). Incorrect specifications of background and observation-error covariance matrices can be identified, interpreted and better understood by the use of influence-matrix diagnostics for the variety of observation types and observed variables used in the data assimilation system. Copyright © 2004 Royal Meteorological Society
Resumo:
A novel type of tweezer molecule containing electron-rich 2-pyrenyloxy arms has been designed to exploit intramolecular hydrogen bonding in stabilising a preferred conformation for supramolecular complexation to complementary sequences in aromatic copolyimides. This tweezer-conformation is demonstrated by single-crystal X-ray analyses of the tweezer molecule itself and of its complex with an aromatic diimide model-compound. In terms of its ability to bind selectively to polyimide chains, the new tweezer molecule shows very high sensitivity to sequence effects. Thus, even low concentrations of tweezer relative to diimide units (<2.5 mol%) are sufficient to produce dramatic, sequence-related splittings of the pyromellitimide proton NMR resonances. These induced resonance-shifts arise from ring-current shielding of pyromellitimide protons by the pyrenyloxy arms of the tweezer-molecule, and the magnitude of such shielding is a function of the tweezer-binding constant for any particular monomer sequence. Recognition of both short-range and long-range sequences is observed, the latter arising from cumulative ring-current shielding of diimide protons by tweezer molecules binding at multiple adjacent sites on the copolymer chain.
Resumo:
Enantio-specific interactions on intrinsically chiral or chirally modified surfaces can be identified experimentally via comparison of the adsorption geometries of similar nonchiral and chiral molecules. Information about the effects of substrate-related and in interactions on the adsorption geometry of glycine, the only natural nonchiral amino acid, is therefore important for identifying enantio-specific interactions of larger chiral amino acids. We have studied the long- and short-range adsorption geometry and bonding properties of glycine on the intrinsically chiral Cu{531} surface with low-energy electron diffraction, near-edge X-ray absorption One structure spectroscopy, X-ray photoelectron spectroscopy, and temperature-programmed desorption. For coverages between 0.15 and 0.33 ML (saturated chemisorbed layer) and temperatures between 300 and 430 K, glycine molecules adsorb in two different azimuthal orientations, which are associated with adsorption sites on the {110} and {311} microfacets of Cu{531}. Both types of adsorption sites allow a triangular footprint with surface bonds through the two oxygen atoms and the nitrogen atom. The occupation of the two adsorption sites is equal for all coverages, which can be explained by pair formation due to similar site-specific adsorption energies and the possibility of forming hydrogen bonds between molecules on adjacent {110} and {311} sites. This is not the ease for alanine and points toward higher site specificity in the case of alanine, which is eventually responsible for the enantiomeric differences observed for the alanine system.
Resumo:
A 24-member ensemble of 1-h high-resolution forecasts over the Southern United Kingdom is used to study short-range forecast error statistics. The initial conditions are found from perturbations from an ensemble transform Kalman filter. Forecasts from this system are assumed to lie within the bounds of forecast error of an operational forecast system. Although noisy, this system is capable of producing physically reasonable statistics which are analysed and compared to statistics implied from a variational assimilation system. The variances for temperature errors for instance show structures that reflect convective activity. Some variables, notably potential temperature and specific humidity perturbations, have autocorrelation functions that deviate from 3-D isotropy at the convective-scale (horizontal scales less than 10 km). Other variables, notably the velocity potential for horizontal divergence perturbations, maintain 3-D isotropy at all scales. Geostrophic and hydrostatic balances are studied by examining correlations between terms in the divergence and vertical momentum equations respectively. Both balances are found to decay as the horizontal scale decreases. It is estimated that geostrophic balance becomes less important at scales smaller than 75 km, and hydrostatic balance becomes less important at scales smaller than 35 km, although more work is required to validate these findings. The implications of these results for high-resolution data assimilation are discussed.
Resumo:
The dispersion of a point-source release of a passive scalar in a regular array of cubical, urban-like, obstacles is investigated by means of direct numerical simulations. The simulations are conducted under conditions of neutral stability and fully rough turbulent flow, at a roughness Reynolds number of Reτ = 500. The Navier–Stokes and scalar equations are integrated assuming a constant rate release from a point source close to the ground within the array. We focus on short-range dispersion, when most of the material is still within the building canopy. Mean and fluctuating concentrations are computed for three different pressure gradient directions (0◦ , 30◦ , 45◦). The results agree well with available experimental data measured in a water channel for a flow angle of 0◦ . Profiles of mean concentration and the three-dimensional structure of the dispersion pattern are compared for the different forcing angles. A number of processes affecting the plume structure are identified and discussed, including: (i) advection or channelling of scalar down ‘streets’, (ii) lateral dispersion by turbulent fluctuations and topological dispersion induced by dividing streamlines around buildings, (iii) skewing of the plume due to flow turning with height, (iv) detrainment by turbulent dispersion or mean recirculation, (v) entrainment and release of scalar in building wakes, giving rise to ‘secondary sources’, (vi) plume meandering due to unsteady turbulent fluctuations. Finally, results on relative concentration fluctuations are presented and compared with the literature for point source dispersion over flat terrain and urban arrays. Keywords Direct numerical simulation · Dispersion modelling · Urban array