56 resultados para Shared component model

em CentAUR: Central Archive University of Reading - UK


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the Eady model, where the meridional potential vorticity (PV) gradient is zero, perturbation energy growth can be partitioned cleanly into three mechanisms: (i) shear instability, (ii) resonance, and (iii) the Orr mechanism. Shear instability involves two-way interaction between Rossby edge waves on the ground and lid, resonance occurs as interior PV anomalies excite the edge waves, and the Orr mechanism involves only interior PV anomalies. These mechanisms have distinct implications for the structural and temporal linear evolution of perturbations. Here, a new framework is developed in which the same mechanisms can be distinguished for growth on basic states with nonzero interior PV gradients. It is further shown that the evolution from quite general initial conditions can be accurately described (peak error in perturbation total energy typically less than 10%) by a reduced system that involves only three Rossby wave components. Two of these are counterpropagating Rossby waves—that is, generalizations of the Rossby edge waves when the interior PV gradient is nonzero—whereas the other component depends on the structure of the initial condition and its PV is advected passively with the shear flow. In the cases considered, the three-component model outperforms approximate solutions based on truncating a modal or singular vector basis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

What happens when digital coordination practices are introduced into the institutionalized setting of an engineering project? This question is addressed through an interpretive study that examines how a shared digital model becomes used in the late design stages of a major station refurbishment project. The paper contributes by mobilizing the idea of ‘hybrid practices’ to understand the diverse patterns of activity that emerge to manage digital coordination of design. It articulates how engineering and architecture professions develop different relationships with the shared model; the design team negotiates paper-based practices across organizational boundaries; and diverse practitioners probe the potential and limitations of the digital infrastructure. While different software packages and tools have become linked together into an integrated digital infrastructure, these emerging hybrid practices contrast with the interactions anticipated in practice and policy guidance and presenting new opportunities and challenges for managing project delivery. The study has implications for researchers working in the growing field of empirical work on engineering project organizations as it shows the importance of considering, and suggests new ways to theorise, the introduction of digital coordination practices into these institutionalized settings.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The concept of a slowest invariant manifold is investigated for the five-component model of Lorenz under conservative dynamics. It is shown that Lorenz's model is a two-degree-of-freedom canonical Hamiltonian system, consisting of a nonlinear vorticity-triad oscillator coupled to a linear gravity wave oscillator, whose solutions consist of regular and chaotic orbits. When either the Rossby number or the rotational Froude number is small, there is a formal separation of timescales, and one can speak of fast and slow motion. In the same regime, the coupling is weak, and the Kolmogorov–Arnold-Moser theorem is shown to apply. The chaotic orbits are inherently unbalanced and are confined to regions sandwiched between invariant tori consisting of quasi-periodic regular orbits. The regular orbits generally contain free fast motion, but a slowest invariant manifold may be geometrically defined as the set of all slow cores of invariant tori (defined by zero fast action) that are smoothly related to such cores in the uncoupled system. This slowest invariant manifold is not global; in fact, its structure is fractal; but it is of nearly full measure in the limit of weak coupling. It is also nonlinearly stable. As the coupling increases, the slowest invariant manifold shrinks until it disappears altogether. The results clarify previous definitions of a slowest invariant manifold and highlight the ambiguity in the definition of “slowness.” An asymptotic procedure, analogous to standard initialization techniques, is found to yield nonzero free fast motion even when the core solutions contain none. A hierarchy of Hamiltonian balanced models preserving the symmetries in the original low-order model is formulated; these models are compared with classic balanced models, asymptotically initialized solutions of the full system and the slowest invariant manifold defined by the core solutions. The analysis suggests that for sufficiently small Rossby or rotational Froude numbers, a stable slowest invariant manifold can be defined for this system, which has zero free gravity wave activity, but it cannot be defined everywhere. The implications of the results for more complex systems are discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The extent and thickness of the Arctic sea ice cover has decreased dramatically in the past few decades with minima in sea ice extent in September 2005 and 2007. These minima have not been predicted in the IPCC AR4 report, suggesting that the sea ice component of climate models should more realistically represent the processes controlling the sea ice mass balance. One of the processes poorly represented in sea ice models is the formation and evolution of melt ponds. Melt ponds accumulate on the surface of sea ice from snow and sea ice melt and their presence reduces the albedo of the ice cover, leading to further melt. Toward the end of the melt season, melt ponds cover up to 50% of the sea ice surface. We have developed a melt pond evolution theory. Here, we have incorporated this melt pond theory into the Los Alamos CICE sea ice model, which has required us to include the refreezing of melt ponds. We present results showing that the presence, or otherwise, of a representation of melt ponds has a significant effect on the predicted sea ice thickness and extent. We also present a sensitivity study to uncertainty in the sea ice permeability, number of thickness categories in the model representation, meltwater redistribution scheme, and pond albedo. We conclude with a recommendation that our melt pond scheme is included in sea ice models, and the number of thickness categories should be increased and concentrated at lower thicknesses.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article explores the problematic nature of the label “home ownership” through a case study of the English model of shared ownership, one of the methods used by the UK government to make home ownership affordable. Adopting a legal and socio-legal analysis, the article considers whether shared ownership is capable of fulfilling the aspirations households have for home ownership. To do so, the article considers the financial and nonfinancial meanings attached to home ownership and suggests that the core expectation lies in ownership of the value. The article demonstrates that the rights and responsibilities of shared owners are different in many respects from those of traditional home owners, including their rights as regards ownership of the value. By examining home ownership through the lens of shared ownership the article draws out lessons of broader significance to housing studies. In particular, it is argued that shared ownership shows the limitations of two dichotomies commonly used in housing discourse: that between private and social housing; and the classification of tenure between owner-occupiers and renters. The article concludes that a much more nuanced way of referring to home ownership is required, and that there is a need for a change of expectations amongst consumers as to what sharing ownership means.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are at least three distinct time scales that are relevant for the evolution of atmospheric convection. These are the time scale of the forcing mechanism, the time scale governing the response to a steady forcing, and the time scale of the response to variations in the forcing. The last of these, tmem, is associated with convective life cycles, which provide an element of memory in the system. A highly simplified model of convection is introduced, which allows for investigation of the character of convection as a function of the three time scales. For short tmem, the convective response is strongly tied to the forcing as in conventional equilibrium parameterization. For long tmem, the convection responds only to the slowly evolving component of forcing, and any fluctuations in the forcing are essentially suppressed. At intermediate tmem, convection becomes less predictable: conventional equilibrium closure breaks down and current levels of convection modify the subsequent response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models of the dynamics of nitrogen in soil (soil-N) can be used to aid the fertilizer management of a crop. The predictions of soil-N models can be validated by comparison with observed data. Validation generally involves calculating non-spatial statistics of the observations and predictions, such as their means, their mean squared-difference, and their correlation. However, when the model predictions are spatially distributed across a landscape the model requires validation with spatial statistics. There are three reasons for this: (i) the model may be more or less successful at reproducing the variance of the observations at different spatial scales; (ii) the correlation of the predictions with the observations may be different at different spatial scales; (iii) the spatial pattern of model error may be informative. In this study we used a model, parameterized with spatially variable input information about the soil, to predict the mineral-N content of soil in an arable field, and compared the results with observed data. We validated the performance of the N model spatially with a linear mixed model of the observations and model predictions, estimated by residual maximum likelihood. This novel approach allowed us to describe the joint variation of the observations and predictions as: (i) independent random variation that occurred at a fine spatial scale; (ii) correlated random variation that occurred at a coarse spatial scale; (iii) systematic variation associated with a spatial trend. The linear mixed model revealed that, in general, the performance of the N model changed depending on the spatial scale of interest. At the scales associated with random variation, the N model underestimated the variance of the observations, and the predictions were correlated poorly with the observations. At the scale of the trend, the predictions and observations shared a common surface. The spatial pattern of the error of the N model suggested that the observations were affected by the local soil condition, but this was not accounted for by the N model. In summary, the N model would be well-suited to field-scale management of soil nitrogen, but suited poorly to management at finer spatial scales. This information was not apparent with a non-spatial validation. (c),2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water quality models generally require a relatively large number of parameters to define their functional relationships, and since prior information on parameter values is limited, these are commonly defined by fitting the model to observed data. In this paper, the identifiability of water quality parameters and the associated uncertainty in model simulations are investigated. A modification to the water quality model `Quality Simulation Along River Systems' is presented in which an improved flow component is used within the existing water quality model framework. The performance of the model is evaluated in an application to the Bedford Ouse river, UK, using a Monte-Carlo analysis toolbox. The essential framework of the model proved to be sound, and calibration and validation performance was generally good. However some supposedly important water quality parameters associated with algal activity were found to be completely insensitive, and hence non-identifiable, within the model structure, while others (nitrification and sedimentation) had optimum values at or close to zero, indicating that those processes were not detectable from the data set examined. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of the atmospheric component of the new Hadley Centre Global Environmental Model (HadGEM1) is assessed in terms of its ability to represent a selection of key aspects of variability in the Tropics and extratropics. These include midlatitude storm tracks and blocking activity, synoptic variability over Europe, and the North Atlantic Oscillation together with tropical convection, the Madden-Julian oscillation, and the Asian summer monsoon. Comparisons with the previous model, the Third Hadley Centre Coupled Ocean-Atmosphere GCM (HadCM3), demonstrate that there has been a considerable increase in the transient eddy kinetic energy (EKE), bringing HadGEM1 into closer agreement with current reanalyses. This increase in EKE results from the increased horizontal resolution and, in combination with the improved physical parameterizations, leads to improvements in the representation of Northern Hemisphere storm tracks and blocking. The simulation of synoptic weather regimes over Europe is also greatly improved compared to HadCM3, again due to both increased resolution and other model developments. The variability of convection in the equatorial region is generally stronger and closer to observations than in HadCM3. There is, however, still limited convective variance coincident with several of the observed equatorial wave modes. Simulation of the Madden-Julian oscillation is improved in HadGEM1: both the activity and interannual variability are increased and the eastward propagation, although slower than observed, is much better simulated. While some aspects of the climatology of the Asian summer monsoon are improved in HadGEM1, the upper-level winds are too weak and the simulation of precipitation deteriorates. The dominant modes of monsoon interannual variability are similar in the two models, although in HadCM3 this is linked to SST forcing, while in HadGEM1 internal variability dominates. Overall, analysis of the phenomena considered here indicates that HadGEM1 performs well and, in many important respects, improves upon HadCM3. Together with the improved representation of the mean climate, this improvement in the simulation of atmospheric variability suggests that HadGEM1 provides a sound basis for future studies of climate and climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ice clouds are an important yet largely unvalidated component of weather forecasting and climate models, but radar offers the potential to provide the necessary data to evaluate them. First in this paper, coordinated aircraft in situ measurements and scans by a 3-GHz radar are presented, demonstrating that, for stratiform midlatitude ice clouds, radar reflectivity in the Rayleigh-scattering regime may be reliably calculated from aircraft size spectra if the "Brown and Francis" mass-size relationship is used. The comparisons spanned radar reflectivity values from -15 to +20 dBZ, ice water contents (IWCs) from 0.01 to 0.4 g m(-3), and median volumetric diameters between 0.2 and 3 mm. In mixed-phase conditions the agreement is much poorer because of the higher-density ice particles present. A large midlatitude aircraft dataset is then used to derive expressions that relate radar reflectivity and temperature to ice water content and visible extinction coefficient. The analysis is an advance over previous work in several ways: the retrievals vary smoothly with both input parameters, different relationships are derived for the common radar frequencies of 3, 35, and 94 GHz, and the problem of retrieving the long-term mean and the horizontal variance of ice cloud parameters is considered separately. It is shown that the dependence on temperature arises because of the temperature dependence of the number concentration "intercept parameter" rather than mean particle size. A comparison is presented of ice water content derived from scanning 3-GHz radar with the values held in the Met Office mesoscale forecast model, for eight precipitating cases spanning 39 h over Southern England. It is found that the model predicted mean I WC to within 10% of the observations at temperatures between -30 degrees and - 10 degrees C but tended to underestimate it by around a factor of 2 at colder temperatures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[ 1] A rapid increase in the variety, quality, and quantity of observations in polar regions is leading to a significant improvement in the understanding of sea ice dynamic and thermodynamic processes and their representation in global climate models. We assess the simulation of sea ice in the new Hadley Centre Global Environmental Model (HadGEM1) against the latest available observations. The HadGEM1 sea ice component uses elastic-viscous-plastic dynamics, multiple ice thickness categories, and zero-layer thermodynamics. The model evaluation is focused on the mean state of the key variables of ice concentration, thickness, velocity, and albedo. The model shows good agreement with observational data sets. The variability of the ice forced by the North Atlantic Oscillation is also found to agree with observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The distribution of tracers in the ocean is often taken as an indication of the ventilation pathways for oceanic water masses. It has been suggested that under anthropogenic forcing heat will be taken up into the interior of the ocean along isopycnal ventilation pathways. This notion is investigated by examining distributions of potential temperature and a passive anomaly temperature tracer in a coupled climate experiment where CO2 is increased at a rate of 2% per year. We show that interior temperature changes cannot be explained solely by passive tracer transport along isopycnals. Heat uptake is strongly affected by changes in circulation and has a substantial diapycnal component.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A systematic modular approach to investigate the respective roles of the ocean and atmosphere in setting El Niño characteristics in coupled general circulation models is presented. Several state-of-the-art coupled models sharing either the same atmosphere or the same ocean are compared. Major results include 1) the dominant role of the atmosphere model in setting El Niño characteristics (periodicity and base amplitude) and errors (regularity) and 2) the considerable improvement of simulated El Niño power spectra—toward lower frequency—when the atmosphere resolution is significantly increased. Likely reasons for such behavior are briefly discussed. It is argued that this new modular strategy represents a generic approach to identifying the source of both coupled mechanisms and model error and will provide a methodology for guiding model improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilbert’s curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilbert’s curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilbert’s curve, Sammon’s mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilbert’s curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilbert’s curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required.