48 resultados para unified projection model
Resumo:
This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.
Resumo:
The latest coupled configuration of the Met Office Unified Model (Global Coupled configuration 2, GC2) is presented. This paper documents the model components which make up the configuration (although the scientific description of these components is detailed elsewhere) and provides a description of the coupling between the components. The performance of GC2 in terms of its systematic errors is assessed using a variety of diagnostic techniques. The configuration is intended to be used by the Met Office and collaborating institutes across a range of timescales, with the seasonal forecast system (GloSea5) and climate projection system (HadGEM) being the initial users. In this paper GC2 is compared against the model currently used operationally in those two systems. Overall GC2 is shown to be an improvement on the configurations used currently, particularly in terms of modes of variability (e.g. mid-latitude and tropical cyclone intensities, the Madden–Julian Oscillation and El Niño Southern Oscillation). A number of outstanding errors are identified with the most significant being a considerable warm bias over the Southern Ocean and a dry precipitation bias in the Indian and West African summer monsoons. Research to address these is ongoing.
Resumo:
The Plant–Craig stochastic convection parameterization (version 2.0) is implemented in the Met Office Regional Ensemble Prediction System (MOGREPS-R) and is assessed in comparison with the standard convection scheme with a simple stochastic scheme only, from random parameter variation. A set of 34 ensemble forecasts, each with 24 members, is considered, over the month of July 2009. Deterministic and probabilistic measures of the precipitation forecasts are assessed. The Plant–Craig parameterization is found to improve probabilistic forecast measures, particularly the results for lower precipitation thresholds. The impact on deterministic forecasts at the grid scale is neutral, although the Plant–Craig scheme does deliver improvements when forecasts are made over larger areas. The improvements found are greater in conditions of relatively weak synoptic forcing, for which convective precipitation is likely to be less predictable.
Resumo:
The characteristics of convectively-generated gravity waves during an episode of deep convection near the coast of Wales are examined in both high resolution mesoscale simulations [with the (UK) Met Oce Unified Model] and in observations from a Mesosphere-Stratosphere-Troposphere (MST) wind profiling Doppler radar. Deep convection reached the tropopause and generated vertically propagating, high frequency waves in the lower stratosphere that produced vertical velocity perturbations O(1 m/s). Wavelet analysis is applied in order to determine the characteristic periods and wavelengths of the waves. In both the simulations and observations, the wavelet spectra contain several distinct preferred scales indicated by multiple spectral peaks. The peaks are most pronounced in the horizontal spectra at several wavelengths less than 50 km. Although these peaks are most clear and of largest amplitude in the highest resolution simulations (with 1 km horizontal grid length), they are also evident in coarser simulations (with 4 km horizontal grid length). Peaks also exist in the vertical and temporal spectra (between approximately 2.5 and 4.5 km, and 10 to 30 minutes, respectively) with good agreement between simulation and observation. Two-dimensional (wavenumber-frequency) spectra demonstrate that each of the selected horizontal scales contains peaks at each of preferred temporal scales revealed by the one- dimensional spectra alone.
Resumo:
Previous assessments of the impacts of climate change on heat-related mortality use the "delta method" to create temperature projection time series that are applied to temperature-mortality models to estimate future mortality impacts. The delta method means that climate model bias in the modelled present does not influence the temperature projection time series and impacts. However, the delta method assumes that climate change will result only in a change in the mean temperature but there is evidence that there will also be changes in the variability of temperature with climate change. The aim of this paper is to demonstrate the importance of considering changes in temperature variability with climate change in impacts assessments of future heat-related mortality. We investigate future heatrelated mortality impacts in six cities (Boston, Budapest, Dallas, Lisbon, London and Sydney) by applying temperature projections from the UK Meteorological Office HadCM3 climate model to the temperature-mortality models constructed and validated in Part 1. We investigate the impacts for four cases based on various combinations of mean and variability changes in temperature with climate change. The results demonstrate that higher mortality is attributed to increases in the mean and variability of temperature with climate change rather than with the change in mean temperature alone. This has implications for interpreting existing impacts estimates that have used the delta method. We present a novel method for the creation of temperature projection time series that includes changes in the mean and variability of temperature with climate change and is not influenced by climate model bias in the modelled present. The method should be useful for future impacts assessments. Few studies consider the implications that the limitations of the climate model may have on the heatrelated mortality impacts. Here, we demonstrate the importance of considering this by conducting an evaluation of the daily and extreme temperatures from HadCM3, which demonstrates that the estimates of future heat-related mortality for Dallas and Lisbon may be overestimated due to positive climate model bias. Likewise, estimates for Boston and London may be underestimated due to negative climate model bias. Finally, we briefly consider uncertainties in the impacts associated with greenhouse gas emissions and acclimatisation. The uncertainties in the mortality impacts due to different emissions scenarios of greenhouse gases in the future varied considerably by location. Allowing for acclimatisation to an extra 2°C in mean temperatures reduced future heat-related mortality by approximately half that of no acclimatisation in each city.
Resumo:
The entropy budget is calculated of the coupled atmosphere–ocean general circulation model HadCM3. Estimates of the different entropy sources and sinks of the climate system are obtained directly from the diabatic heating terms, and an approximate estimate of the planetary entropy production is also provided. The rate of material entropy production of the climate system is found to be ∼50 mW m−2 K−1, a value intermediate in the range 30–70 mW m−2 K−1 previously reported from different models. The largest part of this is due to sensible and latent heat transport (∼38 mW m−2 K−1). Another 13 mW m−2 K−1 is due to dissipation of kinetic energy in the atmosphere by friction and Reynolds stresses. Numerical entropy production in the atmosphere dynamical core is found to be about 0.7 mW m−2 K−1. The material entropy production within the ocean due to turbulent mixing is ∼1 mW m−2 K−1, a very small contribution to the material entropy production of the climate system. The rate of change of entropy of the model climate system is about 1 mW m−2 K−1 or less, which is comparable with the typical size of the fluctuations of the entropy sources due to interannual variability, and a more accurate closure of the budget than achieved by previous analyses. Results are similar for FAMOUS, which has a lower spatial resolution but similar formulation to HadCM3, while more substantial differences are found with respect to other models, suggesting that the formulation of the model has an important influence on the climate entropy budget. Since this is the first diagnosis of the entropy budget in a climate model of the type and complexity used for projection of twenty-first century climate change, it would be valuable if similar analyses were carried out for other such models.
Resumo:
Targeted observations are generally taken in regions of high baroclinicity, but often show little impact. One plausible explanation is that important dynamical information, such as upshear tilt, is not extracted from the targeted observations by the data assimilation scheme and used to correct initial condition error. This is investigated by generating pseudo targeted observations which contain a singular vector (SV) structure that is not present in the background field or routine observations, i.e. assuming that the background has an initial condition error with tilted growing structure. Experiments were performed for a single case-study with varying numbers of pseudo targeted observations. These were assimilated by the Met Office four-dimensional variational (4D-Var) data assimilation scheme, which uses a 6 h window for observations and background-error covariances calculated using the National Meteorological Centre (NMC) method. The forecasts were run using the operational Met Office Unified Model on a 24 km grid. The results presented clearly demonstrate that a 6 h window 4D-Var system is capable of extracting baroclinic information from a limited set of observations and using it to correct initial condition error. To capture the SV structure well (projection of 0.72 in total energy), 50 sondes over an area of 1×106 km2 were required. When the SV was represented by only eight sondes along an example targeting flight track covering a smaller area, the projection onto the SV structure was lower; the resulting forecast perturbations showed an SV structure with increased tilt and reduced initial energy. The total energy contained in the perturbations decreased as the SV structure was less well described by the set of observations (i.e. as fewer pseudo observations were assimilated). The assimilated perturbation had lower energy than the SV unless the pseudo observations were assimilated with the dropsonde observation errors halved from operational values. Copyright © 2010 Royal Meteorological Society
Resumo:
We introduce a technique for assessing the diurnal development of convective storm systems based on outgoing longwave radiation fields. Using the size distribution of the storms measured from a series of images, we generate an array in the lengthscale-time domain based on the standard score statistic. It demonstrates succinctly the size evolution of storms as well as the dissipation kinematics. It also provides evidence related to the temperature evolution of the cloud tops. We apply this approach to a test case comparing observations made by the Geostationary Earth Radiation Budget instrument to output from the Met Office Unified Model run at two resolutions. The 12km resolution model produces peak convective activity on all lengthscales significantly earlier in the day than shown by the observations and no evidence for storms growing in size. The 4km resolution model shows realistic timing and growth evolution although the dissipation mechanism still differs from the observed data.
Resumo:
To explore the projection efficiency of a design, Tsai, et al [2000. Projective three-level main effects designs robust to model uncertainty. Biometrika 87, 467-475] introduced the Q criterion to compare three-level main-effects designs for quantitative factors that allow the consideration of interactions in addition to main effects. In this paper, we extend their method and focus on the case in which experimenters have some prior knowledge, in advance of running the experiment, about the probabilities of effects being non-negligible. A criterion which incorporates experimenters' prior beliefs about the importance of each effect is introduced to compare orthogonal, or nearly orthogonal, main effects designs with robustness to interactions as a secondary consideration. We show that this criterion, exploiting prior information about model uncertainty, can lead to more appropriate designs reflecting experimenters' prior beliefs. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
In this paper, we list some new orthogonal main effects plans for three-level designs for 4, 5 and 6 factors in IS runs and compare them with designs obtained from the existing L-18 orthogonal array. We show that these new designs have better projection properties and can provide better parameter estimates for a range of possible models. Additionally, we study designs in other smaller run-sizes when there are insufficient resources to perform an 18-run experiment. Plans for three-level designs for 4, 5 and 6 factors in 13 to 17 runs axe given. We show that the best designs here are efficient and deserve strong consideration in many practical situations.
Resumo:
Large scientific applications are usually developed, tested and used by a group of geographically dispersed scientists. The problems associated with the remote development and data sharing could be tackled by using collaborative working environments. There are various tools and software to create collaborative working environments. Some software frameworks, currently available, use these tools and software to enable remote job submission and file transfer on top of existing grid infrastructures. However, for many large scientific applications, further efforts need to be put to prepare a framework which offers application-centric facilities. Unified Air Pollution Model (UNI-DEM), developed by Danish Environmental Research Institute, is an example of a large scientific application which is in a continuous development and experimenting process by different institutes in Europe. This paper intends to design a collaborative distributed computing environment for UNI-DEM in particular but the framework proposed may also fit to many large scientific applications as well.
Resumo:
In this paper the meteorological processes responsible for transporting tracer during the second ETEX (European Tracer EXperiment) release are determined using the UK Met Office Unified Model (UM). The UM predicted distribution of tracer is also compared with observations from the ETEX campaign. The dominant meteorological process is a warm conveyor belt which transports large amounts of tracer away from the surface up to a height of 4 km over a 36 h period. Convection is also an important process, transporting tracer to heights of up to 8 km. Potential sources of error when using an operational numerical weather prediction model to forecast air quality are also investigated. These potential sources of error include model dynamics, model resolution and model physics. In the UM a semi-Lagrangian monotonic advection scheme is used with cubic polynomial interpolation. This can predict unrealistic negative values of tracer which are subsequently set to zero, and hence results in an overprediction of tracer concentrations. In order to conserve mass in the UM tracer simulations it was necessary to include a flux corrected transport method. Model resolution can also affect the accuracy of predicted tracer distributions. Low resolution simulations (50 km grid length) were unable to resolve a change in wind direction observed during ETEX 2, this led to an error in the transport direction and hence an error in tracer distribution. High resolution simulations (12 km grid length) captured the change in wind direction and hence produced a tracer distribution that compared better with the observations. The representation of convective mixing was found to have a large effect on the vertical transport of tracer. Turning off the convective mixing parameterisation in the UM significantly reduced the vertical transport of tracer. Finally, air quality forecasts were found to be sensitive to the timing of synoptic scale features. Errors in the position of the cold front relative to the tracer release location of only 1 h resulted in changes in the predicted tracer concentrations that were of the same order of magnitude as the absolute tracer concentrations.
Resumo:
Several previous studies have attempted to assess the sublimation depth-scales of ice particles from clouds into clear air. Upon examining the sublimation depth-scales in the Met Office Unified Model (MetUM), it was found that the MetUM has evaporation depth-scales 2–3 times larger than radar observations. Similar results can be seen in the European Centre for Medium-Range Weather Forecasts (ECMWF), Regional Atmospheric Climate Model (RACMO) and Météo-France models. In this study, we use radar simulation (converting model variables into radar observations) and one-dimensional explicit microphysics numerical modelling to test and diagnose the cause of the deep sublimation depth-scales in the forecast model. The MetUM data and parametrization scheme are used to predict terminal velocity, which can be compared with the observed Doppler velocity. This can then be used to test the hypothesis as to why the sublimation depth-scale is too large within the MetUM. Turbulence could lead to dry air entrainment and higher evaporation rates; particle density may be wrong, particle capacitance may be too high and lead to incorrect evaporation rates or the humidity within the sublimating layer may be incorrectly represented. We show that the most likely cause of deep sublimation zones is an incorrect representation of model humidity in the layer. This is tested further by using a one-dimensional explicit microphysics model, which tests the sensitivity of ice sublimation to key atmospheric variables and is capable of including sonde and radar measurements to simulate real cases. Results suggest that the MetUM grid resolution at ice cloud altitudes is not sufficient enough to maintain the sharp drop in humidity that is observed in the sublimation zone.
Resumo:
A unified view on the interfacial instability in a model of aluminium reduction cells in the presence of a uniform, vertical, background magnetic field is presented. The classification of instability modes is based on the asymptotic theory for high values of parameter β, which characterises the ratio of the Lorentz force based on the disturbance current, and gravity. It is shown that the spectrum of the travelling waves consists of two parts independent of the horizontal cross-section of the cell: highly unstable wall modes and stable or weakly unstable centre, or Sele’s modes. The wall modes with the disturbance of the interface being localised at the sidewalls of the cell dominate the dynamics of instability. Sele’s modes are characterised by a distributed disturbance over the whole horizontal extent of the cell. As β increases these modes are stabilized by the field.
Resumo:
The assimilation of Doppler radar radial winds for high resolution NWP may improve short term forecasts of convective weather. Using insects as the radar target, it is possible to provide wind observations during convective development. This study aims to explore the potential of these new observations, with three case studies. Radial winds from insects detected by 4 operational weather radars were assimilated using 3D-Var into a 1.5 km resolution version of the Met Office Unified Model, using a southern UK domain and no convective parameterization. The effect on the analysis wind was small, with changes in direction and speed up to 45° and 2 m s−1 respectively. The forecast precipitation was perturbed in space and time but not substantially modified. Radial wind observations from insects show the potential to provide small corrections to the location and timing of showers but not to completely relocate convergence lines. Overall, quantitative analysis indicated the observation impact in the three case studies was small and neutral. However, the small sample size and possible ground clutter contamination issues preclude unequivocal impact estimation. The study shows the potential positive impact of insect winds; future operational systems using dual polarization radars which are better able to discriminate between insects and clutter returns should provided a much greater impact on forecasts.