935 resultados para Implementation Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a recent study, Williams introduced a simple modification to the widely used Robert–Asselin (RA) filter for numerical integration. The main purpose of the Robert–Asselin–Williams (RAW) filter is to avoid the undesired numerical damping of the RA filter and to increase the accuracy. In the present paper, the effects of the modification are comprehensively evaluated in the Simplified Parameterizations, Primitive Equation Dynamics (SPEEDY) atmospheric general circulation model. First, the authors search for significant changes in the monthly climatology due to the introduction of the new filter. After testing both at the local level and at the field level, no significant changes are found, which is advantageous in the sense that the new scheme does not require a retuning of the parameterized model physics. Second, the authors examine whether the new filter improves the skill of short- and medium-term forecasts. January 1982 data from the NCEP–NCAR reanalysis are used to evaluate the forecast skill. Improvements are found in all the model variables (except the relative humidity, which is hardly changed). The improvements increase with lead time and are especially evident in medium-range forecasts (96–144 h). For example, in tropical surface pressure predictions, 5-day forecasts made using the RAW filter have approximately the same skill as 4-day forecasts made using the RA filter. The results of this work are encouraging for the implementation of the RAW filter in other models currently using the RA filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The misuse of Personal Protective Equipment results in health risk among smallholders in developing countries, and education is often proposed to promote safer practices. However, evidence point to limited effects of education. This paper presents a System Dynamics model which allows the identification of risk-minimizing policies for behavioural change. The model is based on the IAC framework and survey data. It represents farmers' decision-making from an agent-oriented standpoint. The most successful intervention strategy was the one which intervened in the long term, targeted key stocks in the systems and was diversified. However, the results suggest that, under these conditions, no policy is able to trigger a self sustaining behavioural change. Two implementation approaches were suggested by experts. One, based on constant social control, corresponds to a change of the current model's parameters. The other, based on participation, would lead farmers to new thinking, i.e. changes in their decision-making structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a literature review, we argue that new models of peatland development are needed. Many existing models do not account for potentially important ecohydrological feedbacks, and/or ignore spatial structure and heterogeneity. Existing models, including those that simulate a near total loss of the northern peatland carbon store under a warming climate, may produce misleading results because they rely upon oversimplified representations of ecological and hydrological processes. In this, the first of a pair of papers, we present the conceptual framework for a model of peatland development, DigiBog, which considers peatlands as complex adaptive systems. DigiBog accounts for the interactions between the processes which govern litter production and peat decay, peat soil hydraulic properties, and peatland water-table behaviour, in a novel and genuinely ecohydrological manner. DigiBog consists of a number of interacting submodels, each representing a different aspect of peatland ecohydrology. Here we present in detail the mathematical and computational basis, as well as the implementation and testing, of the hydrological submodel. Remaining submodels are described and analysed in the accompanying paper. Tests of the hydrological submodel against analytical solutions for simple aquifers were highly successful: the greatest deviation between DigiBog and the analytical solutions was 2·83%. We also applied the hydrological submodel to irregularly shaped aquifers with heterogeneous hydraulic properties—situations for which no analytical solutions exist—and found the model's outputs to be plausible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the formulation of a new urban scheme, MORUSES (Met Office–Reading Urban Surface Exchange Scheme) for use in the Met Office Unified Model. The implementation of the new scheme ensures that (1) the new scheme offers more flexibility in the parametrization of the building properties, and hence provides a more realistic representation of the fluxes; (2) the bulk outputs are in satisfactory agreement with previous observational studies; and (3) the impact of the new scheme on the energy balance fluxes is similar to the impact of the current urban scheme when set up to mimic it. As well as having a better physical basis, MORUSES also gains in flexibility in applications and adaptations to different urban materials as well as urban planning. The new scheme represents the urban area as a composition of two tiles, a canyon and a roof, using a simple 2D geometry. Sensitivity analysis to canyon geometry and thickness of the roof canopy emphasizes the gain in flexibility captured by the new scheme. Copyright © 2010 Royal Meteorological Society and Crown Copyright

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the implementation of a 3D variational (3D-Var) data assimilation scheme for a morphodynamic model applied to Morecambe Bay, UK. A simple decoupled hydrodynamic and sediment transport model is combined with a data assimilation scheme to investigate the ability of such methods to improve the accuracy of the predicted bathymetry. The inverse forecast error covariance matrix is modelled using a Laplacian approximation which is calibrated for the length scale parameter required. Calibration is also performed for the Soulsby-van Rijn sediment transport equations. The data used for assimilation purposes comprises waterlines derived from SAR imagery covering the entire period of the model run, and swath bathymetry data collected by a ship-borne survey for one date towards the end of the model run. A LiDAR survey of the entire bay carried out in November 2005 is used for validation purposes. The comparison of the predictive ability of the model alone with the model-forecast-assimilation system demonstrates that using data assimilation significantly improves the forecast skill. An investigation of the assimilation of the swath bathymetry as well as the waterlines demonstrates that the overall improvement is initially large, but decreases over time as the bathymetry evolves away from that observed by the survey. The result of combining the calibration runs into a pseudo-ensemble provides a higher skill score than for a single optimized model run. A brief comparison of the Optimal Interpolation assimilation method with the 3D-Var method shows that the two schemes give similar results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe ncWMS, an implementation of the Open Geospatial Consortium’s Web Map Service (WMS) specification for multidimensional gridded environmental data. ncWMS can read data in a large number of common scientific data formats – notably the NetCDF format with the Climate and Forecast conventions – then efficiently generate map imagery in thousands of different coordinate reference systems. It is designed to require minimal configuration from the system administrator and, when used in conjunction with a suitable client tool, provides end users with an interactive means for visualizing data without the need to download large files or interpret complex metadata. It is also used as a “bridging” tool providing interoperability between the environmental science community and users of geographic information systems. ncWMS implements a number of extensions to the WMS standard in order to fulfil some common scientific requirements, including the ability to generate plots representing timeseries and vertical sections. We discuss these extensions and their impact upon present and future interoperability. We discuss the conceptual mapping between the WMS data model and the data models used by gridded data formats, highlighting areas in which the mapping is incomplete or ambiguous. We discuss the architecture of the system and particular technical innovations of note, including the algorithms used for fast data reading and image generation. ncWMS has been widely adopted within the environmental data community and we discuss some of the ways in which the software is integrated within data infrastructures and portals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radiometric data in the visible domain acquired by satellite remote sensing have proven to be powerful for monitoring the states of the ocean, both physical and biological. With the help of these data it is possible to understand certain variations in biological responses of marine phytoplankton on ecological time scales. Here, we implement a sequential data-assimilation technique to estimate from a conventional nutrient–phytoplankton–zooplankton (NPZ) model the time variations of observed and unobserved variables. In addition, we estimate the time evolution of two biological parameters, namely, the specific growth rate and specific mortality of phytoplankton. Our study demonstrates that: (i) the series of time-varying estimates of specific growth rate obtained by sequential data assimilation improves the fitting of the NPZ model to the satellite-derived time series: the model trajectories are closer to the observations than those obtained by implementing static values of the parameter; (ii) the estimates of unobserved variables, i.e., nutrient and zooplankton, obtained from an NPZ model by implementation of a pre-defined parameter evolution can be different from those obtained on applying the sequences of parameters estimated by assimilation; and (iii) the maximum estimated specific growth rate of phytoplankton in the study area is more sensitive to the sea-surface temperature than would be predicted by temperature-dependent functions reported previously. The overall results of the study are potentially useful for enhancing our understanding of the biological response of phytoplankton in a changing environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The scientific understanding of the Earth’s climate system, including the central question of how the climate system is likely to respond to human-induced perturbations, is comprehensively captured in GCMs and Earth System Models (ESM). Diagnosing the simulated climate response, and comparing responses across different models, is crucially dependent on transparent assumptions of how the GCM/ESM has been driven – especially because the implementation can involve subjective decisions and may differ between modelling groups performing the same experiment. This paper outlines the climate forcings and setup of the Met Office Hadley Centre ESM, HadGEM2-ES for the CMIP5 set of centennial experiments. We document the prescribed greenhouse gas concentrations, aerosol precursors, stratospheric and tropospheric ozone assumptions, as well as implementation of land-use change and natural forcings for the HadGEM2-ES historical and future experiments following the Representative Concentration Pathways. In addition, we provide details of how HadGEM2-ES ensemble members were initialised from the control run and how the palaeoclimate and AMIP experiments, as well as the “emission driven” RCP experiments were performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aerosol indirect effects continue to constitute one of the most important uncertainties for anthropogenic climate perturbations. Within the international AEROCOM initiative, the representation of aerosol-cloud-radiation interactions in ten different general circulation models (GCMs) is evaluated using three satellite datasets. The focus is on stratiform liquid water clouds since most GCMs do not include ice nucleation effects, and none of the model explicitly parameterises aerosol effects on convective clouds. We compute statistical relationships between aerosol optical depth (τa) and various cloud and radiation quantities in a manner that is consistent between the models and the satellite data. It is found that the model-simulated influence of aerosols on cloud droplet number concentration (Nd ) compares relatively well to the satellite data at least over the ocean. The relationship between �a and liquid water path is simulated much too strongly by the models. This suggests that the implementation of the second aerosol indirect effect mainly in terms of an autoconversion parameterisation has to be revisited in the GCMs. A positive relationship between total cloud fraction (fcld) and �a as found in the satellite data is simulated by the majority of the models, albeit less strongly than that in the satellite data in most of them. In a discussion of the hypotheses proposed in the literature to explain the satellite-derived strong fcld–�a relationship, our results indicate that none can be identified as a unique explanation. Relationships similar to the ones found in satellite data between �a and cloud top temperature or outgoing long-wave radiation (OLR) are simulated by only a few GCMs. The GCMs that simulate a negative OLR - �a relationship show a strong positive correlation between �a and fcld. The short-wave total aerosol radiative forcing as simulated by the GCMs is strongly influenced by the simulated anthropogenic fraction of �a, and parameterisation assumptions such as a lower bound on Nd . Nevertheless, the strengths of the statistical relationships are good predictors for the aerosol forcings in the models. An estimate of the total short-wave aerosol forcing inferred from the combination of these predictors for the modelled forcings with the satellite-derived statistical relationships yields a global annual mean value of −1.5±0.5Wm−2. In an alternative approach, the radiative flux perturbation due to anthropogenic aerosols can be broken down into a component over the cloud-free portion of the globe (approximately the aerosol direct effect) and a component over the cloudy portion of the globe (approximately the aerosol indirect effect). An estimate obtained by scaling these simulated clearand cloudy-sky forcings with estimates of anthropogenic �a and satellite-retrieved Nd–�a regression slopes, respectively, yields a global, annual-mean aerosol direct effect estimate of −0.4±0.2Wm−2 and a cloudy-sky (aerosol indirect effect) estimate of −0.7±0.5Wm−2, with a total estimate of −1.2±0.4Wm−2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Plaut, McClelland, Seidenberg and Patterson (1996) connectionist model of reading was evaluated at two points early in its training against reading data collected from British children on two occasions during their first year of literacy instruction. First, the network’s non-word reading was poor relative to word reading when compared with the children. Second, the network made more non-lexical than lexical errors, the opposite pattern to the children. Three adaptations were made to the training of the network to bring it closer to the learning environment of a child: an incremental training regime was adopted; the network was trained on grapheme– phoneme correspondences; and a training corpus based on words found in children’s early reading materials was used. The modifications caused a sharp improvement in non-word reading, relative to word reading, resulting in a near perfect match to the children’s data on this measure. The modified network, however, continued to make predominantly non-lexical errors, although evidence from a small-scale implementation of the full triangle framework suggests that this limitation stems from the lack of a semantic pathway. Taken together, these results suggest that, when properly trained, connectionist models of word reading can offer insights into key aspects of reading development in children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interest towards Enterprise Architecture (EA) has been increasing during the last few years. EA has been found to be a crucial aspect of business survival, and thus the importance of EA implementation success is also crucial. Current literature does not have a tool to be used to measure the success of EA implementation. In this paper, a tentative model for measuring success is presented and empirically validated in EA context. Results show that the success of EA implementation can be measured indirectly by measuring the achievement of the objectives set for the implementation. Results also imply that achieving individual's objectives do not necessarily mean that organisation's objectives are achieved. The presented Success Measurement Model can be used as basis for developing measurement metrics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Ultra Weak Variational Formulation (UWVF) is a powerful numerical method for the approximation of acoustic, elastic and electromagnetic waves in the time-harmonic regime. The use of Trefftz-type basis functions incorporates the known wave-like behaviour of the solution in the discrete space, allowing large reductions in the required number of degrees of freedom for a given accuracy, when compared to standard finite element methods. However, the UWVF is not well disposed to the accurate approximation of singular sources in the interior of the computational domain. We propose an adjustment to the UWVF for seismic imaging applications, which we call the Source Extraction UWVF. Differing fields are solved for in subdomains around the source, and matched on the inter-domain boundaries. Numerical results are presented for a domain of constant wavenumber and for a domain of varying sound speed in a model used for seismic imaging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recommendation to reduce saturated fatty acid (SFA) consumption to ≤10% of total energy (%TE) is a key public health target aimed at lowering cardiovascular disease (CVD) risk. Replacement of SFA with unsaturated fats may provide greater benefit than replacement with carbohydrates, yet the optimal type of fat is unclear. The aim was to develop a flexible food-exchange model to investigate the effects of substituting SFAs with monounsaturated fatty acids (MUFAs) or n-6 (ω-6) polyunsaturated fatty acids (PUFAs) on CVD risk factors. In this parallel study, UK adults aged 21-60 y with moderate CVD risk (50% greater than the population mean) were identified using a risk assessment tool (n = 195; 56% females). Three 16-wk isoenergetic diets of specific fatty acid (FA) composition (%TE SFA:%TE MUFA:%TE n-6 PUFA) were designed using spreads, oils, dairy products, and snacks as follows: 1) SFA-rich diet (17:11:4; n = 65); 2) MUFA-rich diet (9:19:4; n = 64); and 3) n-6 PUFA-rich diet (9:13:10; n = 66). Each diet provided 36%TE total fat. Dietary targets were broadly met for all intervention groups, reaching 17.6 ± 0.4%TE SFA, 18.5 ± 0.3%TE MUFA, and 10.4 ± 0.3%TE n-6 PUFA in the respective diets, with significant overall diet effects for the changes in SFA, MUFA, and n-6 PUFA between groups (P < 0.001). There were no differences in the changes of total fat, protein, carbohydrate, and alcohol intake or anthropometric measures between groups. Plasma phospholipid FA composition showed changes from baseline in the proportions of total SFA, MUFA, and n-6 PUFA for each diet group, with significant overall diet effects for total SFA and MUFA between groups (P < 0.001). In conclusion, successful implementation of the food-exchange model broadly achieved the dietary target intakes for the exchange of SFA with MUFA or n-6 PUFA with minimal disruption to the overall diet in a free-living population. This trial was registered at clinicaltrials.gov as NCT01478958.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes a case study involving information technology managers and their new programmer recruitment policy, but the primary interest is methodological. The processes of issue generation and selection and model conceptualization are described. Early use of “magnetic hexagons” allowed the generation of a range of issues, most of which would not have emerged if system dynamics elicitation techniques had been employed. With the selection of a specific issue, flow diagraming was used to conceptualize a model, computer implementation and scenario generation following naturally. Observations are made on the processes of system dynamics modeling, particularly on the need to employ general techniques of knowledge elicitation in the early stages of interventions. It is proposed that flexible approaches should be used to generate, select, and study the issues, since these reduce any biasing of the elicitation toward system dynamics problems and also allow the participants to take up the most appropriate problem- structuring approach.