945 resultados para Runs


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The initial condition effect on climate prediction skill over a 2-year hindcast time-scale has been assessed from ensemble HadCM3 climate model runs using anomaly initialization over the period 1990–2001, and making comparisons with runs without initialization (equivalent to climatological conditions), and to anomaly persistence. It is shown that the assimilation improves the prediction skill in the first year globally, and in a number of limited areas out into the second year. Skill in hindcasting surface air temperature anomalies is most marked over ocean areas, and is coincident with areas of high sea surface temperature and ocean heat content skill. Skill improvement over land areas is much more limited but is still detectable in some cases. We found little difference in the skill of hindcasts using three different sets of ocean initial conditions, and we obtained the best results by combining these to form a grand ensemble hindcast set. Results are also compared with the idealized predictability studies of Collins (Clim. Dynam. 2002; 19: 671–692), which used the same model. The maximum lead time for which initialization gives enhanced skill over runs without initialization varies in different regions but is very similar to lead times found in the idealized studies, therefore strongly supporting the process representation in the model as well as its use for operational predictions. The limited 12-year period of the study, however, means that the regional details of model skill should probably be further assessed under a wider range of observational conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

New ways of combining observations with numerical models are discussed in which the size of the state space can be very large, and the model can be highly nonlinear. Also the observations of the system can be related to the model variables in highly nonlinear ways, making this data-assimilation (or inverse) problem highly nonlinear. First we discuss the connection between data assimilation and inverse problems, including regularization. We explore the choice of proposal density in a Particle Filter and show how the ’curse of dimensionality’ might be beaten. In the standard Particle Filter ensembles of model runs are propagated forward in time until observations are encountered, rendering it a pure Monte-Carlo method. In large-dimensional systems this is very inefficient and very large numbers of model runs are needed to solve the data-assimilation problem realistically. In our approach we steer all model runs towards the observations resulting in a much more efficient method. By further ’ensuring almost equal weight’ we avoid performing model runs that are useless in the end. Results are shown for the 40 and 1000 dimensional Lorenz 1995 model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the work was to study the survival of Lactobacillus plantarum NCIMB 8826 in model solutions and develop a mathematical model describing its dependence on pH, citric acid and ascorbic acid. A Central Composite Design (CCD) was developed studying each of the three factors at five levels within the following ranges, i.e., pH (3.0-4.2), citric acid (6-40 g/L), and ascorbic acid (100-1000 mg/L). In total, 17 experimental runs were carried out. The initial cell concentration in the model solutions was approximately 1 × 10(8)CFU/mL; the solutions were stored at 4°C for 6 weeks. Analysis of variance (ANOVA) of the stepwise regression demonstrated that a second order polynomial model fits well the data. The results demonstrated that high pH and citric acid concentration enhanced cell survival; one the other hand, ascorbic acid did not have an effect. Cell survival during storage was also investigated in various types of juices, including orange, grapefruit, blackcurrant, pineapple, pomegranate, cranberry and lemon juice. The model predicted well the cell survival in orange, blackcurrant and pineapple, however it failed to predict cell survival in grapefruit and pomegranate, indicating the influence of additional factors, besides pH and citric acid, on cell survival. Very good cell survival (less than 0.4 log decrease) was observed after 6 weeks of storage in orange, blackcurrant and pineapple juice, all of which had a pH of about 3.8. Cell survival in cranberry and pomegranate decreased very quickly, whereas in the case of lemon juice, the cell concentration decreased approximately 1.1 logs after 6 weeks of storage, albeit the fact that lemon juice had the lowest pH (pH~2.5) among all the juices tested. Taking into account the results from the compositional analysis of the juices and the model, it was deduced that in certain juices, other compounds seemed to protect the cells during storage; these were likely to be proteins and dietary fibre In contrast, in certain juices, such as pomegranate, cell survival was much lower than expected; this could be due to the presence of antimicrobial compounds, such as phenolic compounds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The survival of Bifidobacterium longum NCIMB 8809 was studied during refrigerated storage for 6 weeks in model solutions, based on which a mathematical model was constructed describing cell survival as a function of pH, citric acid, protein and dietary fibre. A Central Composite Design (CCD) was developed studying the influence of four factors at three levels, i.e., pH (3.2–4), citric acid (2–15 g/l), protein (0–10 g/l), and dietary fibre (0–8 g/l). In total, 31 experimental runs were carried out. Analysis of variance (ANOVA) of the regression model demonstrated that the model fitted well the data. From the regression coefficients it was deduced that all four factors had a statistically significant (P < 0.05) negative effect on the log decrease [log10N0 week−log10N6 week], with the pH and citric acid being the most influential ones. Cell survival during storage was also investigated in various types of juices, including orange, grapefruit, blackcurrant, pineapple, pomegranate and strawberry. The highest cell survival (less than 0.4 log decrease) after 6 weeks of storage was observed in orange and pineapple, both of which had a pH of about 3.8. Although the pH of grapefruit and blackcurrant was similar (pH ∼3.2), the log decrease of the former was ∼0.5 log, whereas of the latter was ∼0.7 log. One reason for this could be the fact that grapefruit contained a high amount of citric acid (15.3 g/l). The log decrease in pomegranate and strawberry juices was extremely high (∼8 logs). The mathematical model was able to predict adequately the cell survival in orange, grapefruit, blackcurrant, and pineapple juices. However, the model failed to predict the cell survival in pomegranate and strawberry, most likely due to the very high levels of phenolic compounds in these two juices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A number of transient climate runs simulating the last 120kyr have been carried out using FAMOUS, a fast atmosphere-ocean general circulation model (AOGCM). This is the first time such experiments have been done with a full AOGCM, providing a three-dimensional simulation of both atmosphere and ocean over this period. Our simulation thus includes internally generated temporal variability over periods from days to millennia, and physical, detailed representations of important processes such as clouds and precipitation. Although the model is fast, computational restrictions mean that the rate of change of the forcings has been increased by a factor of 10, making each experiment 12kyr long. Atmospheric greenhouse gases (GHGs), northern hemisphere ice sheets and variations in solar radiation arising from changes in the Earth's orbit are treated as forcing factors, and are applied either separately or combined in different experiments. The long-term temperature changes on Antarctica match well with reconstructions derived from ice-core data, as does variability on timescales longer than 10 kyr. Last Glacial Maximum (LGM) cooling on Greenland is reasonably well simulated, although our simulations, which lack ice-sheet meltwater forcing, do not reproduce the abrupt, millennial scale climate shifts seen in northern hemisphere climate proxies or their slower southern hemisphere counterparts. The spatial pattern of sea surface cooling at the LGM matches proxy reconstructions reasonably well. There is significant anti-correlated variability in the strengths of the Atlantic Meridional Overturning Circulation (AMOC) and the Antarctic Circumpolar Current (ACC) on timescales greater than 10kyr in our experiments. We find that GHG forcing weakens the AMOC and strengthens the ACC, whilst the presence of northern hemisphere ice-sheets strengthens the AMOC and weakens the ACC. The structure of the AMOC at the LGM is found to be sensitive to the details of the ice-sheet reconstruction used. The precessional component of the orbital forcing induces ~20kyr oscillations in the AMOC and ACC, whose amplitude is mediated by changes in the eccentricity of the Earth's orbit. These forcing influences combine, to first order, in a linear fashion to produce the mean climate and ocean variability seen in the run with all forcings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hourly winter weather of the Last Glacial Maximum (LGM) is simulated using the Community Climate Model version 3 (CCM3) on a globally resolved T170 (75 km) grid. Results are compared to a longer LGM climatological run with the same boundary conditions and monthly saves. Hourly-scale animations are used to enhance interpretations. The purpose of the study is to explore whether additional insights into ice age conditions can be gleaned by going beyond the standard employment of monthly average model statistics to infer ice age weather and climate. Results for both LGM runs indicate a decrease in North Atlantic and increase in North Pacific cyclogenesis. Storm trajectories react to the mechanical forcing of the Laurentide Ice Sheet, with Pacific storms tracking over middle Alaska and northern Canada, terminating in the Labrador Sea. This result is coincident with other model results in also showing a significant reduction in Greenland wintertime precipitation – a response supported by ice core evidence. Higher-temporal resolution puts in sharper focus the close tracking of Pacific storms along the west coast of North America. This response is consistent with increased poleward heat transport in the LGM climatological run and could help explain “early” glacial warming inferred in this region from proxy climate records. Additional analyses shows a large increase in central Asian surface gustiness that support observational inferences that upper-level winds associated with Asian- Pacific storms transported Asian dust to Greenland during the LGM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an intercomparison and verification analysis of 20 GCMs (Global Circulation Models) included in the 4th IPCC assessment report regarding their representation of the hydrological cycle on the Danube river basin for 1961–2000 and for the 2161–2200 SRESA1B scenario runs. The basin-scale properties of the hydrological cycle are computed by spatially integrating the precipitation, evaporation, and runoff fields using the Voronoi-Thiessen tessellation formalism. The span of the model- simulated mean annual water balances is of the same order of magnitude of the observed Danube discharge of the Delta; the true value is within the range simulated by the models. Some land components seem to have deficiencies since there are cases of violation of water conservation when annual means are considered. The overall performance and the degree of agreement of the GCMs are comparable to those of the RCMs (Regional Climate Models) analyzed in a previous work, in spite of the much higher resolution and common nesting of the RCMs. The reanalyses are shown to feature several inconsistencies and cannot be used as a verification benchmark for the hydrological cycle in the Danubian region. In the scenario runs, for basically all models the water balance decreases, whereas its interannual variability increases. Changes in the strength of the hydrological cycle are not consistent among models: it is confirmed that capturing the impact of climate change on the hydrological cycle is not an easy task over land areas. Moreover, in several cases we find that qualitatively different behaviors emerge among the models: the ensemble mean does not represent any sort of average model, and often it falls between the models’ clusters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technique of relaxation of the tropical atmosphere towards an analysis in a month-season forecast model has previously been successfully exploited in a number of contexts. Here it is shown that when tropical relaxation is used to investigate the possible origin of the observed anomalies in June–July 2007, a simple dynamical model is able to reproduce the observed component of the pattern of anomalies given by an ensemble of ECMWF forecast runs. Following this result, the simple model is used for a range of experiments on time-scales of relaxation, variables and regions relaxed based on a control model run with equatorial heating in a zonal flow. A theory based on scale analysis for the large-scale tropics is used to interpret the results. Typical relationships between scales are determined from the basic equations, and for a specified diabatic heating a chain of deductions for determining the dependent variables is derived. Different critical time-scales are found for tropical relaxation of different dependent variables to be effective. Vorticity has the longest critical time-scale, typically 1.2 days. For temperature and divergence, the time-scales are 10 hours and 3 hours, respectively. However not all the tropical fields, in particular the vertical motion, are reproduced correctly by the model unless divergence is heavily damped. To obtain the correct extra-tropical fields, it is crucial to have the correct rotational flow in the subtropics to initiate the Rossby wave propagation from there. It is sufficient to relax vorticity or temperature on a time-scale comparable or less than their critical time-scales to obtain this. However if the divergent advection of vorticity is important in the Rossby Wave Source then strong relaxation of divergence is required to accurately represent the tropical forcing of Rossby waves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the implementation of a 3D variational (3D-Var) data assimilation scheme for a morphodynamic model applied to Morecambe Bay, UK. A simple decoupled hydrodynamic and sediment transport model is combined with a data assimilation scheme to investigate the ability of such methods to improve the accuracy of the predicted bathymetry. The inverse forecast error covariance matrix is modelled using a Laplacian approximation which is calibrated for the length scale parameter required. Calibration is also performed for the Soulsby-van Rijn sediment transport equations. The data used for assimilation purposes comprises waterlines derived from SAR imagery covering the entire period of the model run, and swath bathymetry data collected by a ship-borne survey for one date towards the end of the model run. A LiDAR survey of the entire bay carried out in November 2005 is used for validation purposes. The comparison of the predictive ability of the model alone with the model-forecast-assimilation system demonstrates that using data assimilation significantly improves the forecast skill. An investigation of the assimilation of the swath bathymetry as well as the waterlines demonstrates that the overall improvement is initially large, but decreases over time as the bathymetry evolves away from that observed by the survey. The result of combining the calibration runs into a pseudo-ensemble provides a higher skill score than for a single optimized model run. A brief comparison of the Optimal Interpolation assimilation method with the 3D-Var method shows that the two schemes give similar results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An ensemble forecast is a collection of runs of a numerical dynamical model, initialized with perturbed initial conditions. In modern weather prediction for example, ensembles are used to retrieve probabilistic information about future weather conditions. In this contribution, we are concerned with ensemble forecasts of a scalar quantity (say, the temperature at a specific location). We consider the event that the verification is smaller than the smallest, or larger than the largest ensemble member. We call these events outliers. If a K-member ensemble accurately reflected the variability of the verification, outliers should occur with a base rate of 2/(K + 1). In operational forecast ensembles though, this frequency is often found to be higher. We study the predictability of outliers and find that, exploiting information available from the ensemble, forecast probabilities for outlier events can be calculated which are more skilful than the unconditional base rate. We prove this analytically for statistically consistent forecast ensembles. Further, the analytical results are compared to the predictability of outliers in an operational forecast ensemble by means of model output statistics. We find the analytical and empirical results to agree both qualitatively and quantitatively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During winter the ocean surface in polar regions freezes over to form sea ice. In the summer the upper layers of sea ice and snow melts producing meltwater that accumulates in Arctic melt ponds on the surface of sea ice. An accurate estimate of the fraction of the sea ice surface covered in melt ponds is essential for a realistic estimate of the albedo for global climate models. We present a melt-pond–sea-ice model that simulates the three-dimensional evolution of melt ponds on an Arctic sea ice surface. The advancements of this model compared to previous models are the inclusion of snow topography; meltwater transport rates are calculated from hydraulic gradients and ice permeability; and the incorporation of a detailed one-dimensional, thermodynamic radiative balance. Results of model runs simulating first-year and multiyear sea ice are presented. Model results show good agreement with observations, with duration of pond coverage, pond area, and ice ablation comparing well for both the first-year ice and multiyear ice cases. We investigate the sensitivity of the melt pond cover to changes in ice topography, snow topography, and vertical ice permeability. Snow was found to have an important impact mainly at the start of the melt season, whereas initial ice topography strongly controlled pond size and pond fraction throughout the melt season. A reduction in ice permeability allowed surface flooding of relatively flat, first-year ice but had little impact on the pond coverage of rougher, multiyear ice. We discuss our results, including model shortcomings and areas of experimental uncertainty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper takes as its starting point the assertion that current rangeland management in the central Eastern Cape Province (former Ciskei) of South Africa, is characterised primarily by an ‘open access’ approach. Empirical material drawn from three case-study communities in the region is used to examine the main barriers to management of rangeland as a ‘commons’. The general inability to define and enforce rights to particular grazing resourses in the face of competing claims from ‘outsiders’, as well as inadequate local institutions responsible for rangeland management are highlighted as being of key importance. These are often exacerbated by lack of available grazing land, diffuse user groups and local political and ethnic divisions. Many of these problems have a strong legacy in historical apartheid policies such as forced resettlement and betterment planning. On this basis it is argued that policy should focus on facilitating the emergence of effective, local institutions for rangeland management. Given the limited grazing available to many communities in the region, a critical aspect of this will be finding ways to legitimise current patterns of extensive resource use, which traverse existing ‘community’ boundaries. However, this runs counter to recent legislation, which strongly links community management with legal ownership of land within strict boundaries often defined through fencing. Finding ways to overcome this apparent disjuncture between theory and policy will be vital for the effective management of common pool grazing resources in the region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The translation of an ensemble of model runs into a probability distribution is a common task in model-based prediction. Common methods for such ensemble interpretations proceed as if verification and ensemble were draws from the same underlying distribution, an assumption not viable for most, if any, real world ensembles. An alternative is to consider an ensemble as merely a source of information rather than the possible scenarios of reality. This approach, which looks for maps between ensembles and probabilistic distributions, is investigated and extended. Common methods are revisited, and an improvement to standard kernel dressing, called ‘affine kernel dressing’ (AKD), is introduced. AKD assumes an affine mapping between ensemble and verification, typically not acting on individual ensemble members but on the entire ensemble as a whole, the parameters of this mapping are determined in parallel with the other dressing parameters, including a weight assigned to the unconditioned (climatological) distribution. These amendments to standard kernel dressing, albeit simple, can improve performance significantly and are shown to be appropriate for both overdispersive and underdispersive ensembles, unlike standard kernel dressing which exacerbates over dispersion. Studies are presented using operational numerical weather predictions for two locations and data from the Lorenz63 system, demonstrating both effectiveness given operational constraints and statistical significance given a large sample.