997 resultados para Ocean Models
Resumo:
A perceived limitation of z-coordinate models associated with spurious diapycnal mixing in eddying, frontal flow, can be readily addressed through appropriate attention to the tracer advection schemes employed. It is demonstrated that tracer advection schemes developed by Prather and collaborators for application in the stratosphere, greatly improve the fidelity of eddying flows, reducing levels of spurious diapycnal mixing to below those directly measured in field experiments, ∼1 × 10−5 m2 s−1. This approach yields a model in which geostrophic eddies are quasi-adiabatic in the ocean interior, so that the residual-mean overturning circulation aligns almost perfectly with density contours. A reentrant channel configuration of the MIT General Circulation Model, that approximates the Antarctic Circumpolar Current, is used to examine these issues. Virtual analogs of ocean deliberate tracer release field experiments reinforce our conclusion, producing passive tracer solutions that parallel field experiments remarkably well.
Resumo:
Sea-ice concentrations in the Laptev Sea simulated by the coupled North Atlantic-Arctic Ocean-Sea-Ice Model and Finite Element Sea-Ice Ocean Model are evaluated using sea-ice concentrations from Advanced Microwave Scanning Radiometer-Earth Observing System satellite data and a polynya classification method for winter 2007/08. While developed to simulate largescale sea-ice conditions, both models are analysed here in terms of polynya simulation. The main modification of both models in this study is the implementation of a landfast-ice mask. Simulated sea-ice fields from different model runs are compared with emphasis placed on the impact of this prescribed landfast-ice mask. We demonstrate that sea-ice models are not able to simulate flaw polynyas realistically when used without fast-ice description. Our investigations indicate that without landfast ice and with coarse horizontal resolution the models overestimate the fraction of open water in the polynya. This is not because a realistic polynya appears but due to a larger-scale reduction of ice concentrations and smoothed ice-concentration fields. After implementation of a landfast-ice mask, the polynya location is realistically simulated but the total open-water area is still overestimated in most cases. The study shows that the fast-ice parameterization is essential for model improvements. However, further improvements are necessary in order to progress from the simulation of large-scale features in the Arctic towards a more detailed simulation of smaller-scaled features (here polynyas) in an Arctic shelf sea.
Resumo:
A parameterization of mesoscale eddy fluxes in the ocean should be consistent with the fact that the ocean interior is nearly adiabatic. Gent and McWilliams have described a framework in which this can be approximated in L-coordinate primitive equation models by incorporating the effects of eddies on the buoyancy field through an eddy-induced velocity. It is also natural to base a parameterization on the simple picture of the mixing of potential vorticity in the interior and the mixing of buoyancy at the surface. The authors discuss the various constraints imposed by these two requirements and attempt to clarify the appropriate boundary conditions on the eddy-induced velocities at the surface. Quasigeostrophic theory is used as a guide to the simplest way of satisfying these constraints.
Resumo:
G-Rex is light-weight Java middleware that allows scientific applications deployed on remote computer systems to be launched and controlled as if they are running on the user's own computer. G-Rex is particularly suited to ocean and climate modelling applications because output from the model is transferred back to the user while the run is in progress, which prevents the accumulation of large amounts of data on the remote cluster. The G-Rex server is a RESTful Web application that runs inside a servlet container on the remote system, and the client component is a Java command line program that can easily be incorporated into existing scientific work-flow scripts. The NEMO and POLCOMS ocean models have been deployed as G-Rex services in the NERC Cluster Grid, and G-Rex is the core grid middleware in the GCEP and GCOMS e-science projects.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
Flow in the world's oceans occurs at a wide range of spatial scales, from a fraction of a metre up to many thousands of kilometers. In particular, regions of intense flow are often highly localised, for example, western boundary currents, equatorial jets, overflows and convective plumes. Conventional numerical ocean models generally use static meshes. The use of dynamically-adaptive meshes has many potential advantages but needs to be guided by an error measure reflecting the underlying physics. A method of defining an error measure to guide an adaptive meshing algorithm for unstructured tetrahedral finite elements, utilizing an adjoint or goal-based method, is described here. This method is based upon a functional, encompassing important features of the flow structure. The sensitivity of this functional, with respect to the solution variables, is used as the basis from which an error measure is derived. This error measure acts to predict those areas of the domain where resolution should be changed. A barotropic wind driven gyre problem is used to demonstrate the capabilities of the method. The overall objective of this work is to develop robust error measures for use in an oceanographic context which will ensure areas of fine mesh resolution are used only where and when they are required. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
In response to increasing atmospheric con- centrations of greenhouse gases, the rate of time- dependent climate change is determined jointly by the strength of climate feedbacks and the e�ciency of pro- cesses which remove heat from the surface into the deep ocean. This work examines the vertical heat transport processes in the ocean of the HADCM2 atmosphere± ocean general circulation model (AOGCM) in experi- ments with CO2 held constant (control) and increasing at 1% per year (anomaly). The control experiment shows that global average heat exchanges between the upper and lower ocean are dominated by the Southern Ocean, where heat is pumped downwards by the wind- driven circulation and di�uses upwards along sloping isopycnals. This is the reverse of the low-latitude balance used in upwelling±di�usion ocean models, the global average upward di�usive transport being against the temperature gradient. In the anomaly experiment, weakened convection at high latitudes leads to reduced diffusive and convective heat loss from the deep ocean, and hence to net heat uptake, since the advective heat input is less a�ected. Reduction of deep water produc- tion at high latitudes results in reduced upwelling of cold water at low latitudes, giving a further contribution to net heat uptake. On the global average, high-latitude processes thus have a controlling in¯uence. The impor- tant role of di�usion highlights the need to ensure that the schemes employed in AOGCMs give an accurate representation of the relevant sub-grid-scale processes.
Resumo:
This paper seeks to elucidate the fundamental differences between the nonconservation of potential temperature and that of Conservative Temperature, in order to better understand the relative merits of each quantity for use as the heat variable in numerical ocean models. The main result is that potential temperature is found to behave similarly to entropy, in the sense that its nonconservation primarily reflects production/destruction by surface heat and freshwater fluxes; in contrast, the nonconservation of Conservative Temperature is found to reflect primarily the overall compressible work of expansion/contraction. This paper then shows how this can be exploited to constrain the nonconservation of potential temperature and entropy from observed surface heat fluxes, and the nonconservation of Conservative Temperature from published estimates of the mechanical energy budgets of ocean numerical models. Finally, the paper shows how to modify the evolution equation for potential temperature so that it is exactly equivalent to using an exactly conservative evolution equation for Conservative Temperature, as was recently recommended by IOC et al. (2010). This result should in principle allow ocean modellers to test the equivalence between the two formulations, and to indirectly investigate to what extent the budget of derived nonconservative quantities such as buoyancy and entropy can be expected to be accurately represented in ocean models.
Resumo:
Operational forecasting centres are currently developing data assimilation systems for coupled atmosphere-ocean models. Strongly coupled assimilation, in which a single assimilation system is applied to a coupled model, presents significant technical and scientific challenges. Hence weakly coupled assimilation systems are being developed as a first step, in which the coupled model is used to compare the current state estimate with observations, but corrections to the atmosphere and ocean initial conditions are then calculated independently. In this paper we provide a comprehensive description of the different coupled assimilation methodologies in the context of four dimensional variational assimilation (4D-Var) and use an idealised framework to assess the expected benefits of moving towards coupled data assimilation. We implement an incremental 4D-Var system within an idealised single column atmosphere-ocean model. The system has the capability to run both strongly and weakly coupled assimilations as well as uncoupled atmosphere or ocean only assimilations, thus allowing a systematic comparison of the different strategies for treating the coupled data assimilation problem. We present results from a series of identical twin experiments devised to investigate the behaviour and sensitivities of the different approaches. Overall, our study demonstrates the potential benefits that may be expected from coupled data assimilation. When compared to uncoupled initialisation, coupled assimilation is able to produce more balanced initial analysis fields, thus reducing initialisation shock and its impact on the subsequent forecast. Single observation experiments demonstrate how coupled assimilation systems are able to pass information between the atmosphere and ocean and therefore use near-surface data to greater effect. We show that much of this benefit may also be gained from a weakly coupled assimilation system, but that this can be sensitive to the parameters used in the assimilation.
Resumo:
Ocean prediction systems are now able to analyse and predict temperature, salinity and velocity structures within the ocean by assimilating measurements of the ocean’s temperature and salinity into physically based ocean models. Data assimilation combines current estimates of state variables, such as temperature and salinity, from a computational model with measurements of the ocean and atmosphere in order to improve forecasts and reduce uncertainty in the forecast accuracy. Data assimilation generally works well with ocean models away from the equator but has been found to induce vigorous and unrealistic overturning circulations near the equator. A pressure correction method was developed at the University of Reading and the Met Office to control these circulations using ideas from control theory and an understanding of equatorial dynamics. The method has been used for the last 10 years in seasonal forecasting and ocean prediction systems at the Met Office and European Center for Medium-range Weather Forecasting (ECMWF). It has been an important element in recent re-analyses of the ocean heat uptake that mitigates climate change.
Resumo:
A set of four eddy-permitting global ocean reanalyses produced in the framework of the MyOcean project have been compared over the altimetry period 1993–2011. The main differences among the reanalyses used here come from the data assimilation scheme implemented to control the ocean state by inserting reprocessed observations of sea surface temperature (SST), in situ temperature and salinity profiles, sea level anomaly and sea-ice concentration. A first objective of this work includes assessing the interannual variability and trends for a series of parameters, usually considered in the community as essential ocean variables: SST, sea surface salinity, temperature and salinity averaged over meaningful layers of the water column, sea level, transports across pre-defined sections, and sea ice parameters. The eddy-permitting nature of the global reanalyses allows also to estimate eddy kinetic energy. The results show that in general there is a good consistency between the different reanalyses. An intercomparison against experiments without data assimilation was done during the MyOcean project and we conclude that data assimilation is crucial for correctly simulating some quantities such as regional trends of sea level as well as the eddy kinetic energy. A second objective is to show that the ensemble mean of reanalyses can be evaluated as one single system regarding its reliability in reproducing the climate signals, where both variability and uncertainties are assessed through the ensemble spread and signal-to-noise ratio. The main advantage of having access to several reanalyses differing in the way data assimilation is performed is that it becomes possible to assess part of the total uncertainty. Given the fact that we use very similar ocean models and atmospheric forcing, we can conclude that the spread of the ensemble of reanalyses is mainly representative of our ability to gauge uncertainty in the assimilation methods. This uncertainty changes a lot from one ocean parameter to another, especially in global indices. However, despite several caveats in the design of the multi-system ensemble, the main conclusion from this study is that an eddy-permitting multi-system ensemble approach has become mature and our results provide a first step towards a systematic comparison of eddy-permitting global ocean reanalyses aimed at providing robust conclusions on the recent evolution of the oceanic state.
Resumo:
Many institutions worldwide have developed ocean reanalyses systems (ORAs) utilizing a variety of ocean models and assimilation techniques. However, the quality of salinity reanalyses arising from the various ORAs has not yet been comprehensively assessed. In this study, we assess the upper ocean salinity content (depth-averaged over 0–700 m) from 14 ORAs and 3 objective ocean analysis systems (OOAs) as part of the Ocean Reanalyses Intercomparison Project. Our results show that the best agreement between estimates of salinity from different ORAs is obtained in the tropical Pacific, likely due to relatively abundant atmospheric and oceanic observations in this region. The largest disagreement in salinity reanalyses is in the Southern Ocean along the Antarctic circumpolar current as a consequence of the sparseness of both atmospheric and oceanic observations in this region. The West Pacific warm pool is the largest region where the signal to noise ratio of reanalysed salinity anomalies is >1. Therefore, the current salinity reanalyses in the tropical Pacific Ocean may be more reliable than those in the Southern Ocean and regions along the western boundary currents. Moreover, we found that the assimilation of salinity in ocean regions with relatively strong ocean fronts is still a common problem as seen in most ORAs. The impact of the Argo data on the salinity reanalyses is visible, especially within the upper 500m, where the interannual variability is large. The increasing trend in global-averaged salinity anomalies can only be found within the top 0–300m layer, but with quite large diversity among different ORAs. Beneath the 300m depth, the global-averaged salinity anomalies from most ORAs switch their trends from a slightly growing trend before 2002 to a decreasing trend after 2002. The rapid switch in the trend is most likely an artefact of the dramatic change in the observing system due to the implementation of Argo.
Resumo:
We compare modeled oceanic carbon uptake in response to pulse CO2 emissions using a suite of global ocean models and Earth system models. In response to a CO2 pulse emission of 590 Pg C (corresponding to an instantaneous doubling of atmospheric CO2 from 278 to 556 ppm), the fraction of CO2 emitted that is absorbed by the ocean is: 37±8%, 56±10%, and 81±4% (model mean ±2σ ) in year 30, 100, and 1000 after the emission pulse, respectively. Modeled oceanic uptake of pulse CO2 on timescales from decades to about a century is strongly correlated with simulated present-day uptake of chlorofluorocarbons (CFCs) and CO2 across all models, while the amount of pulse CO2 absorbed by the ocean from a century to a millennium is strongly correlated with modeled radiocarbon in the deep Southern and Pacific Ocean. However, restricting the analysis to models that are capable of reproducing observations within uncertainty, the correlation is generally much weaker. The rates of surface-to-deep ocean transport are determined for individual models from the instantaneous doubling CO2 simulations, and they are used to calculate oceanic CO2 uptake in response to pulse CO2 emissions of different sizes pulses of 1000 and 5000 Pg C. These results are compared with simulated oceanic uptake of CO2 by a number of models simulations with the coupling of climate-ocean carbon cycle and without it. This comparison demonstrates that the impact of different ocean transport rates across models on oceanic uptake of anthropogenic CO2 is of similar magnitude as that of climate-carbon cycle feedbacks in a single model, emphasizing the important role of ocean transport in the uptake of anthropogenic CO2.
Resumo:
The global ocean is a significant sink for anthropogenic carbon (Cant), absorbing roughly a third of human CO2 emitted over the industrial period. Robust estimates of the magnitude and variability of the storage and distribution of Cant in the ocean are therefore important for understanding the human impact on climate. In this synthesis we review observational and model-based estimates of the storage and transport of Cant in the ocean. We pay particular attention to the uncertainties and potential biases inherent in different inference schemes. On a global scale, three data-based estimates of the distribution and inventory of Cant are now available. While the inventories are found to agree within their uncertainty, there are considerable differences in the spatial distribution. We also present a review of the progress made in the application of inverse and data assimilation techniques which combine ocean interior estimates of Cant with numerical ocean circulation models. Such methods are especially useful for estimating the air–sea flux and interior transport of Cant, quantities that are otherwise difficult to observe directly. However, the results are found to be highly dependent on modeled circulation, with the spread due to different ocean models at least as large as that from the different observational methods used to estimate Cant. Our review also highlights the importance of repeat measurements of hydrographic and biogeochemical parameters to estimate the storage of Cant on decadal timescales in the presence of the variability in circulation that is neglected by other approaches. Data-based Cant estimates provide important constraints on forward ocean models, which exhibit both broad similarities and regional errors relative to the observational fields. A compilation of inventories of Cant gives us a "best" estimate of the global ocean inventory of anthropogenic carbon in 2010 of 155 ± 31 PgC (±20% uncertainty). This estimate includes a broad range of values, suggesting that a combination of approaches is necessary in order to achieve a robust quantification of the ocean sink of anthropogenic CO2.