295 resultados para COSMO KENDA LETKF ensemble assimilation
Resumo:
One of the prerequisites for achieving skill in decadal climate prediction is to initialize and predict the circulation in the Atlantic Ocean successfully. The RAPID array measures the Atlantic Meridional Overturning Circulation (MOC) at 26°N. Here we develop a method to include these observations in the Met Office Decadal Prediction System (DePreSys). The proposed method uses covariances of overturning transport anomalies at 26°N with ocean temperature and salinity anomalies throughout the ocean to create the density structure necessary to reproduce the observed transport anomaly. Assimilating transport alone in this way effectively reproduces the observed transport anomalies at 26°N and is better than using basin-wide temperature and salinity observations alone. However, when the transport observations are combined with in situ temperature and salinity observations in the analysis, the transport is not currently reproduced so well. The reasons for this are investigated using pseudo-observations in a twin experiment framework. Sensitivity experiments show that the MOC on monthly time-scales, at least in the HadCM3 model, is modulated by a mechanism where non-local density anomalies appear to be more important for transport variability at 26°N than local density gradients.
Resumo:
Simulation models are widely employed to make probability forecasts of future conditions on seasonal to annual lead times. Added value in such forecasts is reflected in the information they add, either to purely empirical statistical models or to simpler simulation models. An evaluation of seasonal probability forecasts from the Development of a European Multimodel Ensemble system for seasonal to inTERannual prediction (DEMETER) and ENSEMBLES multi-model ensemble experiments is presented. Two particular regions are considered: Nino3.4 in the Pacific and the Main Development Region in the Atlantic; these regions were chosen before any spatial distribution of skill was examined. The ENSEMBLES models are found to have skill against the climatological distribution on seasonal time-scales. For models in ENSEMBLES that have a clearly defined predecessor model in DEMETER, the improvement from DEMETER to ENSEMBLES is discussed. Due to the long lead times of the forecasts and the evolution of observation technology, the forecast-outcome archive for seasonal forecast evaluation is small; arguably, evaluation data for seasonal forecasting will always be precious. Issues of information contamination from in-sample evaluation are discussed and impacts (both positive and negative) of variations in cross-validation protocol are demonstrated. Other difficulties due to the small forecast-outcome archive are identified. The claim that the multi-model ensemble provides a ‘better’ probability forecast than the best single model is examined and challenged. Significant forecast information beyond the climatological distribution is also demonstrated in a persistence probability forecast. The ENSEMBLES probability forecasts add significantly more information to empirical probability forecasts on seasonal time-scales than on decadal scales. Current operational forecasts might be enhanced by melding information from both simulation models and empirical models. Simulation models based on physical principles are sometimes expected, in principle, to outperform empirical models; direct comparison of their forecast skill provides information on progress toward that goal.
Resumo:
Climate models are potentially useful tools for addressing human dispersals and demographic change. The Arabian Peninsula is becoming increasingly significant in the story of human dispersals out of Africa during the Late Pleistocene. Although characterised largely by arid environments today, emerging climate records indicate that the peninsula was wetter many times in the past, suggesting that the region may have been inhabited considerably more than hitherto thought. Explaining the origins and spatial distribution of increased rainfall is challenging because palaeoenvironmental research in the region is in an early developmental stage. We address environmental oscillations by assembling and analysing an ensemble of five global climate models (CCSM3, COSMOS, HadCM3, KCM, and NorESM). We focus on precipitation, as the variable is key for the development of lakes, rivers and savannas. The climate models generated here were compared with published palaeoenvironmental data such as palaeolakes, speleothems and alluvial fan records as a means of validation. All five models showed, to varying degrees, that the Arabia Peninsula was significantly wetter than today during the Last Interglacial (130 ka and 126/125 ka timeslices), and that the main source of increased rainfall was from the North African summer monsoon rather than the Indian Ocean monsoon or from Mediterranean climate patterns. Where available, 104 ka (MIS 5c), 56 ka (early MIS 3) and 21 ka (LGM) timeslices showed rainfall was present but not as extensive as during the Last Interglacial. The results favour the hypothesis that humans potentially moved out of Africa and into Arabia on multiple occasions during pluvial phases of the Late Pleistocene.
Resumo:
The induction of classification rules from previously unseen examples is one of the most important data mining tasks in science as well as commercial applications. In order to reduce the influence of noise in the data, ensemble learners are often applied. However, most ensemble learners are based on decision tree classifiers which are affected by noise. The Random Prism classifier has recently been proposed as an alternative to the popular Random Forests classifier, which is based on decision trees. Random Prism is based on the Prism family of algorithms, which is more robust to noise. However, like most ensemble classification approaches, Random Prism also does not scale well on large training data. This paper presents a thorough discussion of Random Prism and a recently proposed parallel version of it called Parallel Random Prism. Parallel Random Prism is based on the MapReduce programming paradigm. The paper provides, for the first time, novel theoretical analysis of the proposed technique and in-depth experimental study that show that Parallel Random Prism scales well on a large number of training examples, a large number of data features and a large number of processors. Expressiveness of decision rules that our technique produces makes it a natural choice for Big Data applications where informed decision making increases the user’s trust in the system.
Resumo:
The East China Sea is a hot area for typhoon waves to occur. A wave spectra assimilation model has been developed to predict the typhoon wave more accurately and operationally. This is the first time where wave data from Taiwan have been used to predict typhoon wave along the mainland China coast. The two-dimensional spectra observed in Taiwan northeast coast modify the wave field output by SWAN model through the technology of optimal interpolation (OI) scheme. The wind field correction is not involved as it contributes less than a quarter of the correction achieved by assimilation of waves. The initialization issue for assimilation is discussed. A linear evolution law for noise in the wave field is derived from the SWAN governing equations. A two-dimensional digital low-pass filter is used to obtain the initialized wave fields. The data assimilation model is optimized during the typhoon Sinlaku. During typhoons Krosa and Morakot, data assimilation significantly improves the low frequency wave energy and wave propagation direction in Taiwan coast. For the far-field region, the assimilation model shows an expected ability of improving typhoon wave forecast as well, as data assimilation enhances the low frequency wave energy. The proportion of positive assimilation indexes is over 81% for all the periods of comparison. The paper also finds that the impact of data assimilation on the far-field region depends on the state of the typhoon developing and the swell propagation direction.
Resumo:
The incorporation of numerical weather predictions (NWP) into a flood warning system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and can lead to a high number of false or missed warnings. Weather forecasts using multiple NWPs from various weather centres implemented on catchment hydrology can provide significantly improved early flood warning. The availability of global ensemble weather prediction systems through the ‘THORPEX Interactive Grand Global Ensemble’ (TIGGE) offers a new opportunity for the development of state-of-the-art early flood forecasting systems. This paper presents a case study using the TIGGE database for flood warning on a meso-scale catchment (4062 km2) located in the Midlands region of England. For the first time, a research attempt is made to set up a coupled atmospheric-hydrologic-hydraulic cascade system driven by the TIGGE ensemble forecasts. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE database. The study shows that precipitation input uncertainties dominate and propagate through the cascade chain. The current NWPs fall short of representing the spatial precipitation variability on such a comparatively small catchment, which indicates need to improve NWPs resolution and/or disaggregating techniques to narrow down the spatial gap between meteorology and hydrology. The spread of discharge forecasts varies from centre to centre, but it is generally large and implies a significant level of uncertainties. Nevertheless, the results show the TIGGE database is a promising tool to forecast flood inundation, comparable with that driven by raingauge observation.
Resumo:
This paper investigates the effect on balance of a number of Schur product-type localization schemes which have been designed with the primary function of reducing spurious far-field correlations in forecast error statistics. The localization schemes studied comprise a non-adaptive scheme (where the moderation matrix is decomposed in a spectral basis), and two adaptive schemes, namely a simplified version of SENCORP (Smoothed ENsemble COrrelations Raised to a Power) and ECO-RAP (Ensemble COrrelations Raised to A Power). The paper shows, we believe for the first time, how the degree of balance (geostrophic and hydrostatic) implied by the error covariance matrices localized by these schemes can be diagnosed. Here it is considered that an effective localization scheme is one that reduces spurious correlations adequately but also minimizes disruption of balance (where the 'correct' degree of balance or imbalance is assumed to be possessed by the unlocalized ensemble). By varying free parameters that describe each scheme (e.g. the degree of truncation in the schemes that use the spectral basis, the 'order' of each scheme, and the degree of ensemble smoothing), it is found that a particular configuration of the ECO-RAP scheme is best suited to the convective-scale system studied. According to our diagnostics this ECO-RAP configuration still weakens geostrophic and hydrostatic balance, but overall this is less so than for other schemes.
Resumo:
Instrumental observations, palaeo-proxies, and climate models suggest significant decadal variability within the North Atlantic subpolar gyre (NASPG). However, a poorly sampled observational record and a diversity of model behaviours mean that the precise nature and mechanisms of this variability are unclear. Here, we analyse an exceptionally large multi-model ensemble of 42 present-generation climate models to test whether NASPG mean state biases systematically affect the representation of decadal variability. Temperature and salinity biases in the Labrador Sea co-vary and influence whether density variability is controlled by temperature or salinity variations. Ocean horizontal resolution is a good predictor of the biases and the location of the dominant dynamical feedbacks within the NASPG. However, we find no link to the spectral characteristics of the variability. Our results suggest that the mean state and mechanisms of variability within the NASPG are not independent. This represents an important caveat for decadal predictions using anomaly-assimilation methods.
Resumo:
Operational forecasting centres are currently developing data assimilation systems for coupled atmosphere-ocean models. Strongly coupled assimilation, in which a single assimilation system is applied to a coupled model, presents significant technical and scientific challenges. Hence weakly coupled assimilation systems are being developed as a first step, in which the coupled model is used to compare the current state estimate with observations, but corrections to the atmosphere and ocean initial conditions are then calculated independently. In this paper we provide a comprehensive description of the different coupled assimilation methodologies in the context of four dimensional variational assimilation (4D-Var) and use an idealised framework to assess the expected benefits of moving towards coupled data assimilation. We implement an incremental 4D-Var system within an idealised single column atmosphere-ocean model. The system has the capability to run both strongly and weakly coupled assimilations as well as uncoupled atmosphere or ocean only assimilations, thus allowing a systematic comparison of the different strategies for treating the coupled data assimilation problem. We present results from a series of identical twin experiments devised to investigate the behaviour and sensitivities of the different approaches. Overall, our study demonstrates the potential benefits that may be expected from coupled data assimilation. When compared to uncoupled initialisation, coupled assimilation is able to produce more balanced initial analysis fields, thus reducing initialisation shock and its impact on the subsequent forecast. Single observation experiments demonstrate how coupled assimilation systems are able to pass information between the atmosphere and ocean and therefore use near-surface data to greater effect. We show that much of this benefit may also be gained from a weakly coupled assimilation system, but that this can be sensitive to the parameters used in the assimilation.
Resumo:
This study has investigated serial (temporal) clustering of extra-tropical cyclones simulated by 17 climate models that participated in CMIP5. Clustering was estimated by calculating the dispersion (ratio of variance to mean) of 30 December-February counts of Atlantic storm tracks passing nearby each grid point. Results from single historical simulations of 1975-2005 were compared to those from historical ERA40 reanalyses from 1958-2001 ERA40 and single future model projections of 2069-2099 under the RCP4.5 climate change scenario. Models were generally able to capture the broad features in reanalyses reported previously: underdispersion/regularity (i.e. variance less than mean) in the western core of the Atlantic storm track surrounded by overdispersion/clustering (i.e. variance greater than mean) to the north and south and over western Europe. Regression of counts onto North Atlantic Oscillation (NAO) indices revealed that much of the overdispersion in the historical reanalyses and model simulations can be accounted for by NAO variability. Future changes in dispersion were generally found to be small and not consistent across models. The overdispersion statistic, for any 30 year sample, is prone to large amounts of sampling uncertainty that obscures the climate change signal. For example, the projected increase in dispersion for storm counts near London in the CNRMCM5 model is 0.1 compared to a standard deviation of 0.25. Projected changes in the mean and variance of NAO are insufficient to create changes in overdispersion that are discernible above natural sampling variations.
Resumo:
We utilized an ecosystem process model (SIPNET, simplified photosynthesis and evapotranspiration model) to estimate carbon fluxes of gross primary productivity and total ecosystem respiration of a high-elevation coniferous forest. The data assimilation routine incorporated aggregated twice-daily measurements of the net ecosystem exchange of CO2 (NEE) and satellite-based reflectance measurements of the fraction of absorbed photosynthetically active radiation (fAPAR) on an eight-day timescale. From these data we conducted a data assimilation experiment with fifteen different combinations of available data using twice-daily NEE, aggregated annual NEE, eight-day f AP AR, and average annual fAPAR. Model parameters were conditioned on three years of NEE and fAPAR data and results were evaluated to determine the information content from the different combinations of data streams. Across the data assimilation experiments conducted, model selection metrics such as the Bayesian Information Criterion and Deviance Information Criterion obtained minimum values when assimilating average annual fAPAR and twice-daily NEE data. Application of wavelet coherence analyses showed higher correlations between measured and modeled fAPAR on longer timescales ranging from 9 to 12 months. There were strong correlations between measured and modeled NEE (R2, coefficient of determination, 0.86), but correlations between measured and modeled eight-day fAPAR were quite poor (R2 = −0.94). We conclude that this inability to determine fAPAR on eight-day timescale would improve with the considerations of the radiative transfer through the plant canopy. Modeled fluxes when assimilating average annual fAPAR and annual NEE were comparable to corresponding results when assimilating twice-daily NEE, albeit at a greater uncertainty. Our results support the conclusion that for this coniferous forest twice-daily NEE data are a critical measurement stream for the data assimilation. The results from this modeling exercise indicate that for this coniferous forest, average annuals for satellite-based fAPAR measurements paired with annual NEE estimates may provide spatial detail to components of ecosystem carbon fluxes in proximity of eddy covariance towers. Inclusion of other independent data streams in the assimilation will also reduce uncertainty on modeled values.
Resumo:
This paper investigates the use of a particle filter for data assimilation with a full scale coupled ocean–atmosphere general circulation model. Synthetic twin experiments are performed to assess the performance of the equivalent weights filter in such a high-dimensional system. Artificial 2-dimensional sea surface temperature fields are used as observational data every day. Results are presented for different values of the free parameters in the method. Measures of the performance of the filter are root mean square errors, trajectories of individual variables in the model and rank histograms. Filter degeneracy is not observed and the performance of the filter is shown to depend on the ability to keep maximum spread in the ensemble.
Resumo:
4-Dimensional Variational Data Assimilation (4DVAR) assimilates observations through the minimisation of a least-squares objective function, which is constrained by the model flow. We refer to 4DVAR as strong-constraint 4DVAR (sc4DVAR) in this thesis as it assumes the model is perfect. Relaxing this assumption gives rise to weak-constraint 4DVAR (wc4DVAR), leading to a different minimisation problem with more degrees of freedom. We consider two wc4DVAR formulations in this thesis, the model error formulation and state estimation formulation. The 4DVAR objective function is traditionally solved using gradient-based iterative methods. The principle method used in Numerical Weather Prediction today is the Gauss-Newton approach. This method introduces a linearised `inner-loop' objective function, which upon convergence, updates the solution of the non-linear `outer-loop' objective function. This requires many evaluations of the objective function and its gradient, which emphasises the importance of the Hessian. The eigenvalues and eigenvectors of the Hessian provide insight into the degree of convexity of the objective function, while also indicating the difficulty one may encounter while iterative solving 4DVAR. The condition number of the Hessian is an appropriate measure for the sensitivity of the problem to input data. The condition number can also indicate the rate of convergence and solution accuracy of the minimisation algorithm. This thesis investigates the sensitivity of the solution process minimising both wc4DVAR objective functions to the internal assimilation parameters composing the problem. We gain insight into these sensitivities by bounding the condition number of the Hessians of both objective functions. We also precondition the model error objective function and show improved convergence. We show that both formulations' sensitivities are related to error variance balance, assimilation window length and correlation length-scales using the bounds. We further demonstrate this through numerical experiments on the condition number and data assimilation experiments using linear and non-linear chaotic toy models.