77 resultados para Geo-statistical model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A strong relationship between dissolved organic carbon (DOC) and sulphate (SO42−) dynamics under drought conditions has been revealed from analysis of a 10-year time series (1993–2002). Soil solution from a blanket peat at 10 cm depth and stream water were collected at biweekly and weekly intervals, respectively, by the Environmental Change Network at Moor House-Upper Teesdale National Nature Reserve in the North Pennine uplands of Britain. DOC concentrations in soil solution and stream water were closely coupled, displaying a strong seasonal cycle with lowest concentrations in early spring and highest in late summer/early autumn. Soil solution DOC correlated strongly with seasonal variations in soil temperature at the same depth 4-weeks prior to sampling. Deviation from this relationship was seen, however, in years with significant water table drawdown (>−25 cm), such that DOC concentrations were up to 60% lower than expected. Periods of drought also resulted in the release of SO42−, because of the oxidation of inorganic/organic sulphur stored in the peat, which was accompanied by a decrease in pH and increase in ionic strength. As both pH and ionic strength are known to control the solubility of DOC, inclusion of a function to account for DOC suppression because of drought-induced acidification accounted for more of the variability of DOC in soil solution (R2=0.81) than temperature alone (R2=0.58). This statistical model of peat soil solution DOC at 10 cm depth was extended to reproduce 74% of the variation in stream DOC over this period. Analysis of annual budgets showed that the soil was the main source of SO42− during droughts, while atmospheric deposition was the main source in other years. Mass balance calculations also showed that most of the DOC originated from the peat. The DOC flux was also lower in the drought years of 1994 and 1995, reflecting low DOC concentrations in soil and stream water. The analysis presented in this paper suggests that lower concentrations of DOC in both soil and stream waters during drought years can be explained in terms of drought-induced acidification. As future climate change scenarios suggest an increase in the magnitude and frequency of drought events, these results imply potential for a related increase in DOC suppression by episodic acidification.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Logistic models are studied as a tool to convert dynamical forecast information (deterministic and ensemble) into probability forecasts. A logistic model is obtained by setting the logarithmic odds ratio equal to a linear combination of the inputs. As with any statistical model, logistic models will suffer from overfitting if the number of inputs is comparable to the number of forecast instances. Computational approaches to avoid overfitting by regularization are discussed, and efficient techniques for model assessment and selection are presented. A logit version of the lasso (originally a linear regression technique), is discussed. In lasso models, less important inputs are identified and the corresponding coefficient is set to zero, providing an efficient and automatic model reduction procedure. For the same reason, lasso models are particularly appealing for diagnostic purposes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Simulations of 15 coupled chemistry climate models, for the period 1960–2100, are presented. The models include a detailed stratosphere, as well as including a realistic representation of the tropospheric climate. The simulations assume a consistent set of changing greenhouse gas concentrations, as well as temporally varying chlorofluorocarbon concentrations in accordance with observations for the past and expectations for the future. The ozone results are analyzed using a nonparametric additive statistical model. Comparisons are made with observations for the recent past, and the recovery of ozone, indicated by a return to 1960 and 1980 values, is investigated as a function of latitude. Although chlorine amounts are simulated to return to 1980 values by about 2050, with only weak latitudinal variations, column ozone amounts recover at different rates due to the influence of greenhouse gas changes. In the tropics, simulated peak ozone amounts occur by about 2050 and thereafter total ozone column declines. Consequently, simulated ozone does not recover to values which existed prior to the early 1980s. The results also show a distinct hemispheric asymmetry, with recovery to 1980 values in the Northern Hemisphere extratropics ahead of the chlorine return by about 20 years. In the Southern Hemisphere midlatitudes, ozone is simulated to return to 1980 levels only 10 years ahead of chlorine. In the Antarctic, annually averaged ozone recovers at about the same rate as chlorine in high latitudes and hence does not return to 1960s values until the last decade of the simulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The search for ever deeper relationships among the World’s languages is bedeviled by the fact that most words evolve too rapidly to preserve evidence of their ancestry beyond 5,000 to 9,000 y. On the other hand, quantitative modeling indicates that some “ultraconserved” words exist that might be used to find evidence for deep linguistic relationships beyond that time barrier. Here we use a statistical model, which takes into account the frequency with which words are used in common everyday speech, to predict the existence of a set of such highly conserved words among seven language families of Eurasia postulated to form a linguistic superfamily that evolved from a common ancestor around 15,000 y ago. We derive a dated phylogenetic tree of this proposed superfamily with a time-depth of ∼14,450 y, implying that some frequently used words have been retained in related forms since the end of the last ice age. Words used more than once per 1,000 in everyday speech were 7- to 10-times more likely to show deep ancestry on this tree. Our results suggest a remarkable fidelity in the transmission of some words and give theoretical justification to the search for features of language that might be preserved across wide spans of time and geography.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The propagation of 7.335 MHz, c.w. signals over a 5212 km sub-auroral, west-east path is studied. Measurements and semi-empirical predictions are made of the amplitude distributions and Doppler shifts of the received signals. The observed amplitude distribution is fitted with one produced by a numerical fading model, yielding the power losses suffered by the signals during propagation via the predominating modes. The signals are found to suffer exceptionally low losses at certain local times under geomagnetically quiet conditions. The mid-latitude trough in the F2 peak ionization density is predicted by a statistical model to be at the latitudes of this path at these times and at low Kp values. A sharp cut-off in low-power losses at a mean Kp of 2.75 strongly implicates the trough in the propagation of these signals. The Doppler shifts observed at these times cannot be explained by a simple ray-tracing model. It is shown however, that a simple extension of this model to allow for the trough can reproduce the form of the observed diurnal variation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we assess opinion polls, prediction markets, expert opinion and statistical modelling over a large number of US elections in order to determine which perform better in terms of forecasting outcomes. In line with existing literature, we bias-correct opinion polls. We consider accuracy, bias and precision over different time horizons before an election, and we conclude that prediction markets appear to provide the most precise forecasts and are similar in terms of bias to opinion polls. We find that our statistical model struggles to provide competitive forecasts, while expert opinion appears to be of value. Finally we note that the forecast horizon matters; whereas prediction market forecasts tend to improve the nearer an election is, opinion polls appear to perform worse, while expert opinion performs consistently throughout. We thus contribute to the growing literature comparing election forecasts of polls and prediction markets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The paper discusses the observed and projected warming in the Caucasus region and its implications for glacier melt and runoff. A strong positive trend in summer air temperatures of 0.05 degrees C a(-1) is observed in the high-altitude areas providing for a strong glacier melt and continuous decline in glacier mass balance. A warming of 4-7 degrees C and 3-5 degrees C is projected for the summer months in 2071-2100 under the A2 and B2 emission scenarios respectively, suggesting that enhanced glacier melt can be expected. The expected changes in winter precipitation will not compensate for the summer melt and glacier retreat is likely to continue. However, a projected small increase in both winter and summer precipitation combined with the enhanced glacier melt will result in increased summer runoff in the currently glaciated region of the Caucasus (independent of whether the region is glaciated at the end of the twenty-first century) by more than 50% compared with the baseline period.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The proportional odds model provides a powerful tool for analysing ordered categorical data and setting sample size, although for many clinical trials its validity is questionable. The purpose of this paper is to present a new class of constrained odds models which includes the proportional odds model. The efficient score and Fisher's information are derived from the profile likelihood for the constrained odds model. These results are new even for the special case of proportional odds where the resulting statistics define the Mann-Whitney test. A strategy is described involving selecting one of these models in advance, requiring assumptions as strong as those underlying proportional odds, but allowing a choice of such models. The accuracy of the new procedure and its power are evaluated.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We compared output from 3 dynamic process-based models (DMs: ECOSSE, MILLENNIA and the Durham Carbon Model) and 9 bioclimatic envelope models (BCEMs; including BBOG ensemble and PEATSTASH) ranging from simple threshold to semi-process-based models. Model simulations were run at 4 British peatland sites using historical climate data and climate projections under a medium (A1B) emissions scenario from the 11-RCM (regional climate model) ensemble underpinning UKCP09. The models showed that blanket peatlands are vulnerable to projected climate change; however, predictions varied between models as well as between sites. All BCEMs predicted a shift from presence to absence of a climate associated with blanket peat, where the sites with the lowest total annual precipitation were closest to the presence/absence threshold. DMs showed a more variable response. ECOSSE predicted a decline in net C sink and shift to net C source by the end of this century. The Durham Carbon Model predicted a smaller decline in the net C sink strength, but no shift to net C source. MILLENNIA predicted a slight overall increase in the net C sink. In contrast to the BCEM projections, the DMs predicted that the sites with coolest temperatures and greatest total annual precipitation showed the largest change in carbon sinks. In this model inter-comparison, the greatest variation in model output in response to climate change projections was not between the BCEMs and DMs but between the DMs themselves, because of different approaches to modelling soil organic matter pools and decomposition amongst other processes. The difference in the sign of the response has major implications for future climate feedbacks, climate policy and peatland management. Enhanced data collection, in particular monitoring peatland response to current change, would significantly improve model development and projections of future change.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We investigate the initialization of Northern-hemisphere sea ice in the global climate model ECHAM5/MPI-OM by assimilating sea-ice concentration data. The analysis updates for concentration are given by Newtonian relaxation, and we discuss different ways of specifying the analysis updates for mean thickness. Because the conservation of mean ice thickness or actual ice thickness in the analysis updates leads to poor assimilation performance, we introduce a proportional dependence between concentration and mean thickness analysis updates. Assimilation with these proportional mean-thickness analysis updates significantly reduces assimilation error both in identical-twin experiments and when assimilating sea-ice observations, reducing the concentration error by a factor of four to six, and the thickness error by a factor of two. To understand the physical aspects of assimilation errors, we construct a simple prognostic model of the sea-ice thermodynamics, and analyse its response to the assimilation. We find that the strong dependence of thermodynamic ice growth on ice concentration necessitates an adjustment of mean ice thickness in the analysis update. To understand the statistical aspects of assimilation errors, we study the model background error covariance between ice concentration and ice thickness. We find that the spatial structure of covariances is best represented by the proportional mean-thickness analysis updates. Both physical and statistical evidence supports the experimental finding that proportional mean-thickness updates are superior to the other two methods considered and enable us to assimilate sea ice in a global climate model using simple Newtonian relaxation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We investigate the initialisation of Northern Hemisphere sea ice in the global climate model ECHAM5/MPI-OM by assimilating sea-ice concentration data. The analysis updates for concentration are given by Newtonian relaxation, and we discuss different ways of specifying the analysis updates for mean thickness. Because the conservation of mean ice thickness or actual ice thickness in the analysis updates leads to poor assimilation performance, we introduce a proportional dependence between concentration and mean thickness analysis updates. Assimilation with these proportional mean-thickness analysis updates leads to good assimilation performance for sea-ice concentration and thickness, both in identical-twin experiments and when assimilating sea-ice observations. The simulation of other Arctic surface fields in the coupled model is, however, not significantly improved by the assimilation. To understand the physical aspects of assimilation errors, we construct a simple prognostic model of the sea-ice thermodynamics, and analyse its response to the assimilation. We find that an adjustment of mean ice thickness in the analysis update is essential to arrive at plausible state estimates. To understand the statistical aspects of assimilation errors, we study the model background error covariance between ice concentration and ice thickness. We find that the spatial structure of covariances is best represented by the proportional mean-thickness analysis updates. Both physical and statistical evidence supports the experimental finding that assimilation with proportional mean-thickness updates outperforms the other two methods considered. The method described here is very simple to implement, and gives results that are sufficiently good to be used for initialising sea ice in a global climate model for seasonal to decadal predictions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Regional climate downscaling has arrived at an important juncture. Some in the research community favour continued refinement and evaluation of downscaling techniques within a broader framework of uncertainty characterisation and reduction. Others are calling for smarter use of downscaling tools, accepting that conventional, scenario-led strategies for adaptation planning have limited utility in practice. This paper sets out the rationale and new functionality of the Decision Centric (DC) version of the Statistical DownScaling Model (SDSM-DC). This tool enables synthesis of plausible daily weather series, exotic variables (such as tidal surge), and climate change scenarios guided, not determined, by climate model output. Two worked examples are presented. The first shows how SDSM-DC can be used to reconstruct and in-fill missing records based on calibrated predictor-predictand relationships. Daily temperature and precipitation series from sites in Africa, Asia and North America are deliberately degraded to show that SDSM-DC can reconstitute lost data. The second demonstrates the application of the new scenario generator for stress testing a specific adaptation decision. SDSM-DC is used to generate daily precipitation scenarios to simulate winter flooding in the Boyne catchment, Ireland. This sensitivity analysis reveals the conditions under which existing precautionary allowances for climate change might be insufficient. We conclude by discussing the wider implications of the proposed approach and research opportunities presented by the new tool.