111 resultados para Projections onto convex sets
Resumo:
Future changes in the stratospheric circulation could have an important impact on Northern winter tropospheric climate change, given that sea level pressure (SLP) responds not only to tropospheric circulation variations but also to vertically coherent variations in troposphere-stratosphere circulation. Here we assess Northern winter stratospheric change and its potential to influence surface climate change in the Coupled Model Intercomparison Project – phase 5 (CMIP5) multi-model ensemble. In the stratosphere at high latitudes, an easterly change in zonally averaged zonal wind is found for the majority of the CMIP5 models, under the Representative Concentration Pathway 8.5 scenario. Comparable results are also found in the 1% CO2 increase per year projections, indicating that the stratospheric easterly change is common feature in future climate projections. This stratospheric wind change, however, shows a significant spread among the models. By using linear regression, we quantify the impact of tropical upper troposphere warming, polar amplification and the stratospheric wind change on SLP. We find that the inter-model spread in stratospheric wind change contributes substantially to the inter-model spread in Arctic SLP change. The role of the stratosphere in determining part of the spread in SLP change is supported by the fact that the SLP change lags the stratospheric zonally averaged wind change. Taken together, these findings provide further support for the importance of simulating the coupling between the stratosphere and the troposphere, to narrow the uncertainty in the future projection of tropospheric circulation changes.
Resumo:
Concern that European forest biodiversity is depleted and declining has provoked widespread efforts to improve management practices. To gauge the success of these actions, appropriate monitoring of forest ecosystems is paramount. Multi-species indicators are frequently used to assess the state of biodiversity and its response to implemented management, but generally applicable and objective methodologies for species' selection are lacking. Here we use a niche-based approach, underpinned by coarse quantification of species' resource use, to objectively select species for inclusion in a pan-European forest bird indicator. We identify both the minimum number of species required to deliver full resource coverage and the most sensitive species' combination, and explore the trade-off between two key characteristics, sensitivity and redundancy, associated with indicators comprising different numbers of species. We compare our indicator to an existing forest bird indicator selected on the basis of expert opinion and show it is more representative of the wider community. We also present alternative indicators for regional and forest type specific monitoring and show that species' choice can have a significant impact on the indicator and consequent projections about the state of the biodiversity it represents. Furthermore, by comparing indicator sets drawn from currently monitored species and the full forest bird community, we identify gaps in the coverage of the current monitoring scheme. We believe that adopting this niche-based framework for species' selection supports the objective development of multi-species indicators and that it has good potential to be extended to a range of habitats and taxa.
Resumo:
Future changes in runoff can have important implications for water resources and flooding. In this study, runoff projections from ISI-MIP (Inter-sectoral Impact Model Inter-comparison Project) simulations forced with HadGEM2-ES bias-corrected climate data under the Representative Concentration Pathway 8.5 have been analysed for differences between impact models. Projections of change from a baseline period (1981-2010) to the future (2070-2099) from 12 impacts models which contributed to the hydrological and biomes sectors of ISI-MIP were studied. The biome models differed from the hydrological models by the inclusion of CO2 impacts and most also included a dynamic vegetation distribution. The biome and hydrological models agreed on the sign of runoff change for most regions of the world. However, in West Africa, the hydrological models projected drying, and the biome models a moistening. The biome models tended to produce larger increases and smaller decreases in regionally averaged runoff than the hydrological models, although there is large inter-model spread. The timing of runoff change was similar, but there were differences in magnitude, particularly at peak runoff. The impact of vegetation distribution change was much smaller than the projected change over time, while elevated CO2 had an effect as large as the magnitude of change over time projected by some models in some regions. The effect of CO2 on runoff was not consistent across the models, with two models showing increases and two decreases. There was also more spread in projections from the runs with elevated CO2 than with constant CO2. The biome models which gave increased runoff from elevated CO2 were also those which differed most from the hydrological models. Spatially, regions with most difference between model types tended to be projected to have most effect from elevated CO2, and seasonal differences were also similar, so elevated CO2 can partly explain the differences between hydrological and biome model runoff change projections. Therefore, this shows that a range of impact models should be considered to give the full range of uncertainty in impacts studies.
Resumo:
This paper presents an assessment of the effects of climate change on river flow regimes in representative English catchments, using the UKCP09 climate projections. These comprise a set of 10,000 coherent climate scenarios, used here (i) to evaluate the distribution of potential changes in hydrological behaviour and (ii) to construct relationships between indicators of climate change and hydrological change. The study uses six catchments, and focuses on change in average flow, high flow (Q5) and low flow (Q95). There is a large range in hydrological change in each catchment between the plausible UKCP09 climate projections, with differences between catchments largely due to differences in catchment geology and baseline water balance. The range in change between the UKCP09 projections is in most catchments smaller than the range between changes with scenarios based on the CMIP3 ensemble of climate models, and earlier UK scenarios produce changes that tend towards the lower (drier) end of the UKCP09 range. The difference between emissions scenarios is small compared to the range across the 10,000 scenarios. Changes in high flows are largely driven by changes in winter precipitation, whilst changes in low flows are determined by changes in summer precipitation and temperature.
Resumo:
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.
Resumo:
We are looking into variants of a domination set problem in social networks. While randomised algorithms for solving the minimum weighted domination set problem and the minimum alpha and alpha-rate domination problem on simple graphs are already present in the literature, we propose here a randomised algorithm for the minimum weighted alpha-rate domination set problem which is, to the best of our knowledge, the first such algorithm. A theoretical approximation bound based on a simple randomised rounding technique is given. The algorithm is implemented in Python and applied to a UK Twitter mentions networks using a measure of individuals’ influence (klout) as weights. We argue that the weights of vertices could be interpreted as the costs of getting those individuals on board for a campaign or a behaviour change intervention. The minimum weighted alpha-rate dominating set problem can therefore be seen as finding a set that minimises the total cost and each individual in a network has at least alpha percentage of its neighbours in the chosen set. We also test our algorithm on generated graphs with several thousand vertices and edges. Our results on this real-life Twitter networks and generated graphs show that the implementation is reasonably efficient and thus can be used for real-life applications when creating social network based interventions, designing social media campaigns and potentially improving users’ social media experience.
Resumo:
We consider a generic basic semi-algebraic subset S of the space of generalized functions, that is a set given by (not necessarily countably many) polynomial constraints. We derive necessary and sufficient conditions for an infinite sequence of generalized functions to be realizable on S, namely to be the moment sequence of a finite measure concentrated on S. Our approach combines the classical results about the moment problem on nuclear spaces with the techniques recently developed to treat the moment problem on basic semi-algebraic sets of Rd. In this way, we determine realizability conditions that can be more easily verified than the well-known Haviland type conditions. Our result completely characterizes the support of the realizing measure in terms of its moments. As concrete examples of semi-algebraic sets of generalized functions, we consider the set of all Radon measures and the set of all the measures having bounded Radon–Nikodym density w.r.t. the Lebesgue measure.
Resumo:
We apply a new parameterisation of the Greenland ice sheet (GrIS) feedback between surface mass balance (SMB: the sum of surface accumulation and surface ablation) and surface elevation in the MAR regional climate model (Edwards et al., 2014) to projections of future climate change using five ice sheet models (ISMs). The MAR (Modèle Atmosphérique Régional: Fettweis, 2007) climate projections are for 2000–2199, forced by the ECHAM5 and HadCM3 global climate models (GCMs) under the SRES A1B emissions scenario. The additional sea level contribution due to the SMB– elevation feedback averaged over five ISM projections for ECHAM5 and three for HadCM3 is 4.3% (best estimate; 95% credibility interval 1.8–6.9 %) at 2100, and 9.6% (best estimate; 95% credibility interval 3.6–16.0 %) at 2200. In all results the elevation feedback is significantly positive, amplifying the GrIS sea level contribution relative to the MAR projections in which the ice sheet topography is fixed: the lower bounds of our 95% credibility intervals (CIs) for sea level contributions are larger than the “no feedback” case for all ISMs and GCMs. Our method is novel in sea level projections because we propagate three types of modelling uncertainty – GCM and ISM structural uncertainties, and elevation feedback parameterisation uncertainty – along the causal chain, from SRES scenario to sea level, within a coherent experimental design and statistical framework. The relative contributions to uncertainty depend on the timescale of interest. At 2100, the GCM uncertainty is largest, but by 2200 both the ISM and parameterisation uncertainties are larger. We also perform a perturbed parameter ensemble with one ISM to estimate the shape of the projected sea level probability distribution; our results indicate that the probability density is slightly skewed towards higher sea level contributions.
Resumo:
A fast simple climate modelling approach is developed for predicting and helping to understand general circulation model (GCM) simulations. We show that the simple model reproduces the GCM results accurately, for global mean surface air temperature change and global-mean heat uptake projections from 9 GCMs in the fifth coupled model inter-comparison project (CMIP5). This implies that understanding gained from idealised CO2 step experiments is applicable to policy-relevant scenario projections. Our approach is conceptually simple. It works by using the climate response to a CO2 step change taken directly from a GCM experiment. With radiative forcing from non-CO2 constituents obtained by adapting the Forster and Taylor method, we use our method to estimate results for CMIP5 representative concentration pathway (RCP) experiments for cases not run by the GCMs. We estimate differences between pairs of RCPs rather than RCP anomalies relative to the pre-industrial state. This gives better results because it makes greater use of available GCM projections. The GCMs exhibit differences in radiative forcing, which we incorporate in the simple model. We analyse the thus-completed ensemble of RCP projections. The ensemble mean changes between 1986–2005 and 2080–2099 for global temperature (heat uptake) are, for RCP8.5: 3.8 K (2.3 × 1024 J); for RCP6.0: 2.3 K (1.6 × 1024 J); for RCP4.5: 2.0 K (1.6 × 1024 J); for RCP2.6: 1.1 K (1.3 × 1024 J). The relative spread (standard deviation/ensemble mean) for these scenarios is around 0.2 and 0.15 for temperature and heat uptake respectively. We quantify the relative effect of mitigation action, through reduced emissions, via the time-dependent ratios (change in RCPx)/(change in RCP8.5), using changes with respect to pre-industrial conditions. We find that the effects of mitigation on global-mean temperature change and heat uptake are very similar across these different GCMs.
Resumo:
In September 2013, the 5th Assessment Report (5AR) of the International Panel on Climate Change (IPCC) has been released. Taking the 5AR cli-mate change scenarios into account, the World Bank published an earli-er report on climate change and its impacts on selected hot spot re-gions, including Southeast Asia. Currently, dynamical and statistical-dynamical downscaling efforts are underway to obtain higher resolution and more robust regional climate change projections for tropical South-east Asia, including Vietnam. Such initiatives are formalized under the World Meteorological Organization (WMO) Coordinated Regional Dynamic Downscaling Experiment (CORDEX) East Asia and Southeast Asia and also take place in climate change impact projects such as the joint Vietnam-ese-German project “Environmental and Water Protection Technologies of Coastal Zones in Vietnam (EWATEC-COAST)”. In this contribution, the lat-est assessments for changes in temperature, precipitation, sea level, and tropical cyclones (TCs) under the 5AR Representative Concentration Pathway (RCP) scenarios 4.5 and 8.5 are reviewed. Special emphasis is put on changes in extreme events like heat waves and/or heavy precipita-tion. A regional focus is Vietnam south of 16°N. A continued increase in mean near surface temperature is projected, reaching up to 5°C at the end of this century in northern Vietnam un-der the high greenhouse-gas forcing scenario RCP8.5. Overall, project-ed changes in annual precipitation are small, but there is a tendency of more rainfall in the boreal winter dry season. Unprecedented heat waves and an increase in extreme precipitation events are projected by both global and regional climate models. Globally, TCs are projected to decrease in number, but an increase in intensity of peak winds and rain-fall in the inner core region is estimated. Though an assessment of changes in land-falling frequency in Vietnam is uncertain due to difficul-ties in assessing changes in TC tracks, some work indicates a reduction in the number of land-falling TCs in Vietnam. Sea level may rise by 75-100 cm until the end of the century with the Vietnamese coastline experienc-ing 10-15% higher rise than on global average. Given the large rice and aquaculture production in the Mekong and Red River Deltas, that are both prone to TC-related storm surges and flooding, this poses a challenge to foodsecurity and protection of coastal population and assets.
Resumo:
Classical regression methods take vectors as covariates and estimate the corresponding vectors of regression parameters. When addressing regression problems on covariates of more complex form such as multi-dimensional arrays (i.e. tensors), traditional computational models can be severely compromised by ultrahigh dimensionality as well as complex structure. By exploiting the special structure of tensor covariates, the tensor regression model provides a promising solution to reduce the model’s dimensionality to a manageable level, thus leading to efficient estimation. Most of the existing tensor-based methods independently estimate each individual regression problem based on tensor decomposition which allows the simultaneous projections of an input tensor to more than one direction along each mode. As a matter of fact, multi-dimensional data are collected under the same or very similar conditions, so that data share some common latent components but can also have their own independent parameters for each regression task. Therefore, it is beneficial to analyse regression parameters among all the regressions in a linked way. In this paper, we propose a tensor regression model based on Tucker Decomposition, which identifies not only the common components of parameters across all the regression tasks, but also independent factors contributing to each particular regression task simultaneously. Under this paradigm, the number of independent parameters along each mode is constrained by a sparsity-preserving regulariser. Linked multiway parameter analysis and sparsity modeling further reduce the total number of parameters, with lower memory cost than their tensor-based counterparts. The effectiveness of the new method is demonstrated on real data sets.
Resumo:
Given a dataset of two-dimensional points in the plane with integer coordinates, the method proposed reduces a set of n points down to a set of s points s ≤ n, such that the convex hull on the set of s points is the same as the convex hull of the original set of n points. The method is O(n). It helps any convex hull algorithm run faster. The empirical analysis of a practical case shows a percentage reduction in points of over 98%, that is reflected as a faster computation with a speedup factor of at least 4.
Resumo:
This study has investigated serial (temporal) clustering of extra-tropical cyclones simulated by 17 climate models that participated in CMIP5. Clustering was estimated by calculating the dispersion (ratio of variance to mean) of 30 December-February counts of Atlantic storm tracks passing nearby each grid point. Results from single historical simulations of 1975-2005 were compared to those from historical ERA40 reanalyses from 1958-2001 ERA40 and single future model projections of 2069-2099 under the RCP4.5 climate change scenario. Models were generally able to capture the broad features in reanalyses reported previously: underdispersion/regularity (i.e. variance less than mean) in the western core of the Atlantic storm track surrounded by overdispersion/clustering (i.e. variance greater than mean) to the north and south and over western Europe. Regression of counts onto North Atlantic Oscillation (NAO) indices revealed that much of the overdispersion in the historical reanalyses and model simulations can be accounted for by NAO variability. Future changes in dispersion were generally found to be small and not consistent across models. The overdispersion statistic, for any 30 year sample, is prone to large amounts of sampling uncertainty that obscures the climate change signal. For example, the projected increase in dispersion for storm counts near London in the CNRMCM5 model is 0.1 compared to a standard deviation of 0.25. Projected changes in the mean and variance of NAO are insufficient to create changes in overdispersion that are discernible above natural sampling variations.