882 resultados para Ensemble of classifiers


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Climate change is projected to cause substantial alterations in vegetation distribution, but these have been given little attention in comparison to land-use in the Representative Concentration Pathway (RCP) scenarios. Here we assess the climate-induced land cover changes (CILCC) in the RCPs, and compare them to land-use land cover change (LULCC). To do this, we use an ensemble of simulations with and without LULCC in earth system model HadGEM2-ES for RCP2.6, RCP4.5 and RCP8.5. We find that climate change causes an expansion poleward of vegetation that affects more land area than LULCC in all of the RCPs considered here. The terrestrial carbon changes from CILCC are also larger than for LULCC. When considering only forest, the LULCC is larger, but the CILCC is highly variable with the overall radiative forcing of the scenario. The CILCC forest increase compensates 90% of the global anthropogenic deforestation by 2100 in RCP8.5, but just 3% in RCP2.6. Overall, bigger land cover changes tend to originate from LULCC in the shorter term or lower radiative forcing scenarios, and from CILCC in the longer term and higher radiative forcing scenarios. The extent to which CILCC could compensate for LULCC raises difficult questions regarding global forest and biodiversity offsetting, especially at different timescales. This research shows the importance of considering the relative size of CILCC to LULCC, especially with regard to the ecological effects of the different RCPs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Instrumental observations, palaeo-proxies, and climate models suggest significant decadal variability within the North Atlantic subpolar gyre (NASPG). However, a poorly sampled observational record and a diversity of model behaviours mean that the precise nature and mechanisms of this variability are unclear. Here, we analyse an exceptionally large multi-model ensemble of 42 present-generation climate models to test whether NASPG mean state biases systematically affect the representation of decadal variability. Temperature and salinity biases in the Labrador Sea co-vary and influence whether density variability is controlled by temperature or salinity variations. Ocean horizontal resolution is a good predictor of the biases and the location of the dominant dynamical feedbacks within the NASPG. However, we find no link to the spectral characteristics of the variability. Our results suggest that the mean state and mechanisms of variability within the NASPG are not independent. This represents an important caveat for decadal predictions using anomaly-assimilation methods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper an equation is derived for the mean backscatter cross section of an ensemble of snowflakes at centimeter and millimeter wavelengths. It uses the Rayleigh–Gans approximation, which has previously been found to be applicable at these wavelengths due to the low density of snow aggregates. Although the internal structure of an individual snowflake is random and unpredictable, the authors find from simulations of the aggregation process that their structure is “self-similar” and can be described by a power law. This enables an analytic expression to be derived for the backscatter cross section of an ensemble of particles as a function of their maximum dimension in the direction of propagation of the radiation, the volume of ice they contain, a variable describing their mean shape, and two variables describing the shape of the power spectrum. The exponent of the power law is found to be −. In the case of 1-cm snowflakes observed by a 3.2-mm-wavelength radar, the backscatter is 40–100 times larger than that of a homogeneous ice–air spheroid with the same mass, size, and aspect ratio.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The predictability of high impact weather events on multiple time scales is a crucial issue both in scientific and socio-economic terms. In this study, a statistical-dynamical downscaling (SDD) approach is applied to an ensemble of decadal hindcasts obtained with the Max-Planck-Institute Earth System Model (MPI-ESM) to estimate the decadal predictability of peak wind speeds (as a proxy for gusts) over Europe. Yearly initialized decadal ensemble simulations with ten members are investigated for the period 1979–2005. The SDD approach is trained with COSMO-CLM regional climate model simulations and ERA-Interim reanalysis data and applied to the MPI-ESM hindcasts. The simulations for the period 1990–1993, which was characterized by several windstorm clusters, are analyzed in detail. The anomalies of the 95 % peak wind quantile of the MPI-ESM hindcasts are in line with the positive anomalies in reanalysis data for this period. To evaluate both the skill of the decadal predictability system and the added value of the downscaling approach, quantile verification skill scores are calculated for both the MPI-ESM large-scale wind speeds and the SDD simulated regional peak winds. Skill scores are predominantly positive for the decadal predictability system, with the highest values for short lead times and for (peak) wind speeds equal or above the 75 % quantile. This provides evidence that the analyzed hindcasts and the downscaling technique are suitable for estimating wind and peak wind speeds over Central Europe on decadal time scales. The skill scores for SDD simulated peak winds are slightly lower than those for large-scale wind speeds. This behavior can be largely attributed to the fact that peak winds are a proxy for gusts, and thus have a higher variability than wind speeds. The introduced cost-efficient downscaling technique has the advantage of estimating not only wind speeds but also estimates peak winds (a proxy for gusts) and can be easily applied to large ensemble datasets like operational decadal prediction systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper discusses an important issue related to the implementation and interpretation of the analysis scheme in the ensemble Kalman filter . I t i s shown that the obser vations must be treated as random variables at the analysis steps. That is, one should add random perturbations with the correct statistics to the obser vations and generate an ensemble of obser vations that then is used in updating the ensemble of model states. T raditionally , this has not been done in previous applications of the ensemble Kalman filter and, as will be shown, this has resulted in an updated ensemble with a variance that is too low . This simple modification of the analysis scheme results in a completely consistent approach if the covariance of the ensemble of model states is interpreted as the prediction error covariance, and there are no further requirements on the ensemble Kalman filter method, except for the use of an ensemble of sufficient size. Thus, there is a unique correspondence between the error statistics from the ensemble Kalman filter and the standard Kalman filter approach

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Sixteen monthly air–sea heat flux products from global ocean/coupled reanalyses are compared over 1993–2009 as part of the Ocean Reanalysis Intercomparison Project (ORA-IP). Objectives include assessing the global heat closure, the consistency of temporal variability, comparison with other flux products, and documenting errors against in situ flux measurements at a number of OceanSITES moorings. The ensemble of 16 ORA-IP flux estimates has a global positive bias over 1993–2009 of 4.2 ± 1.1 W m−2. Residual heat gain (i.e., surface flux + assimilation increments) is reduced to a small positive imbalance (typically, +1–2 W m−2). This compensation between surface fluxes and assimilation increments is concentrated in the upper 100 m. Implied steady meridional heat transports also improve by including assimilation sources, except near the equator. The ensemble spread in surface heat fluxes is dominated by turbulent fluxes (>40 W m−2 over the western boundary currents). The mean seasonal cycle is highly consistent, with variability between products mostly <10 W m−2. The interannual variability has consistent signal-to-noise ratio (~2) throughout the equatorial Pacific, reflecting ENSO variability. Comparisons at tropical buoy sites (10°S–15°N) over 2007–2009 showed too little ocean heat gain (i.e., flux into the ocean) in ORA-IP (up to 1/3 smaller than buoy measurements) primarily due to latent heat flux errors in ORA-IP. Comparisons with the Stratus buoy (20°S, 85°W) over a longer period, 2001–2009, also show the ORA-IP ensemble has 16 W m−2 smaller net heat gain, nearly all of which is due to too much latent cooling caused by differences in surface winds imposed in ORA-IP.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ocean–sea ice reanalyses are crucial for assessing the variability and recent trends in the Arctic sea ice cover. This is especially true for sea ice volume, as long-term and large scale sea ice thickness observations are inexistent. Results from the Ocean ReAnalyses Intercomparison Project (ORA-IP) are presented, with a focus on Arctic sea ice fields reconstructed by state-of-the-art global ocean reanalyses. Differences between the various reanalyses are explored in terms of the effects of data assimilation, model physics and atmospheric forcing on properties of the sea ice cover, including concentration, thickness, velocity and snow. Amongst the 14 reanalyses studied here, 9 assimilate sea ice concentration, and none assimilate sea ice thickness data. The comparison reveals an overall agreement in the reconstructed concentration fields, mainly because of the constraints in surface temperature imposed by direct assimilation of ocean observations, prescribed or assimilated atmospheric forcing and assimilation of sea ice concentration. However, some spread still exists amongst the reanalyses, due to a variety of factors. In particular, a large spread in sea ice thickness is found within the ensemble of reanalyses, partially caused by the biases inherited from their sea ice model components. Biases are also affected by the assimilation of sea ice concentration and the treatment of sea ice thickness in the data assimilation process. An important outcome of this study is that the spatial distribution of ice volume varies widely between products, with no reanalysis standing out as clearly superior as compared to altimetry estimates. The ice thickness from systems without assimilation of sea ice concentration is not worse than that from systems constrained with sea ice observations. An evaluation of the sea ice velocity fields reveals that ice drifts too fast in most systems. As an ensemble, the ORA-IP reanalyses capture trends in Arctic sea ice area and extent relatively well. However, the ensemble can not be used to get a robust estimate of recent trends in the Arctic sea ice volume. Biases in the reanalyses certainly impact the simulated air–sea fluxes in the polar regions, and questions the suitability of current sea ice reanalyses to initialize seasonal forecasts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Spatial and temporal fluctuations in the concentration field from an ensemble of continuous point-source releases in a regular building array are analyzed from data generated by direct numerical simulations. The release is of a passive scalar under conditions of neutral stability. Results are related to the underlying flow structure by contrasting data for an imposed wind direction of 0 deg and 45 deg relative to the buildings. Furthermore, the effects of distance from the source and vicinity to the plume centreline on the spatial and temporal variability are documented. The general picture that emerges is that this particular geometry splits the flow domain into segments (e.g. “streets” and “intersections”) in each of which the air is, to a first approximation, well mixed. Notable exceptions to this general rule include regions close to the source, near the plume edge, and in unobstructed channels when the flow is aligned. In the oblique (45 deg) case the strongly three-dimensional nature of the flow enhances mixing of a scalar within the canopy leading to reduced temporal and spatial concentration fluctuations within the plume core. These fluctuations are in general larger for the parallel flow (0 deg) case, especially so in the long unobstructed channels. Due to the more complex flow structure in the canyon-type streets behind buildings, fluctuations are lower than in the open channels, though still substantially larger than for oblique flow. These results are relevant to the formulation of simple models for dispersion in urban areas and to the quantification of the uncertainties in their predictions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Madden-Julian oscillation (MJO) is the most prominent form of tropical intraseasonal variability. This study investigated the following questions. Do inter-annual-to-decadal variations in tropical sea surface temperature (SST) lead to substantial changes in MJO activity? Was there a change in the MJO in the 1970s? Can this change be associated to SST anomalies? What was the level of MJO activity in the pre-reanalysis era? These questions were investigated with a stochastic model of the MJO. Reanalysis data (1948-2008) were used to develop a nine-state first order Markov model capable to simulate the non-stationarity of the MJO. The model is driven by observed SST anomalies and a large ensemble of simulations was performed to infer the activity of the MJO in the instrumental period (1880-2008). The model is capable to reproduce the activity of the MJO during the reanalysis period. The simulations indicate that the MJO exhibited a regime of near normal activity in 1948-1972 (3.4 events year(-1)) and two regimes of high activity in 1973-1989 (3.9 events) and 1990-2008 (4.6 events). Stochastic simulations indicate decadal shifts with near normal levels in 1880-1895 (3.4 events), low activity in 1896 1917 (2.6 events) and a return to near normal levels during 1918-1947 (3.3 events). The results also point out to significant decadal changes in probabilities of very active years (5 or more MJO events): 0.214 (1880-1895), 0.076 (1896-1917), 0.197 (1918-1947) and 0.193 (1948-1972). After a change in behavior in the 1970s, this probability has increased to 0.329 (1973-1989) and 0.510 (1990-2008). The observational and stochastic simulations presented here call attention to the need to further understand the variability of the MJO on a wide range of time scales.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The relationship between Islamic Law and other legal systems (basically western type domestic legal orders and international law) is often thought of in terms of compatibility or incompatibility. Concerning certain subject matters of choice, the compatibility of Islamic (legal) principles with the values embedded in legal systems that are regarded as characteristic of the Modern Age is tested by sets of questions: is democracy possible in Islam? Does Islam recognize human rights and are those rights equivalent to a more universal conception? Does Islam recognize or condone more extreme acts of violence and does it justify violence differently? Etc. Such questions and many more presuppose the existence of an ensemble of rules or principles which, as any other set of rules and principles, purport to regulate social behavior. This ensemble is generically referred to as Islamic Law. However, one set of questions is usually left unanswered: is Islamic Law a legal system? If it is a legal system, what are its specific characteristics? How does it work? Where does it apply? It is this paper`s argument that the relationship between Islamic Law and domestic and international law can only be understood if looked upon as a relationship between distinct legal systems or legal orders.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multi-classifier systems, also known as ensembles, have been widely used to solve several problems, because they, often, present better performance than the individual classifiers that form these systems. But, in order to do so, it s necessary that the base classifiers to be as accurate as diverse among themselves this is also known as diversity/accuracy dilemma. Given its importance, some works have investigate the ensembles behavior in context of this dilemma. However, the majority of them address homogenous ensemble, i.e., ensembles composed only of the same type of classifiers. Thus, motivated by this limitation, this thesis, using genetic algorithms, performs a detailed study on the dilemma diversity/accuracy for heterogeneous ensembles

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Classifier ensembles are systems composed of a set of individual classifiers and a combination module, which is responsible for providing the final output of the system. In the design of these systems, diversity is considered as one of the main aspects to be taken into account since there is no gain in combining identical classification methods. The ideal situation is a set of individual classifiers with uncorrelated errors. In other words, the individual classifiers should be diverse among themselves. One way of increasing diversity is to provide different datasets (patterns and/or attributes) for the individual classifiers. The diversity is increased because the individual classifiers will perform the same task (classification of the same input patterns) but they will be built using different subsets of patterns and/or attributes. The majority of the papers using feature selection for ensembles address the homogenous structures of ensemble, i.e., ensembles composed only of the same type of classifiers. In this investigation, two approaches of genetic algorithms (single and multi-objective) will be used to guide the distribution of the features among the classifiers in the context of homogenous and heterogeneous ensembles. The experiments will be divided into two phases that use a filter approach of feature selection guided by genetic algorithm

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work discusses the application of techniques of ensembles in multimodal recognition systems development in revocable biometrics. Biometric systems are the future identification techniques and user access control and a proof of this is the constant increases of such systems in current society. However, there is still much advancement to be developed, mainly with regard to the accuracy, security and processing time of such systems. In the search for developing more efficient techniques, the multimodal systems and the use of revocable biometrics are promising, and can model many of the problems involved in traditional biometric recognition. A multimodal system is characterized by combining different techniques of biometric security and overcome many limitations, how: failures in the extraction or processing the dataset. Among the various possibilities to develop a multimodal system, the use of ensembles is a subject quite promising, motivated by performance and flexibility that they are demonstrating over the years, in its many applications. Givin emphasis in relation to safety, one of the biggest problems found is that the biometrics is permanently related with the user and the fact of cannot be changed if compromised. However, this problem has been solved by techniques known as revocable biometrics, which consists of applying a transformation on the biometric data in order to protect the unique characteristics, making its cancellation and replacement. In order to contribute to this important subject, this work compares the performance of individual classifiers methods, as well as the set of classifiers, in the context of the original data and the biometric space transformed by different functions. Another factor to be highlighted is the use of Genetic Algorithms (GA) in different parts of the systems, seeking to further maximize their eficiency. One of the motivations of this development is to evaluate the gain that maximized ensembles systems by different GA can bring to the data in the transformed space. Another relevant factor is to generate revocable systems even more eficient by combining two or more functions of transformations, demonstrating that is possible to extract information of a similar standard through applying different transformation functions. With all this, it is clear the importance of revocable biometrics, ensembles and GA in the development of more eficient biometric systems, something that is increasingly important in the present day

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The goal of this work is to assess the efficacy of texture measures for estimating levels of crowd densities ill images. This estimation is crucial for the problem of crowd monitoring. and control. The assessment is carried out oil a set of nearly 300 real images captured from Liverpool Street Train Station. London, UK using texture measures extracted from the images through the following four different methods: gray level dependence matrices, straight lille segments. Fourier analysis. and fractal dimensions. The estimations of dowel densities are given in terms of the classification of the input images ill five classes of densities (very low, low. moderate. high and very high). Three types of classifiers are used: neural (implemented according to the Kohonen model). Bayesian. and an approach based on fitting functions. The results obtained by these three classifiers. using the four texture measures. allowed the conclusion that, for the problem of crowd density estimation. texture analysis is very effective.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Some dynamical properties present in a problem concerning the acceleration of particles in a wave packet are studied. The dynamics of the model is described in terms of a two-dimensional area preserving map. We show that the phase space is mixed in the sense that there are regular and chaotic regions coexisting. We use a connection with the standard map in order to find the position of the first invariant spanning curve which borders the chaotic sea. We find that the position of the first invariant spanning curve increases as a power of the control parameter with the exponent 2/3. The standard deviation of the kinetic energy of an ensemble of initial conditions obeys a power law as a function of time, and saturates after some crossover. Scaling formalism is used in order to characterise the chaotic region close to the transition from integrability to nonintegrability and a relationship between the power law exponents is derived. The formalism can be applied in many different systems with mixed phase space. Then, dissipation is introduced into the model and therefore the property of area preservation is broken, and consequently attractors are observed. We show that after a small change of the dissipation, the chaotic attractor as well as its basin of attraction are destroyed, thus leading the system to experience a boundary crisis. The transient after the crisis follows a power law with exponent -2. (C) 2011 Elsevier Ltd. All rights reserved.