111 resultados para Empirical Algorithm Analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

We employ a large dataset of physical inventory data on 21 different commodities for the period 1993–2011 to empirically analyze the behavior of commodity prices and their volatility as predicted by the theory of storage. We examine two main issues. First, we analyze the relationship between inventory and the shape of the forward curve. Low (high) inventory is associated with forward curves in backwardation (contango), as the theory of storage predicts. Second, we show that price volatility is a decreasing function of inventory for the majority of commodities in our sample. This effect is more pronounced in backwardated markets. Our findings are robust with respect to alternative inventory measures and over the recent commodity price boom.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Full-waveform laser scanning data acquired with a Riegl LMS-Q560 instrument were used to classify an orange orchard into orange trees, grass and ground using waveform parameters alone. Gaussian decomposition was performed on this data capture from the National Airborne Field Experiment in November 2006 using a custom peak-detection procedure and a trust-region-reflective algorithm for fitting Gauss functions. Calibration was carried out using waveforms returned from a road surface, and the backscattering coefficient c was derived for every waveform peak. The processed data were then analysed according to the number of returns detected within each waveform and classified into three classes based on pulse width and c. For single-peak waveforms the scatterplot of c versus pulse width was used to distinguish between ground, grass and orange trees. In the case of multiple returns, the relationship between first (or first plus middle) and last return c values was used to separate ground from other targets. Refinement of this classification, and further sub-classification into grass and orange trees was performed using the c versus pulse width scatterplots of last returns. In all cases the separation was carried out using a decision tree with empirical relationships between the waveform parameters. Ground points were successfully separated from orange tree points. The most difficult class to separate and verify was grass, but those points in general corresponded well with the grass areas identified in the aerial photography. The overall accuracy reached 91%, using photography and relative elevation as ground truth. The overall accuracy for two classes, orange tree and combined class of grass and ground, yielded 95%. Finally, the backscattering coefficient c of single-peak waveforms was also used to derive reflectance values of the three classes. The reflectance of the orange tree class (0.31) and ground class (0.60) are consistent with published values at the wavelength of the Riegl scanner (1550 nm). The grass class reflectance (0.46) falls in between the other two classes as might be expected, as this class has a mixture of the contributions of both vegetation and ground reflectance properties.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this paper is to explore how companies that hold carbon trading accounts under European Union Emissions Trading Scheme (EU ETS) respond to the climate change by using disclosures on carbon emissions as a means to generate legitimacy compared to others. The study is based on disclosures made in annual reports and stand-alone sustainability reports of UK listed companies from 2001- 2012. The study uses content analysis to capture both the quality and volume of the carbon disclosures. The results show that there is a significant increase in both the quality and volume of the carbon disclosures after the launch of EU ETS. Companies with carbon trading accounts provide greater detailed disclosures as compared to the others without an account. We also find that company size is positively correlated with the disclosures while the association with the industry produces an inconclusive result.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We examine whether and under what circumstances World Bank and International Monetary Fund (IMF) programs affect the likelihood of major government crises. We find that crises are, on average, more likely as a consequence of World Bank programs. We also find that governments face an increasing risk of entering a crisis when they remain under an IMF or World Bank arrangement once the economy's performance improves. The international financial institution's (IFI) scapegoat function thus seems to lose its value when the need for financial support is less urgent. While the probability of a crisis increases when a government turns to the IFIs, programs inherited by preceding governments do not affect the probability of a crisis. This is in line with two interpretations. First, the conclusion of IFI programs can signal the government's incompetence, and second, governments that inherit programs might be less likely to implement program conditions agreed to by their predecessors.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: The validity of ensemble averaging on event-related potential (ERP) data has been questioned, due to its assumption that the ERP is identical across trials. Thus, there is a need for preliminary testing for cluster structure in the data. New method: We propose a complete pipeline for the cluster analysis of ERP data. To increase the signalto-noise (SNR) ratio of the raw single-trials, we used a denoising method based on Empirical Mode Decomposition (EMD). Next, we used a bootstrap-based method to determine the number of clusters, through a measure called the Stability Index (SI). We then used a clustering algorithm based on a Genetic Algorithm (GA)to define initial cluster centroids for subsequent k-means clustering. Finally, we visualised the clustering results through a scheme based on Principal Component Analysis (PCA). Results: After validating the pipeline on simulated data, we tested it on data from two experiments – a P300 speller paradigm on a single subject and a language processing study on 25 subjects. Results revealed evidence for the existence of 6 clusters in one experimental condition from the language processing study. Further, a two-way chi-square test revealed an influence of subject on cluster membership.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Empirical Mode Decomposition is presented as an alternative to traditional analysis methods to decompose geomagnetic time series into spectral components. Important comments on the algorithm and its variations will be given. Using this technique, planetary wave modes of 5-, 10-, and 16-day mean periods can be extracted from magnetic field components of three different stations in Germany. In a second step, the amplitude modulation functions of these wave modes can be shown to contain significant contribution from solar cycle variation through correlation with smoothed sunspot numbers. Additionally, the data indicate connections with geomagnetic jerk occurrences, supported by a second set of data providing reconstructed near-Earth magnetic field for 150 years. Usually attributed to internal dynamo processes within the Earth's outer core, the question of who is impacting whom will be briefly discussed here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An improved algorithm for the generation of gridded window brightness temperatures is presented. The primary data source is the International Satellite Cloud Climatology Project, level B3 data, covering the period from July 1983 to the present. The algorithm rakes window brightness, temperatures from multiple satellites, both geostationary and polar orbiting, which have already been navigated and normalized radiometrically to the National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer, and generates 3-hourly global images on a 0.5 degrees by 0.5 degrees latitude-longitude grid. The gridding uses a hierarchical scheme based on spherical kernel estimators. As part of the gridding procedure, the geostationary data are corrected for limb effects using a simple empirical correction to the radiances, from which the corrected temperatures are computed. This is in addition to the application of satellite zenith angle weighting to downweight limb pixels in preference to nearer-nadir pixels. The polar orbiter data are windowed on the target time with temporal weighting to account for the noncontemporaneous nature of the data. Large regions of missing data are interpolated from adjacent processed images using a form of motion compensated interpolation based on the estimation of motion vectors using an hierarchical block matching scheme. Examples are shown of the various stages in the process. Also shown are examples of the usefulness of this type of data in GCM validation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complexity inherent in climate data makes it necessary to introduce more than one statistical tool to the researcher to gain insight into the climate system. Empirical orthogonal function (EOF) analysis is one of the most widely used methods to analyze weather/climate modes of variability and to reduce the dimensionality of the system. Simple structure rotation of EOFs can enhance interpretability of the obtained patterns but cannot provide anything more than temporal uncorrelatedness. In this paper, an alternative rotation method based on independent component analysis (ICA) is considered. The ICA is viewed here as a method of EOF rotation. Starting from an initial EOF solution rather than rotating the loadings toward simplicity, ICA seeks a rotation matrix that maximizes the independence between the components in the time domain. If the underlying climate signals have an independent forcing, one can expect to find loadings with interpretable patterns whose time coefficients have properties that go beyond simple noncorrelation observed in EOFs. The methodology is presented and an application to monthly means sea level pressure (SLP) field is discussed. Among the rotated (to independence) EOFs, the North Atlantic Oscillation (NAO) pattern, an Arctic Oscillation–like pattern, and a Scandinavian-like pattern have been identified. There is the suggestion that the NAO is an intrinsic mode of variability independent of the Pacific.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Annual total phosphorus (TP) export data from 108 European micro-catchments were analyzed against descriptive catchment data on climate (runoff), soil types, catchment size, and land use. The best possible empirical model developed included runoff, proportion of agricultural land and catchment size as explanatory variables but with a low explanation of the variance in the dataset (R-2 = 0.37). Improved country specific empirical models could be developed in some cases. The best example was from Norway where an analysis of TP-export data from 12 predominantly agricultural micro-catchments revealed a relationship explaining 96% of the variance in TP-export. The explanatory variables were in this case soil-P status (P-AL), proportion of organic soil, and the export of suspended sediment. Another example is from Denmark where an empirical model was established for the basic annual average TP-export from 24 catchments with percentage sandy soils, percentage organic soils, runoff, and application of phosphorus in fertilizer and animal manure as explanatory variables (R-2 = 0.97).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Across Europe, elevated phosphorus (P) concentrations in lowland rivers have made them particularly susceptible to eutrophication. This is compounded in southern and central UK by increasing pressures on water resources, which may be further enhanced by the potential effects of climate change. The EU Water Framework Directive requires an integrated approach to water resources management at the catchment scale and highlights the need for modelling tools that can distinguish relative contributions from multiple nutrient sources and are consistent with the information content of the available data. Two such models are introduced and evaluated within a stochastic framework using daily flow and total phosphorus concentrations recorded in a clay catchment typical of many areas of the lowland UK. Both models disaggregate empirical annual load estimates, derived from land use data, as a function of surface/near surface runoff, generated using a simple conceptual rainfall-runoff model. Estimates of the daily load from agricultural land, together with those from baseflow and point sources, feed into an in-stream routing algorithm. The first model assumes constant concentrations in runoff via surface/near surface pathways and incorporates an additional P store in the river-bed sediments, depleted above a critical discharge, to explicitly simulate resuspension. The second model, which is simpler, simulates P concentrations as a function of surface/near surface runoff, thus emphasising the influence of non-point source loads during flow peaks and mixing of baseflow and point sources during low flows. The temporal consistency of parameter estimates and thus the suitability of each approach is assessed dynamically following a new approach based on Monte-Carlo analysis. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the study was to establish and verify a predictive vegetation model for plant community distribution in the alti-Mediterranean zone of the Lefka Ori massif, western Crete. Based on previous work three variables were identified as significant determinants of plant community distribution, namely altitude, slope angle and geomorphic landform. The response of four community types against these variables was tested using classification trees analysis in order to model community type occurrence. V-fold cross-validation plots were used to determine the length of the best fitting tree. The final 9node tree selected, classified correctly 92.5% of the samples. The results were used to provide decision rules for the construction of a spatial model for each community type. The model was implemented within a Geographical Information System (GIS) to predict the distribution of each community type in the study site. The evaluation of the model in the field using an error matrix gave an overall accuracy of 71%. The user's accuracy was higher for the Crepis-Cirsium (100%) and Telephium-Herniaria community type (66.7%) and relatively lower for the Peucedanum-Alyssum and Dianthus-Lomelosia community types (63.2% and 62.5%, respectively). Misclassification and field validation points to the need for improved geomorphological mapping and suggests the presence of transitional communities between existing community types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Empirical orthogonal function (EOF) analysis is a powerful tool for data compression and dimensionality reduction used broadly in meteorology and oceanography. Often in the literature, EOF modes are interpreted individually, independent of other modes. In fact, it can be shown that no such attribution can generally be made. This review demonstrates that in general individual EOF modes (i) will not correspond to individual dynamical modes, (ii) will not correspond to individual kinematic degrees of freedom, (iii) will not be statistically independent of other EOF modes, and (iv) will be strongly influenced by the nonlocal requirement that modes maximize variance over the entire domain. The goal of this review is not to argue against the use of EOF analysis in meteorology and oceanography; rather, it is to demonstrate the care that must be taken in the interpretation of individual modes in order to distinguish the medium from the message.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Space weather effects on technological systems originate with energy carried from the Sun to the terrestrial environment by the solar wind. In this study, we present results of modeling of solar corona-heliosphere processes to predict solar wind conditions at the L1 Lagrangian point upstream of Earth. In particular we calculate performance metrics for (1) empirical, (2) hybrid empirical/physics-based, and (3) full physics-based coupled corona-heliosphere models over an 8-year period (1995–2002). L1 measurements of the radial solar wind speed are the primary basis for validation of the coronal and heliosphere models studied, though other solar wind parameters are also considered. The models are from the Center for Integrated Space-Weather Modeling (CISM) which has developed a coupled model of the whole Sun-to-Earth system, from the solar photosphere to the terrestrial thermosphere. Simple point-by-point analysis techniques, such as mean-square-error and correlation coefficients, indicate that the empirical coronal-heliosphere model currently gives the best forecast of solar wind speed at 1 AU. A more detailed analysis shows that errors in the physics-based models are predominately the result of small timing offsets to solar wind structures and that the large-scale features of the solar wind are actually well modeled. We suggest that additional “tuning” of the coupling between the coronal and heliosphere models could lead to a significant improvement of their accuracy. Furthermore, we note that the physics-based models accurately capture dynamic effects at solar wind stream interaction regions, such as magnetic field compression, flow deflection, and density buildup, which the empirical scheme cannot.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nitrogen oxide biogenic emissions from soils are driven by soil and environmental parameters. The relationship between these parameters and NO fluxes is highly non linear. A new algorithm, based on a neural network calculation, is used to reproduce the NO biogenic emissions linked to precipitations in the Sahel on the 6 August 2006 during the AMMA campaign. This algorithm has been coupled in the surface scheme of a coupled chemistry dynamics model (MesoNH Chemistry) to estimate the impact of the NO emissions on NOx and O3 formation in the lower troposphere for this particular episode. Four different simulations on the same domain and at the same period are compared: one with anthropogenic emissions only, one with soil NO emissions from a static inventory, at low time and space resolution, one with NO emissions from neural network, and one with NO from neural network plus lightning NOx. The influence of NOx from lightning is limited to the upper troposphere. The NO emission from soils calculated with neural network responds to changes in soil moisture giving enhanced emissions over the wetted soil, as observed by aircraft measurements after the passing of a convective system. The subsequent enhancement of NOx and ozone is limited to the lowest layers of the atmosphere in modelling, whereas measurements show higher concentrations above 1000 m. The neural network algorithm, applied in the Sahel region for one particular day of the wet season, allows an immediate response of fluxes to environmental parameters, unlike static emission inventories. Stewart et al (2008) is a companion paper to this one which looks at NOx and ozone concentrations in the boundary layer as measured on a research aircraft, examines how they vary with respect to the soil moisture, as indicated by surface temperature anomalies, and deduces NOx fluxes. In this current paper the model-derived results are compared to the observations and calculated fluxes presented by Stewart et al (2008).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilbert’s curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilbert’s curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilbert’s curve, Sammon’s mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilbert’s curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilbert’s curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required.