960 resultados para stock mixture analysis
Resumo:
The transition to a low-carbon economy urgently demands better information on the drivers of energy consumption. UK government policy has prioritized energy efficiency in the built stock as a means of carbon reduction, but the sector is historically information poor, particularly the non-domestic building stock. This paper presents the results of a pilot study that investigated whether and how property and energy consumption data might be combined for non-domestic energy analysis. These data were combined in a ‘Non-Domestic Energy Efficiency Database’ to describe the location and physical attributes of each property and its energy consumption. The aim was to support the generation of a range of energy-efficiency statistics for the industrial, commercial and institutional sectors of the non-domestic building stock, and to provide robust evidence for national energy-efficiency and carbon-reduction policy development and monitoring. The work has brought together non-domestic energy data, property data and mapping in a ‘data framework’ for the first time. The results show what is possible when these data are integrated and the associated difficulties. A data framework offers the potential to inform energy-efficiency policy formation and to support its monitoring at a level of detail not previously possible.
Resumo:
The rapid growth of non-listed real estate funds over the last several years has contributed towards establishing this sector as a major investment vehicle for gaining exposure to commercial real estate. Academic research has not kept up with this development, however, as there are still only a few published studies on non-listed real estate funds. This paper aims to identify the factors driving the total return over a seven-year period. Influential factors tested in our analysis include the weighted underlying direct property returns in each country and sector as well as fund size, investment style gearing and the distribution yield. Furthermore, we analyze the interaction of non-listed real estate funds with the performance of the overall economy and that of competing asset classes and found that lagged GDP growth and stock market returns as well as contemporaneous government bond rates are significant and positive predictors of annual fund performance.
Resumo:
By eliminating the short range negative divergence of the Debye–Hückel pair distribution function, but retaining the exponential charge screening known to operate at large interparticle separation, the thermodynamic properties of one-component plasmas of point ions or charged hard spheres can be well represented even in the strong coupling regime. Predicted electrostatic free energies agree within 5% of simulation data for typical Coulomb interactions up to a factor of 10 times the average kinetic energy. Here, this idea is extended to the general case of a uniform ionic mixture, comprising an arbitrary number of components, embedded in a rigid neutralizing background. The new theory is implemented in two ways: (i) by an unambiguous iterative algorithm that requires numerical methods and breaks the symmetry of cross correlation functions; and (ii) by invoking generalized matrix inverses that maintain symmetry and yield completely analytic solutions, but which are not uniquely determined. The extreme computational simplicity of the theory is attractive when considering applications to complex inhomogeneous fluids of charged particles.
Resumo:
Background: Microarray based comparative genomic hybridisation (CGH) experiments have been used to study numerous biological problems including understanding genome plasticity in pathogenic bacteria. Typically such experiments produce large data sets that are difficult for biologists to handle. Although there are some programmes available for interpretation of bacterial transcriptomics data and CGH microarray data for looking at genetic stability in oncogenes, there are none specifically to understand the mosaic nature of bacterial genomes. Consequently a bottle neck still persists in accurate processing and mathematical analysis of these data. To address this shortfall we have produced a simple and robust CGH microarray data analysis process that may be automated in the future to understand bacterial genomic diversity. Results: The process involves five steps: cleaning, normalisation, estimating gene presence and absence or divergence, validation, and analysis of data from test against three reference strains simultaneously. Each stage of the process is described and we have compared a number of methods available for characterising bacterial genomic diversity, for calculating the cut-off between gene presence and absence or divergence, and shown that a simple dynamic approach using a kernel density estimator performed better than both established, as well as a more sophisticated mixture modelling technique. We have also shown that current methods commonly used for CGH microarray analysis in tumour and cancer cell lines are not appropriate for analysing our data. Conclusion: After carrying out the analysis and validation for three sequenced Escherichia coli strains, CGH microarray data from 19 E. coli O157 pathogenic test strains were used to demonstrate the benefits of applying this simple and robust process to CGH microarray studies using bacterial genomes.
Resumo:
‘Sustainable’ or ‘green’ commercial buildings are frequently seen as a growth sector in the property investment market. This research examines the emergence of sustainable commercial buildings in both the UK and overseas. The empirical part of the paper is based on a telephone survey of 50 UK corporate (private sector) occupiers taking leased and owner–occupied office space, which was carried out during the period of April to November 2008. The survey focused on actual moves made within the previous two years, or moves that were imminent during 2006–2008. The research suggests that although there is an emerging and increasing demand for sustainable offices in the UK, other factors such as location and availability of stock continue to remain more important than sustainability in determining occupiers’ final choice of office. Occupiers who moved to a Building Research Establishment Environmental Assessment Method (BREEAM)‐rated building, and were in business sectors with strong environmental and corporate responsibility policies, placed more emphasis on sustainability than other groups in the final choice of office, but location and availability remained paramount.
Resumo:
We discuss the modeling of dielectric responses of electromagnetically excited networks which are composed of a mixture of capacitors and resistors. Such networks can be employed as lumped-parameter circuits to model the response of composite materials containing conductive and insulating grains. The dynamics of the excited network systems are studied using a state space model derived from a randomized incidence matrix. Time and frequency domain responses from synthetic data sets generated from state space models are analyzed for the purpose of estimating the fraction of capacitors in the network. Good results were obtained by using either the time-domain response to a pulse excitation or impedance data at selected frequencies. A chemometric framework based on a Successive Projections Algorithm (SPA) enables the construction of multiple linear regression (MLR) models which can efficiently determine the ratio of conductive to insulating components in composite material samples. The proposed method avoids restrictions commonly associated with Archie’s law, the application of percolation theory or Kohlrausch-Williams-Watts models and is applicable to experimental results generated by either time domain transient spectrometers or continuous-wave instruments. Furthermore, it is quite generic and applicable to tomography, acoustics as well as other spectroscopies such as nuclear magnetic resonance, electron paramagnetic resonance and, therefore, should be of general interest across the dielectrics community.
Resumo:
Full-waveform laser scanning data acquired with a Riegl LMS-Q560 instrument were used to classify an orange orchard into orange trees, grass and ground using waveform parameters alone. Gaussian decomposition was performed on this data capture from the National Airborne Field Experiment in November 2006 using a custom peak-detection procedure and a trust-region-reflective algorithm for fitting Gauss functions. Calibration was carried out using waveforms returned from a road surface, and the backscattering coefficient c was derived for every waveform peak. The processed data were then analysed according to the number of returns detected within each waveform and classified into three classes based on pulse width and c. For single-peak waveforms the scatterplot of c versus pulse width was used to distinguish between ground, grass and orange trees. In the case of multiple returns, the relationship between first (or first plus middle) and last return c values was used to separate ground from other targets. Refinement of this classification, and further sub-classification into grass and orange trees was performed using the c versus pulse width scatterplots of last returns. In all cases the separation was carried out using a decision tree with empirical relationships between the waveform parameters. Ground points were successfully separated from orange tree points. The most difficult class to separate and verify was grass, but those points in general corresponded well with the grass areas identified in the aerial photography. The overall accuracy reached 91%, using photography and relative elevation as ground truth. The overall accuracy for two classes, orange tree and combined class of grass and ground, yielded 95%. Finally, the backscattering coefficient c of single-peak waveforms was also used to derive reflectance values of the three classes. The reflectance of the orange tree class (0.31) and ground class (0.60) are consistent with published values at the wavelength of the Riegl scanner (1550 nm). The grass class reflectance (0.46) falls in between the other two classes as might be expected, as this class has a mixture of the contributions of both vegetation and ground reflectance properties.
Resumo:
We generalize the popular ensemble Kalman filter to an ensemble transform filter, in which the prior distribution can take the form of a Gaussian mixture or a Gaussian kernel density estimator. The design of the filter is based on a continuous formulation of the Bayesian filter analysis step. We call the new filter algorithm the ensemble Gaussian-mixture filter (EGMF). The EGMF is implemented for three simple test problems (Brownian dynamics in one dimension, Langevin dynamics in two dimensions and the three-dimensional Lorenz-63 model). It is demonstrated that the EGMF is capable of tracking systems with non-Gaussian uni- and multimodal ensemble distributions. Copyright © 2011 Royal Meteorological Society
Resumo:
This paper examines the impact of changes in the composition of real estate stock indices, considering companies both joining and leaving the indices. Stocks that are newly included not only see a short-term increase in their share price, but trading volumes increase in a permanent fashion following the event. This highlights the importance of indices in not only a benchmarking context but also in enhancing investor awareness and aiding liquidity. By contrast, as anticipated, the share prices of firms removed from indices fall around the time of the index change. The fact that the changes in share prices, either upwards for index inclusions or downwards for deletions, are generally not reversed, would indicate that the movements are not purely due to price pressure, but rather are more consistent with the information content hypothesis. There is no evidence, however, that index changes significantly affect the volatility of price changes or their operating performances as measured by their earnings per share.
Resumo:
The present paper explores, theoretically, and empirically, whether compliance with the International Code of marketing of breast-milk substitutes impacts on financial performance measured by stock markets. The empirical analysis, which considers a 20-year period, shows that stock markets are indifferent to the level of compliance by manufacturers with the International Code. Two important issues emerge from this result. Based on our finding that financial performance as measured by stock markets cannot explain the level of compliance, the first issue refers to what alternative types of mechanisms drive manufacturers who comply the least with voluntary codes such as the International Code. Conversely, from our finding that stock markets do not reward the most compliant, the second issue raised is an inherent weakness of stock markets to fully incorporate social and environmental values.
Resumo:
We assess the robustness of previous findings on the determinants of terrorism. Using extreme bound analysis, the three most comprehensive terrorism datasets, and focusing on the three most commonly analyzed aspects of terrorist activity, i.e., location, victim, and perpetrator, we re-assess the effect of 65 proposed correlates. Evaluating around 13.4 million regressions, we find 18 variables to be robustly associated with the number of incidents occurring in a given country-year, 15 variables with attacks against citizens from a particular country in a given year, and six variables with attacks perpetrated by citizens of a particular country in a given year.
Resumo:
This paper introduces a new agent-based model, which incorporates the actions of individual homeowners in a long-term domestic stock model, and details how it was applied in energy policy analysis. The results indicate that current policies are likely to fall significantly short of the 80% target and suggest that current subsidy levels need re-examining. In the model, current subsidy levels appear to offer too much support to some technologies, which in turn leads to the suppression of other technologies that have a greater energy saving potential. The model can be used by policy makers to develop further scenarios to find alternative, more effective, sets of policy measures. The model is currently limited to the owner-occupied stock in England, although it can be expanded, subject to the availability of data.
Resumo:
The analysis step of the (ensemble) Kalman filter is optimal when (1) the distribution of the background is Gaussian, (2) state variables and observations are related via a linear operator, and (3) the observational error is of additive nature and has Gaussian distribution. When these conditions are largely violated, a pre-processing step known as Gaussian anamorphosis (GA) can be applied. The objective of this procedure is to obtain state variables and observations that better fulfil the Gaussianity conditions in some sense. In this work we analyse GA from a joint perspective, paying attention to the effects of transformations in the joint state variable/observation space. First, we study transformations for state variables and observations that are independent from each other. Then, we introduce a targeted joint transformation with the objective to obtain joint Gaussianity in the transformed space. We focus primarily in the univariate case, and briefly comment on the multivariate one. A key point of this paper is that, when (1)-(3) are violated, using the analysis step of the EnKF will not recover the exact posterior density in spite of any transformations one may perform. These transformations, however, provide approximations of different quality to the Bayesian solution of the problem. Using an example in which the Bayesian posterior can be analytically computed, we assess the quality of the analysis distributions generated after applying the EnKF analysis step in conjunction with different GA options. The value of the targeted joint transformation is particularly clear for the case when the prior is Gaussian, the marginal density for the observations is close to Gaussian, and the likelihood is a Gaussian mixture.
Resumo:
Evidence suggests that rational, periodically collapsing speculative bubbles may be pervasive in stock markets globally, but there is no research that considers them at the individual stock level. In this study we develop and test an empirical asset pricing model that allows for speculative bubbles to affect stock returns. We show that stocks incorporating larger bubbles yield higher returns. The bubble deviation, at the stock level as opposed to the industry or market level, is a priced source of risk that is separate from the standard market risk, size and value factors. We demonstrate that much of the common variation in stock returns that can be attributable to market risk is due to the co-movement of bubbles rather than being driven by fundamentals.
Resumo:
Cyber warfare is an increasingly important emerging phenomenon in international relations. The focus of this edited volume is on this notion of cyber warfare, meaning interstate cyber aggression, as distinct from cyber-terrorism or cyber-crime. Waging warfare in cyberspace has the capacity to be as devastating as any conventional means of conducting armed conflict. However, while there is a growing amount of literature on the subject within disciplines, there has been very little work done on cyber warfare across disciplines, which necessarily limits our understanding of it. This book is a major multidisciplinary analysis of cyber warfare, featuring contributions by world-leading experts from a mixture of academic and professional backgrounds.