960 resultados para Real Electricity Markets Data
Resumo:
The content of this paper is a snapshot of a current project looking at producing a real-time sensor-based building assessment tool, and a system that personalises work-spaces using multi-agent technology. Both systems derive physical environment information from a wireless sensor network that allows clients to subscribe to real-time sensed data. The principal ideologies behind this project are energy efficiency and well-being of occupants; in the context of leveraging the current state-of-the-art in agent technology, wireless sensor networks and building assessment systems to enable the optimisation and assessment of buildings. Participants of this project are from both industry (construction and research) and academia.
Resumo:
Many natural and technological applications generate time ordered sequences of networks, defined over a fixed set of nodes; for example time-stamped information about ‘who phoned who’ or ‘who came into contact with who’ arise naturally in studies of communication and the spread of disease. Concepts and algorithms for static networks do not immediately carry through to this dynamic setting. For example, suppose A and B interact in the morning, and then B and C interact in the afternoon. Information, or disease, may then pass from A to C, but not vice versa. This subtlety is lost if we simply summarize using the daily aggregate network given by the chain A-B-C. However, using a natural definition of a walk on an evolving network, we show that classic centrality measures from the static setting can be extended in a computationally convenient manner. In particular, communicability indices can be computed to summarize the ability of each node to broadcast and receive information. The computations involve basic operations in linear algebra, and the asymmetry caused by time’s arrow is captured naturally through the non-mutativity of matrix-matrix multiplication. Illustrative examples are given for both synthetic and real-world communication data sets. We also discuss the use of the new centrality measures for real-time monitoring and prediction.
Resumo:
Despite continuing developments in information technology and the growing economic significance of the emerging Eastern European, South American and Asian economies, international financial activity remains strongly concentrated in a relatively small number of international financial centres. That concentration of financial activity requires a critical mass of office occupation and creates demand for high specification, high cost space. The demand for that space is increasingly linked to the fortunes of global capital markets. That linkage has been emphasised by developments in real estate markets, notably the development of global real estate investment, innovation in property investment vehicles and the growth of debt securitisation. The resultant interlinking of occupier, asset, debt and development markets within and across global financial centres is a source of potential volatility and risk. The paper sets out a broad conceptual model of the linkages and their implications for systemic market risk and presents preliminary empirical results that provide support for the model proposed.
Resumo:
One of the most vexing issues for analysts and managers of property companies across Europe has been the existence and persistence of deviations of Net Asset Values of property companies from their market capitalisation. The issue has clear links to similar discounts and premiums in closed-end funds. The closed end fund puzzle is regarded as an important unsolved problem in financial economics undermining theories of market efficiency and the Law of One Price. Consequently, it has generated a huge body of research. Although it can be tempting to focus on the particular inefficiencies of real estate markets in attempting to explain deviations from NAV, the closed end fund discount puzzle indicates that divergences between underlying asset values and market capitalisation are not a ‘pure’ real estate phenomenon. When examining potential explanations, two recurring factors stand out in the closed end fund literature as often undermining the economic rationale for a discount – the existence of premiums and cross-sectional and periodic fluctuations in the level of discount/premium. These need to be borne in mind when considering potential explanations for real estate markets. There are two approaches to investigating the discount to net asset value in closed-end funds: the ‘rational’ approach and the ‘noise trader’ or ‘sentiment’ approach. The ‘rational’ approach hypothesizes the discount to net asset value as being the result of company specific factors relating to such factors as management quality, tax liability and the type of stocks held by the fund. Despite the intuitive appeal of the ‘rational’ approach to closed-end fund discounts the studies have not successfully explained the variance in closed-end fund discounts or why the discount to net asset value in closed-end funds varies so much over time. The variation over time in the average sector discount is not only a feature of closed-end funds but also property companies. This paper analyses changes in the deviations from NAV for UK property companies between 2000 and 2003. The paper present a new way to study the phenomenon ‘cleaning’ the gearing effect by introducing a new way of calculating the discount itself. We call it “ungeared discount”. It is calculated by assuming that a firm issues new equity to repurchase outstanding debt without any variation on asset side. In this way discount does not depend on an accounting effect and the analysis should better explain the effect of other independent variables.
Resumo:
A new generation of advanced surveillance systems is being conceived as a collection of multi-sensor components such as video, audio and mobile robots interacting in a cooperating manner to enhance situation awareness capabilities to assist surveillance personnel. The prominent issues that these systems face are: the improvement of existing intelligent video surveillance systems, the inclusion of wireless networks, the use of low power sensors, the design architecture, the communication between different components, the fusion of data emerging from different type of sensors, the location of personnel (providers and consumers) and the scalability of the system. This paper focuses on the aspects pertaining to real-time distributed architecture and scalability. For example, to meet real-time requirements, these systems need to process data streams in concurrent environments, designed by taking into account scheduling and synchronisation. The paper proposes a framework for the design of visual surveillance systems based on components derived from the principles of Real Time Networks/Data Oriented Requirements Implementation Scheme (RTN/DORIS). It also proposes the implementation of these components using the well-known middleware technology Common Object Request Broker Architecture (CORBA). Results using this architecture for video surveillance are presented through an implemented prototype.
Resumo:
The plethora, and mass take up, of digital communication tech- nologies has resulted in a wealth of interest in social network data collection and analysis in recent years. Within many such networks the interactions are transient: thus those networks evolve over time. In this paper we introduce a class of models for such networks using evolving graphs with memory dependent edges, which may appear and disappear according to their recent history. We consider time discrete and time continuous variants of the model. We consider the long term asymptotic behaviour as a function of parameters controlling the memory dependence. In particular we show that such networks may continue evolving forever, or else may quench and become static (containing immortal and/or extinct edges). This depends on the ex- istence or otherwise of certain infinite products and series involving age dependent model parameters. To test these ideas we show how model parameters may be calibrated based on limited samples of time dependent data, and we apply these concepts to three real networks: summary data on mobile phone use from a developing region; online social-business network data from China; and disaggregated mobile phone communications data from a reality mining experiment in the US. In each case we show that there is evidence for memory dependent dynamics, such as that embodied within the class of models proposed here.
Resumo:
We propose and analyse a class of evolving network models suitable for describing a dynamic topological structure. Applications include telecommunication, on-line social behaviour and information processing in neuroscience. We model the evolving network as a discrete time Markov chain, and study a very general framework where, conditioned on the current state, edges appear or disappear independently at the next timestep. We show how to exploit symmetries in the microscopic, localized rules in order to obtain conjugate classes of random graphs that simplify analysis and calibration of a model. Further, we develop a mean field theory for describing network evolution. For a simple but realistic scenario incorporating the triadic closure effect that has been empirically observed by social scientists (friends of friends tend to become friends), the mean field theory predicts bistable dynamics, and computational results confirm this prediction. We also discuss the calibration issue for a set of real cell phone data, and find support for a stratified model, where individuals are assigned to one of two distinct groups having different within-group and across-group dynamics.
Resumo:
The situation considered is that of a zonally symmetric model of the middle atmosphere subject to a given quasi-steady zonal force F̄, conceived to be the result of irreversible angular momentum transfer due to the upward propagation and breaking of Rossby and gravity waves together with any other dissipative eddy effects that may be relevant. The model's diabatic heating is assumed to have the qualitative character of a relaxation toward some radiatively determined temperature field. To the extent that the force F̄ may be regarded as given, and the extratropical angular momentum distribution is realistic, the extratropical diabatic mass flow across a given isentropic surface may be regarded as controlled exclusively by the F̄ distribution above that surface (implying control by the eddy dissipation above that surface and not, for instance, by the frequency of tropopause folding below). This “downward control” principle expresses a critical part of the dynamical chain of cause and effect governing the average rate at which photochemical products like ozone become available for folding into, or otherwise descending into, the extratropical troposphere. The dynamical facts expressed by the principle are also relevant, for instance, to understanding the seasonal-mean rate of upwelling of water vapor to the summer mesopause, and the interhemispheric differences in stratospheric tracer transport. The robustness of the principle is examined when F̄ is time-dependent. For a global-scale, zonally symmetric diabatic circulation with a Brewer-Dobson-like horizontal structure given by the second zonally symmetric Hough mode, with Rossby height HR = 13 km in an isothermal atmosphere with density scale height H = 7 km, the vertical partitioning of the unsteady part of the mass circulation caused by fluctuations in F̄ confined to a shallow layer LF̄ is always at least 84% downward. It is 90% downward when the force fluctuates sinusoidally on twice the radiative relaxation timescale and 95% if five times slower. The time-dependent adjustment when F̄ is changed suddenly is elucidated, extending the work of Dickinson (1968), when the atmosphere is unbounded above and below. Above the forcing, the adjustment is characterized by decay of the meridional mass circulation cell at a rate proportional to the radiative relaxation rate τr−1 divided by {1 + (4H2/HR2)}. This decay is related to the boundedness of the angular momentum that can be taken up by the finite mass of air above LF̄ without causing an ever-increasing departure from thermal wind balance. Below the forcing, the meridional mass circulation cell penetrates downward at a speed τr−1 HR2/H. For the second Hough mode, the time for downward penetration through one density scale height is about 6 days if the radiative relaxation time is 20 days, the latter being representative of the lower stratosphere. At any given altitude, a steady state is approached. The effect of a rigid lower boundary on the time-dependent adjustment is also considered. If a frictional planetary boundary layer is present then a steady state is ultimately approached everywhere, with the mass circulation extending downward from LF̄ and closing via the boundary layer. Satellite observations of temperature and ozone are used in conjunction with a radiative transfer scheme to estimate the altitudes from which the lower stratospheric diabatic vertical velocity is controlled by the effective F̄ in the real atmosphere. The data appear to indicate that about 80% of the effective control is usually exerted from below 40 km but with significant exceptions up to 70 km (in the high latitude southern hemispheric winter). The implications for numerical modelling of chemical transport are noted.
Resumo:
We present an efficient graph-based algorithm for quantifying the similarity of household-level energy use profiles, using a notion of similarity that allows for small time–shifts when comparing profiles. Experimental results on a real smart meter data set demonstrate that in cases of practical interest our technique is far faster than the existing method for computing the same similarity measure. Having a fast algorithm for measuring profile similarity improves the efficiency of tasks such as clustering of customers and cross-validation of forecasting methods using historical data. Furthermore, we apply a generalisation of our algorithm to produce substantially better household-level energy use forecasts from historical smart meter data.
Resumo:
Two different TAMSAT (Tropical Applications of Meteorological Satellites) methods of rainfall estimation were developed for northern and southern Africa, based on Meteosat images. These two methods were used to make rainfall estimates for the southern rainy season from October 1995 to April 1996. Estimates produced by both TAMSAT methods and estimates produced by the CPC (Climate Prediction Center) method were then compared with kriged data from over 800 raingauges in southern Africa. This shows that operational TAMSAT estimates are better over plateau regions, with 59% of estimates within one standard error (s.e.) of the kriged rainfall. Over mountainous regions the CPC approach is generally better, although all methods underestimate and give only 40% of estimates within 1 s.e. The two TAMSAT methods show little difference across a whole season, but when looked at in detail the northern method gives unsatisfactory calibrations. The CPC method does have significant overall improvements by building in real-time raingauge data, but only where sufficient raingauges are available.
Resumo:
We evaluate the predictive power of leading indicators for output growth at horizons up to 1 year. We use the MIDAS regression approach as this allows us to combine multiple individual leading indicators in a parsimonious way and to directly exploit the information content of the monthly series to predict quarterly output growth. When we use real-time vintage data, the indicators are found to have significant predictive ability, and this is further enhanced by the use of monthly data on the quarter at the time the forecast is made
Resumo:
This paper considers the effect of short- and long-term interest rates, and interest rate spreads upon real estate index returns in the UK. Using Johansen's vector autoregressive framework, it is found that the real estate index cointegrates with the term spread, but not with the short or long rates themselves. Granger causality tests indicate that movements in short term interest rates and the spread cause movements in the returns series. However, decomposition of the forecast error variances from VAR models indicate that changes in these variables can only explain a small proportion of the overall variability of the returns, and that the effect has fully worked through after two months. The results suggest that these financial variables could potentially be used as leading indicators for real estate markets, with corresponding implications for return predictability.
Resumo:
The Distribution Network Operators (DNOs) role is becoming more difficult as electric vehicles and electric heating penetrate the network, increasing the demand. As a result it becomes harder for the distribution networks infrastructure to remain within its operating constraints. Energy storage is a potential alternative to conventional network reinforcement such as upgrading cables and transformers. The research presented here in this paper shows that due to the volatile nature of the LV network, the control approach used for energy storage has a significant impact on performance. This paper presents and compares control methodologies for energy storage where the objective is to get the greatest possible peak demand reduction across the day from a pre-specified storage device. The results presented show the benefits and detriments of specific types of control on a storage device connected to a single phase of an LV network, using aggregated demand profiles based on real smart meter data from individual homes. The research demonstrates an important relationship between how predictable an aggregation is and the best control methodology required to achieve the objective.
Resumo:
This paper presents the mathematical development of a body-centric nonlinear dynamic model of a quadrotor UAV that is suitable for the development of biologically inspired navigation strategies. Analytical approximations are used to find an initial guess of the parameters of the nonlinear model, then parameter estimation methods are used to refine the model parameters using the data obtained from onboard sensors during flight. Due to the unstable nature of the quadrotor model, the identification process is performed with the system in closed-loop control of attitude angles. The obtained model parameters are validated using real unseen experimental data. Based on the identified model, a Linear-Quadratic (LQ) optimal tracker is designed to stabilize the quadrotor and facilitate its translational control by tracking body accelerations. The LQ tracker is tested on an experimental quadrotor UAV and the obtained results are a further means to validate the quality of the estimated model. The unique formulation of the control problem in the body frame makes the controller better suited for bio-inspired navigation and guidance strategies than conventional attitude or position based control systems that can be found in the existing literature.
Resumo:
Noncompetitive bids have recently become a major concern in both public and private sector construction contract auctions. Consequently, several models have been developed to help identify bidders potentially involved in collusive practices. However, most of these models require complex calculations and extensive information that is difficult to obtain. The aim of this paper is to utilize recent developments for detecting abnormal bids in capped auctions (auctions with an upper bid limit set by the auctioner) and extend them to the more conventional uncapped auctions (where no such limits are set). To accomplish this, a new method is developed for estimating the values of bid distribution supports by using the solution to what has become known as the German Tank problem. The model is then demonstrated and tested on a sample of real construction bid data, and shown to detect cover bids with high accuracy. This paper contributes to an improved understanding of abnormal bid behavior as an aid to detecting and monitoring potential collusive bid practices.