103 resultados para attori, concorrenza, COOP, Akka, benchmark
Resumo:
A new Bayesian algorithm for retrieving surface rain rate from Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) over the ocean is presented, along with validations against estimates from the TRMM Precipitation Radar (PR). The Bayesian approach offers a rigorous basis for optimally combining multichannel observations with prior knowledge. While other rain-rate algorithms have been published that are based at least partly on Bayesian reasoning, this is believed to be the first self-contained algorithm that fully exploits Bayes’s theorem to yield not just a single rain rate, but rather a continuous posterior probability distribution of rain rate. To advance the understanding of theoretical benefits of the Bayesian approach, sensitivity analyses have been conducted based on two synthetic datasets for which the “true” conditional and prior distribution are known. Results demonstrate that even when the prior and conditional likelihoods are specified perfectly, biased retrievals may occur at high rain rates. This bias is not the result of a defect of the Bayesian formalism, but rather represents the expected outcome when the physical constraint imposed by the radiometric observations is weak owing to saturation effects. It is also suggested that both the choice of the estimators and the prior information are crucial to the retrieval. In addition, the performance of the Bayesian algorithm herein is found to be comparable to that of other benchmark algorithms in real-world applications, while having the additional advantage of providing a complete continuous posterior probability distribution of surface rain rate.
Resumo:
In this paper stability of one-step ahead predictive controllers based on non-linear models is established. It is shown that, under conditions which can be fulfilled by most industrial plants, the closed-loop system is robustly stable in the presence of plant uncertainties and input–output constraints. There is no requirement that the plant should be open-loop stable and the analysis is valid for general forms of non-linear system representation including the case out when the problem is constraint-free. The effectiveness of controllers designed according to the algorithm analyzed in this paper is demonstrated on a recognized benchmark problem and on a simulation of a continuous-stirred tank reactor (CSTR). In both examples a radial basis function neural network is employed as the non-linear system model.
Resumo:
Although in several EU Member States many public interventions have been running for the prevention and/or management of obesity and other nutrition-related health conditions, few have yet been formally evaluated. The multidisciplinary team of the EATWELL project will gather benchmark data on healthy eating interventions in EU Member States and review existing information on the effectiveness of interventions using a three-stage procedure (i) Assessment of the intervention's impact on consumer attitudes, consumer behaviour and diets; (ii) The impact of the change in diets on obesity and health and (iii) The value attached by society to these changes, measured in life years gained, cost savings and quality-adjusted life years. Where evaluations have been inadequate, EATWELL will gather secondary data and analyse them with a multidisciplinary approach incorporating models from the psychology and economics disciplines. Particular attention will be paid to lessons that can be learned from private sector that are transferable to the healthy eating campaigns in the public sector. Through consumer surveys and workshops with other stakeholders, EATWELL will assess the acceptability of the range of potential interventions. Armed with scientific quantitative evaluations of policy interventions and their acceptability to stakeholders, EATWELL expects to recommend more appropriate interventions for Member States and the EU, providing a one-stop guide to methods and measures in interventions evaluation, and outline data collection priorities for the future.
Resumo:
Associative memory networks such as Radial Basis Functions, Neurofuzzy and Fuzzy Logic used for modelling nonlinear processes suffer from the curse of dimensionality (COD), in that as the input dimension increases the parameterization, computation cost, training data requirements, etc. increase exponentially. Here a new algorithm is introduced for the construction of a Delaunay input space partitioned optimal piecewise locally linear models to overcome the COD as well as generate locally linear models directly amenable to linear control and estimation algorithms. The training of the model is configured as a new mixture of experts network with a new fast decision rule derived using convex set theory. A very fast simulated reannealing (VFSR) algorithm is utilized to search a global optimal solution of the Delaunay input space partition. A benchmark non-linear time series is used to demonstrate the new approach.
Resumo:
In a world of almost permanent and rapidly increasing electronic data availability, techniques of filtering, compressing, and interpreting this data to transform it into valuable and easily comprehensible information is of utmost importance. One key topic in this area is the capability to deduce future system behavior from a given data input. This book brings together for the first time the complete theory of data-based neurofuzzy modelling and the linguistic attributes of fuzzy logic in a single cohesive mathematical framework. After introducing the basic theory of data-based modelling, new concepts including extended additive and multiplicative submodels are developed and their extensions to state estimation and data fusion are derived. All these algorithms are illustrated with benchmark and real-life examples to demonstrate their efficiency. Chris Harris and his group have carried out pioneering work which has tied together the fields of neural networks and linguistic rule-based algortihms. This book is aimed at researchers and scientists in time series modeling, empirical data modeling, knowledge discovery, data mining, and data fusion.
Resumo:
Based on integrated system optimisation and parameter estimation a method is described for on-line steady state optimisation which compensates for model-plant mismatch and solves a non-linear optimisation problem by iterating on a linear - quadratic representation. The method requires real process derivatives which are estimated using a dynamic identification technique. The utility of the method is demonstrated using a simulation of the Tennessee Eastman benchmark chemical process.
Resumo:
We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.
Resumo:
Van der Heijden’s ENDGAME STUDY DATABASE IV, HhdbIV, is the definitive collection of 76,132 chess studies. In each one, White is to achieve the stipulated goal, win or draw: study solutions should be essentially unique with minor alternatives at most. In this second note on the mining of the database, we use the definitive Nalimov endgame tables to benchmark White’s moves in sub-7-man chess against this standard of uniqueness. Amongst goal-compatible mainline positions and goal-achieving moves, we identify the occurrence of absolutely unique moves and analyse the frequency and lengths of absolutely-unique-move sequences, AUMSs. We identify the occurrence of equi-optimal moves and suboptimal moves and refer to a defined method for classifying their significance.
Resumo:
Depreciation is a key element of understanding the returns from and price of commercial real estate. Understanding its impact is important for asset allocation models and asset management decisions. It is a key input into well-constructed pricing models and its impact on indices of commercial real estate prices needs to be recognised. There have been a number of previous studies of the impact of depreciation on real estate, particularly in the UK. Law (2004) analysed all of these studies and found that the seemingly consistent results were an illusion as they all used a variety of measurement methods and data. In addition, none of these studies examined impact on total returns; they examined either rental value depreciation alone or rental and capital value depreciation. This study seeks to rectify this omission, adopting the best practice measurement framework set out by Law (2004). Using individual property data from the UK Investment Property Databank for the 10-year period between 1994 and 2003, rental and capital depreciation, capital expenditure rates, and total return series for the data sample and for a benchmark are calculated for 10 market segments. The results are complicated by the period of analysis which started in the aftermath of the major UK real estate recession of the early 1990s, but they give important insights into the impact of depreciation in different segments of the UK real estate investment market.
Resumo:
The question as to whether active management adds any value above that of the funds investment policy is one of continual interest to investors. In order to investigate this issue in the UK real estate market we examine a number of related questions. First, how much return variability is explained by investment policy? Second, how similar are the policies across funds? Third, how much of a fund’s return is determined by investment policy? Finally, how was this added value achieved? Using data for 19 real estate funds we find that investment policy explains less than half of the variability in returns over time, nothing of the variation across funds and that more than 100% of a level of return is attributed to investment policy. The results also show UK real estate fund focus exclusively on trying to pick winners to add value and that in pursuit of active return fund mangers incur high tracking error risk, consequently, successful active management is very difficult to achieve. In addition, the results are dependent on the benchmark used to represent the investment policy of the fund. Nonetheless, active management can indeed add value to a real estate funds performance. This is the good news. The bad news is adding value is much more difficult to achieve than is generally accepted.
Resumo:
Built environment programmes in West African universities; and research contributions from West Africa in six leading international journals and proceedings of the WABER conference are explored. At least 20 universities in the region offer degree programmes in Architecture (86% out of 23 universities); Building (57%); Civil Engineering (67%); Estate Management (52%); Quantity Surveying (52%); Surveying and Geoinformatics (55%); Urban and Regional Planning (67%). The lecturer-student ratio on programmes is around 1:25 compared to the 1:10 benchmark for excellence. Academics who teach on the programmes are clearly research active with some having published papers in leading international journals. There is, however, plenty of scope for improvement particularly at the highest international level. Out of more than 5000 papers published in six leading international peer-reviewed journals since each of them was established, only 23 of the papers have come from West Africa. The 23 papers are published by 28 academics based in 13 universities. Although some academics may publish their work in the plethora of journals that have proliferated in recent years, new generation researchers are encouraged to publish in more established journals. The analyses of 187 publications in the WABER conference proceedings revealed 18 research-active universities. Factors like quality of teaching, research and lecturer-student ratio, etc count in the ranking of universities. The findings lay bare some of the areas that should be addressed to improve the landscape of higher education in West Africa.
Resumo:
The reduction of portfolio risk is important to all investors but is particularly important to real estate investors as most property portfolios are generally small. As a consequence, portfolios are vulnerable to a significant risk of under-performing the market, or a target rate of return and so investors may be exposing themselves to greater risk than necessary. Given the potentially higher risk of underperformance from owning only a few properties, we follow the approach of Vassal (2001) and examine the benefits of holding more properties in a real estate portfolio. Using Monte Carlo simulation and the returns from 1,728 properties in the IPD database, held over the 10-year period from 1995 to 2004, the results show that increases in portfolio size offers the possibility of a more stable and less volatile return pattern over time, i.e. down-side risk is diminished with increasing portfolio size. Nonetheless, increasing portfolio size has the disadvantage of restricting the probability of out-performing the benchmark index by a significant amount. In other words, although increasing portfolio size reduces the down-side risk in a portfolio, it also decreases its up-side potential. Be that as it may, the results provide further evidence that portfolios with large numbers of properties are always preferable to portfolios of a smaller size.
Resumo:
Electricity consumption in Ghana is estimated to be increasing by 10% per annum due to the demand from the growing population. However, current sources of production (hydro and thermal facilities) generate only 66% of the current demand. Considering current trends, it is difficult to substantiate these basic facts, because of the lack of information. As a result, research into the existing sources of generating electricity, electricity consumption and prospective projects has been performed. This was achieved using three key techniques; review of literature, empirical studies and modelling. The results presented suggest that, current annual installed capacity of energy generation (i.e. 1960 MW) must be increased to 9,405.59 MW, assuming 85% plant availability. This is then capable to coop with the growing demand and it would give access to the entire population as well as support commercial and industrial activities for the growth of the economy. The prospect of performing this research is with the expectation to present an academic research agenda for further exploration into the subject area, without which the growth of the country would be stagnant.
Resumo:
In the last ten years Regulatory Impact Analysis has become the instrument providing groundwork for evidence-based regulatory decisions in most developed countries. However, to an increase in quantity, it did not correspond an increase in quality. In Italy, Regulatory Impact Analysis has been in place for ten years on paper, but in practice it has not been performed consistently. Of particular interest is the case of independent regulatory authorities, which have been required to apply Regulatory Impact Analysis since 2003. This paper explores how Regulatory Impact Analysis is carried out, by examining in depth how an individual case –on the Regulation for Quality of Service- was executed by the Autorità per l’energia elettrica e il gas. The aim is to provide a picture of the process leading to the final Regulatory Impact Analysis report, rather than just a study of its content. The case illustrates how Regulatory Impact Analysis, when properly employed, can be an important aid to the regulatory decision, not only by assessing ex ante the economic impacts of regulatory proposals in terms of costs, benefits and risks, but also opening the spectrum of policy alternatives and systematically considering stakeholder opinions as part of the decision-making process. This case highlights also several difficulties, analytical and process-related, that emerge in practical applications. Finally, it shows that the experience and expertise built by the regulatory authority over the years had a significant impact on the quality of the analysis.