95 resultados para Simulator of Performance in Error
Resumo:
European economic and political integration have been recognised as having implications for patterns of performance in national real estate and capital markets and have generated a wide body of research and commentary. In 1999, progress towards monetary integration within the European Union culminated in the introduction of a common currency and monetary policy. This paper investigates the effects of this ‘event’ on the behaviour of stock returns in European real estate companies. A range of statistical tests is applied to the performance of European property companies to test for changes in segmentation, co-movement and causality. The results suggest that, relative to the wider equity markets, the dispersion of performance is higher, correlations are lower, a common contemporaneous factor has much lower explanatory power whilst lead-lag relationships are stronger. Consequently, the evidence of transmission of monetary integration to real estate securities is less noticeable than to general securities. Less and slower integration is attributed to the relatively small size of the real estate securities market and the local and national nature of the majority of the companies’ portfolios.
Resumo:
The performance of flood inundation models is often assessed using satellite observed data; however these data have inherent uncertainty. In this study we assess the impact of this uncertainty when calibrating a flood inundation model (LISFLOOD-FP) for a flood event in December 2006 on the River Dee, North Wales, UK. The flood extent is delineated from an ERS-2 SAR image of the event using an active contour model (snake), and water levels at the flood margin calculated through intersection of the shoreline vector with LiDAR topographic data. Gauged water levels are used to create a reference water surface slope for comparison with the satellite-derived water levels. Residuals between the satellite observed data points and those from the reference line are spatially clustered into groups of similar values. We show that model calibration achieved using pattern matching of observed and predicted flood extent is negatively influenced by this spatial dependency in the data. By contrast, model calibration using water elevations produces realistic calibrated optimum friction parameters even when spatial dependency is present. To test the impact of removing spatial dependency a new method of evaluating flood inundation model performance is developed by using multiple random subsamples of the water surface elevation data points. By testing for spatial dependency using Moran’s I, multiple subsamples of water elevations that have no significant spatial dependency are selected. The model is then calibrated against these data and the results averaged. This gives a near identical result to calibration using spatially dependent data, but has the advantage of being a statistically robust assessment of model performance in which we can have more confidence. Moreover, by using the variations found in the subsamples of the observed data it is possible to assess the effects of observational uncertainty on the assessment of flooding risk.
Resumo:
The prediction of Northern Hemisphere (NH) extratropical cyclones by nine different ensemble prediction systems(EPSs), archived as part of The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE), has recently been explored using a cyclone tracking approach. This paper provides a continuation of this work, extending the analysis to the Southern Hemisphere (SH). While the EPSs have larger error in all cyclone properties in the SH, the relative performance of the different EPSs remains broadly consistent between the two hemispheres. Some interesting differences are also shown. The Chinese Meteorological Administration (CMA) EPS has a significantly lower level of performance in the SH compared to the NH. Previous NH results showed that the Centro de Previsao de Tempo e Estudos Climaticos (CPTEC) EPS underpredicts cyclone intensity. The results of this current study show that this bias is significantly larger in the SH. The CPTEC EPS also has very little spread in both hemispheres. As with the NH results, cyclone propagation speed is underpredicted by all the EPSs in the SH. To investigate this further, the bias was also computed for theECMWFhigh-resolution deterministic forecast. The bias was significantly smaller than the lower resolution ECMWF EPS.
Resumo:
The Asian region has become a focus of attention for investors in recent years. Due to the strong economic performance of the region, the higher expected returns in the area compared with Europe and the USA and the additional diversification benefits investment in the region would offer. Nonetheless many investors have doubts about the prudence of investing in such areas. In particular it may be felt that the expected returns offered in the countries of the Asian region are not sufficient to compensate investors for the increased risks of investing in such markets. These risks can be categorised into under four headings: investment risk, currency risk, political risk, and institutional risk. This paper analyses each of these risks in turn to see if they are sufficiently large to deter real estate investment in the region in general or in a particular country.
Resumo:
In this paper, we seek to achieve four objectives. First, we provide some contextual material concerning the performance of the UK real estate market relative to stocks and bonds over a long period. Second, we provide UK – and some non-UK European - evidence of the tendency for property demand, supply, prices and returns to fluctuate around their long term trends or averages. Third, we briefly examine some hypotheses which suggest institutional contributions to property cycles in European markets. Fourth, we suggest some reasons why the future may not be as cyclical as the past.
Resumo:
Near-isogenic lines (NILs) of winter wheat varying for alleles for reduced height (Rht), gibberellin (GA) response and photoperiod insensitivity (Ppd-D1a) in cv. Mercia background (rht (tall), Rht-B1b, Rht-D1b, Rht-B1c, Rht8c+Ppd-D1a, Rht-D1c, Rht12) and cv. Maris Widgeon (rht (tall), Rht-D1b, Rht-B1c) backgrounds were compared to investigate main effects and interactions with tillage (plough-based, minimum-, and zero-tillage) over two years. Both minimum- and zero- tillage were associated with reduced grain yields allied to reduced harvest index, biomass accumulation, interception of photosynthetically active radiation (PAR), and plant populations. Grain yields were optimized at mature crop heights of around 740mm because this provided the best compromise between harvest index which declined with height, and above ground biomass which increased with height. Improving biomass with height was due to improvements in both PAR interception and radiation-use efficiency. Optimum height for grain yield was unaffected by tillage system or GA-sensitivity. After accounting for effects of height, GA insensitivity was associated with increased grain yields due to increased grains per spike, which was more than enough to compensate for poorer plant establishment and lower mean grain weights compared to the GA-sensitive lines. Although better establishment was possible with GA-sensitive lines, there was no evidence that this effect interacted with tillage method. We find, therefore, little evidence to question the current adoption of wheats with reduced sensitivity to GA in the UK, even as tillage intensity lessens.
Resumo:
Placing a child in out-of-home care is one of the most important decisions made by professionals in the child care system, with substantial social, psychological, educational, medical and economic consequences. This paper considers the challenges and difficulties of building statistical models of this decision by reviewing the available international evidence. Despite the large number of empirical investigations over a 50 year period, a consensus on the variables associated with this decision is hard to identify. In addition, the individual models have low explanatory and predictive power and should not be relied on to make placement decisions. A number of reasons for this poor performance are offered, and some ways forwards suggested. This paper also aims to facilitate the emergence of a coherent and integrated international literature from the disconnected and fragmented empirical studies. Rather than one placement problem, there are many slightly different problems, and therefore it is expected that a number of related sub-literatures will emerge, each concentrating on a particular definition of the placement problem.
Resumo:
We examine to what degree we can expect to obtain accurate temperature trends for the last two decades near the surface and in the lower troposphere. We compare temperatures obtained from surface observations and radiosondes as well as satellite-based measurements from the Microwave Soundings Units (MSU), which have been adjusted for orbital decay and non-linear instrument-body effects, and reanalyses from the European Centre for Medium-Range Weather Forecasts (ERA) and the National Centre for Environmental Prediction (NCEP). In regions with abundant conventional data coverage, where the MSU has no major influence on the reanalysis, temperature anomalies obtained from microwave sounders, radiosondes and from both reanalyses agree reasonably. Where coverage is insufficient, in particular over the tropical oceans, large differences are found between the MSU and either reanalysis. These differences apparently relate to changes in the satellite data availability and to differing satellite retrieval methodologies, to which both reanalyses are quite sensitive over the oceans. For NCEP, this results from the use of raw radiances directly incorporated into the analysis, which make the reanalysis sensitive to changes in the underlying algorithms, e.g. those introduced in August 1992. For ERA, the bias-correction of the one-dimensional variational analysis may introduce an error when the satellite relative to which the correction is calculated is biased itself or when radiances change on a time scale longer than a couple of months, e.g. due to orbit decay. ERA inhomogeneities are apparent in April 1985, October/November 1986 and April 1989. These dates can be identified with the replacements of satellites. It is possible that a negative bias in the sea surface temperatures (SSTs) used in the reanalyses may have been introduced over the period of the satellite record. This could have resulted from a decrease in the number of ship measurements, a concomitant increase in the importance of satellite-derived SSTs, and a likely cold bias in the latter. Alternately, a warm bias in SSTs could have been caused by an increase in the percentage of buoy measurements (relative to deeper ship intake measurements) in the tropical Pacific. No indications for uncorrected inhomogeneities of land surface temperatures could be found. Near-surface temperatures have biases in the boundary layer in both reanalyses, presumably due to the incorrect treatment of snow cover. The increase of near-surface compared to lower tropospheric temperatures in the last two decades may be due to a combination of several factors, including high-latitude near-surface winter warming due to an enhanced NAO and upper-tropospheric cooling due to stratospheric ozone decrease.
Resumo:
A system for continuous data assimilation is presented and discussed. To simulate the dynamical development a channel version of a balanced barotropic model is used and geopotential (height) data are assimilated into the models computations as data become available. In the first experiment the updating is performed every 24th, 12th and 6th hours with a given network. The stations are distributed at random in 4 groups in order to simulate 4 areas with different density of stations. Optimum interpolation is performed for the difference between the forecast and the valid observations. The RMS-error of the analyses is reduced in time, and the error being smaller the more frequent the updating is performed. The updating every 6th hour yields an error in the analysis less than the RMS-error of the observation. In a second experiment the updating is performed by data from a moving satellite with a side-scan capability of about 15°. If the satellite data are analysed at every time step before they are introduced into the system the error of the analysis is reduced to a value below the RMS-error of the observation already after 24 hours and yields as a whole a better result than updating from a fixed network. If the satellite data are introduced without any modification the error of the analysis is reduced much slower and it takes about 4 days to reach a comparable result to the one where the data have been analysed.
Resumo:
An analysis of the attribution of past and future changes in stratospheric ozone and temperature to anthropogenic forcings is presented. The analysis is an extension of the study of Shepherd and Jonsson (2008) who analyzed chemistry-climate simulations from the Canadian Middle Atmosphere Model (CMAM) and attributed both past and future changes to changes in the external forcings, i.e. the abundances of ozone-depleting substances (ODS) and well-mixed greenhouse gases. The current study is based on a new CMAM dataset and includes two important changes. First, we account for the nonlinear radiative response to changes in CO2. It is shown that over centennial time scales the radiative response in the upper stratosphere to CO2 changes is significantly nonlinear and that failure to account for this effect leads to a significant error in the attribution. To our knowledge this nonlinearity has not been considered before in attribution analysis, including multiple linear regression studies. For the regression analysis presented here the nonlinearity was taken into account by using CO2 heating rate, rather than CO2 abundance, as the explanatory variable. This approach yields considerable corrections to the results of the previous study and can be recommended to other researchers. Second, an error in the way the CO2 forcing changes are implemented in the CMAM was corrected, which significantly affects the results for the recent past. As the radiation scheme, based on Fomichev et al. (1998), is used in several other models we provide some description of the problem and how it was fixed.
Resumo:
Considerable progress has taken place in numerical weather prediction over the last decade. It has been possible to extend predictive skills in the extra-tropics of the Northern Hemisphere during the winter from less than five days to seven days. Similar improvements, albeit on a lower level, have taken place in the Southern Hemisphere. Another example of improvement in the forecasts is the prediction of intense synoptic phenomena such as cyclogenesis which on the whole is quite successful with the most advanced operational models (Bengtsson (1989), Gadd and Kruze (1988)). A careful examination shows that there are no single causes for the improvements in predictive skill, but instead they are due to several different factors encompassing the forecasting system as a whole (Bengtsson, 1985). In this paper we will focus our attention on the role of data-assimilation and the effect it may have on reducing the initial error and hence improving the forecast. The first part of the paper contains a theoretical discussion on error growth in simple data assimilation systems, following Leith (1983). In the second part we will apply the result on actual forecast data from ECMWF. The potential for further forecast improvements within the framework of the present observing system in the two hemispheres will be discussed.
Resumo:
Within generative L2 acquisition research there is a longstanding debate as to what underlies observable differences in L1/L2 knowledge/ performance. On the one hand, Full Accessibility approaches maintain that target L2 syntactic representations (new functional categories and features) are acquirable (e.g., Schwartz & Sprouse, 1996). Conversely, Partial Accessibility approaches claim that L2 variability and/or optionality, even at advanced levels, obtains as a result of inevitable deficits in L2 narrow syntax and is conditioned upon a maturational failure in adulthood to acquire (some) new functional features (e.g., Beck, 1998; Hawkins & Chan, 1997; Hawkins & Hattori, 2006; Tsimpli & Dimitrakopoulou, 2007). The present study tests the predictions of these two sets of approaches with advanced English learners of L2 Brazilian Portuguese (n = 21) in the domain of inflected infinitives. These advanced L2 learners reliably differentiate syntactically between finite verbs, uninflected and inflected infinitives, which, as argued, only supports Full Accessibility approaches. Moreover, we will discuss how testing the domain of inflected infinitives is especially interesting in light of recent proposals that Brazilian Portuguese colloquial dialects no longer actively instantiate them (Lightfoot, 1991; Pires, 2002, 2006; Pires & Rothman, 2009; Rothman, 2007).