136 resultados para inaccuracy
Resumo:
This paper demonstrates that the use of GARCH-type models for the calculation of minimum capital risk requirements (MCRRs) may lead to the production of inaccurate and therefore inefficient capital requirements. We show that this inaccuracy stems from the fact that GARCH models typically overstate the degree of persistence in return volatility. A simple modification to the model is found to improve the accuracy of MCRR estimates in both back- and out-of-sample tests. Given that internal risk management models are currently in widespread usage in some parts of the world (most notably the USA), and will soon be permitted for EC banks and investment firms, we believe that our paper should serve as a valuable caution to risk management practitioners who are using, or intend to use this popular class of models.
Resumo:
Existing empirical evidence has frequently observed that professional forecasters are conservative and display herding behaviour. Whilst a large number of papers have considered equities as well as macroeconomic series, few have considered the accuracy of forecasts in alternative asset classes such as real estate. We consider the accuracy of forecasts for the UK commercial real estate market over the period 1999-2011. The results illustrate that forecasters display a tendency to under-estimate growth rates during strong market conditions and over-estimate when the market is performing poorly. This conservatism not only results in smoothed estimates but also implies that forecasters display herding behaviour. There is also a marked difference in the relative accuracy of capital and total returns versus rental figures. Whilst rental growth forecasts are relatively accurate, considerable inaccuracy is observed with respect to capital value and total returns.
Resumo:
Cool materials are characterized by high solar reflectance and high thermal emittance; when applied to the external surface of a roof, they make it possible to limit the amount of solar irradiance absorbed by the roof, and to increase the rate of heat flux emitted by irradiation to the environment, especially during nighttime. However, a roof also releases heat by convection on its external surface; this mechanism is not negligible, and an incorrect evaluation of its entity might introduce significant inaccuracy in the assessment of the thermal performance of a cool roof, in terms of surface temperature and rate of heat flux transferred to the indoors. This issue is particularly relevant in numerical simulations, which are essential in the design stage, therefore it deserves adequate attention. In the present paper, a review of the most common algorithms used for the calculation of the convective heat transfer coefficient due to wind on horizontal building surfaces is presented. Then, with reference to a case study in Italy, the simulated results are compared to the outcomes of a measurement campaign. Hence, the most appropriate algorithms for the convective coefficient are identified, and the errors deriving by an incorrect selection of this coefficient are discussed.
Resumo:
This paper integrates research on child simultaneous bilingual (2L1) acquisition more directly into the heritage language (HL) acquisition literature. The 2L1 literature mostly focuses on development in childhood, whereas heritage speakers (HSs) are often tested at an endstate in adulthood. However, insights from child 2L1 acquisition must be considered in HL acquisition theorizing precisely because many HSs are the adult outcomes of child 2L1 acquisition. Data from 2L1 acquisition raises serious questions for the construct of incomplete acquisition, a term broadly used in HL acquisition studies to describe almost any difference HSs display from baseline controls (usually monolinguals). We offer an epistemological discussion related to incomplete acquisition, highlighting the descriptive and theoretical inaccuracy of the term. We focus our discussion on two of several possible causal factors that contribute to variable competence outcomes in adult HSs, input (e.g., Sorace, 2004; Rothman, 2007; Pascual y Cabo & Rothman, 2012) and formal instruction (e.g., Kupisch, 2013; Kupisch et al., 2014) in the HL. We conclude by offering alternative terminology for HS outcomes.
Resumo:
GPS technology has been embedded into portable, low-cost electronic devices nowadays to track the movements of mobile objects. This implication has greatly impacted the transportation field by creating a novel and rich source of traffic data on the road network. Although the promise offered by GPS devices to overcome problems like underreporting, respondent fatigue, inaccuracies and other human errors in data collection is significant; the technology is still relatively new that it raises many issues for potential users. These issues tend to revolve around the following areas: reliability, data processing and the related application. This thesis aims to study the GPS tracking form the methodological, technical and practical aspects. It first evaluates the reliability of GPS based traffic data based on data from an experiment containing three different traffic modes (car, bike and bus) traveling along the road network. It then outline the general procedure for processing GPS tracking data and discuss related issues that are uncovered by using real-world GPS tracking data of 316 cars. Thirdly, it investigates the influence of road network density in finding optimal location for enhancing travel efficiency and decreasing travel cost. The results show that the geographical positioning is reliable. Velocity is slightly underestimated, whereas altitude measurements are unreliable.Post processing techniques with auxiliary information is found necessary and important when solving the inaccuracy of GPS data. The densities of the road network influence the finding of optimal locations. The influence will stabilize at a certain level and do not deteriorate when the node density is higher.
Resumo:
Digital elevation model (DEM) plays a substantial role in hydrological study, from understanding the catchment characteristics, setting up a hydrological model to mapping the flood risk in the region. Depending on the nature of study and its objectives, high resolution and reliable DEM is often desired to set up a sound hydrological model. However, such source of good DEM is not always available and it is generally high-priced. Obtained through radar based remote sensing, Shuttle Radar Topography Mission (SRTM) is a publicly available DEM with resolution of 92m outside US. It is a great source of DEM where no surveyed DEM is available. However, apart from the coarse resolution, SRTM suffers from inaccuracy especially on area with dense vegetation coverage due to the limitation of radar signals not penetrating through canopy. This will lead to the improper setup of the model as well as the erroneous mapping of flood risk. This paper attempts on improving SRTM dataset, using Normalised Difference Vegetation Index (NDVI), derived from Visible Red and Near Infra-Red band obtained from Landsat with resolution of 30m, and Artificial Neural Networks (ANN). The assessment of the improvement and the applicability of this method in hydrology would be highlighted and discussed.
Resumo:
This study aims to examine the contemporary viewer and also propose a review of the literature surrounding the spanish researcher Jésus Martín-Barbero. The proposed review, based on Martín-Barbero observations and the Latin American Cultural Studies analysis, observes the individual as part of the processes of communication and research on the concept of cultural mediation, the viewer as component/agent/object of the production of meaning. The study will perceive it as an element of inaccuracy/inadequacy and understand it as a subject of these reactive processes that contain complex and involve communication and culture, in contrast the impressions that resulted from a discussion between a television professional and a communication researcher and of these findings on the same viewer. The work also aims to point the viewer as a component of the possible mapping desired by Martín-Barbero to massive contemporary interpretation of the processes and its concept of night map
Resumo:
The idea of considering imprecision in probabilities is old, beginning with the Booles George work, who in 1854 wanted to reconcile the classical logic, which allows the modeling of complete ignorance, with probabilities. In 1921, John Maynard Keynes in his book made explicit use of intervals to represent the imprecision in probabilities. But only from the work ofWalley in 1991 that were established principles that should be respected by a probability theory that deals with inaccuracies. With the emergence of the theory of fuzzy sets by Lotfi Zadeh in 1965, there is another way of dealing with uncertainty and imprecision of concepts. Quickly, they began to propose several ways to consider the ideas of Zadeh in probabilities, to deal with inaccuracies, either in the events associated with the probabilities or in the values of probabilities. In particular, James Buckley, from 2003 begins to develop a probability theory in which the fuzzy values of the probabilities are fuzzy numbers. This fuzzy probability, follows analogous principles to Walley imprecise probabilities. On the other hand, the uses of real numbers between 0 and 1 as truth degrees, as originally proposed by Zadeh, has the drawback to use very precise values for dealing with uncertainties (as one can distinguish a fairly element satisfies a property with a 0.423 level of something that meets with grade 0.424?). This motivated the development of several extensions of fuzzy set theory which includes some kind of inaccuracy. This work consider the Krassimir Atanassov extension proposed in 1983, which add an extra degree of uncertainty to model the moment of hesitation to assign the membership degree, and therefore a value indicate the degree to which the object belongs to the set while the other, the degree to which it not belongs to the set. In the Zadeh fuzzy set theory, this non membership degree is, by default, the complement of the membership degree. Thus, in this approach the non-membership degree is somehow independent of the membership degree, and this difference between the non-membership degree and the complement of the membership degree reveals the hesitation at the moment to assign a membership degree. This new extension today is called of Atanassov s intuitionistic fuzzy sets theory. It is worth noting that the term intuitionistic here has no relation to the term intuitionistic as known in the context of intuitionistic logic. In this work, will be developed two proposals for interval probability: the restricted interval probability and the unrestricted interval probability, are also introduced two notions of fuzzy probability: the constrained fuzzy probability and the unconstrained fuzzy probability and will eventually be introduced two notions of intuitionistic fuzzy probability: the restricted intuitionistic fuzzy probability and the unrestricted intuitionistic fuzzy probability
Resumo:
Reactive-optimisation procedures are responsible for the minimisation of online power losses in interconnected systems. These procedures are performed separately at each control centre and involve external network representations. If total losses can be minimised by the implementation of calculated local control actions, the entire system benefits economically, but such control actions generally result in a certain degree of inaccuracy, owing to errors in the modelling of the external system. Since these errors are inevitable, they must at least be maintained within tolerable limits by external-modelling approaches. Care must be taken to avoid unrealistic loss minimisation, as the local-control actions adopted can lead the system to points of operation which will be less economical for the interconnected system as a whole. The evaluation of the economic impact of the external modelling during reactive-optimisation procedures in interconnected systems, in terms of both the amount of losses and constraint violations, becomes important in this context. In the paper, an analytical approach is proposed for such an evaluation. Case studies using data from the Brazilian South-Southeast system (810 buses) have been carried out to compare two different external-modelling approaches, both derived from the equivalent-optimal-power-flow (EOPF) model. Results obtained show that, depending on the external-model representation adopted, the loss representation can be flawed. Results also suggest some modelling features that should be adopted in the EOPF model to enhance the economy of the overall system.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Ciências Cartográficas - FCT