994 resultados para Accounting errors


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação apresentada ao Instituto Superior de Contabilidade e Administração do Porto para a obtenção do Grau de Mestre em Auditoria Orientador: Professor Doutor José da Silva Fernandes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There have been several studies on the performance of TCP controlled transfers over an infrastructure IEEE 802.11 WLAN, assuming perfect channel conditions. In this paper, we develop an analytical model for the throughput of TCP controlled file transfers over the IEEE 802.11 DCF with different packet error probabilities for the stations, accounting for the effect of packet drops on the TCP window. Our analysis proceeds by combining two models: one is an extension of the usual TCP-over-DCF model for an infrastructure WLAN, where the throughput of a station depends on the probability that the head-of-the-line packet at the Access Point belongs to that station; the second is a model for the TCP window process for connections with different drop probabilities. Iterative calculations between these models yields the head-of-the-line probabilities, and then, performance measures such as the throughputs and packet failure probabilities can be derived. We find that, due to MAC layer retransmissions, packet losses are rare even with high channel error probabilities and the stations obtain fair throughputs even when some of them have packet error probabilities as high as 0.1 or 0.2. For some restricted settings we are also able to model tail-drop loss at the AP. Although involving many approximations, the model captures the system behavior quite accurately, as compared with simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop general model-free adjustment procedures for the calculation of unbiased volatility loss functions based on practically feasible realized volatility benchmarks. The procedures, which exploit recent nonparametric asymptotic distributional results, are both easy-to-implement and highly accurate in empirically realistic situations. We also illustrate that properly accounting for the measurement errors in the volatility forecast evaluations reported in the existing literature can result in markedly higher estimates for the true degree of return volatility predictability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This note develops general model-free adjustment procedures for the calculation of unbiased volatility loss functions based on practically feasible realized volatility benchmarks. The procedures, which exploit the recent asymptotic distributional results in Barndorff-Nielsen and Shephard (2002a), are both easy to implement and highly accurate in empirically realistic situations. On properly accounting for the measurement errors in the volatility forecast evaluations reported in Andersen, Bollerslev, Diebold and Labys (2003), the adjustments result in markedly higher estimates for the true degree of return-volatility predictability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inspired by the commercial desires of global brands and retailers to access the lucrative green consumer market, carbon is increasingly being counted and made knowable at the mundane sites of everyday production and consumption, from the carbon footprint of a plastic kitchen fork to that of an online bank account. Despite the challenges of counting and making commensurable the global warming impact of a myriad of biophysical and societal activities, this desire to communicate a product or service's carbon footprint has sparked complicated carbon calculative practices and enrolled actors at literally every node of multi-scaled and vastly complex global supply chains. Against this landscape, this paper critically analyzes the counting practices that create the ‘e’ in ‘CO2e’. It is shown that, central to these practices are a series of tools, models and databases which, in building upon previous work (Eden, 2012 and Star and Griesemer, 1989) we conceptualize here as ‘boundary objects’. By enrolling everyday actors from farmers to consumers, these objects abstract and stabilize greenhouse gas emissions from their messy material and social contexts into units of CO2e which can then be translated along a product's supply chain, thereby establishing a new currency of ‘everyday supply chain carbon’. However, in making all greenhouse gas-related practices commensurable and in enrolling and stabilizing the transfer of information between multiple actors these objects oversee a process of simplification reliant upon, and subject to, a multiplicity of approximations, assumptions, errors, discrepancies and/or omissions. Further the outcomes of these tools are subject to the politicized and commercial agendas of the worlds they attempt to link, with each boundary actor inscribing different meanings to a product's carbon footprint in accordance with their specific subjectivities, commercial desires and epistemic framings. It is therefore shown that how a boundary object transforms greenhouse gas emissions into units of CO2e, is the outcome of distinct ideologies regarding ‘what’ a product's carbon footprint is and how it should be made legible. These politicized decisions, in turn, inform specific reduction activities and ultimately advance distinct, specific and increasingly durable transition pathways to a low carbon society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to put forward an innovative approach for reducing the variation between Type I and Type II errors in the context of ratio-based modeling of corporate collapse, without compromising the accuracy of the predictive model. Its contribution to the literature lies in resolving the problematic trade-off between predictive accuracy and variations between the two types of errors.

Design/methodology/approach – The methodological approach in this paper – called MCCCRA – utilizes a novel multi-classification matrix based on a combination of correlation and regression analysis, with the former being subject to optimisation criteria. In order to ascertain its accuracy in signaling collapse, MCCCRA is empirically tested against multiple discriminant analysis (MDA).

Findings –
Based on a data sample of 899 US publicly listed companies, the empirical results indicate that in addition to a high level of accuracy in signaling collapse, MCCCRA generates lower variability between Type I and Type II errors when compared to MDA.

Originality/value –
Although correlation and regression analysis are long-standing statistical tools, the optimisation constraints that are applied to the correlations are unique. Moreover, the multi-classification matrix is a first in signaling collapse. By providing economic insight into more stable financial modeling, these innovations make an original contribution to the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We empirically compare the reliability of the dividend (DIV) model, the residual income valuation (CT, GLS) model, and the abnormal earnings growth (OJ) model. We find that valuation estimates from the OJ model are generally more reliable than those from the other three models, because the residual income valuation model anchored by book value gets off to a poor start when compared with the OJ model led by capitalized next-year earnings. We adopt a 34-year sample covering from 1985 to 2013 to compare the reliability of valuation estimates via their means of absolute pricing errors (MAPE) and corresponding t statistics. We further use the switching regression of Barrios and Blanco to show that the average probability of OJ valuation estimates is greater in explaining stock prices than the DIV, CT, and GLS models. In addition, our finding that the OJ model yields more reliable estimates is robust to analysts-based and model-based earnings measures.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Within the context of debate about the state of accounting education in general, introductory accounting subjects have been the target of considerable criticism, particularly in terms of narrow content, technical focus, use of transmissive models of teaching, and inattention to the development of students‟ generic skills. This paper reports on the results of an exploratory study of these issues in introductory accounting and which involved the review of subject outlines and prescribed textbooks, and the conduct of a cross-sectional survey of the introductory accounting teaching coordinators in Australian universities (n=21). The primary aims of the study were to establish and apply benchmarks in evaluating existing curricula with respect to subject orientation, learning objectives, topics, teaching delivery, learning strategies, and assessment. The results of our study suggest that traditional approaches to subject content and delivery continue to dominate, with limited indicators of innovations to enhance the diversity and quality of learning experiences and learning outcomes.