916 resultados para Insider econometrics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Maximum-likelihood estimates of the parameters of stochastic differential equations are consistent and asymptotically efficient, but unfortunately difficult to obtain if a closed-form expression for the transitional probability density function of the process is not available. As a result, a large number of competing estimation procedures have been proposed. This article provides a critical evaluation of the various estimation techniques. Special attention is given to the ease of implementation and comparative performance of the procedures when estimating the parameters of the Cox–Ingersoll–Ross and Ornstein–Uhlenbeck equations respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A pervasive and puzzling feature of banks’ Value-at-Risk (VaR) is its abnormally high level, which leads to excessive regulatory capital. A possible explanation for the tendency of commercial banks to overstate their VaR is that they incompletely account for the diversification effect among broad risk categories (e.g., equity, interest rate, commodity, credit spread, and foreign exchange). By underestimating the diversification effect, bank’s proprietary VaR models produce overly prudent market risk assessments. In this paper, we examine empirically the validity of this hypothesis using actual VaR data from major US commercial banks. In contrast to the VaR diversification hypothesis, we find that US banks show no sign of systematic underestimation of the diversification effect. In particular, diversification effects used by banks is very close to (and quite often larger than) our empirical diversification estimates. A direct implication of this finding is that individual VaRs for each broad risk category, just like aggregate VaRs, are biased risk assessments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we study both the level of Value-at-Risk (VaR) disclosure and the accuracy of the disclosed VaR figures for a sample of US and international commercial banks. To measure the level of VaR disclosures, we develop a VaR Disclosure Index that captures many different facets of market risk disclosure. Using panel data over the period 1996–2005, we find an overall upward trend in the quantity of information released to the public. We also find that Historical Simulation is by far the most popular VaR method. We assess the accuracy of VaR figures by studying the number of VaR exceedances and whether actual daily VaRs contain information about the volatility of subsequent trading revenues. Unlike the level of VaR disclosure, the quality of VaR disclosure shows no sign of improvement over time. We find that VaR computed using Historical Simulation contains very little information about future volatility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many of the classification algorithms developed in the machine learning literature, including the support vector machine and boosting, can be viewed as minimum contrast methods that minimize a convex surrogate of the 0–1 loss function. The convexity makes these algorithms computationally efficient. The use of a surrogate, however, has statistical consequences that must be balanced against the computational virtues of convexity. To study these issues, we provide a general quantitative relationship between the risk as assessed using the 0–1 loss and the risk as assessed using any nonnegative surrogate loss function. We show that this relationship gives nontrivial upper bounds on excess risk under the weakest possible condition on the loss function—that it satisfies a pointwise form of Fisher consistency for classification. The relationship is based on a simple variational transformation of the loss function that is easy to compute in many applications. We also present a refined version of this result in the case of low noise, and show that in this case, strictly convex loss functions lead to faster rates of convergence of the risk than would be implied by standard uniform convergence arguments. Finally, we present applications of our results to the estimation of convergence rates in function classes that are scaled convex hulls of a finite-dimensional base class, with a variety of commonly used loss functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the surprising recurring phenomena observed in experiments with boosting is that the test error of the generated classifier usually does not increase as its size becomes very large, and often is observed to decrease even after the training error reaches zero. In this paper, we show that this phenomenon is related to the distribution of margins of the training examples with respect to the generated voting classification rule, where the margin of an example is simply the difference between the number of correct votes and the maximum number of votes received by any incorrect label. We show that techniques used in the analysis of Vapnik's support vector classifiers and of neural networks with small weights can be applied to voting methods to relate the margin distribution to the test error. We also show theoretically and experimentally that boosting is especially effective at increasing the margins of the training examples. Finally, we compare our explanation to those based on the bias-variance decomposition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider complexity penalization methods for model selection. These methods aim to choose a model to optimally trade off estimation and approximation errors by minimizing the sum of an empirical risk term and a complexity penalty. It is well known that if we use a bound on the maximal deviation between empirical and true risks as a complexity penalty, then the risk of our choice is no more than the approximation error plus twice the complexity penalty. There are many cases, however, where complexity penalties like this give loose upper bounds on the estimation error. In particular, if we choose a function from a suitably simple convex function class with a strictly convex loss function, then the estimation error (the difference between the risk of the empirical risk minimizer and the minimal risk in the class) approaches zero at a faster rate than the maximal deviation between empirical and true risks. In this paper, we address the question of whether it is possible to design a complexity penalized model selection method for these situations. We show that, provided the sequence of models is ordered by inclusion, in these cases we can use tight upper bounds on estimation error as a complexity penalty. Surprisingly, this is the case even in situations when the difference between the empirical risk and true risk (and indeed the error of any estimate of the approximation error) decreases much more slowly than the complexity penalty. We give an oracle inequality showing that the resulting model selection method chooses a function with risk no more than the approximation error plus a constant times the complexity penalty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification and prediction with convex function classes, and with kernel classes in particular.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In dynamic and uncertain environments such as healthcare, where the needs of security and information availability are difficult to balance, an access control approach based on a static policy will be suboptimal regardless of how comprehensive it is. The uncertainty stems from the unpredictability of users’ operational needs as well as their private incentives to misuse permissions. In Role Based Access Control (RBAC), a user’s legitimate access request may be denied because its need has not been anticipated by the security administrator. Alternatively, even when the policy is correctly specified an authorised user may accidentally or intentionally misuse the granted permission. This paper introduces a novel approach to access control under uncertainty and presents it in the context of RBAC. By taking insights from the field of economics, in particular the insurance literature, we propose a formal model where the value of resources are explicitly defined and an RBAC policy (entailing those predictable access needs) is only used as a reference point to determine the price each user has to pay for access, as opposed to representing hard and fast rules that are always rigidly applied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In dynamic and uncertain environments, where the needs of security and information availability are difficult to balance, an access control approach based on a static policy will be suboptimal regardless of how comprehensive it is. Risk-based approaches to access control attempt to address this problem by allocating a limited budget to users, through which they pay for the exceptions deemed necessary. So far the primary focus has been on how to incorporate the notion of budget into access control rather than what or if there is an optimal amount of budget to allocate to users. In this paper we discuss the problems that arise from a sub-optimal allocation of budget and introduce a generalised characterisation of an optimal budget allocation function that maximises organisations expected benefit in the presence of self-interested employees and costly audit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Women are substantially under-represented in the professoriate in Australia with a ratio of one female professor to every three male professors. This gender imbalance has been an ongoing concern with various affirmative action programs implemented in universities but to limited effect. Hence, there is a need to investigate the catalysts for and inhibitors to women’s ascent to the professoriate. This investigation focussed on women appointed to the professoriate between 2005, when a research quality assessment was first proposed, and 2008. Henceforth, these women are referred to as “New Women Professors”. The catalysts and inhibitors in these women’s careers were investigated through an electronic survey and focus group interviews. The survey was administered to new women professors (n=255) and new men professors (n=240) to enable a comparison of responses. However, only women participated in focus group discussions (n=21). An analysis of the survey and interview data revealed that the most critical catalysts for women’s advancement to the professoriate were equal employment opportunities and mentoring. Equal opportunity initiatives provided women with access to traditionally male-dominated forums. Mentoring gave women an insider perspective on the complexity of academia and the politics of the academy. The key inhibitors to women’s career advancement were negative discrimination, the culture of the boys’ club, the tension between personal and professional life, and isolation. Negative discrimination and the boys’ club are problematic because they favour men and marginalise women. The tension between personal and professional life is a particular concern for women who bear children and typically assume the major role in a family for child rearing. Isolation was a concern for both women and men with isolation appearing to increase after ascent to the professoriate. Knowledge of the significant catalysts and inhibitors provides a pragmatic way to orient universities towards redressing the gender balance in the professoriate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The majority of Australians will work, sleep and die in the garments of the mass market. Yet, as Ian Griffiths has termed it, the designers of these garments are ‘invisible’ (2000). To the general public, the values, opinions and individual design processes of these designers are as unknown as their names. However, the designer’s role is crucial in making decisions which will have impacts throughout the life of the garment. The high product volume within the mass market ensures that even a small decision in the design process to source a particular fabric, or to use a certain trim or textile finish, can have a profound environmental or social effect. While big companies in Australia have implemented some visible strategies for sustainability, it is uncertain how these may have flowed through to design practices. To explore this question, this presentation will discuss preliminary findings from in-depth semi-structured interviews with Australian mass market fashion designers and product developers. The aim of the interviews was to hear the voice of the insider – to listen to mass market designers describe their design process, discuss the Australian fashion industry and its future challenges and opportunities, and to comment on what a ‘sustainability’ for their industry could look like. These interviews will be discussed within the framework of design philosopher Tony Fry’s writing on design redirection for sustainability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two-party key exchange (2PKE) protocols have been rigorously analyzed under various models considering different adversarial actions. However, the analysis of group key exchange (GKE) protocols has not been as extensive as that of 2PKE protocols. Particularly, an important security attribute called key compromise impersonation (KCI) resilience has been completely ignored for the case of GKE protocols. Informally, a protocol is said to provide KCI resilience if the compromise of the long-term secret key of a protocol participant A does not allow the adversary to impersonate an honest participant B to A. In this paper, we argue that KCI resilience for GKE protocols is at least as important as it is for 2PKE protocols. Our first contribution is revised definitions of security for GKE protocols considering KCI attacks by both outsider and insider adversaries. We also give a new proof of security for an existing two-round GKE protocol under the revised security definitions assuming random oracles. We then show how to achieve insider KCIR in a generic way using a known compiler in the literature. As one may expect, this additional security assurance comes at the cost of an extra round of communication. Finally, we show that a few existing protocols are not secure against outsider KCI attacks. The attacks on these protocols illustrate the necessity of considering KCI resilience for GKE protocols.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A considerable body of knowledge has been constructed perpetuating the notion single parenthood is a significant problem for society, and while this is supported by specific research designs and sampling practices, it is also maintained by two key discourses. The first constitutes single parenthood as a deficit, while the second identifies it as a risk. In both cases, these discourses are operationalised by the philosophy of neo-liberalism, which envisions good citizenship as economic autonomy. Historically, it has been the convergence of the risk and deficit discourses that has constituted single parenthood as a social problem. More recently, however, risk discourses have come to dominate thinking about single parenthood. As a result, this thesis terms risk discourses as dominant discourses. As dominant discourses, risk sidelines or discounts other ways of thinking about single parenthood. While a few exceptions are notable, including some feminist, poststructural and family resilience scholars, most researchers appear unable to see past the positioning of these discourses and envision another way of being for parents who are single. This means that alternative subjectivities are obscured and have limited influence in this field of research. Because this thesis aimed to problematise dominant subjectivities of single parenthood, a poststructural Foucauldian framework has been utilized in order to document the discursive constructions of single parenthood through literature, insider discourses, and outsider discourses. For the purposes of this thesis, outsider discourses are constituted as those outside the subjectivities of single parenthood, such as media and research discourses. An examination of the Australian media has been undertaken over a one year period, the results of which form the basis for the analysis of media discourses of single parenthood. Parents who are single were also targeted for self selection into this project to provide insider discourses about single parenthood. This analysis explored how respondents negotiated the discourses of single parenthood and how they themselves used or rejected the subjectivities constructed for them via these discourses to constitute their own subjectivities. This thesis aimed to explore the role of discourses in the construction of individuals' subjectivities. Specifically, it draws attention to the way in which knowledge and power work through discourses to emphasize what is allowable, both publicly and privately, in relation to single parenthood. Most importantly, this thesis offers alternative subjectivities for single parenthood to facilitate new ways of thinking about parents who are single.