999 resultados para Copula theory


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Overland rain retrieval using spaceborne microwave radiometer offers a myriad of complications as land presents itself as a radiometrically warm and highly variable background. Hence, land rainfall algorithms of the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) have traditionally incorporated empirical relations of microwave brightness temperature (Tb) with rain rate, rather than relying on physically based radiative transfer modeling of rainfall (as implemented in the TMI ocean algorithm). In this paper, sensitivity analysis is conducted using the Spearman rank correlation coefficient as benchmark, to estimate the best combination of TMI low-frequency channels that are highly sensitive to the near surface rainfall rate from the TRMM Precipitation Radar (PR). Results indicate that the TMI channel combinations not only contain information about rainfall wherein liquid water drops are the dominant hydrometeors but also aid in surface noise reduction over a predominantly vegetative land surface background. Furthermore, the variations of rainfall signature in these channel combinations are not understood properly due to their inherent uncertainties and highly nonlinear relationship with rainfall. Copula theory is a powerful tool to characterize the dependence between complex hydrological variables as well as aid in uncertainty modeling by ensemble generation. Hence, this paper proposes a regional model using Archimedean copulas, to study the dependence of TMI channel combinations with respect to precipitation, over the land regions of Mahanadi basin, India, using version 7 orbital data from the passive and active sensors on board TRMM, namely, TMI and PR. Studies conducted for different rainfall regimes over the study area show the suitability of Clayton and Gumbel copulas for modeling convective and stratiform rainfall types for the majority of the intraseasonal months. Furthermore, large ensembles of TMI Tb (from the most sensitive TMI channel combination) were generated conditional on various quantiles (25th, 50th, 75th, and 95th) of the convective and the stratiform rainfall. Comparatively greater ambiguity was observed to model extreme values of the convective rain type. Finally, the efficiency of the proposed model was tested by comparing the results with traditionally employed linear and quadratic models. Results reveal the superior performance of the proposed copula-based technique.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this review paper we collect several results about copula-based models, especially concerning regression models, by focusing on some insurance applications. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Matematica Aplicada e Computacional - FCT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimal design for generalized linear models has primarily focused on univariate data. Often experiments are performed that have multiple dependent responses described by regression type models, and it is of interest and of value to design the experiment for all these responses. This requires a multivariate distribution underlying a pre-chosen model for the data. Here, we consider the design of experiments for bivariate binary data which are dependent. We explore Copula functions which provide a rich and flexible class of structures to derive joint distributions for bivariate binary data. We present methods for deriving optimal experimental designs for dependent bivariate binary data using Copulas, and demonstrate that, by including the dependence between responses in the design process, more efficient parameter estimates are obtained than by the usual practice of simply designing for a single variable only. Further, we investigate the robustness of designs with respect to initial parameter estimates and Copula function, and also show the performance of compound criteria within this bivariate binary setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In dieser Arbeit werden mithilfe der Likelihood-Tiefen, eingeführt von Mizera und Müller (2004), (ausreißer-)robuste Schätzfunktionen und Tests für den unbekannten Parameter einer stetigen Dichtefunktion entwickelt. Die entwickelten Verfahren werden dann auf drei verschiedene Verteilungen angewandt. Für eindimensionale Parameter wird die Likelihood-Tiefe eines Parameters im Datensatz als das Minimum aus dem Anteil der Daten, für die die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, und dem Anteil der Daten, für die diese Ableitung nicht positiv ist, berechnet. Damit hat der Parameter die größte Tiefe, für den beide Anzahlen gleich groß sind. Dieser wird zunächst als Schätzer gewählt, da die Likelihood-Tiefe ein Maß dafür sein soll, wie gut ein Parameter zum Datensatz passt. Asymptotisch hat der Parameter die größte Tiefe, für den die Wahrscheinlichkeit, dass für eine Beobachtung die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, gleich einhalb ist. Wenn dies für den zu Grunde liegenden Parameter nicht der Fall ist, ist der Schätzer basierend auf der Likelihood-Tiefe verfälscht. In dieser Arbeit wird gezeigt, wie diese Verfälschung korrigiert werden kann sodass die korrigierten Schätzer konsistente Schätzungen bilden. Zur Entwicklung von Tests für den Parameter, wird die von Müller (2005) entwickelte Simplex Likelihood-Tiefe, die eine U-Statistik ist, benutzt. Es zeigt sich, dass für dieselben Verteilungen, für die die Likelihood-Tiefe verfälschte Schätzer liefert, die Simplex Likelihood-Tiefe eine unverfälschte U-Statistik ist. Damit ist insbesondere die asymptotische Verteilung bekannt und es lassen sich Tests für verschiedene Hypothesen formulieren. Die Verschiebung in der Tiefe führt aber für einige Hypothesen zu einer schlechten Güte des zugehörigen Tests. Es werden daher korrigierte Tests eingeführt und Voraussetzungen angegeben, unter denen diese dann konsistent sind. Die Arbeit besteht aus zwei Teilen. Im ersten Teil der Arbeit wird die allgemeine Theorie über die Schätzfunktionen und Tests dargestellt und zudem deren jeweiligen Konsistenz gezeigt. Im zweiten Teil wird die Theorie auf drei verschiedene Verteilungen angewandt: Die Weibull-Verteilung, die Gauß- und die Gumbel-Copula. Damit wird gezeigt, wie die Verfahren des ersten Teils genutzt werden können, um (robuste) konsistente Schätzfunktionen und Tests für den unbekannten Parameter der Verteilung herzuleiten. Insgesamt zeigt sich, dass für die drei Verteilungen mithilfe der Likelihood-Tiefen robuste Schätzfunktionen und Tests gefunden werden können. In unverfälschten Daten sind vorhandene Standardmethoden zum Teil überlegen, jedoch zeigt sich der Vorteil der neuen Methoden in kontaminierten Daten und Daten mit Ausreißern.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Women with a disability continue to experience social oppression and domestic violence as a consequence of gender and disability dimensions. Current explanations of domestic violence and disability inadequately explain several features that lead women who have a disability to experience violent situations. This article incorporates both disability and material feminist theory as an alternative explanation to the dominant approaches (psychological and sociological traditions) of conceptualising domestic violence. This paper is informed by a study which was concerned with examining the nature and perceptions of violence against women with a physical impairment. The emerging analytical framework integrating material feminist interpretations and disability theory provided a basis for exploring gender and disability dimensions. Insight was also provided by the women who identified as having a disability in the study and who explained domestic violence in terms of a gendered and disabling experience. The article argues that material feminist interpretations and disability theory, with their emphasis on gender relations, disablism and poverty, should be used as an alternative tool for exploring the nature and consequences of violence against women with a disability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study develops a life-cycle model where investors make investment decisions in a realistic environment. Model results show that personal illiquid projects (housing and children), fixed costs (once-off/per-period participation costs plus variable/fixed transaction costs) and endogenous risky human capital (with permanent, transitory and disastrous shocks) together are able to address both the non-participation puzzle and the age-effects puzzle. Empirical implications of the model are examined using Heckman’s two-step method with the latest five Surveys of Consumer Finance (SCF). Regression results show that liquidity, informational cost and human capital are indeed the major determinants of participation and asset allocation decisions at different stages of an investor’s life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The issue of ‘rigour vs. relevance’ in IS research has generated an intense, heated debate for over a decade. It is possible to identify, however, only a limited number of contributions on how to increase the relevance of IS research without compromising its rigour. Based on a lifecycle view of IS research, we propose the notion of ‘reality checks’ in order to review IS research outcomes in the light of actual industry demands. We assume that five barriers impact the efficient transfer of IS research outcomes; they are lack of awareness, lack of understandability, lack of relevance, lack of timeliness, and lack of applicability. In seeking to understand the effect of these barriers on the transfer of mature IS research into practice, we used focus groups. We chose DeLone and McLean’s IS success model as our stimulus because it is one of the more widely researched areas of IS.