959 resultados para Unconditional and Conditional Grants,


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Dirichlet family owes its privileged status within simplex distributions to easyness of interpretation and good mathematical properties. In particular, we recall fundamental properties for the analysis of compositional data such as closure under amalgamation and subcomposition. From a probabilistic point of view, it is characterised (uniquely) by a variety of independence relationships which makes it indisputably the reference model for expressing the non trivial idea of substantial independence for compositions. Indeed, its well known inadequacy as a general model for compositional data stems from such an independence structure together with the poorness of its parametrisation. In this paper a new class of distributions (called Flexible Dirichlet) capable of handling various dependence structures and containing the Dirichlet as a special case is presented. The new model exhibits a considerably richer parametrisation which, for example, allows to model the means and (part of) the variance-covariance matrix separately. Moreover, such a model preserves some good mathematical properties of the Dirichlet, i.e. closure under amalgamation and subcomposition with new parameters simply related to the parent composition parameters. Furthermore, the joint and conditional distributions of subcompositions and relative totals can be expressed as simple mixtures of two Flexible Dirichlet distributions. The basis generating the Flexible Dirichlet, though keeping compositional invariance, shows a dependence structure which allows various forms of partitional dependence to be contemplated by the model (e.g. non-neutrality, subcompositional dependence and subcompositional non-invariance), independence cases being identified by suitable parameter configurations. In particular, within this model substantial independence among subsets of components of the composition naturally occurs when the subsets have a Dirichlet distribution

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lecture notes in PDF

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exam questions and solutions in PDF

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lecture notes in LaTex

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exercises and solutions in LaTex

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exam questions and solutions in LaTex

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introducción. La depresión mayor es una enfermedad frecuente y compleja de origen poligénico. Dada su importancia en la fisiopatología y terapéutica de la enfermedad, se ha demostrado que el gen que codifica para el transportador de serotonina (5-HTT) está asociado con el desarrollo de la enfermedad. Se realizó un estudio para evaluar la asociación entre polimorfismos del gen 5-HTT y trastorno depresivo mayor. Métodos. Estudio de casos y controles pareados 1:1. Los participantes se clasificaron a partir de la entrevista estructurada del DSM-IV-TR. Los resultados fueron analizados con OR de McNemar y ji-cuadrado y pruebas exactas pareadas. Se utilizó la regresión logística condicional. Se evaluó la presencia de Equilibrio de Hardy-Weinberg con ji-cuadrado de Pearson. Resultados. Se evaluaron 69 casos y 69 controles, cuyas características socio-demográficas y clínicas fueron similares a lo reportado previamente en la literatura. La muestra se encontraba en equilibrio de Hardy-Weinberg. No se encontró asociación estadísticamente significativa entre trastorno depresivo mayor y polimorfismos del gen 5-HTT en general, aunque se encontró la asociación en sujetos de 37 años y menos. Conclusión. Los resultados son similares a lo previamente reportado por otros estudios en pacientes colombianos con trastorno bipolar, lo cual sugiere que en esta población no hay asociación entre trastornos afectivos y polimorfismos del gen 5-HTT. Se necesitan más estudios en el área.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este trabajo se implementa una metodología para incluir momentos de orden superior en la selección de portafolios, haciendo uso de la Distribución Hiperbólica Generalizada, para posteriormente hacer un análisis comparativo frente al modelo de Markowitz.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The structure of turbulent flow over large roughness consisting of regular arrays of cubical obstacles is investigated numerically under constant pressure gradient conditions. Results are analysed in terms of first- and second-order statistics, by visualization of instantaneous flow fields and by conditional averaging. The accuracy of the simulations is established by detailed comparisons of first- and second-order statistics with wind-tunnel measurements. Coherent structures in the log region are investigated. Structure angles are computed from two-point correlations, and quadrant analysis is performed to determine the relative importance of Q2 and Q4 events (ejections and sweeps) as a function of height above the roughness. Flow visualization shows the existence of low-momentum regions (LMRs) as well as vortical structures throughout the log layer. Filtering techniques are used to reveal instantaneous examples of the association of the vortices with the LMRs, and linear stochastic estimation and conditional averaging are employed to deduce their statistical properties. The conditional averaging results reveal the presence of LMRs and regions of Q2 and Q4 events that appear to be associated with hairpin-like vortices, but a quantitative correspondence between the sizes of the vortices and those of the LMRs is difficult to establish; a simple estimate of the ratio of the vortex width to the LMR width gives a value that is several times larger than the corresponding ratio over smooth walls. The shape and inclination of the vortices and their spatial organization are compared to recent findings over smooth walls. Characteristic length scales are shown to scale linearly with height in the log region. Whilst there are striking qualitative similarities with smooth walls, there are also important differences in detail regarding: (i) structure angles and sizes and their dependence on distance from the rough surface; (ii) the flow structure close to the roughness; (iii) the roles of inflows into and outflows from cavities within the roughness; (iv) larger vortices on the rough wall compared to the smooth wall; (v) the effect of the different generation mechanism at the wall in setting the scales of structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In previous empirical and modelling studies of rare species and weeds, evidence of fractal behaviour has been found. We propose that weeds in modern agricultural systems may be managed close to critical population dynamic thresholds, below which their rates of increase will be negative and where scale-invariance may be expected as a consequence. We collected detailed spatial data on five contrasting species over a period of three years in a primarily arable field. Counts in 20×20 cm contiguous quadrats, 225,000 in 1998 and 84,375 thereafter, could be re-structured into a wide range of larger quadrat sizes. These were analysed using three methods based on correlation sum, incidence and conditional incidence. We found non-trivial scale invariance for species occurring at low mean densities and where they were strongly aggregated. The fact that the scale-invariance was not found for widespread species occurring at higher densities suggests that the scaling in agricultural weed populations may, indeed, be related to critical phenomena.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new Bayesian algorithm for retrieving surface rain rate from Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) over the ocean is presented, along with validations against estimates from the TRMM Precipitation Radar (PR). The Bayesian approach offers a rigorous basis for optimally combining multichannel observations with prior knowledge. While other rain-rate algorithms have been published that are based at least partly on Bayesian reasoning, this is believed to be the first self-contained algorithm that fully exploits Bayes’s theorem to yield not just a single rain rate, but rather a continuous posterior probability distribution of rain rate. To advance the understanding of theoretical benefits of the Bayesian approach, sensitivity analyses have been conducted based on two synthetic datasets for which the “true” conditional and prior distribution are known. Results demonstrate that even when the prior and conditional likelihoods are specified perfectly, biased retrievals may occur at high rain rates. This bias is not the result of a defect of the Bayesian formalism, but rather represents the expected outcome when the physical constraint imposed by the radiometric observations is weak owing to saturation effects. It is also suggested that both the choice of the estimators and the prior information are crucial to the retrieval. In addition, the performance of the Bayesian algorithm herein is found to be comparable to that of other benchmark algorithms in real-world applications, while having the additional advantage of providing a complete continuous posterior probability distribution of surface rain rate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article examines the characteristics of key measures of volatility for different types of futures contracts to provide a better foundation for modeling volatility behavior and derivative values. Particular attention is focused on analyzing how different measures of volatility affect volatility persistence relationships. Intraday realized measures of volatility are found to be more persistent than daily measures, the type of GARCH procedure used for conditional volatility analysis is critical, and realized volatility persistence is not coherent with conditional volatility persistence. Specifically, although there is a good fit between the realized and conditional volatilities, no coherence exists between their degrees of persistence, a counterintuitive finding that shows realized and conditional volatility measures are not a substitute for one another

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decadal predictions have a high profile in the climate science community and beyond, yet very little is known about their skill. Nor is there any agreed protocol for estimating their skill. This paper proposes a sound and coordinated framework for verification of decadal hindcast experiments. The framework is illustrated for decadal hindcasts tailored to meet the requirements and specifications of CMIP5 (Coupled Model Intercomparison Project phase 5). The chosen metrics address key questions about the information content in initialized decadal hindcasts. These questions are: (1) Do the initial conditions in the hindcasts lead to more accurate predictions of the climate, compared to un-initialized climate change projections? and (2) Is the prediction model’s ensemble spread an appropriate representation of forecast uncertainty on average? The first question is addressed through deterministic metrics that compare the initialized and uninitialized hindcasts. The second question is addressed through a probabilistic metric applied to the initialized hindcasts and comparing different ways to ascribe forecast uncertainty. Verification is advocated at smoothed regional scales that can illuminate broad areas of predictability, as well as at the grid scale, since many users of the decadal prediction experiments who feed the climate data into applications or decision models will use the data at grid scale, or downscale it to even higher resolution. An overall statement on skill of CMIP5 decadal hindcasts is not the aim of this paper. The results presented are only illustrative of the framework, which would enable such studies. However, broad conclusions that are beginning to emerge from the CMIP5 results include (1) Most predictability at the interannual-to-decadal scale, relative to climatological averages, comes from external forcing, particularly for temperature; (2) though moderate, additional skill is added by the initial conditions over what is imparted by external forcing alone; however, the impact of initialization may result in overall worse predictions in some regions than provided by uninitialized climate change projections; (3) limited hindcast records and the dearth of climate-quality observational data impede our ability to quantify expected skill as well as model biases; and (4) as is common to seasonal-to-interannual model predictions, the spread of the ensemble members is not necessarily a good representation of forecast uncertainty. The authors recommend that this framework be adopted to serve as a starting point to compare prediction quality across prediction systems. The framework can provide a baseline against which future improvements can be quantified. The framework also provides guidance on the use of these model predictions, which differ in fundamental ways from the climate change projections that much of the community has become familiar with, including adjustment of mean and conditional biases, and consideration of how to best approach forecast uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An idealised modelling study of sting-jet cyclones is presented. Sting jets are descending mesoscale jets that occur in some extratropical cyclones and produce localised regions of strong low-level winds in the frontal fracture region. Moist baroclinic lifecycle (LC1) simulations are performed with modifications to produce cyclones resembling observed sting-jet cyclones. A sting jet exists in the idealised control cyclone with similar characteristics to the sting jet in a simulation of windstorm Gudrun (a confirmed sting-jet case). Unlike in windstorm Gudrun, a low-level layer of strong moist static stability prohibits the descent of the strong winds from above the boundary layer to the surface in the idealised case. Conditional symmetric instability (CSI) exists in the cloud head and dissipates as the sting jet leaves the cloud head and descends. The descending, initially moist, sting-jet trajectories consistently have negative or near-zero saturated moist potential vorticity but moist static stability and inertial stability, consistent with CSI release; the moist static stability becomes negative during the period of most rapid descent, by which time the air is relatively dry implying conditional instability release is unlikely. Sensitivity experiments show that the existence of the sting jet is robust to changes in the initial state, and that the initial tropospheric static stability significantly impacts the descent rate of the sting jet. Inertial and conditional instability are probably being released in the experiment with the weakest initial static stability. This suggests that sting jets can arise through the release of all three instabilities associated with negative saturated moist potential vorticity. While evaporative cooling occurs along the sting-jet trajectories, a sensitivity experiment with evaporation effects turned off shows no significant change to the wind strength or descent rate of the sting jet implying that instability release is the dominant sting-jet driving mechanism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Techniques are proposed for evaluating forecast probabilities of events. The tools are especially useful when, as in the case of the Survey of Professional Forecasters (SPF) expected probability distributions of inflation, recourse cannot be made to the method of construction in the evaluation of the forecasts. The tests of efficiency and conditional efficiency are applied to the forecast probabilities of events of interest derived from the SPF distributions, and supplement a whole-density evaluation of the SPF distributions based on the probability integral transform approach.