885 resultados para SLASHED HALF-NORMAL DISTRIBUTION


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis deals with inflation theory, focussing on the model of Jarrow & Yildirim, which is nowadays used when pricing inflation derivatives. After recalling main results about short and forward interest rate models, the dynamics of the main components of the market are derived. Then the most important inflation-indexed derivatives are explained (zero coupon swap, year-on-year, cap and floor), and their pricing proceeding is shown step by step. Calibration is explained and performed with a common method and an heuristic and non standard one. The model is enriched with credit risk, too, which allows to take into account the possibility of bankrupt of the counterparty of a contract. In this context, the general method of pricing is derived, with the introduction of defaultable zero-coupon bonds, and the Monte Carlo method is treated in detailed and used to price a concrete example of contract. Appendixes: A: martingale measures, Girsanov's theorem and the change of numeraire. B: some aspects of the theory of Stochastic Differential Equations; in particular, the solution for linear EDSs, and the Feynman-Kac Theorem, which shows the connection between EDSs and Partial Differential Equations. C: some useful results about normal distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose an extension of the approach provided by Kluppelberg and Kuhn (2009) for inference on second-order structure moments. As in Kluppelberg and Kuhn (2009) we adopt a copula-based approach instead of assuming normal distribution for the variables, thus relaxing the equality in distribution assumption. A new copula-based estimator for structure moments is investigated. The methodology provided by Kluppelberg and Kuhn (2009) is also extended considering the copulas associated with the family of Eyraud-Farlie-Gumbel-Morgenstern distribution functions (Kotz, Balakrishnan, and Johnson, 2000, Equation 44.73). Finally, a comprehensive simulation study and an application to real financial data are performed in order to compare the different approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die vorliegende Arbeit beschäftigt sich mit der Entwicklung eines Funktionsapproximators und dessen Verwendung in Verfahren zum Lernen von diskreten und kontinuierlichen Aktionen: 1. Ein allgemeiner Funktionsapproximator – Locally Weighted Interpolating Growing Neural Gas (LWIGNG) – wird auf Basis eines Wachsenden Neuralen Gases (GNG) entwickelt. Die topologische Nachbarschaft in der Neuronenstruktur wird verwendet, um zwischen benachbarten Neuronen zu interpolieren und durch lokale Gewichtung die Approximation zu berechnen. Die Leistungsfähigkeit des Ansatzes, insbesondere in Hinsicht auf sich verändernde Zielfunktionen und sich verändernde Eingabeverteilungen, wird in verschiedenen Experimenten unter Beweis gestellt. 2. Zum Lernen diskreter Aktionen wird das LWIGNG-Verfahren mit Q-Learning zur Q-LWIGNG-Methode verbunden. Dafür muss der zugrunde liegende GNG-Algorithmus abgeändert werden, da die Eingabedaten beim Aktionenlernen eine bestimmte Reihenfolge haben. Q-LWIGNG erzielt sehr gute Ergebnisse beim Stabbalance- und beim Mountain-Car-Problem und gute Ergebnisse beim Acrobot-Problem. 3. Zum Lernen kontinuierlicher Aktionen wird ein REINFORCE-Algorithmus mit LWIGNG zur ReinforceGNG-Methode verbunden. Dabei wird eine Actor-Critic-Architektur eingesetzt, um aus zeitverzögerten Belohnungen zu lernen. LWIGNG approximiert sowohl die Zustands-Wertefunktion als auch die Politik, die in Form von situationsabhängigen Parametern einer Normalverteilung repräsentiert wird. ReinforceGNG wird erfolgreich zum Lernen von Bewegungen für einen simulierten 2-rädrigen Roboter eingesetzt, der einen rollenden Ball unter bestimmten Bedingungen abfangen soll.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we studied the efficiency of the benchmarks used in the asset management industry. In chapter 2 we analyzed the efficiency of the benchmark used for the government bond markets. We found that for the Emerging Market Bonds an equally weighted index for the country weights is probably the more suited because guarantees maximum diversification of country risk but for the Eurozone government bond market we found a GDP weighted index is better because the most important matter is to avoid a higher weight for highly indebted countries. In chapter 3 we analyzed the efficiency of a Derivatives Index to invest in the European corporate bond market instead of a Cash Index. We can state that the two indexes are similar in terms of returns, but that the Derivatives Index is less risky because it has a lower volatility, has values of skewness and kurtosis closer to those of a normal distribution and is a more liquid instrument, as the autocorrelation is not significant. In chapter 4 it is analyzed the impact of fallen angels on the corporate bond portfolios. Our analysis investigated the impact of the month-end rebalancing of the ML Emu Non Financial Corporate Index for the exit of downgraded bond (the event). We can conclude a flexible approach to the month-end rebalancing is better in order to avoid a loss of valued due to the benchmark construction rules. In chapter 5 we did a comparison between the equally weighted and capitalization weighted method for the European equity market. The benefit which results from reweighting the portfolio into equal weights can be attributed to the fact that EW portfolios implicitly follow a contrarian investment strategy, because they mechanically rebalance away from stocks that increase in price.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deep vein thrombosis (DVT) and its complication, pulmonary embolism, are frequent causes of disability and mortality. Although blood flow disturbance is considered an important triggering factor, the mechanism of DVT initiation remains elusive. Here we show that 48-hour flow restriction in the inferior vena cava (IVC) results in the development of thrombi structurally similar to human deep vein thrombi. von Willebrand factor (VWF)-deficient mice were protected from thrombosis induced by complete (stasis) or partial (stenosis) flow restriction in the IVC. Mice with half normal VWF levels were also protected in the stenosis model. Besides promoting platelet adhesion, VWF carries Factor VIII. Repeated infusions of recombinant Factor VIII did not rescue thrombosis in VWF(-/-) mice, indicating that impaired coagulation was not the primary reason for the absence of DVT in VWF(-/-) mice. Infusion of GPG-290, a mutant glycoprotein Ib?-immunoglobulin chimera that specifically inhibits interaction of the VWF A1 domain with platelets, prevented thrombosis in wild-type mice. Intravital microscopy showed that platelet and leukocyte recruitment in the early stages of DVT was dramatically higher in wild-type than in VWF(-/-) IVC. Our results demonstrate a pathogenetic role for VWF-platelet interaction in flow disturbance-induced venous thrombosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To evaluate diffusion-weighted magnetic resonance (MR) imaging of the human placenta in fetuses with and fetuses without intrauterine growth restriction (IUGR) who were suspected of having placental insufficiency. MATERIALS AND METHODS: The study was approved by the local ethics committee, and written informed consent was obtained. The authors retrospectively evaluated 1.5-T fetal MR images from 102 singleton pregnancies (mean gestation ± standard deviation, 29 weeks ± 5; range, 21-41 weeks). Morphologic and diffusion-weighted MR imaging were performed. A region of interest analysis of the apparent diffusion coefficient (ADC) of the placenta was independently performed by two observers who were blinded to clinical data and outcome. Placental insufficiency was diagnosed if flattening of the growth curve was detected at obstetric ultrasonography (US), if the birth weight was in the 10th percentile or less, or if fetal weight estimated with US was below the 10th percentile. Abnormal findings at Doppler US of the umbilical artery and histopathologic examination of specimens from the placenta were recorded. The ADCs in fetuses with placental insufficiency were compared with those in fetuses of the same gestational age without placental insufficiency and tested for normal distribution. The t tests and Pearson correlation coefficients were used to compare these results at 5% levels of significance. RESULTS: Thirty-three of the 102 pregnancies were ultimately categorized as having an insufficient placenta. MR imaging depicted morphologic changes (eg, infarction or bleeding) in 27 fetuses. Placental dysfunction was suspected in 33 fetuses at diffusion-weighted imaging (mean ADC, 146.4 sec/mm(2) ± 10.63 for fetuses with placental insufficiency vs 177.1 sec/mm(2) ± 18.90 for fetuses without placental insufficiency; P < .01, with one false-positive case). The use of diffusion-weighted imaging in addition to US increased sensitivity for the detection of placental insufficiency from 73% to 100%, increased accuracy from 91% to 99%, and preserved specificity at 99%. CONCLUSION: Placental dysfunction associated with growth restriction is associated with restricted diffusion and reduced ADC. A decreased ADC used as an early marker of placental damage might be indicative of pregnancy complications such as IUGR.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Locally affine (polyaffine) image registration methods capture intersubject non-linear deformations with a low number of parameters, while providing an intuitive interpretation for clinicians. Considering the mandible bone, anatomical shape differences can be found at different scales, e.g. left or right side, teeth, etc. Classically, sequential coarse to fine registration are used to handle multiscale deformations, instead we propose a simultaneous optimization of all scales. To avoid local minima we incorporate a prior on the polyaffine transformations. This kind of groupwise registration approach is natural in a polyaffine context, if we assume one configuration of regions that describes an entire group of images, with varying transformations for each region. In this paper, we reformulate polyaffine deformations in a generative statistical model, which enables us to incorporate deformation statistics as a prior in a Bayesian setting. We find optimal transformations by optimizing the maximum a posteriori probability. We assume that the polyaffine transformations follow a normal distribution with mean and concentration matrix. Parameters of the prior are estimated from an initial coarse to fine registration. Knowing the region structure, we develop a blockwise pseudoinverse to obtain the concentration matrix. To our knowledge, we are the first to introduce simultaneous multiscale optimization through groupwise polyaffine registration. We show results on 42 mandible CT images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Social cognition is an important aspect of social behavior in humans. Social cognitive deficits are associated with neurodevelopmental and neuropsychiatric disorders. In this study we examine the neural substrates of social cognition and face processing in a group of healthy young adults to examine the neural substrates of social cognition. METHODS: Fifty-seven undergraduates completed a battery of social cognition tasks and were assessed with electroencephalography (EEG) during a face-perception task. A subset (N=22) were administered a face-perception task during functional magnetic resonance imaging. RESULTS: Variance in the N170 EEG was predicted by social attribution performance and by a quantitative measure of empathy. Neurally, face processing was more bilateral in females than in males. Variance in fMRI voxel count in the face-sensitive fusiform gyrus was predicted by quantitative measures of social behavior, including the Social Responsiveness Scale (SRS) and the Empathizing Quotient. CONCLUSIONS: When measured as a quantitative trait, social behaviors in typical and pathological populations share common neural pathways. The results highlight the importance of viewing neurodevelopmental and neuropsychiatric disorders as spectrum phenomena that may be informed by studies of the normal distribution of relevant traits in the general population. Copyright 2014 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a numerically simple routine for locally adaptive smoothing. The locally heterogeneous regression function is modelled as a penalized spline with a smoothly varying smoothing parameter modelled as another penalized spline. This is being formulated as hierarchical mixed model, with spline coe±cients following a normal distribution, which by itself has a smooth structure over the variances. The modelling exercise is in line with Baladandayuthapani, Mallick & Carroll (2005) or Crainiceanu, Ruppert & Carroll (2006). But in contrast to these papers Laplace's method is used for estimation based on the marginal likelihood. This is numerically simple and fast and provides satisfactory results quickly. We also extend the idea to spatial smoothing and smoothing in the presence of non normal response.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Markov chain Monte Carlo is a method of producing a correlated sample in order to estimate features of a complicated target distribution via simple ergodic averages. A fundamental question in MCMC applications is when should the sampling stop? That is, when are the ergodic averages good estimates of the desired quantities? We consider a method that stops the MCMC sampling the first time the width of a confidence interval based on the ergodic averages is less than a user-specified value. Hence calculating Monte Carlo standard errors is a critical step in assessing the output of the simulation. In particular, we consider the regenerative simulation and batch means methods of estimating the variance of the asymptotic normal distribution. We describe sufficient conditions for the strong consistency and asymptotic normality of both methods and investigate their finite sample properties in a variety of examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To test the role of telomere biology in T-cell prolymphocytic leukemia (T-PLL), a rare aggressive disease characterized by the expansion of a T-cell clone derived from immuno-competent post-thymic T-lymphocytes, we analyzed telomere length and telomerase activity in subsets of peripheral blood leukocytes from 11 newly diagnosed or relapsed patients with sporadic T-PLL. Telomere length values of the leukemic T cells (mean+/-s.d.: 1.53+/-0.65 kb) were all below the 1st percentile of telomere length values observed in T cells from healthy age-matched controls whereas telomere length of normal T- and B cells fell between the 1st and 99th percentile of the normal distribution. Leukemic T cells exhibited high levels of telomerase and were sensitive to the telomerase inhibitor BIBR1532 at doses that showed no effect on normal, unstimulated T cells. Targeting the short telomeres and telomerase activity in T-PLL seems an attractive strategy for the future treatment of this devastating disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte Carlo simulation was used to evaluate properties of a simple Bayesian MCMC analysis of the random effects model for single group Cormack-Jolly-Seber capture-recapture data. The MCMC method is applied to the model via a logit link, so parameters p, S are on a logit scale, where logit(S) is assumed to have, and is generated from, a normal distribution with mean μ and variance σ2 . Marginal prior distributions on logit(p) and μ were independent normal with mean zero and standard deviation 1.75 for logit(p) and 100 for μ ; hence minimally informative. Marginal prior distribution on σ2 was placed on τ2=1/σ2 as a gamma distribution with α=β=0.001 . The study design has 432 points spread over 5 factors: occasions (t) , new releases per occasion (u), p, μ , and σ . At each design point 100 independent trials were completed (hence 43,200 trials in total), each with sample size n=10,000 from the parameter posterior distribution. At 128 of these design points comparisons are made to previously reported results from a method of moments procedure. We looked at properties of point and interval inference on μ , and σ based on the posterior mean, median, and mode and equal-tailed 95% credibility interval. Bayesian inference did very well for the parameter μ , but under the conditions used here, MCMC inference performance for σ was mixed: poor for sparse data (i.e., only 7 occasions) or σ=0 , but good when there were sufficient data and not small σ .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Environmental data sets of pollutant concentrations in air, water, and soil frequently include unquantified sample values reported only as being below the analytical method detection limit. These values, referred to as censored values, should be considered in the estimation of distribution parameters as each represents some value of pollutant concentration between zero and the detection limit. Most of the currently accepted methods for estimating the population parameters of environmental data sets containing censored values rely upon the assumption of an underlying normal (or transformed normal) distribution. This assumption can result in unacceptable levels of error in parameter estimation due to the unbounded left tail of the normal distribution. With the beta distribution, which is bounded by the same range of a distribution of concentrations, $\rm\lbrack0\le x\le1\rbrack,$ parameter estimation errors resulting from improper distribution bounds are avoided. This work developed a method that uses the beta distribution to estimate population parameters from censored environmental data sets and evaluated its performance in comparison to currently accepted methods that rely upon an underlying normal (or transformed normal) distribution. Data sets were generated assuming typical values encountered in environmental pollutant evaluation for mean, standard deviation, and number of variates. For each set of model values, data sets were generated assuming that the data was distributed either normally, lognormally, or according to a beta distribution. For varying levels of censoring, two established methods of parameter estimation, regression on normal ordered statistics, and regression on lognormal ordered statistics, were used to estimate the known mean and standard deviation of each data set. The method developed for this study, employing a beta distribution assumption, was also used to estimate parameters and the relative accuracy of all three methods were compared. For data sets of all three distribution types, and for censoring levels up to 50%, the performance of the new method equaled, if not exceeded, the performance of the two established methods. Because of its robustness in parameter estimation regardless of distribution type or censoring level, the method employing the beta distribution should be considered for full development in estimating parameters for censored environmental data sets. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rank-based nonlinear predictability score was recently introduced as a test for determinism in point processes. We here adapt this measure to time series sampled from time-continuous flows. We use noisy Lorenz signals to compare this approach against a classical amplitude-based nonlinear prediction error. Both measures show an almost identical robustness against Gaussian white noise. In contrast, when the amplitude distribution of the noise has a narrower central peak and heavier tails than the normal distribution, the rank-based nonlinear predictability score outperforms the amplitude-based nonlinear prediction error. For this type of noise, the nonlinear predictability score has a higher sensitivity for deterministic structure in noisy signals. It also yields a higher statistical power in a surrogate test of the null hypothesis of linear stochastic correlated signals. We show the high relevance of this improved performance in an application to electroencephalographic (EEG) recordings from epilepsy patients. Here the nonlinear predictability score again appears of higher sensitivity to nonrandomness. Importantly, it yields an improved contrast between signals recorded from brain areas where the first ictal EEG signal changes were detected (focal EEG signals) versus signals recorded from brain areas that were not involved at seizure onset (nonfocal EEG signals).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The role of Soil Organic Carbon (SOC) in mitigating climate change, indicating soil quality and ecosystem function has created research interested to know the nature of SOC at landscape level. The objective of this study was to examine variation and distribution of SOC in a long-term land management at a watershed and plot level. This study was based on meta-analysis of three case studies and 128 surface soil samples from Ethiopia. Three sites (Gununo, Anjeni and Maybar) were compared after considering two Land Management Categories (LMC) and three types of land uses (LUT) in quasi-experimental design. Shapiro-Wilk tests showed non-normal distribution (p = 0.002, a = 0.05) of the data. SOC median value showed the effect of long-term land management with values of 2.29 and 2.38 g kg-1 for less and better-managed watersheds, respectively. SOC values were 1.7, 2.8 and 2.6 g kg-1 for Crop (CLU), Grass (GLU) and Forest Land Use (FLU), respectively. The rank order for SOC variability was FLU>GLU>CLU. Mann-Whitney U and Kruskal-Wallis test showed a significant difference in the medians and distribution of SOC among the LUT, between soil profiles (p<0.05, confidence interval 95%, a = 0.05) while it is not significant (p>0.05) for LMC. The mean and sum rank of Mann Whitney U and Kruskal Wallis test also showed the difference at watershed and plot level. Using SOC as a predictor, cross-validated correct classification with discriminant analysis showed 46 and 49% for LUT and LMC, respectively. The study showed how to categorize landscapes using SOC with respect to land management for decision-makers.