933 resultados para Stochastic Dominance
Resumo:
This article compares two recent analyses of continuity and change in the American power structure since 1900, with a main focus on the years after World War II. The first analysis asserts that the “corporate elite” has fractured and fragmented in recent decades and no longer has the unity to have a collective impact on public policy. The second analysis claims that corporate leaders remain united, albeit with moderate-conservative and ultra-conservative differences on several issues, and continue to have a dominant collective impact on public policies that involve their major goals. After comparing the two perspectives on key issues from 1900 to 1945, the article analyzes the fractured-elite theory’s three claims about the postwar era: an activist government constrained the corporate elite, the union movement negotiated a capital-labor accord; and bank boards created policy cohesion among corporations. Finally, it compares the two perspectives on tax issues, health-care policies, and trade expansion between 1990 and 2010.
Resumo:
The population of naive T cells in the periphery is best described by determining both its T cell receptor diversity, or number of clonotypes, and the sizes of its clonal subsets. In this paper, we make use of a previously introduced mathematical model of naive T cell homeostasis, to study the fate and potential of naive T cell clonotypes in the periphery. This is achieved by the introduction of several new stochastic descriptors for a given naive T cell clonotype, such as its maximum clonal size, the time to reach this maximum, the number of proliferation events required to reach this maximum, the rate of contraction of the clonotype during its way to extinction, as well as the time to a given number of proliferation events. Our results show that two fates can be identified for the dynamics of the clonotype: extinction in the short-term if the clonotype experiences too hostile a peripheral environment, or establishment in the periphery in the long-term. In this second case the probability mass function for the maximum clonal size is bimodal, with one mode near one and the other mode far away from it. Our model also indicates that the fate of a recent thymic emigrant (RTE) during its journey in the periphery has a clear stochastic component, where the probability of extinction cannot be neglected, even in a friendly but competitive environment. On the other hand, a greater deterministic behaviour can be expected in the potential size of the clonotype seeded by the RTE in the long-term, once it escapes extinction.
Resumo:
The paper develops a novel realized matrix-exponential stochastic volatility model of multivariate returns and realized covariances that incorporates asymmetry and long memory (hereafter the RMESV-ALM model). The matrix exponential transformation guarantees the positivedefiniteness of the dynamic covariance matrix. The contribution of the paper ties in with Robert Basmann’s seminal work in terms of the estimation of highly non-linear model specifications (“Causality tests and observationally equivalent representations of econometric models”, Journal of Econometrics, 1988, 39(1-2), 69–104), especially for developing tests for leverage and spillover effects in the covariance dynamics. Efficient importance sampling is used to maximize the likelihood function of RMESV-ALM, and the finite sample properties of the quasi-maximum likelihood estimator of the parameters are analysed. Using high frequency data for three US financial assets, the new model is estimated and evaluated. The forecasting performance of the new model is compared with a novel dynamic realized matrix-exponential conditional covariance model. The volatility and co-volatility spillovers are examined via the news impact curves and the impulse response functions from returns to volatility and co-volatility.
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
A landfill represents a complex and dynamically evolving structure that can be stochastically perturbed by exogenous factors. Both thermodynamic (equilibrium) and time varying (non-steady state) properties of a landfill are affected by spatially heterogenous and nonlinear subprocesses that combine with constraining initial and boundary conditions arising from the associated surroundings. While multiple approaches have been made to model landfill statistics by incorporating spatially dependent parameters on the one hand (data based approach) and continuum dynamical mass-balance equations on the other (equation based modelling), practically no attempt has been made to amalgamate these two approaches while also incorporating inherent stochastically induced fluctuations affecting the process overall. In this article, we will implement a minimalist scheme of modelling the time evolution of a realistic three dimensional landfill through a reaction-diffusion based approach, focusing on the coupled interactions of four key variables - solid mass density, hydrolysed mass density, acetogenic mass density and methanogenic mass density, that themselves are stochastically affected by fluctuations, coupled with diffusive relaxation of the individual densities, in ambient surroundings. Our results indicate that close to the linearly stable limit, the large time steady state properties, arising out of a series of complex coupled interactions between the stochastically driven variables, are scarcely affected by the biochemical growth-decay statistics. Our results clearly show that an equilibrium landfill structure is primarily determined by the solid and hydrolysed mass densities only rendering the other variables as statistically "irrelevant" in this (large time) asymptotic limit. The other major implication of incorporation of stochasticity in the landfill evolution dynamics is in the hugely reduced production times of the plants that are now approximately 20-30 years instead of the previous deterministic model predictions of 50 years and above. The predictions from this stochastic model are in conformity with available experimental observations.
Resumo:
In this dissertation, we develop a novel methodology for characterizing and simulating nonstationary, full-field, stochastic turbulent wind fields.
In this new method, nonstationarity is characterized and modeled via temporal coherence, which is quantified in the discrete frequency domain by probability distributions of the differences in phase between adjacent Fourier components.
The empirical distributions of the phase differences can also be extracted from measured data, and the resulting temporal coherence parameters can quantify the occurrence of nonstationarity in empirical wind data.
This dissertation (1) implements temporal coherence in a desktop turbulence simulator, (2) calibrates empirical temporal coherence models for four wind datasets, and (3) quantifies the increase in lifetime wind turbine loads caused by temporal coherence.
The four wind datasets were intentionally chosen from locations around the world so that they had significantly different ambient atmospheric conditions.
The prevalence of temporal coherence and its relationship to other standard wind parameters was modeled through empirical joint distributions (EJDs), which involved fitting marginal distributions and calculating correlations.
EJDs have the added benefit of being able to generate samples of wind parameters that reflect the characteristics of a particular site.
Lastly, to characterize the effect of temporal coherence on design loads, we created four models in the open-source wind turbine simulator FAST based on the \windpact turbines, fit response surfaces to them, and used the response surfaces to calculate lifetime turbine responses to wind fields simulated with and without temporal coherence.
The training data for the response surfaces was generated from exhaustive FAST simulations that were run on the high-performance computing (HPC) facilities at the National Renewable Energy Laboratory.
This process was repeated for wind field parameters drawn from the empirical distributions and for wind samples drawn using the recommended procedure in the wind turbine design standard \iec.
The effect of temporal coherence was calculated as a percent increase in the lifetime load over the base value with no temporal coherence.
Resumo:
While molecular and cellular processes are often modeled as stochastic processes, such as Brownian motion, chemical reaction networks and gene regulatory networks, there are few attempts to program a molecular-scale process to physically implement stochastic processes. DNA has been used as a substrate for programming molecular interactions, but its applications are restricted to deterministic functions and unfavorable properties such as slow processing, thermal annealing, aqueous solvents and difficult readout limit them to proof-of-concept purposes. To date, whether there exists a molecular process that can be programmed to implement stochastic processes for practical applications remains unknown.
In this dissertation, a fully specified Resonance Energy Transfer (RET) network between chromophores is accurately fabricated via DNA self-assembly, and the exciton dynamics in the RET network physically implement a stochastic process, specifically a continuous-time Markov chain (CTMC), which has a direct mapping to the physical geometry of the chromophore network. Excited by a light source, a RET network generates random samples in the temporal domain in the form of fluorescence photons which can be detected by a photon detector. The intrinsic sampling distribution of a RET network is derived as a phase-type distribution configured by its CTMC model. The conclusion is that the exciton dynamics in a RET network implement a general and important class of stochastic processes that can be directly and accurately programmed and used for practical applications of photonics and optoelectronics. Different approaches to using RET networks exist with vast potential applications. As an entropy source that can directly generate samples from virtually arbitrary distributions, RET networks can benefit applications that rely on generating random samples such as 1) fluorescent taggants and 2) stochastic computing.
By using RET networks between chromophores to implement fluorescent taggants with temporally coded signatures, the taggant design is not constrained by resolvable dyes and has a significantly larger coding capacity than spectrally or lifetime coded fluorescent taggants. Meanwhile, the taggant detection process becomes highly efficient, and the Maximum Likelihood Estimation (MLE) based taggant identification guarantees high accuracy even with only a few hundred detected photons.
Meanwhile, RET-based sampling units (RSU) can be constructed to accelerate probabilistic algorithms for wide applications in machine learning and data analytics. Because probabilistic algorithms often rely on iteratively sampling from parameterized distributions, they can be inefficient in practice on the deterministic hardware traditional computers use, especially for high-dimensional and complex problems. As an efficient universal sampling unit, the proposed RSU can be integrated into a processor / GPU as specialized functional units or organized as a discrete accelerator to bring substantial speedups and power savings.
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: a) increase the efficiency of the portfolio optimization process, b) implement large-scale optimizations, and c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH).
Resumo:
This data set contains aboveground community plant biomass (Sown plant community, Weed plant community, Dead plant material, and Unidentified plant material; all measured in biomass as dry weight) and species-specific biomass from the sown species of the dominance experiment plots of a large grassland biodiversity experiment (the Jena Experiment; see further details below). In the dominance experiment, 206 grassland plots of 3.5 x 3.5 m were established from a pool of 9 plant species that can be dominant in semi-natural grassland communities of the study region. In May 2002, varying numbers of plant species from this species pool were sown into the plots to create a gradient of plant species richness (1, 2, 3, 4, 6, and 9 species). Plots were maintained by bi-annual weeding and mowing. Aboveground community biomass was harvested twice in May and August 2008 on all experimental plots of the dominance experiment. This was done by clipping the vegetation at 3 cm above ground in two rectangles of 0.2 x 0.5 m per experimental plot. The location of these rectangles was assigned by random selection of coordinates within the central area of the plots (excluding an outer edge of 50cm). The positions of the rectangles within plots were identical for all plots. The harvested biomass was sorted into categories: individual species for the sown plant species, weed plant species (species not sown at the particular plot), detached dead plant material, and remaining plant material that could not be assigned to any category. All biomass was dried to constant weight (70°C, >= 48 h) and weighed. Sown plant community biomass was calculated as the sum of the biomass of the individual sown species. The mean of both samples per plot and the individual measurements are provided in the data file. Overall, analyses of the community biomass data have identified species richness and the presence of particular species as an important driver of a positive biodiversity-productivity relationship.
Resumo:
This paper reports the findings from a study of the learning of English intonation by Spanish speakers within the discourse mode of L2 oral presentation. The purpose of this experiment is, firstly, to compare four prosodic parameters before and after an L2 discourse intonation training programme and, secondly, to confirm whether subjects, after the aforementioned L2 discourse intonation training, are able to match the form of these four prosodic parameters to the discourse-pragmatic function of dominance and control. The study designed the instructions and tasks to create the oral and written corpora and Brazil’s Pronunciation for Advanced Learners of English was adapted for the pedagogical aims of the present study. The learners’ pre- and post-tasks were acoustically analysed and a pre / post- questionnaire design was applied to interpret the acoustic analysis. Results indicate most of the subjects acquired a wider choice of the four prosodic parameters partly due to the prosodically-annotated transcripts that were developed throughout the L2 discourse intonation course. Conversely, qualitative and quantitative data reveal most subjects failed to match the forms to their appropriate pragmatic functions to express dominance and control in an L2 oral presentation.
Resumo:
Energy efficiency improvement has been a key objective of China’s long-term energy policy. In this paper, we derive single-factor technical energy efficiency (abbreviated as energy efficiency) in China from multi-factor efficiency estimated by means of a translog production function and a stochastic frontier model on the basis of panel data on 29 Chinese provinces over the period 2003–2011. We find that average energy efficiency has been increasing over the research period and that the provinces with the highest energy efficiency are at the east coast and the ones with the lowest in the west, with an intermediate corridor in between. In the analysis of the determinants of energy efficiency by means of a spatial Durbin error model both factors in the own province and in first-order neighboring provinces are considered. Per capita income in the own province has a positive effect. Furthermore, foreign direct investment and population density in the own province and in neighboring provinces have positive effects, whereas the share of state-owned enterprises in Gross Provincial Product in the own province and in neighboring provinces has negative effects. From the analysis it follows that inflow of foreign direct investment and reform of state-owned enterprises are important policy handles.