930 resultados para nested Archimedean copulas
Resumo:
The standard variance components method for mapping quantitative trait loci is derived on the assumption of normality. Unsurprisingly, statistical tests based on this method do not perform so well if this assumption is not satisfied. We use the statistical concept of copulas to relax the assumption of normality and derive a test that can perform well under any distribution of the continuous trait. In particular, we discuss bivariate normal copulas in the context of sib-pair studies. Our approach is illustrated by a linkage analysis of lipoprotein(a) levels, whose distribution is highly skewed. We demonstrate that the asymptotic critical levels of the test can still be calculated using the interval mapping approach. The new method can be extended to more general pedigrees and multivariate phenotypes in a similar way as the original variance components method.
Resumo:
In the past two decades, multi-agent systems (MAS) have emerged as a new paradigm for conceptualizing large and complex distributed software systems. A multi-agent system view provides a natural abstraction for both the structure and the behavior of modern-day software systems. Although there were many conceptual frameworks for using multi-agent systems, there was no well established and widely accepted method for modeling multi-agent systems. This dissertation research addressed the representation and analysis of multi-agent systems based on model-oriented formal methods. The objective was to provide a systematic approach for studying MAS at an early stage of system development to ensure the quality of design. ^ Given that there was no well-defined formal model directly supporting agent-oriented modeling, this study was centered on three main topics: (1) adapting a well-known formal model, predicate transition nets (PrT nets), to support MAS modeling; (2) formulating a modeling methodology to ease the construction of formal MAS models; and (3) developing a technique to support machine analysis of formal MAS models using model checking technology. PrT nets were extended to include the notions of dynamic structure, agent communication and coordination to support agent-oriented modeling. An aspect-oriented technique was developed to address the modularity of agent models and compositionality of incremental analysis. A set of translation rules were defined to systematically translate formal MAS models to concrete models that can be verified through the model checker SPIN (Simple Promela Interpreter). ^ This dissertation presents the framework developed for modeling and analyzing MAS, including a well-defined process model based on nested PrT nets, and a comprehensive methodology to guide the construction and analysis of formal MAS models.^
Resumo:
Thèse numérisée par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
Bayesian methods offer a flexible and convenient probabilistic learning framework to extract interpretable knowledge from complex and structured data. Such methods can characterize dependencies among multiple levels of hidden variables and share statistical strength across heterogeneous sources. In the first part of this dissertation, we develop two dependent variational inference methods for full posterior approximation in non-conjugate Bayesian models through hierarchical mixture- and copula-based variational proposals, respectively. The proposed methods move beyond the widely used factorized approximation to the posterior and provide generic applicability to a broad class of probabilistic models with minimal model-specific derivations. In the second part of this dissertation, we design probabilistic graphical models to accommodate multimodal data, describe dynamical behaviors and account for task heterogeneity. In particular, the sparse latent factor model is able to reveal common low-dimensional structures from high-dimensional data. We demonstrate the effectiveness of the proposed statistical learning methods on both synthetic and real-world data.
Resumo:
Thèse numérisée par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
Field-programmable gate arrays are ideal hosts to custom accelerators for signal, image, and data processing but de- mand manual register transfer level design if high performance and low cost are desired. High-level synthesis reduces this design burden but requires manual design of complex on-chip and off-chip memory architectures, a major limitation in applications such as video processing. This paper presents an approach to resolve this shortcoming. A constructive process is described that can derive such accelerators, including on- and off-chip memory storage from a C description such that a user-defined throughput constraint is met. By employing a novel statement-oriented approach, dataflow intermediate models are derived and used to support simple ap- proaches for on-/off-chip buffer partitioning, derivation of custom on-chip memory hierarchies and architecture transformation to ensure user-defined throughput constraints are met with minimum cost. When applied to accelerators for full search motion estima- tion, matrix multiplication, Sobel edge detection, and fast Fourier transform, it is shown how real-time performance up to an order of magnitude in advance of existing commercial HLS tools is enabled whilst including all requisite memory infrastructure. Further, op- timizations are presented that reduce the on-chip buffer capacity and physical resource cost by up to 96% and 75%, respectively, whilst maintaining real-time performance.
Resumo:
Heterogeneous computing systems have become common in modern processor architectures. These systems, such as those released by AMD, Intel, and Nvidia, include both CPU and GPU cores on a single die available with reduced communication overhead compared to their discrete predecessors. Currently, discrete CPU/GPU systems are limited, requiring larger, regular, highly-parallel workloads to overcome the communication costs of the system. Without the traditional communication delay assumed between GPUs and CPUs, we believe non-traditional workloads could be targeted for GPU execution. Specifically, this thesis focuses on the execution model of nested parallel workloads on heterogeneous systems. We have designed a simulation flow which utilizes widely used CPU and GPU simulators to model heterogeneous computing architectures. We then applied this simulator to non-traditional GPU workloads using different execution models. We also have proposed a new execution model for nested parallelism allowing users to exploit these heterogeneous systems to reduce execution time.
Resumo:
The goal of Vehicle Routing Problems (VRP) and their variations is to transport a set of orders with the minimum number of vehicles at least cost. Most approaches are designed to solve specific problem variations independently, whereas in real world applications, different constraints are handled concurrently. This research extends solutions obtained for the traveling salesman problem with time windows to a much wider class of route planning problems in logistics. The work describes a novel approach that: supports a heterogeneous fleet of vehicles dynamically reduces the number of vehicles respects individual capacity restrictions satisfies pickup and delivery constraints takes Hamiltonian paths (rather than cycles) The proposed approach uses Monte-Carlo Tree Search and in particular Nested Rollout Policy Adaptation. For the evaluation of the work, real data from the industry was obtained and tested and the results are reported.
Resumo:
Coprime and nested sampling are well known deterministic sampling techniques that operate at rates significantly lower than the Nyquist rate, and yet allow perfect reconstruction of the spectra of wide sense stationary signals. However, theoretical guarantees for these samplers assume ideal conditions such as synchronous sampling, and ability to perfectly compute statistical expectations. This thesis studies the performance of coprime and nested samplers in spatial and temporal domains, when these assumptions are violated. In spatial domain, the robustness of these samplers is studied by considering arrays with perturbed sensor locations (with unknown perturbations). Simplified expressions for the Fisher Information matrix for perturbed coprime and nested arrays are derived, which explicitly highlight the role of co-array. It is shown that even in presence of perturbations, it is possible to resolve $O(M^2)$ under appropriate conditions on the size of the grid. The assumption of small perturbations leads to a novel ``bi-affine" model in terms of source powers and perturbations. The redundancies in the co-array are then exploited to eliminate the nuisance perturbation variable, and reduce the bi-affine problem to a linear underdetermined (sparse) problem in source powers. This thesis also studies the robustness of coprime sampling to finite number of samples and sampling jitter, by analyzing their effects on the quality of the estimated autocorrelation sequence. A variety of bounds on the error introduced by such non ideal sampling schemes are computed by considering a statistical model for the perturbation. They indicate that coprime sampling leads to stable estimation of the autocorrelation sequence, in presence of small perturbations. Under appropriate assumptions on the distribution of WSS signals, sharp bounds on the estimation error are established which indicate that the error decays exponentially with the number of samples. The theoretical claims are supported by extensive numerical experiments.
Resumo:
Compressed covariance sensing using quadratic samplers is gaining increasing interest in recent literature. Covariance matrix often plays the role of a sufficient statistic in many signal and information processing tasks. However, owing to the large dimension of the data, it may become necessary to obtain a compressed sketch of the high dimensional covariance matrix to reduce the associated storage and communication costs. Nested sampling has been proposed in the past as an efficient sub-Nyquist sampling strategy that enables perfect reconstruction of the autocorrelation sequence of Wide-Sense Stationary (WSS) signals, as though it was sampled at the Nyquist rate. The key idea behind nested sampling is to exploit properties of the difference set that naturally arises in quadratic measurement model associated with covariance compression. In this thesis, we will focus on developing novel versions of nested sampling for low rank Toeplitz covariance estimation, and phase retrieval, where the latter problem finds many applications in high resolution optical imaging, X-ray crystallography and molecular imaging. The problem of low rank compressive Toeplitz covariance estimation is first shown to be fundamentally related to that of line spectrum recovery. In absence if noise, this connection can be exploited to develop a particular kind of sampler called the Generalized Nested Sampler (GNS), that can achieve optimal compression rates. In presence of bounded noise, we develop a regularization-free algorithm that provably leads to stable recovery of the high dimensional Toeplitz matrix from its order-wise minimal sketch acquired using a GNS. Contrary to existing TV-norm and nuclear norm based reconstruction algorithms, our technique does not use any tuning parameters, which can be of great practical value. The idea of nested sampling idea also finds a surprising use in the problem of phase retrieval, which has been of great interest in recent times for its convex formulation via PhaseLift, By using another modified version of nested sampling, namely the Partial Nested Fourier Sampler (PNFS), we show that with probability one, it is possible to achieve a certain conjectured lower bound on the necessary measurement size. Moreover, for sparse data, an l1 minimization based algorithm is proposed that can lead to stable phase retrieval using order-wise minimal number of measurements.
Resumo:
Introduction: Preeclampsia is the main complication of pregnancy in developing countries. Calcium starting at 14 weeks of pregnancy is indicated to prevent the disease. Recent advances in prevention of preeclampsia endorse the addition of conjugated linoleic acid. Objective: To estimate the protective effect from calcium alone, compared to calcium plus conjugated linoleic acid in nulliparous women at risk of preeclampsia. Methods: A case-control design nested in the cohort of nulliparous women attending antenatal care from 2010 to 2014. The clinical histories of 387 cases of preeclampsia were compared with 1,054 normotensive controls. The exposure was prescriptions for calcium alone, the first period, or calcium plus conjugated linoleic acid, the second period, from 12 to 16 weeks of gestational age to labor. Confounding variables were controlled, allowing only nulliparous women into the study and stratifying by age, education and ethnic group. Results: The average age was 26.4 yrs old (range= 13-45), 85% from mixed ethnic backgrounds and with high school education. There were no differences between women who received calcium carbonate and those who did not (OR= 0.96; 95% CI= 0.73–1.27). The group of adolescents (13 to 18 yrs old) in the calcium plus conjugated linoleic acid was protected for preeclampsia (OR= 0.00; 95% CI= 0.00–0.44) independent of the confounder variables. Conclusions: 1. Calcium supplementation during pregnancy did not have preventive effects on preeclampsia. 2. Calcium plus Conjugated Linoleic acid provided to adolescents was observed to have preventive effect on Preeclampsia.
Resumo:
Introduction: Preeclampsia is the main complication of pregnancy in developing countries. Calcium starting at 14 weeks of pregnancy is indicated to prevent the disease. Recent advances in prevention of preeclampsia endorse the addition of conjugated linoleic acid. Objective: To estimate the protective effect from calcium alone, compared to calcium plus conjugated linoleic acid in nulliparous women at risk of preeclampsia. Methods: A case-control design nested in the cohort of nulliparous women attending antenatal care from 2010 to 2014. The clinical histories of 387 cases of preeclampsia were compared with 1,054 normotensive controls. The exposure was prescriptions for calcium alone, the first period, or calcium plus conjugated linoleic acid, the second period, from 12 to 16 weeks of gestational age to labor. Confounding variables were controlled, allowing only nulliparous women into the study and stratifying by age, education and ethnic group. Results: The average age was 26.4 yrs old (range= 13-45), 85% from mixed ethnic backgrounds and with high school education. There were no differences between women who received calcium carbonate and those who did not (OR= 0.96; 95% IC= 0.73–1.27). The group of adolescents (13 to 18 yrs old) in the calcium plus conjugated linoleic acid was protected for preeclampsia (OR= 0.00; 95% CI= 0.00–0.44) independent of the confounder variables. Conclusions: 1. Calcium supplementation during pregnancy did not have preventive effects on preeclampsia. 2. Calcium plus Conjugated Linoleic acid provided to adolescents was observed to have preventive effect on Preeclampsia.
Resumo:
Este comunicado descreve os procedimentos de coleta, processamento do líquido céfalo-raquidiano, para a extração de RNA genômico viral, e de detecção do vírus através da técnica de RT-nested PCR, potencial método de diagnóstico molecular da CAE.