8 resultados para unified framework
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Systematic reviews of well-designed trials constitute a high level of scientific evidence and are important for medical decision making. Meta-analysis facilitates integration of the evidence using a transparent and systematic approach, leading to a broader interpretation of treatment effectiveness and safety than can be attained from individual studies. Traditional meta-analyses are limited to comparing just 2 interventions concurrently and cannot combine evidence concerning multiple treatments. A relatively recent extension of the traditional meta-analytical approach is network meta-analysis, which allows, under certain assumptions, the quantitative synthesis of all evidence under a unified framework and across a network of all eligible trials. Network meta-analysis combines evidence from direct and indirect information via common comparators; interventions can therefore be ranked in terms of the analyzed outcome. In this article, the network meta-analysis approach is introduced in a nontechnical manner using a worked example on the treatment effectiveness of conventional and self-ligating appliances.
Resumo:
In this thesis, we develop an adaptive framework for Monte Carlo rendering, and more specifically for Monte Carlo Path Tracing (MCPT) and its derivatives. MCPT is attractive because it can handle a wide variety of light transport effects, such as depth of field, motion blur, indirect illumination, participating media, and others, in an elegant and unified framework. However, MCPT is a sampling-based approach, and is only guaranteed to converge in the limit, as the sampling rate grows to infinity. At finite sampling rates, MCPT renderings are often plagued by noise artifacts that can be visually distracting. The adaptive framework developed in this thesis leverages two core strategies to address noise artifacts in renderings: adaptive sampling and adaptive reconstruction. Adaptive sampling consists in increasing the sampling rate on a per pixel basis, to ensure that each pixel value is below a predefined error threshold. Adaptive reconstruction leverages the available samples on a per pixel basis, in an attempt to have an optimal trade-off between minimizing the residual noise artifacts and preserving the edges in the image. In our framework, we greedily minimize the relative Mean Squared Error (rMSE) of the rendering by iterating over sampling and reconstruction steps. Given an initial set of samples, the reconstruction step aims at producing the rendering with the lowest rMSE on a per pixel basis, and the next sampling step then further reduces the rMSE by distributing additional samples according to the magnitude of the residual rMSE of the reconstruction. This iterative approach tightly couples the adaptive sampling and adaptive reconstruction strategies, by ensuring that we only sample densely regions of the image where adaptive reconstruction cannot properly resolve the noise. In a first implementation of our framework, we demonstrate the usefulness of our greedy error minimization using a simple reconstruction scheme leveraging a filterbank of isotropic Gaussian filters. In a second implementation, we integrate a powerful edge aware filter that can adapt to the anisotropy of the image. Finally, in a third implementation, we leverage auxiliary feature buffers that encode scene information (such as surface normals, position, or texture), to improve the robustness of the reconstruction in the presence of strong noise.
Resumo:
With millions of users worldwide, microblogging has developed into a powerful tool for interaction and information dissemination. While both men and women readily use this technology, there are significant differences in how they embrace it. Understanding these differences is important to ensure gender parity, provide advertisers with actionable insights on the marketing potential of both groups, and to inform current theories on how microblogging affordances shape gender roles. So far, existing research has not provided a unified framework for such analysis, with gender insights scattered across multiple studies. To fill this gap, our study conducts a comprehensive meta-review of existing research. We find that current discourse offers a solid body of knowledge on gender differences in adoption, shared content, stylistic presentation, and a rather convoluted picture of female and male interaction. Together, our structured findings offer a deeper insight into the underlying dynamics of gender differences in microblogging.
Resumo:
Kriging-based optimization relying on noisy evaluations of complex systems has recently motivated contributions from various research communities. Five strategies have been implemented in the DiceOptim package. The corresponding functions constitute a user-friendly tool for solving expensive noisy optimization problems in a sequential framework, while offering some flexibility for advanced users. Besides, the implementation is done in a unified environment, making this package a useful device for studying the relative performances of existing approaches depending on the experimental setup. An overview of the package structure and interface is provided, as well as a description of the strategies and some insight about the implementation challenges and the proposed solutions. The strategies are compared to some existing optimization packages on analytical test functions and show promising performances.
Resumo:
Methods for tracking an object have generally fallen into two groups: tracking by detection and tracking through local optimization. The advantage of detection-based tracking is its ability to deal with target appearance and disappearance, but it does not naturally take advantage of target motion continuity during detection. The advantage of local optimization is efficiency and accuracy, but it requires additional algorithms to initialize tracking when the target is lost. To bridge these two approaches, we propose a framework for unified detection and tracking as a time-series Bayesian estimation problem. The basis of our approach is to treat both detection and tracking as a sequential entropy minimization problem, where the goal is to determine the parameters describing a target in each frame. To do this we integrate the Active Testing (AT) paradigm with Bayesian filtering, and this results in a framework capable of both detecting and tracking robustly in situations where the target object enters and leaves the field of view regularly. We demonstrate our approach on a retinal tool tracking problem and show through extensive experiments that our method provides an efficient and robust tracking solution.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Architectural decisions are often encoded in the form of constraints and guidelines. Non-functional requirements can be ensured by checking the conformance of the implementation against this kind of invariant. Conformance checking is often a costly and error-prone process that involves the use of multiple tools, differing in effectiveness, complexity and scope of applicability. To reduce the overall effort entailed by this activity, we propose a novel approach that supports verification of human- readable declarative rules through the use of adapted off-the-shelf tools. Our approach consists of a rule specification DSL, called Dicto, and a tool coordination framework, called Probo. The approach has been implemented in a soon to be evaluated prototype.