925 resultados para Convex Duality


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We prove large deviation results for sums of heavy-tailed random elements in rather general convex cones being semigroups equipped with a rescaling operation by positive real numbers. In difference to previous results for the cone of convex sets, our technique does not use the embedding of cones in linear spaces. Examples include the cone of convex sets with the Minkowski addition, positive half-line with maximum operation and the family of square integrable functions with arithmetic addition and argument rescaling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present local stereological estimators of Minkowski tensors defined on convex bodies in ℝ d . Special cases cover a number of well-known local stereological estimators of volume and surface area in ℝ3, but the general set-up also provides new local stereological estimators of various types of centres of gravity and tensors of rank two. Rank two tensors can be represented as ellipsoids and contain information about shape and orientation. The performance of some of the estimators of centres of gravity and volume tensors of rank two is investigated by simulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study representations of MV-algebras -- equivalently, unital lattice-ordered abelian groups -- through the lens of Stone-Priestley duality, using canonical extensions as an essential tool. Specifically, the theory of canonical extensions implies that the (Stone-Priestley) dual spaces of MV-algebras carry the structure of topological partial commutative ordered semigroups. We use this structure to obtain two different decompositions of such spaces, one indexed over the prime MV-spectrum, the other over the maximal MV-spectrum. These decompositions yield sheaf representations of MV-algebras, using a new and purely duality-theoretic result that relates certain sheaf representations of distributive lattices to decompositions of their dual spaces. Importantly, the proofs of the MV-algebraic representation theorems that we obtain in this way are distinguished from the existing work on this topic by the following features: (1) we use only basic algebraic facts about MV-algebras; (2) we show that the two aforementioned sheaf representations are special cases of a common result, with potential for generalizations; and (3) we show that these results are strongly related to the structure of the Stone-Priestley duals of MV-algebras. In addition, using our analysis of these decompositions, we prove that MV-algebras with isomorphic underlying lattices have homeomorphic maximal MV-spectra. This result is an MV-algebraic generalization of a classical theorem by Kaplansky stating that two compact Hausdorff spaces are homeomorphic if, and only if, the lattices of continuous [0, 1]-valued functions on the spaces are isomorphic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we introduce a class of descriptors for regular languages arising from an application of the Stone duality between finite Boolean algebras and finite sets. These descriptors, called classical fortresses, are object specified in classical propositional logic and capable to accept exactly regular languages. To prove this, we show that the languages accepted by classical fortresses and deterministic finite automata coincide. Classical fortresses, besides being propositional descriptors for regular languages, also turn out to be an efficient tool for providing alternative and intuitive proofs for the closure properties of regular languages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel approach to the reconstruction of depth from light field data. Our method uses dictionary representations and group sparsity constraints to derive a convex formulation. Although our solution results in an increase of the problem dimensionality, we keep numerical complexity at bay by restricting the space of solutions and by exploiting an efficient Primal-Dual formulation. Comparisons with state of the art techniques, on both synthetic and real data, show promising performances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the current issues of debate in the study of mild cognitive impairment (MCI) is deviations of oscillatory brain responses from normal brain states and its dynamics. This work aims to characterize the differences of power in brain oscillations during the execution of a recognition memory task in MCI subjects in comparison with elderly controls. Magnetoencephalographic (MEG) signals were recorded during a continuous recognition memory task performance. Oscillatory brain activity during the recognition phase of the task was analyzed by wavelet transform in the source space by means of minimum norm algorithm. Both groups obtained a 77% hit ratio. In comparison with healthy controls, MCI subjects showed increased theta (p < 0.001), lower beta reduction (p < 0.001) and decreased alpha and gamma power (p < 0.002 and p < 0.001 respectively) in frontal, temporal and parietal areas during early and late latencies. Our results point towards a dual pattern of activity (increase and decrease) which is indicative of MCI and specific to certain time windows, frequency bands and brain regions. These results could represent two neurophysiological sides of MCI. Characterizing these opposing processes may contribute to the understanding of the disorder.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The design of nuclear power plant has to follow a number of regulations aimed at limiting the risks inherent in this type of installation. The goal is to prevent and to limit the consequences of any possible incident that might threaten the public or the environment. To verify that the safety requirements are met a safety assessment process is followed. Safety analysis is as key component of a safety assessment, which incorporates both probabilistic and deterministic approaches. The deterministic approach attempts to ensure that the various situations, and in particular accidents, that are considered to be plausible, have been taken into account, and that the monitoring systems and engineered safety and safeguard systems will be capable of ensuring the safety goals. On the other hand, probabilistic safety analysis tries to demonstrate that the safety requirements are met for potential accidents both within and beyond the design basis, thus identifying vulnerabilities not necessarily accessible through deterministic safety analysis alone. Probabilistic safety assessment (PSA) methodology is widely used in the nuclear industry and is especially effective in comprehensive assessment of the measures needed to prevent accidents with small probability but severe consequences. Still, the trend towards a risk informed regulation (RIR) demanded a more extended use of risk assessment techniques with a significant need to further extend PSA’s scope and quality. Here is where the theory of stimulated dynamics (TSD) intervenes, as it is the mathematical foundation of the integrated safety assessment (ISA) methodology developed by the CSN(Consejo de Seguridad Nuclear) branch of Modelling and Simulation (MOSI). Such methodology attempts to extend classical PSA including accident dynamic analysis, an assessment of the damage associated to the transients and a computation of the damage frequency. The application of this ISA methodology requires a computational framework called SCAIS (Simulation Code System for Integrated Safety Assessment). SCAIS provides accident dynamic analysis support through simulation of nuclear accident sequences and operating procedures. Furthermore, it includes probabilistic quantification of fault trees and sequences; and integration and statistic treatment of risk metrics. SCAIS comprehensively implies an intensive use of code coupling techniques to join typical thermal hydraulic analysis, severe accident and probability calculation codes. The integration of accident simulation in the risk assessment process and thus requiring the use of complex nuclear plant models is what makes it so powerful, yet at the cost of an enormous increase in complexity. As the complexity of the process is primarily focused on such accident simulation codes, the question of whether it is possible to reduce the number of required simulation arises, which will be the focus of the present work. This document presents the work done on the investigation of more efficient techniques applied to the process of risk assessment inside the mentioned ISA methodology. Therefore such techniques will have the primary goal of decreasing the number of simulation needed for an adequate estimation of the damage probability. As the methodology and tools are relatively recent, there is not much work done inside this line of investigation, making it a quite difficult but necessary task, and because of time limitations the scope of the work had to be reduced. Therefore, some assumptions were made to work in simplified scenarios best suited for an initial approximation to the problem. The following section tries to explain in detail the process followed to design and test the developed techniques. Then, the next section introduces the general concepts and formulae of the TSD theory which are at the core of the risk assessment process. Afterwards a description of the simulation framework requirements and design is given. Followed by an introduction to the developed techniques, giving full detail of its mathematical background and its procedures. Later, the test case used is described and result from the application of the techniques is shown. Finally the conclusions are presented and future lines of work are exposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show the existence of sets with n points (n ? 4) for which every convex decomposition contains more than (35/32)n?(3/2) polygons,which refutes the conjecture that for every set of n points there is a convex decomposition with at most n+C polygons. For sets having exactly three extreme pointswe show that more than n+sqr(2(n ? 3))?4 polygons may be necessary to form a convex decomposition.