29 resultados para probabilistic refinement calculus


Relevância:

20.00% 20.00%

Publicador:

Resumo:

La seguridad verificada es una metodología para demostrar propiedades de seguridad de los sistemas informáticos que se destaca por las altas garantías de corrección que provee. Los sistemas informáticos se modelan como programas probabilísticos y para probar que verifican una determinada propiedad de seguridad se utilizan técnicas rigurosas basadas en modelos matemáticos de los programas. En particular, la seguridad verificada promueve el uso de demostradores de teoremas interactivos o automáticos para construir demostraciones completamente formales cuya corrección es certificada mecánicamente (por ordenador). La seguridad verificada demostró ser una técnica muy efectiva para razonar sobre diversas nociones de seguridad en el área de criptografía. Sin embargo, no ha podido cubrir un importante conjunto de nociones de seguridad “aproximada”. La característica distintiva de estas nociones de seguridad es que se expresan como una condición de “similitud” entre las distribuciones de salida de dos programas probabilísticos y esta similitud se cuantifica usando alguna noción de distancia entre distribuciones de probabilidad. Este conjunto incluye destacadas nociones de seguridad de diversas áreas como la minería de datos privados, el análisis de flujo de información y la criptografía. Ejemplos representativos de estas nociones de seguridad son la indiferenciabilidad, que permite reemplazar un componente idealizado de un sistema por una implementación concreta (sin alterar significativamente sus propiedades de seguridad), o la privacidad diferencial, una noción de privacidad que ha recibido mucha atención en los últimos años y tiene como objetivo evitar la publicación datos confidenciales en la minería de datos. La falta de técnicas rigurosas que permitan verificar formalmente este tipo de propiedades constituye un notable problema abierto que tiene que ser abordado. En esta tesis introducimos varias lógicas de programa quantitativas para razonar sobre esta clase de propiedades de seguridad. Nuestra principal contribución teórica es una versión quantitativa de una lógica de Hoare relacional para programas probabilísticos. Las pruebas de correción de estas lógicas son completamente formalizadas en el asistente de pruebas Coq. Desarrollamos, además, una herramienta para razonar sobre propiedades de programas a través de estas lógicas extendiendo CertiCrypt, un framework para verificar pruebas de criptografía en Coq. Confirmamos la efectividad y aplicabilidad de nuestra metodología construyendo pruebas certificadas por ordendor de varios sistemas cuyo análisis estaba fuera del alcance de la seguridad verificada. Esto incluye, entre otros, una meta-construcción para diseñar funciones de hash “seguras” sobre curvas elípticas y algoritmos diferencialmente privados para varios problemas de optimización combinatoria de la literatura reciente. ABSTRACT The verified security methodology is an emerging approach to build high assurance proofs about security properties of computer systems. Computer systems are modeled as probabilistic programs and one relies on rigorous program semantics techniques to prove that they comply with a given security goal. In particular, it advocates the use of interactive theorem provers or automated provers to build fully formal machine-checked versions of these security proofs. The verified security methodology has proved successful in modeling and reasoning about several standard security notions in the area of cryptography. However, it has fallen short of covering an important class of approximate, quantitative security notions. The distinguishing characteristic of this class of security notions is that they are stated as a “similarity” condition between the output distributions of two probabilistic programs, and this similarity is quantified using some notion of distance between probability distributions. This class comprises prominent security notions from multiple areas such as private data analysis, information flow analysis and cryptography. These include, for instance, indifferentiability, which enables securely replacing an idealized component of system with a concrete implementation, and differential privacy, a notion of privacy-preserving data mining that has received a great deal of attention in the last few years. The lack of rigorous techniques for verifying these properties is thus an important problem that needs to be addressed. In this dissertation we introduce several quantitative program logics to reason about this class of security notions. Our main theoretical contribution is, in particular, a quantitative variant of a full-fledged relational Hoare logic for probabilistic programs. The soundness of these logics is fully formalized in the Coq proof-assistant and tool support is also available through an extension of CertiCrypt, a framework to verify cryptographic proofs in Coq. We validate the applicability of our approach by building fully machine-checked proofs for several systems that were out of the reach of the verified security methodology. These comprise, among others, a construction to build “safe” hash functions into elliptic curves and differentially private algorithms for several combinatorial optimization problems from the recent literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rising water demands are difficult to meet in many regions of the world. In consequence, under meteorological adverse conditions, big economic losses in agriculture can take place. This paper aims to analyze the variability of water shortage in an irrigation district and the effect on farmer?s income. A probabilistic analysis of water availability for agriculture in the irrigation district is performed, through a supply-system simulation approach, considering stochastically generated series of stream-flows. Net margins associated to crop production are as well estimated depending on final water allocations. Net margins are calculated considering either single-crop farming, either a polyculture system. In a polyculture system, crop distribution and water redistribution are calculated through an optimization approach using the General Algebraic Modeling System (GAMS) for several scenarios of irrigation water availability. Expected net margins are obtained by crop and for the optimal crop and water distribution. The maximum expected margins are obtained for the optimal crop combination, followed by the alfalfa monoculture, maize, rice, wheat and finally barley. Water is distributed as follows, from biggest to smallest allocation: rice, alfalfa, maize, wheat and barley.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a biomolecular probabilistic model driven by the action of a DNA toolbox made of a set of DNA templates and enzymes that is able to perform Bayesian inference. The model will take single-stranded DNA as input data, representing the presence or absence of a specific molecular signal (the evidence). The program logic uses different DNA templates and their relative concentration ratios to encode the prior probability of a disease and the conditional probability of a signal given the disease. When the input and program molecules interact, an enzyme-driven cascade of reactions (DNA polymerase extension, nicking and degradation) is triggered, producing a different pair of single-stranded DNA species. Once the system reaches equilibrium, the ratio between the output species will represent the application of Bayes? law: the conditional probability of the disease given the signal. In other words, a qualitative diagnosis plus a quantitative degree of belief in that diagno- sis. Thanks to the inherent amplification capability of this DNA toolbox, the resulting system will be able to to scale up (with longer cascades and thus more input signals) a Bayesian biosensor that we designed previously.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Colombia is one of the largest per capita mercury polluters in the world as a consequence of its artisanal gold mining activities. The severity of this problem in terms of potential health effects was evaluated by means of a probabilistic risk assessment carried out in the twelve departments (or provinces) in Colombia with the largest gold production. The two exposure pathways included in the risk assessment were inhalation of elemental Hg vapors and ingestion of fish contaminated with methyl mercury. Exposure parameters for the adult population (especially rates of fish consumption) were obtained from nation-wide surveys and concentrations of Hg in air and of methyl-mercury in fish were gathered from previous scientific studies. Fish consumption varied between departments and ranged from 0 to 0.3 kg d?1. Average concentrations of total mercury in fish (70 data) ranged from 0.026 to 3.3 lg g?1. A total of 550 individual measurements of Hg in workshop air (ranging from menor queDL to 1 mg m?3) and 261 measurements of Hg in outdoor air (ranging from menor queDL to 0.652 mg m?3) were used to generate the probability distributions used as concentration terms in the calculation of risk. All but two of the distributions of Hazard Quotients (HQ) associated with ingestion of Hg-contaminated fish for the twelve regions evaluated presented median values higher than the threshold value of 1 and the 95th percentiles ranged from 4 to 90. In the case of exposure to Hg vapors, minimum values of HQ for the general population exceeded 1 in all the towns included in this study, and the HQs for miner-smelters burning the amalgam is two orders of magnitude higher, reaching values of 200 for the 95th percentile. Even acknowledging the conservative assumptions included in the risk assessment and the uncertainties associated with it, its results clearly reveal the exorbitant levels of risk endured not only by miner-smelters but also by the general population of artisanal gold mining communities in Colombia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of ventilation systems in tunnels is to reach the highest possible safety level both in service and fire situation; being the fire one, the most relevant when designing the system. When designing a longitudinal ventilation system, the methodology to evaluate the capacity of the system is similar both in service and fire situation, with the exception of the chimney effect and the phenomena of thermal transfer which is responsible or the changes in the density of the air. When facing the dimensioning task for longitudinal ventilated tunnels, although similar methodologies are used in different countries, specific hypothesis (aerodynamic, thermal properties, traffic) even if discussed in the literature or current practice, are not usually detailed in the regulations or recommendations. The aim of this paper is to propose a probabilistic approach to the problem which would allow the designer, and the tunnel owner, to understand the uncertainty and sensibility adopted in the results and, eventually, identify possible ways of optimizing the ventilation solution to be adopted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of interdependence between housing and commuting in a city has been analysed within the framework of welfare economics. Uncertain changes overtime in the working population has been considered by means of a dynamic, probabilistic model. The characteristics of irreversibility and durability in city building have been explicitly dealt with. The ultimate objective is that the model after further development will be an auxiliary tool in city planning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A toolbox is a set of procedures taking advantage of the computing power and graphical capacities of a CAS. With these procedures the students can solve math problems, apply mathematics to engineering or simply reinforce the learning of certain mathematical concepts. From the point of view of their construction, we can consider two types of toolboxes: (i) the closed box, built by the teacher, in which the utility files are provided to the students together with the respective tutorials and several worksheets with proposed exercises and problems,

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Colombia is one the largest per capita mercury polluters as a consequence of its artisanal gold mining operations, which are steadily increasing following the rising price of this metal. Compared to gravimetric separation methods and cyanidation, the concentration of gold using Hg amalgams presents several advantages: the process is less time-consuming and minimizes gold losses, and Hg is easily transported and inexpensive relative to the selling price of gold. Very often, mercury amalgamation is carried out on site by unprotected workers. During this operation large amounts of mercury are discharged to the environment and eventually reach the fresh water bodies in the vicinity where it is subjected to methylation. Additionally, as gold is released from the amalgam by heating on open charcoal furnaces in small workshops, mercury vapors are emitted and inhaled by the artisanal smelters and the general population

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El principal objetivo de la presente tesis es el de desarrollar y probar un código capaz de resolver las ecuaciones de Maxwell en el dominio del tiempo con Malla Refinada Adaptativa (AMR por sus siglas en inglés). AMR es una técnica de cálculo basada en dividir el dominio físico del problema en distintas mallas rectangulares paralelas a las direcciones cartesianas. Cada una de las mallas tendrá distinta resolución y aquellas con mayor resolución se sitúan allí dónde las ondas electromagnéticas se propagan o interaccionan con los materiales, es decir, dónde mayor precisión es requerida. Como las ondas van desplazándose por todo el dominio, las mayas deberán seguirlas. El principal problema al utilizar esta metodología se puede encontrar en las fronteras internas, dónde las distintas mallas se unen. Ya que el método más corrientemente utilizado para resolver las ecuaciones de Maxwell es el de las diferencias finitas en el dominio del tiempo (FDTD por sus siglas en inglés) , el trabajo comenzó tratando de adaptar AMR a FDTD. Tras descubrirse que esta interacción resultaba en problemas de inestabilidades en las fronteras internas antes citadas, se decidió cambiar a un método basado en volúmenes finitos en el dominio del tiempo (FVTD por sus siglas en inglés). Este se basa en considerar la forma en ecuaciones de conservación de las ecuaciones de Maxwell y aplicar a su resolución un esquema de Godunov. Se ha probado que es clave para el correcto funcionamiento del código la elección de un limitador de flujo que proteja los extremos de la onda de la disipación típica de los métodos de este tipo. Otro problema clásico a la hora de resolver las ecuaciones de Maxwell es el de tratar con las condiciones de frontera física cuando se simulan dominios no acotados, es decir, dónde las ondas deben salir del sistema sin producir ninguna reflexión. Normalmente la solución es la de disponer una banda absorbente en las fronteras físicas. En AMREM se ha desarrollado un nuevo método basado en los campos característicos que con menor requisito de CPU funcina suficientemente bien incluso en los casos más desfaborables. El código ha sido contrastado con soluciones analíticas de diferentes problemas y también su velocidad ha sido comparada con la de Meep, uno de los programas más conocidos del ámbito. También algunas aplicaciones han sido simuladas con el fin de demostrar el amplio espectro de campos en los que AMREM puede funcionar como una útil herramienta.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

(ENG) IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses (SPA)DPSA (Metodologías Integradas de Análisis Determinista-Probabilista de Seguridad) es un conjunto de métodos que utilizan métodos probabilistas y deterministas estrechamente acoplados para abordar las respectivas fuentes de incertidumbre, permitiendo la toma de decisiones Informada por el Riesgo de forma consistente. El punto de inicio del marco IDPSA es que la justificación de seguridad debe estar basada en el acoplamiento entre consideraciones deterministas (consecuencias) y probabilistas (frecuencia) para abordar la interacción mutua entre perturbaciones estocásticas (como por ejemplo fallos de los equipos, acciones humanas, fenómenos físicos estocásticos) y la respuesta determinista de la planta (como por ejemplo los transitorios). Este artículo da una visión general de algunos métodos IDSPA así como posibles aplicaciones al análisis de seguridad de los PWR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hock and Mumby (2015) describe an approach to quantify dispersal probabilities along paths in networks of habitat patches. This approach basically consists in determining the most probable (most reliable) path for movement between habitat patches by calculating the product of the dispersal probabilities in each link (step) along the paths in the network. Although the paper by Hock and Mumby (2015) has value and includes interesting analyses (see comments in section 7 below), the approach they describe is not new.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the recent years, the computer vision community has shown great interest on depth-based applications thanks to the performance and flexibility of the new generation of RGB-D imagery. In this paper, we present an efficient background subtraction algorithm based on the fusion of multiple region-based classifiers that processes depth and color data provided by RGB-D cameras. Foreground objects are detected by combining a region-based foreground prediction (based on depth data) with different background models (based on a Mixture of Gaussian algorithm) providing color and depth descriptions of the scene at pixel and region level. The information given by these modules is fused in a mixture of experts fashion to improve the foreground detection accuracy. The main contributions of the paper are the region-based models of both background and foreground, built from the depth and color data. The obtained results using different database sequences demonstrate that the proposed approach leads to a higher detection accuracy with respect to existing state-of-the-art techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Validating modern oceanographic theories using models produced through stereo computer vision principles has recently emerged. Space-time (4-D) models of the ocean surface may be generated by stacking a series of 3-D reconstructions independently generated for each time instant or, in a more robust manner, by simultaneously processing several snapshots coherently in a true ?4-D reconstruction.? However, the accuracy of these computer-vision-generated models is subject to the estimations of camera parameters, which may be corrupted under the influence of natural factors such as wind and vibrations. Therefore, removing the unpredictable errors of the camera parameters is necessary for an accurate reconstruction. In this paper, we propose a novel algorithm that can jointly perform a 4-D reconstruction as well as correct the camera parameter errors introduced by external factors. The technique is founded upon variational optimization methods to benefit from their numerous advantages: continuity of the estimated surface in space and time, robustness, and accuracy. The performance of the proposed algorithm is tested using synthetic data produced through computer graphics techniques, based on which the errors of the camera parameters arising from natural factors can be simulated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabajo presenta una solución al problema del reconocimiento del género de un rostro humano a partir de una imagen. Adoptamos una aproximación que utiliza la cara completa a través de la textura de la cara normalizada y redimensionada como entrada a un clasificador Näive Bayes. Presentamos la técnica de Análisis de Componentes Principales Probabilístico Condicionado-a-la-Clase (CC-PPCA) para reducir la dimensionalidad de los vectores de características para la clasificación y asegurar la asunción de independencia para el clasificador. Esta nueva aproximación tiene la deseable propiedad de presentar un modelo paramétrico sencillo para las marginales. Además, este modelo puede estimarse con muy pocos datos. En los experimentos que hemos desarrollados mostramos que CC-PPCA obtiene un 90% de acierto en la clasificación, resultado muy similar al mejor presentado en la literatura---ABSTRACT---This paper presents a solution to the problem of recognizing the gender of a human face from an image. We adopt a holistic approach by using the cropped and normalized texture of the face as input to a Naïve Bayes classifier. First it is introduced the Class-Conditional Probabilistic Principal Component Analysis (CC-PPCA) technique to reduce the dimensionality of the classification attribute vector and enforce the independence assumption of the classifier. This new approach has the desirable property of a simple parametric model for the marginals. Moreover this model can be estimated with very few data. In the experiments conducted we show that using CCPPCA we get 90% classification accuracy, which is similar result to the best in the literature. The proposed method is very simple to train and implement.