897 resultados para Quantitative information


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper introduces vulnerability and quantitative privacy. Optional reading

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper uses Shannon's information theory to give a quantitative definition of information flow in systems that transform inputs to outputs. For deterministic systems, the definition is shown to specialise to a simpler form when the information source and the known inputs jointly determine the inputs. For this special case, the definition is related to the classical security condition of non-interference and an equivalence is established between non-interference and independence of random variables. Quantitative information flow for deterministic systems is then presented in relational form. With this presentation, it is shown how relational parametricity can be used to derive upper and lower bounds on information flows through families of functions defined in the second order lambda calculus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Secrecy is fundamental to computer security, but real systems often cannot avoid leaking some secret information. For this reason, the past decade has seen growing interest in quantitative theories of information flow that allow us to quantify the information being leaked. Within these theories, the system is modeled as an information-theoretic channel that specifies the probability of each output, given each input. Given a prior distribution on those inputs, entropy-like measures quantify the amount of information leakage caused by the channel. ^ This thesis presents new results in the theory of min-entropy leakage. First, we study the perspective of secrecy as a resource that is gradually consumed by a system. We explore this intuition through various models of min-entropy consumption. Next, we consider several composition operators that allow smaller systems to be combined into larger systems, and explore the extent to which the leakage of a combined system is constrained by the leakage of its constituents. Most significantly, we prove upper bounds on the leakage of a cascade of two channels, where the output of the first channel is used as input to the second. In addition, we show how to decompose a channel into a cascade of channels. ^ We also establish fundamental new results about the recently-proposed g-leakage family of measures. These results further highlight the significance of channel cascading. We prove that whenever channel A is composition refined by channel B, that is, whenever A is the cascade of B and R for some channel R, the leakage of A never exceeds that of B, regardless of the prior distribution or leakage measure (Shannon leakage, guessing entropy leakage, min-entropy leakage, or g-leakage). Moreover, we show that composition refinement is a partial order if we quotient away channel structure that is redundant with respect to leakage alone. These results are strengthened by the proof that composition refinement is the only way for one channel to never leak more than another with respect to g-leakage. Therefore, composition refinement robustly answers the question of when a channel is always at least as secure as another from a leakage point of view.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Protecting confidential information from improper disclosure is a fundamental security goal. While encryption and access control are important tools for ensuring confidentiality, they cannot prevent an authorized system from leaking confidential information to its publicly observable outputs, whether inadvertently or maliciously. Hence, secure information flow aims to provide end-to-end control of information flow. Unfortunately, the traditionally-adopted policy of noninterference, which forbids all improper leakage, is often too restrictive. Theories of quantitative information flow address this issue by quantifying the amount of confidential information leaked by a system, with the goal of showing that it is intuitively "small" enough to be tolerated. Given such a theory, it is crucial to develop automated techniques for calculating the leakage in a system. ^ This dissertation is concerned with program analysis for calculating the maximum leakage, or capacity, of confidential information in the context of deterministic systems and under three proposed entropy measures of information leakage: Shannon entropy leakage, min-entropy leakage, and g-leakage. In this context, it turns out that calculating the maximum leakage of a program reduces to counting the number of possible outputs that it can produce. ^ The new approach introduced in this dissertation is to determine two-bit patterns, the relationships among pairs of bits in the output; for instance we might determine that two bits must be unequal. By counting the number of solutions to the two-bit patterns, we obtain an upper bound on the number of possible outputs. Hence, the maximum leakage can be bounded. We first describe a straightforward computation of the two-bit patterns using an automated prover. We then show a more efficient implementation that uses an implication graph to represent the two- bit patterns. It efficiently constructs the graph through the use of an automated prover, random executions, STP counterexamples, and deductive closure. The effectiveness of our techniques, both in terms of efficiency and accuracy, is shown through a number of case studies found in recent literature. ^

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Hedonic pricing techniques can be used to generate quantitative information useful to the project appraiser at various stages of the project cycle, most notably project formulation and investment appraisal. To illustrate, a hedonic pricing model is applied to marina berthing charges in England and Wales. The technique determines the relevant marina facilities that are reflected in marina rental price. The contribution of the key marina facilities is expressed in monetary terms as the contribution to cost per overall rental price per foot.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A method for obtaining quantitative information about electric field and charge distributions from proton imaging measurements of laser-induced plasmas is presented. A parameterised charge distribution is used as target plasma. The deflection of a proton beam by the electric field of such a plasma is simulated numerically as well as the resulting proton density, which will be obtained on a screen behind the plasma according to the proton imaging technique. The parameters of the specific charge distributions are delivered by a combination of linear regression and nonlinear fitting of the calculated proton density distribution to the measured optical density of a radiochromic film screen changed by proton exposure. It is shown that superpositions of spherical Gaussian charge distributions as target plasma are sufficient to simulate various structures in proton imaging measurements, which makes this method very flexible.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The dentate gyrus is one of only two regions of the mammalian brain where substantial neurogenesis occurs postnatally. However, detailed quantitative information about the postnatal structural maturation of the primate dentate gyrus is meager. We performed design-based, stereological studies of neuron number and size, and volume of the dentate gyrus layers in rhesus macaque monkeys (Macaca mulatta) of different postnatal ages. We found that about 40% of the total number of granule cells observed in mature 5-10-year-old macaque monkeys are added to the granule cell layer postnatally; 25% of these neurons are added within the first three postnatal months. Accordingly, cell proliferation and neurogenesis within the dentate gyrus peak within the first 3 months after birth and remain at an intermediate level between 3 months and at least 1 year of age. Although granule cell bodies undergo their largest increase in size during the first year of life, cell size and the volume of the three layers of the dentate gyrus (i.e. the molecular, granule cell and polymorphic layers) continue to increase beyond 1 year of age. Moreover, the different layers of the dentate gyrus exhibit distinct volumetric changes during postnatal development. Finally, we observe significant levels of cell proliferation, neurogenesis and cell death in the context of an overall stable number of granule cells in mature 5-10-year-old monkeys. These data identify an extended developmental period during which neurogenesis might be modulated to significantly impact the structure and function of the dentate gyrus in adulthood.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The organization of non-crystalline polymeric materials at a local level, namely on a spatial scale between a few and 100 a, is still unclear in many respects. The determination of the local structure in terms of the configuration and conformation of the polymer chain and of the packing characteristics of the chain in the bulk material represents a challenging problem. Data from wide-angle diffraction experiments are very difficult to interpret due to the very large amount of information that they carry, that is the large number of correlations present in the diffraction patterns.We describe new approaches that permit a detailed analysis of the complex neutron diffraction patterns characterizing polymer melts and glasses. The coupling of different computer modelling strategies with neutron scattering data over a wide Q range allows the extraction of detailed quantitative information on the structural arrangements of the materials of interest. Proceeding from modelling routes as diverse as force field calculations, single-chain modelling and reverse Monte Carlo, we show the successes and pitfalls of each approach in describing model systems, which illustrate the need to attack the data analysis problem simultaneously from several fronts.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We report on the assembly of tumor necrosis factor receptor 1 (TNF-R1) prior to ligand activation and its ligand-induced reorganization at the cell membrane. We apply single-molecule localization microscopy to obtain quantitative information on receptor cluster sizes and copy numbers. Our data suggest a dimeric pre-assembly of TNF-R1, as well as receptor reorganization toward higher oligomeric states with stable populations comprising three to six TNF-R1. Our experimental results directly serve as input parameters for computational modeling of the ligand-receptor interaction. Simulations corroborate the experimental finding of higher-order oligomeric states. This work is a first demonstration how quantitative, super-resolution and advanced microscopy can be used for systems biology approaches at the single-molecule and single-cell level.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The vertical distribution of cloud cover has a significant impact on a large number of meteorological and climatic processes. Cloud top altitude and cloud geometrical thickness are then essential. Previous studies established the possibility of retrieving those parameters from multi-angular oxygen A-band measurements. Here we perform a study and comparison of the performances of future instruments. The 3MI (Multi-angle, Multi-channel and Multi-polarization Imager) instrument developed by EUMETSAT, which is an extension of the POLDER/PARASOL instrument, and MSPI (Multi-angles Spectro-Polarimetric Imager) develoloped by NASA's Jet Propulsion Laboratory will measure total and polarized light reflected by the Earth's atmosphere–surface system in several spectral bands (from UV to SWIR) and several viewing geometries. Those instruments should provide opportunities to observe the links between the cloud structures and the anisotropy of the reflected solar radiation into space. Specific algorithms will need be developed in order to take advantage of the new capabilities of this instrument. However, prior to this effort, we need to understand, through a theoretical Shannon information content analysis, the limits and advantages of these new instruments for retrieving liquid and ice cloud properties, and especially, in this study, the amount of information coming from the A-Band channel on the cloud top altitude (CTOP) and geometrical thickness (CGT). We compare the information content of 3MI A-Band in two configurations and that of MSPI. Quantitative information content estimates show that the retrieval of CTOP with a high accuracy is possible in almost all cases investigated. The retrieval of CGT seems less easy but possible for optically thick clouds above a black surface, at least when CGT > 1–2 km.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Das Time-of-Flight Aerosol Mass Spectrometer (ToF-AMS) der Firma Aerodyne ist eine Weiterentwicklung des Aerodyne Aerosolmassenspektrometers (Q-AMS). Dieses ist gut charakterisiert und kommt weltweit zum Einsatz. Beide Instrumente nutzen eine aerodynamische Linse, aerodynamische Partikelgrößenbestimmung, thermische Verdampfung und Elektronenstoß-Ionisation. Im Gegensatz zum Q-AMS, wo ein Quadrupolmassenspektrometer zur Analyse der Ionen verwendet wird, kommt beim ToF-AMS ein Flugzeit-Massenspektrometer zum Einsatz. In der vorliegenden Arbeit wird anhand von Laborexperimenten und Feldmesskampagnen gezeigt, dass das ToF-AMS zur quantitativen Messung der chemischen Zusammensetzung von Aerosolpartikeln mit hoher Zeit- und Größenauflösung geeignet ist. Zusätzlich wird ein vollständiges Schema zur ToF-AMS Datenanalyse vorgestellt, dass entwickelt wurde, um quantitative und sinnvolle Ergebnisse aus den aufgenommenen Rohdaten, sowohl von Messkampagnen als auch von Laborexperimenten, zu erhalten. Dieses Schema basiert auf den Charakterisierungsexperimenten, die im Rahmen dieser Arbeit durchgeführt wurden. Es beinhaltet Korrekturen, die angebracht werden müssen, und Kalibrationen, die durchgeführt werden müssen, um zuverlässige Ergebnisse aus den Rohdaten zu extrahieren. Beträchtliche Arbeit wurde außerdem in die Entwicklung eines zuverlässigen und benutzerfreundlichen Datenanalyseprogramms investiert. Dieses Programm kann zur automatischen und systematischen ToF-AMS Datenanalyse und –korrektur genutzt werden.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A confocal imaging and image processing scheme is introduced to visualize and evaluate the spatial distribution of spectral information in tissue. The image data are recorded using a confocal laser-scanning microscope equipped with a detection unit that provides high spectral resolution. The processing scheme is based on spectral data, is less error-prone than intensity-based visualization and evaluation methods, and provides quantitative information on the composition of the sample. The method is tested and validated in the context of the development of dermal drug delivery systems, introducing a quantitative uptake indicator to compare the performances of different delivery systems is introduced. A drug penetration study was performed in vitro. The results show that the method is able to detect, visualize and measure spectral information in tissue. In the penetration study, uptake efficiencies of different experiment setups could be discriminated and quantitatively described. The developed uptake indicator is a step towards a quantitative assessment and, in a more general view apart from pharmaceutical research, provides valuable information on tissue composition. It can potentially be used for clinical in vitro and in vivo applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Expert panels have been used extensively in the development of the "Highway Safety Manual" to extract research information from highway safety experts. While the panels have been used to recommend agendas for new and continuing research, their primary role has been to develop accident modification factors—quantitative relationships between highway safety and various highway safety treatments. Because the expert panels derive quantitative information in a “qualitative” environment and because their findings can have significant impacts on highway safety investment decisions, the expert panel process should be described and critiqued. This paper is the first known written description and critique of the expert panel process and is intended to serve professionals wishing to conduct such panels.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The cycling interaction between climate change and building performance is of dynamic nature and both are essentially the cause and the effect of each other. On one hand, buildings contribute significantly to the global warming process. On the other hand, climate change is also expected to impact on many aspects of building performance. In this paper, the status of current research on the implication of climate change on built environment is reviewed. It is found that although the present research has covered broad areas of research, they are generally only limited to the qualitative analyses. It is also highlighted that although it is widely realized that reducing greenhouse gas emissions from the building sector is very important, the adoption of complementary adaptation strategy to prepare the building for a range of climate change scenarios is also necessary. Due to the lack of holistic approach to generate future hourly weather data, various approaches have been used to generate different key weather variables. This ad hoc situation has seriously hindered the application of building simulation technique to the climate change impact study, in particular, to provide quantitative information for policy and design development.