911 resultados para information flow properties


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network building and exchange of information by people within networks is crucial to the innovation process. Contrary to older models, in social networks the flow of information is noncontinuous and nonlinear. There are critical barriers to information flow that operate in a problematic manner. New models and new analytic tools are needed for these systems. This paper introduces the concept of virtual circuits and draws on recent concepts of network modelling and design to introduce a probabilistic switch theory that can be described using matrices. It can be used to model multistep information flow between people within organisational networks, to provide formal definitions of efficient and balanced networks and to describe distortion of information as it passes along human communication channels. The concept of multi-dimensional information space arises naturally from the use of matrices. The theory and the use of serial diagonal matrices have applications to organisational design and to the modelling of other systems. It is hypothesised that opinion leaders or creative individuals are more likely to emerge at information-rich nodes in networks. A mathematical definition of such nodes is developed and it does not invariably correspond with centrality as defined by early work on networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In disorders such as sleep apnea, sleep is fragmented with frequent EEG-arousal (EEGA) as determined via changes in the sleep-electroencephalogram. EEGA is a poorly understood, complicated phenomenon which is critically important in studying the mysteries of sleep. In this paper we study the information flow between the left and right hemispheres of the brain during the EEGA as manifested through inter-hemispheric asynchrony (IHA) of the surface EEG. EEG data (using electrodes A1/C4 and A2/C3 of international 10-20 system) was collected from 5 subjects undergoing routine polysomnography (PSG). Spectral correlation coefficient (R) was computed between EEG data from two hemispheres for delta-delta(0.5-4 Hz), theta-thetas(4.1-8 Hz), alpha-alpha(8.1-12 Hz) & beta-beta(12.1-25 Hz) frequency bands, during EEGA events. EEGA were graded in 3 levels as (i) micro arousals (3-6 s), (ii) short arousals (6.1-10 s), & (iii) long arousals (10.1-15 s). Our results revealed that in beta band, IHA increases above the baseline after the onset of EEGA and returns to the baseline after the conclusion of event. Results indicated that the duration of EEGA events has a direct influence on the onset of IHA. The latency (L) between the onset of arousals and IHA were found to be L=2plusmn0.5 s (for micro arousals), 4plusmn2.2 s (short arousals) and 6.5plusmn3.6 s (long arousals)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The feasibility of using a small-scale avalanche tester to measure the flow properties of pharmaceutical lactose powders was examined. The modes of behaviour observed in larger systems were displayed and showed a clear distinction between angular, free-flowing particles and more spherical particles of similar flow characteristics. Angular Lactohale LH100 particles showed slumping behaviour at a rotational frequency of 0.33Hz which disappeared at higher frequencies. Spherical lactose powder with a similar flow function to LH100 only showed rolling behaviour under the same conditions, as did more cohesive powders LH200 and LH300. Further investigation of the LH100 data using Fast Fourier analysis showed that the slumping frequency was 1/10th of the rotational frequency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

If we classify variables in a program into various security levels, then a secure information flow analysis aims to verify statically that information in a program can flow only in ways consistent with the specified security levels. One well-studied approach is to formulate the rules of the secure information flow analysis as a type system. A major trend of recent research focuses on how to accommodate various sophisticated modern language features. However, this approach often leads to overly complicated and restrictive type systems, making them unfit for practical use. Also, problems essential to practical use, such as type inference and error reporting, have received little attention. This dissertation identified and solved major theoretical and practical hurdles to the application of secure information flow. ^ We adopted a minimalist approach to designing our language to ensure a simple lenient type system. We started out with a small simple imperative language and only added features that we deemed most important for practical use. One language feature we addressed is arrays. Due to the various leaking channels associated with array operations, arrays have received complicated and restrictive typing rules in other secure languages. We presented a novel approach for lenient array operations, which lead to simple and lenient typing of arrays. ^ Type inference is necessary because usually a user is only concerned with the security types for input/output variables of a program and would like to have all types for auxiliary variables inferred automatically. We presented a type inference algorithm B and proved its soundness and completeness. Moreover, algorithm B stays close to the program and the type system and therefore facilitates informative error reporting that is generated in a cascading fashion. Algorithm B and error reporting have been implemented and tested. ^ Lastly, we presented a novel framework for developing applications that ensure user information privacy. In this framework, core computations are defined as code modules that involve input/output data from multiple parties. Incrementally, secure flow policies are refined based on feedback from the type checking/inference. Core computations only interact with code modules from involved parties through well-defined interfaces. All code modules are digitally signed to ensure their authenticity and integrity. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Type systems for secure information flow aim to prevent a program from leaking information from H (high) to L (low) variables. Traditionally, bisimulation has been the prevalent technique for proving the soundness of such systems. This work introduces a new proof technique based on stripping and fast simulation, and shows that it can be applied in a number of cases where bisimulation fails. We present a progressive development of this technique over a representative sample of languages including a simple imperative language (core theory), a multiprocessing nondeterministic language, a probabilistic language, and a language with cryptographic primitives. In the core theory we illustrate the key concepts of this technique in a basic setting. A fast low simulation in the context of transition systems is a binary relation where simulating states can match the moves of simulated states while maintaining the equivalence of low variables; stripping is a function that removes high commands from programs. We show that we can prove secure information flow by arguing that the stripping relation is a fast low simulation. We then extend the core theory to an abstract distributed language under a nondeterministic scheduler. Next, we extend to a probabilistic language with a random assignment command; we generalize fast simulation to the setting of discrete time Markov Chains, and prove approximate probabilistic noninterference. Finally, we introduce cryptographic primitives into the probabilistic language and prove computational noninterference, provided that the underling encryption scheme is secure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Secrecy is fundamental to computer security, but real systems often cannot avoid leaking some secret information. For this reason, the past decade has seen growing interest in quantitative theories of information flow that allow us to quantify the information being leaked. Within these theories, the system is modeled as an information-theoretic channel that specifies the probability of each output, given each input. Given a prior distribution on those inputs, entropy-like measures quantify the amount of information leakage caused by the channel. ^ This thesis presents new results in the theory of min-entropy leakage. First, we study the perspective of secrecy as a resource that is gradually consumed by a system. We explore this intuition through various models of min-entropy consumption. Next, we consider several composition operators that allow smaller systems to be combined into larger systems, and explore the extent to which the leakage of a combined system is constrained by the leakage of its constituents. Most significantly, we prove upper bounds on the leakage of a cascade of two channels, where the output of the first channel is used as input to the second. In addition, we show how to decompose a channel into a cascade of channels. ^ We also establish fundamental new results about the recently-proposed g-leakage family of measures. These results further highlight the significance of channel cascading. We prove that whenever channel A is composition refined by channel B, that is, whenever A is the cascade of B and R for some channel R, the leakage of A never exceeds that of B, regardless of the prior distribution or leakage measure (Shannon leakage, guessing entropy leakage, min-entropy leakage, or g-leakage). Moreover, we show that composition refinement is a partial order if we quotient away channel structure that is redundant with respect to leakage alone. These results are strengthened by the proof that composition refinement is the only way for one channel to never leak more than another with respect to g-leakage. Therefore, composition refinement robustly answers the question of when a channel is always at least as secure as another from a leakage point of view.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Protecting confidential information from improper disclosure is a fundamental security goal. While encryption and access control are important tools for ensuring confidentiality, they cannot prevent an authorized system from leaking confidential information to its publicly observable outputs, whether inadvertently or maliciously. Hence, secure information flow aims to provide end-to-end control of information flow. Unfortunately, the traditionally-adopted policy of noninterference, which forbids all improper leakage, is often too restrictive. Theories of quantitative information flow address this issue by quantifying the amount of confidential information leaked by a system, with the goal of showing that it is intuitively "small" enough to be tolerated. Given such a theory, it is crucial to develop automated techniques for calculating the leakage in a system. ^ This dissertation is concerned with program analysis for calculating the maximum leakage, or capacity, of confidential information in the context of deterministic systems and under three proposed entropy measures of information leakage: Shannon entropy leakage, min-entropy leakage, and g-leakage. In this context, it turns out that calculating the maximum leakage of a program reduces to counting the number of possible outputs that it can produce. ^ The new approach introduced in this dissertation is to determine two-bit patterns, the relationships among pairs of bits in the output; for instance we might determine that two bits must be unequal. By counting the number of solutions to the two-bit patterns, we obtain an upper bound on the number of possible outputs. Hence, the maximum leakage can be bounded. We first describe a straightforward computation of the two-bit patterns using an automated prover. We then show a more efficient implementation that uses an implication graph to represent the two- bit patterns. It efficiently constructs the graph through the use of an automated prover, random executions, STP counterexamples, and deductive closure. The effectiveness of our techniques, both in terms of efficiency and accuracy, is shown through a number of case studies found in recent literature. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal was to understand, document and module how information is currently flown internally in the largest dairy organization in Finland. The organization has undergone radical changes in the past years due to economic sanctions between European Union and Russia. Therefore, organization’s ultimate goal would be to continue its growth through managing its sales process more efficiently. The thesis consists of a literature review and an empirical part. The literature review consists of knowledge management and process modeling theories. First, the knowledge management discusses how data, information and knowledge are exchanged in the process. Knowledge management models and processes are describing how knowledge is created, exchanged and can be managed in an organization. Secondly, the process modeling is responsible for visualizing information flow through discussion of modeling approaches and presenting different methods and techniques. Finally, process’ documentation procedure was presented. In the end, a constructive research approach was used in order to identify process’ related problems and bottlenecks. Therefore, possible solutions were presented based on this approach. The empirical part of the study is based on 37 interviews, organization’s internal data sources and theoretical framework. The acquired data and information were used to document and to module the sales process in question with a flowchart diagram. Results are conducted through construction of the flowchart diagram and analysis of the documentation. In fact, answers to research questions are derived from empirical and theoretical parts. In the end, 14 problems and two bottlenecks were identified in the process. The most important problems are related to approach and/or standardization for information sharing, insufficient information technology tool utilization and lack of systematization of documentation. The bottlenecks are caused by the alarming amount of changes to files after their deadlines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The subiculum, considered to be the output structure of the hippocampus, modulates information flow from the hippocampus to various cortical and sub-cortical areas such as the nucleus accumbens, lateral septal region, thalamus, nucleus gelatinosus, medial nucleus and mammillary nuclei. Tonic inhibitory current plays an important role in neuronal physiology and pathophysiology by modulating the electrophysiological properties of neurons. While the alterations of various electrical properties due to tonic inhibition have been studied in neurons from different regions, its influence on intrinsic subthreshold resonance in pyramidal excitatory neurons expressing hyperpolarization-activated cyclic nucleotide-gated (HCN) channels is not known. Using pharmacological agents, we show the involvement of alpha 5 beta gamma GABA(A) receptors in the picrotoxin-sensitive tonic current in subicular pyramidal neurons. We further investigated the contribution of tonic conductance in regulating subthreshold electrophysiological properties using current clamp and dynamic clamp experiments. We demonstrate that tonic GABAergic inhibition can actively modulate subthreshold properties, including resonance due to HCN channels, which can potentially alter the response dynamics of subicular pyramidal neurons in an oscillating neuronal network.