10 resultados para chaotic encryption
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
The variables involved in the equations that describe realistic synaptic dynamics always vary in a limited range. Their boundedness makes the synapses forgetful, not for the mere passage of time, but because new experiences overwrite old memories. The forgetting rate depends on how many synapses are modified by each new experience: many changes means fast learning and fast forgetting, whereas few changes means slow learning and long memory retention. Reducing the average number of modified synapses can extend the memory span at the price of a reduced amount of information stored when a new experience is memorized. Every trick which allows to slow down the learning process in a smart way can improve the memory performance. We review some of the tricks that allow to elude fast forgetting (oblivion). They are based on the stochastic selection of the synapses whose modifications are actually consolidated following each new experience. In practice only a randomly selected, small fraction of the synapses eligible for an update are actually modified. This allows to acquire the amount of information necessary to retrieve the memory without compromising the retention of old experiences. The fraction of modified synapses can be further reduced in a smart way by changing synapses only when it is really necessary, i.e. when the post-synaptic neuron does not respond as desired. Finally we show that such a stochastic selection emerges naturally from spike driven synaptic dynamics which read noisy pre and post-synaptic neural activities. These activities can actually be generated by a chaotic system.
Resumo:
Our approaches to the use of EEG studies for the understanding of the pathogenesis of schizophrenic symptoms are presented. The basic assumptions of a heuristic and multifactorial model of the psychobiological brain mechanisms underlying the organization of normal behavior is described and used in order to formulate and test hypotheses about the pathogenesis of schizophrenic behavior using EEG measures. Results from our studies on EEG activity and EEG reactivity (= EEG components of a memory-driven, adaptive, non-unitary orienting response) as analyzed with spectral parameters and "chaotic" dimensionality (correlation dimension) are summarized. Both analysis procedures showed a deviant brain functional organization in never-treated first-episode schizophrenia which, within the framework of the model, suggests as common denominator for the pathogenesis of the symptoms a deviation of working memory, the nature of which is functional and not structural.
Resumo:
Although vascular endothelial growth factor (VEGF) has been described as a potent angiogenic stimulus, its application in therapy remains difficult: blood vessels formed by exposure to VEGF tend to be malformed and leaky. In nature, the principal form of VEGF possesses a binding site for ECM components that maintain it in the immobilized state until released by local cellular enzymatic activity. In this study, we present an engineered variant form of VEGF, alpha2PI1-8-VEGF121, that mimics this concept of matrix-binding and cell-mediated release by local cell-associated enzymatic activity, working in the surgically-relevant biological matrix fibrin. We show that matrix-conjugated alpha2PI1-8-VEGF121 is protected from clearance, contrary to native VEGF121 mixed into fibrin, which was completely released as a passive diffusive burst. Grafting studies on the embryonic chicken chorioallantoic membrane (CAM) and in adult mice were performed to assess and compare the quantity and quality of neovasculature induced in response to fibrin implants formulated with matrix-bound alpha2PI1-8-VEGF121 or native diffusible VEGF121. Our CAM measurements demonstrated that cell-demanded release of alpha2PI1-8-VEGF121 increases the formation of new arterial and venous branches, whereas exposure to passively released wild-type VEGF121 primarily induced chaotic changes within the capillary plexus. Specifically, our analyses at several levels, from endothelial cell morphology and endothelial interactions with periendothelial cells, to vessel branching and network organization, revealed that alpha2PI1-8-VEGF121 induces vessel formation more potently than native VEGF121 and that those vessels possess more normal morphologies at the light microscopic and ultrastructural level. Permeability studies in mice validated that vessels induced by alpha2PI1-8-VEGF121 do not leak. In conclusion, cell-demanded release of engineered VEGF121 from fibrin implants may present a therapeutically safe and practical modality to induce local angiogenesis.
Resumo:
The Earth’s climate system is driven by a complex interplay of internal chaotic dynamics and natural and anthropogenic external forcing. Recent instrumental data have shown a remarkable degree of asynchronicity between Northern Hemisphere and Southern Hemisphere temperature fluctuations, thereby questioning the relative importance of internal versus external drivers of past as well as future climate variability1, 2, 3. However, large-scale temperature reconstructions for the past millennium have focused on the Northern Hemisphere4, 5, limiting empirical assessments of inter-hemispheric variability on multi-decadal to centennial timescales. Here, we introduce a new millennial ensemble reconstruction of annually resolved temperature variations for the Southern Hemisphere based on an unprecedented network of terrestrial and oceanic palaeoclimate proxy records. In conjunction with an independent Northern Hemisphere temperature reconstruction ensemble5, this record reveals an extended cold period (1594–1677) in both hemispheres but no globally coherent warm phase during the pre-industrial (1000–1850) era. The current (post-1974) warm phase is the only period of the past millennium where both hemispheres are likely to have experienced contemporaneous warm extremes. Our analysis of inter-hemispheric temperature variability in an ensemble of climate model simulations for the past millennium suggests that models tend to overemphasize Northern Hemisphere–Southern Hemisphere synchronicity by underestimating the role of internal ocean–atmosphere dynamics, particularly in the ocean-dominated Southern Hemisphere. Our results imply that climate system predictability on decadal to century timescales may be lower than expected based on assessments of external climate forcing and Northern Hemisphere temperature variations5, 6 alone.
Resumo:
The traditional Newton method for solving nonlinear operator equations in Banach spaces is discussed within the context of the continuous Newton method. This setting makes it possible to interpret the Newton method as a discrete dynamical system and thereby to cast it in the framework of an adaptive step size control procedure. In so doing, our goal is to reduce the chaotic behavior of the original method without losing its quadratic convergence property close to the roots. The performance of the modified scheme is illustrated with various examples from algebraic and differential equations.
Resumo:
BACKGROUND Record linkage of existing individual health care data is an efficient way to answer important epidemiological research questions. Reuse of individual health-related data faces several problems: Either a unique personal identifier, like social security number, is not available or non-unique person identifiable information, like names, are privacy protected and cannot be accessed. A solution to protect privacy in probabilistic record linkages is to encrypt these sensitive information. Unfortunately, encrypted hash codes of two names differ completely if the plain names differ only by a single character. Therefore, standard encryption methods cannot be applied. To overcome these challenges, we developed the Privacy Preserving Probabilistic Record Linkage (P3RL) method. METHODS In this Privacy Preserving Probabilistic Record Linkage method we apply a three-party protocol, with two sites collecting individual data and an independent trusted linkage center as the third partner. Our method consists of three main steps: pre-processing, encryption and probabilistic record linkage. Data pre-processing and encryption are done at the sites by local personnel. To guarantee similar quality and format of variables and identical encryption procedure at each site, the linkage center generates semi-automated pre-processing and encryption templates. To retrieve information (i.e. data structure) for the creation of templates without ever accessing plain person identifiable information, we introduced a novel method of data masking. Sensitive string variables are encrypted using Bloom filters, which enables calculation of similarity coefficients. For date variables, we developed special encryption procedures to handle the most common date errors. The linkage center performs probabilistic record linkage with encrypted person identifiable information and plain non-sensitive variables. RESULTS In this paper we describe step by step how to link existing health-related data using encryption methods to preserve privacy of persons in the study. CONCLUSION Privacy Preserving Probabilistic Record linkage expands record linkage facilities in settings where a unique identifier is unavailable and/or regulations restrict access to the non-unique person identifiable information needed to link existing health-related data sets. Automated pre-processing and encryption fully protect sensitive information ensuring participant confidentiality. This method is suitable not just for epidemiological research but also for any setting with similar challenges.
Resumo:
PURPOSE The implementation of genomic-based medicine is hindered by unresolved questions regarding data privacy and delivery of interpreted results to health-care practitioners. We used DNA-based prediction of HIV-related outcomes as a model to explore critical issues in clinical genomics. METHODS We genotyped 4,149 markers in HIV-positive individuals. Variants allowed for prediction of 17 traits relevant to HIV medical care, inference of patient ancestry, and imputation of human leukocyte antigen (HLA) types. Genetic data were processed under a privacy-preserving framework using homomorphic encryption, and clinical reports describing potentially actionable results were delivered to health-care providers. RESULTS A total of 230 patients were included in the study. We demonstrated the feasibility of encrypting a large number of genetic markers, inferring patient ancestry, computing monogenic and polygenic trait risks, and reporting results under privacy-preserving conditions. The average execution time of a multimarker test on encrypted data was 865 ms on a standard computer. The proportion of tests returning potentially actionable genetic results ranged from 0 to 54%. CONCLUSIONS The model of implementation presented herein informs on strategies to deliver genomic test results for clinical care. Data encryption to ensure privacy helps to build patient trust, a key requirement on the road to genomic-based medicine.Genet Med advance online publication 14 January 2016Genetics in Medicine (2016); doi:10.1038/gim.2015.167.
Resumo:
Throughout the last millennium, mankind was affected by prolonged deviations from the climate mean state. While periods like the Maunder Minimum in the 17th century have been assessed in greater detail, earlier cold periods such as the 15th century received much less attention due to the sparse information available. Based on new evidence from different sources ranging from proxy archives to model simulations, it is now possible to provide an end-to-end assessment about the climate state during an exceptionally cold period in the 15th century, the role of internal, unforced climate variability and external forcing in shaping these extreme climatic conditions, and the impacts on and responses of the medieval society in Central Europe. Climate reconstructions from a multitude of natural and human archives indicate that, during winter, the period of the early Spörer Minimum (1431–1440 CE) was the coldest decade in Central Europe in the 15th century. The particularly cold winters and normal but wet summers resulted in a strong seasonal cycle that challenged food production and led to increasing food prices, a subsistence crisis, and a famine in parts of Europe. As a consequence, authorities implemented adaptation measures, such as the installation of grain storage capacities, in order to be prepared for future events. The 15th century is characterised by a grand solar minimum and enhanced volcanic activity, which both imply a reduction of seasonality. Climate model simulations show that periods with cold winters and strong seasonality are associated with internal climate variability rather than external forcing. Accordingly, it is hypothesised that the reconstructed extreme climatic conditions during this decade occurred by chance and in relation to the partly chaotic, internal variability within the climate system.
Resumo:
Digital Rights Management Systems (DRMS) are seen by content providers as the appropriate tool to, on the one hand, fight piracy and, on the other hand, monetize their assets. Although these systems claim to be very powerful and include multiple protection technologies, there is a lack of understanding about how such systems are currently being implemented and used by content providers. The aim of this paper is twofold. First, it provides a theoretical basis through which we present shortly the seven core protection technologies of a DRMS. Second, this paper provides empirical evidence that the seven protection technologies outlined in the first section of this paper are the most commonly used technologies. It further evaluates to what extent these technologies are being used within the music and print industry. It concludes that the three main Technologies are encryption, password, and payment systems. However, there are some industry differences: the number of protection technologies used, the requirements for a DRMS, the required investment, or the perceived success of DRMS in fighting piracy.
Resumo:
Technology advances in hardware, software and IP-networks such as the Internet or peer-to-peer file sharing systems are threatening the music business. The result has been an increasing amount of illegal copies available on-line as well as off-line. With the emergence of digital rights management systems (DRMS), the music industry seems to have found the appropriate tool to simultaneously fight piracy and to monetize their assets. Although these systems are very powerful and include multiple technologies to prevent piracy, it is as of yet unknown to what extent such systems are currently being used by content providers. We provide empirical analyses, results, and conclusions related to digital rights management systems and the protection of digital content in the music industry. It shows that most content providers are protecting their digital content through a variety of technologies such as passwords or encryption. However, each protection technology has its own specific goal, and not all prevent piracy. The majority of the respondents are satisfied with their current protection but want to reinforce it for the future, due to fear of increasing piracy. Surprisingly, although encryption is seen as the core DRM technology, only few companies are currently using it. Finally, half of the respondents do not believe in the success of DRMS and their ability to reduce piracy.