972 resultados para pseudo-random number generator
Resumo:
A chaotic encryption algorithm is proposed based on the "Life-like" cellular automata (CA), which acts as a pseudo-random generator (PRNG). The paper main focus is to use chaos theory to cryptography. Thus, CA was explored to look for this "chaos" property. This way, the manuscript is more concerning on tests like: Lyapunov exponent, Entropy and Hamming distance to measure the chaos in CA, as well as statistic analysis like DIEHARD and ENT suites. Our results achieved higher randomness quality than others ciphers in literature. These results reinforce the supposition of a strong relationship between chaos and the randomness quality. Thus, the "chaos" property of CA is a good reason to be employed in cryptography, furthermore, for its simplicity, low cost of implementation and respectable encryption power. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Active head turns to the left and right have recently been shown to influence numerical cognition by shifting attention along the mental number line. In the present study, we found that passive whole-body motion influences numerical cognition. In a random-number generation task (Experiment 1), leftward and downward displacement of participants facilitated small number generation, whereas rightward and upward displacement facilitated the generation of large numbers. Influences of leftward and rightward motion were also found for the processing of auditorily presented numbers in a magnitude-judgment task (Experiment 2). Additionally, we investigated the reverse effect of the number-space association (Experiment 3). Participants were displaced leftward or rightward and asked to detect motion direction as fast as possible while small or large numbers were auditorily presented. When motion detection was difficult, leftward motion was detected faster when hearing small number and rightward motion when hearing large number. We provide new evidence that bottom-up vestibular activation is sufficient to interact with the higher-order spatial representation underlying numerical cognition. The results show that action planning or motor activity is not necessary to influence spatial attention. Moreover, our results suggest that self-motion perception and numerical cognition can mutually influence each other.
Resumo:
El objetivo del PFC es el diseño e implementación de una aplicación que funcione como osciloscopio, analizador de espectro y generador de funciones virtual, todo dentro de la misma aplicacion. Mediante una tarjeta de adquisición de datos tomaremos muestras de señales del mundo real (sistema analógico) para generar datos que puedan ser manipulados por un ordenador (sistema digital). Con esta misma tarjeta también se podrán generar señales básicas, tales como señales senoidales, cuadradas.... y además se ha añadido la funcionalidad de generar señales moduladas en frecuencia, señales tipo Chirp (usadas comúnmente tanto en aplicaciones sonar y radar, como en transmisión óptica) o PRN (ruido pseudo-aleatorio que consta de una secuencia determinista de pulsos que se repite cada periodo, usada comúnmente en receptores GPS), como también señales ampliamente conocidas como el ruido blanco Gaussiano o el ruido blanco uniforme. La aplicación mostrará con detalle las señales adquiridas y analizará de diversas maneras esas señales. Posee la función de enventanado de los tipos de ventana mas comunes, respuesta en frecuencia, transformada de Fourier, etc. La configuración es elegida por el usuario en un entorno amigable y de visualización atractiva. The objective of the PFC is the design and implementation of an application that works as oscilloscope, spectrum analyzer and virtual signal generator, all within the same application. Through a data acquisition card, the user can take samples of real-world signals (analog system) to generate data that can be manipulated by a computer (digital system). This same card can also generate basic signals, such as sine waves, square waves, sawtooth waves.... and further has added other functionalities as frequency modulated signals generation, Chirp signals type generation (commonly used in both sonar and radar applications, such as optical transmission) or PRN (pseudo-random noise sequence comprising a deterministic pulse that repeats every period, commonly used in GPS receivers). It also can generate widely known as Gaussian white noise signals or white noise uniform signals. The application will show in detail the acquired signals and will analyze these signals in different ways selected by the user. Windowing function has the most common window types, frequency response, Fourier transform are examples of what kind of analyzing that can be processed. The configuration is chosen by the user throught friendly and attractive displays and panels.
Resumo:
The characteristics of the power-line communication (PLC) channel are difficult to model due to the heterogeneity of the networks and the lack of common wiring practices. To obtain the full variability of the PLC channel, random channel generators are of great importance for the design and testing of communication algorithms. In this respect, we propose a random channel generator that is based on the top-down approach. Basically, we describe the multipath propagation and the coupling effects with an analytical model. We introduce the variability into a restricted set of parameters and, finally, we fit the model to a set of measured channels. The proposed model enables a closed-form description of both the mean path-loss profile and the statistical correlation function of the channel frequency response. As an example of application, we apply the procedure to a set of in-home measured channels in the band 2-100 MHz whose statistics are available in the literature. The measured channels are divided into nine classes according to their channel capacity. We provide the parameters for the random generation of channels for all nine classes, and we show that the results are consistent with the experimental ones. Finally, we merge the classes to capture the entire heterogeneity of in-home PLC channels. In detail, we introduce the class occurrence probability, and we present a random channel generator that targets the ensemble of all nine classes. The statistics of the composite set of channels are also studied, and they are compared to the results of experimental measurement campaigns in the literature.
Resumo:
The noise properties of supercontinuum generation continue to be a subject of wide interest within both pure and applied physics. Aside from immediate applications in supercontinuum source development, detailed studies of supercontinuum noise mechanisms have attracted interdisciplinary attention because of links with extreme instabilities in other physical systems, especially the infamous and destructive oceanic rogue waves. But the instabilities inherent in supercontinuum generation can also be interpreted in terms of natural links with the general field of random processes, and this raises new possibilities for applications in areas such as random number generation. In this contribution we will describe recent work where we interpret supercontinuum intensity and phase fluctuations in this way.
Resumo:
We report a numerical study showing how the random intensity and phase fluctuations across the bandwidth of a broadband optical supercontinuum can be interpreted in terms of the random processes of random walks and Lévy flights. We also describe how the intensity fluctuations can be applied to physical random number generation. We conclude that the optical supercontinuum provides a highly versatile means of studying and generating a wide class of random processes at optical wavelengths. © 2012 Optical Society of America.
Resumo:
We examined the possibility of using noise or pseudo-random variations of the refractive index in the design of fiber Bragg gratings (FBGs). We demonstrated theoretically and experimentally that top-hat FBGs may be designed and fabricated using this approach. The reflectivity of the fabricated top-hat FBG matches quite well with that of the designed one. © 2015 Optical Society of America.
Resumo:
Aim. To evaluate the effectiveness of three approaches to assisting the female partners of male problem drinkers with the stress imposed by the male's drinking. Design. Participants were assigned randomly via random number tables to one of three treatment conditions: supportive counselling, stress management or alcohol-focused couples therapy. Setting. The intervention took place at the Behaviour Research and Therapy Centre (BRTC), The University of Queensland. This research and training centre offers outpatient psychology services to the community. Participants. Sixty-one married women whose husbands drank heavily. Participants reported protracted alcohol problems, severe impact of alcohol on social functioning and severe marital distress. Measurement. The women's stress, alcohol consumption by the male, and relationship functioning were assessed at pre- and post-treatment and at 6-month follow-up. Interventions. All three treatments involved 15 1-hour sessions with the woman. In the alcohol-focused couple therapy, attempts were made to engage the man in these sessions. Results. Contrary to our predictions, there were few differences between the treatments. All three treatments were associated with reductions in the women's reported stress, with trends for somewhat greater reduction in the women's stress in the stress management and alcohol-focused couples therapy conditions than for supportive counselling. None of the treatments produced clinically significant reductions in men's drinking or relationship distress. Conclusion. The treatments ease stresses and burden but do not improve drinking or relationships. Limited power in the design restricted the capacity to detect differential treatment effects.
Resumo:
Botnets are a group of computers infected with a specific sub-set of a malware family and controlled by one individual, called botmaster. This kind of networks are used not only, but also for virtual extorsion, spam campaigns and identity theft. They implement different types of evasion techniques that make it harder for one to group and detect botnet traffic. This thesis introduces one methodology, called CONDENSER, that outputs clusters through a self-organizing map and that identify domain names generated by an unknown pseudo-random seed that is known by the botnet herder(s). Aditionally DNS Crawler is proposed, this system saves historic DNS data for fast-flux and double fastflux detection, and is used to identify live C&Cs IPs used by real botnets. A program, called CHEWER, was developed to automate the calculation of the SVM parameters and features that better perform against the available domain names associated with DGAs. CONDENSER and DNS Crawler were developed with scalability in mind so the detection of fast-flux and double fast-flux networks become faster. We used a SVM for the DGA classififer, selecting a total of 11 attributes and achieving a Precision of 77,9% and a F-Measure of 83,2%. The feature selection method identified the 3 most significant attributes of the total set of attributes. For clustering, a Self-Organizing Map was used on a total of 81 attributes. The conclusions of this thesis were accepted in Botconf through a submited article. Botconf is known conferênce for research, mitigation and discovery of botnets tailled for the industry, where is presented current work and research. This conference is known for having security and anti-virus companies, law enforcement agencies and researchers.
Resumo:
1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.
Resumo:
La finalitat d'aquest projecte és definir el problema Max-SAT amb codificació multiavaluada, implementar algorismes exactes de resolució del problema i construir un generador aleatori de problemes que permeti avaluar aquests algorismes.
Resumo:
En aquest TFC es proposa un protocol de CMS aplicat a un Joc de Bingo, en que les entrades són nombres generats pels jugadors i el resultat cercat és un nombre aleatori per al joc.
Resumo:
We propose and validate a multivariate classification algorithm for characterizing changes in human intracranial electroencephalographic data (iEEG) after learning motor sequences. The algorithm is based on a Hidden Markov Model (HMM) that captures spatio-temporal properties of the iEEG at the level of single trials. Continuous intracranial iEEG was acquired during two sessions (one before and one after a night of sleep) in two patients with depth electrodes implanted in several brain areas. They performed a visuomotor sequence (serial reaction time task, SRTT) using the fingers of their non-dominant hand. Our results show that the decoding algorithm correctly classified single iEEG trials from the trained sequence as belonging to either the initial training phase (day 1, before sleep) or a later consolidated phase (day 2, after sleep), whereas it failed to do so for trials belonging to a control condition (pseudo-random sequence). Accurate single-trial classification was achieved by taking advantage of the distributed pattern of neural activity. However, across all the contacts the hippocampus contributed most significantly to the classification accuracy for both patients, and one fronto-striatal contact for one patient. Together, these human intracranial findings demonstrate that a multivariate decoding approach can detect learning-related changes at the level of single-trial iEEG. Because it allows an unbiased identification of brain sites contributing to a behavioral effect (or experimental condition) at the level of single subject, this approach could be usefully applied to assess the neural correlates of other complex cognitive functions in patients implanted with multiple electrodes.
Resumo:
This paper presents a new framework for studying irreversible (dis)investment whena market follows a random number of random-length cycles (such as a high-tech productmarket). It is assumed that a firm facing such market evolution is always unsure aboutwhether the current cycle is the last one, although it can update its beliefs about theprobability of facing a permanent decline by observing that no further growth phasearrives. We show that the existence of regime shifts in fluctuating markets suffices for anoption value of waiting to (dis)invest to arise, and we provide a marginal interpretationof the optimal (dis)investment policies, absent in the real options literature. Thepaper also shows that, despite the stochastic process of the underlying variable has acontinuous sample path, the discreteness in the regime changes implies that the samplepath of the firm s value experiences jumps whenever the regime switches all of a sudden,irrespective of whether the firm is active or not.
Resumo:
The purpose of this master thesis was to perform simulations that involve use of random number while testing hypotheses especially on two samples populations being compared weather by their means, variances or Sharpe ratios. Specifically, we simulated some well known distributions by Matlab and check out the accuracy of an hypothesis testing. Furthermore, we went deeper and check what could happen once the bootstrapping method as described by Effrons is applied on the simulated data. In addition to that, one well known RobustSharpe hypothesis testing stated in the paper of Ledoit and Wolf was applied to measure the statistical significance performance between two investment founds basing on testing weather there is a statistically significant difference between their Sharpe Ratios or not. We collected many literatures about our topic and perform by Matlab many simulated random numbers as possible to put out our purpose; As results we come out with a good understanding that testing are not always accurate; for instance while testing weather two normal distributed random vectors come from the same normal distribution. The Jacque-Berra test for normality showed that for the normal random vector r1 and r2, only 94,7% and 95,7% respectively are coming from normal distribution in contrast 5,3% and 4,3% failed to shown the truth already known; but when we introduce the bootstrapping methods by Effrons while estimating pvalues where the hypothesis decision is based, the accuracy of the test was 100% successful. From the above results the reports showed that bootstrapping methods while testing or estimating some statistics should always considered because at most cases the outcome are accurate and errors are minimized in the computation. Also the RobustSharpe test which is known to use one of the bootstrapping methods, studentised one, were applied first on different simulated data including distribution of many kind and different shape secondly, on real data, Hedge and Mutual funds. The test performed quite well to agree with the existence of statistical significance difference between their Sharpe ratios as described in the paper of Ledoit andWolf.