915 resultados para Pseudo-random
Resumo:
Pós-graduação em Agronomia (Energia na Agricultura) - FCA
Resumo:
As contribuições dos mecanismos de detecção de contraste ao potencial cortical provocado visual (VECP) têm sido investigadas com o estudo das funções de resposta ao contraste e de resposta à frequência espacial. Anteriormente, o uso de sequências-m para o controle da estimulação era restrito à estimulação eletrofisiológica multifocal que, em alguns aspectos, se diferencia substancialmente do VECP convencional. Estimulações únicas com contraste espacial controlado por sequências-m não foram extensivamente estudadas ou comparadas às respostas obtidas com as técnicas multifocais. O objetivo deste trabalho foi avaliar a influência da frequência espacial e do contraste de redes senoidais no VECP gerado por estimulação pseudoaleatória. Nove sujeitos normais foram estimulados por redes senoidais acromáticas controladas por uma sequência-m binária pseudoaleatória em 7 frequências espaciais (0,4 a 10 cpg) em 3 tamanhos diferentes (4º, 8º e 16º de ângulo visual). Em 8º, foram testados adicionalmente seis níveis de contraste (3,12% a 99%). O kernel de primeira ordem não forneceu respostas consistentes com sinais mensuráveis através das frequências espaciais e dos contrastes testados – o sinal foi muito pequeno ou ausente – enquanto o primeiro e o segundo slice do kernel de segunda ordem exibiram respostas bastante confiáveis para as faixas de estímulo testadas. As principais diferenças entre os resultados obtidos com o primeiro e o segundo slice do kernel de segunda ordem foram o perfil das funções de amplitude versus contraste e de amplitude versus frequência espacial. Os resultados indicaram que o primeiro slice do kernel de segunda ordem foi dominado pela via M, porém para algumas condições de estímulo, pôde ser percebida a contribuição da via P. Já o segundo slice do kernel de segunda ordem refletiu contribuição apenas da via P. O presente trabalho estende achados anteriores sobre a contribuição das vias visuais ao VECP gerado por estimulação pseudoaleatória para uma grande faixa de frequências espaciais.
Resumo:
A seletividade espacial para cor tem sido investigada usando métodos eletrofisiológicos invasivos e não invasivos, e métodos psicofísicos. Em eletrofisiologia cortical visual não invasiva este tópico foi investigado usando métodos convencionais de estimulação periódica e extração de respostas por promediação simples. Novos métodos de estimulação (apresentação pseudo-aleatória) e extração de respostas corticais não invasivas (correlação cruzada) foram desenvolvidos e ainda não foram usados para investigar a seletividade espacial de cor de respostas corticais. Este trabalho objetivou introduzir esse novo método de eletrofisiologia pseudoaleatória para estudar a seletividade espacial de cor. Foram avaliados 14 tricromatas e 16 discromatópsicos com acuidade visual normal ou corrigida. Os voluntários foram avaliados pelo anomaloscópio HMC e teste de figuras de Ishihara para caracterizar a visão de cores quanto à presença de tricromacia. Foram usadas redes senoidais, 8º de ângulo visual, vermelho-verde para 8 frequências espaciais entre 0,2 a 10 cpg. O estímulo foi temporalmente modulado por uma sequência-m binária em um modo de apresentação de padrão reverso. O sistema VERIS foi usado para extrair o primeiro e o segundo slice do kernel de segunda ordem (K2.1 e K2.2, respectivamente). Após a modelagem da resposta às frequências espaciais com função de diferença de gaussianas, extraiu-se a frequência espacial ótima e banda de frequências com amplitudes acima de ¾ da amplitude máxima da função para servirem como indicadores da seletividade espacial da função. Também foi estimada a acuidade visual cromática pelo ajuste de uma função linear aos dados de amplitude a partir da frequência espacial do pico de amplitude até a mais alta frequência espacial testada. Em tricromatas, foi encontrada respostas cromáticas no K2.1 e no K2.2 que apresentaram seletividade espacial diferentes. Os componentes negativos do K2.1 e do K2.2 apresentaram sintonia passa-banda e o componente positivo do K2.1 apresentou sintonia passa-baixa. A acuidade visual estimada de todos os componentes estudados foi próxima àquelas encontradas por Mullen (1985) e Kelly (1983). Diferentes componentes celulares podem estar contribuindo para a geração do VECP pseudoaleatório. Este novo método se candidata a ser uma importante ferramenta para a avaliação não invasiva da visão de cores em humanos.
Resumo:
We investigate the nonequilibrium roughening transition of a one-dimensional restricted solid-on-solid model by directly sampling the stationary probability density of a suitable order parameter as the surface adsorption rate varies. The shapes of the probability density histograms suggest a typical Ginzburg-Landau scenario for the phase transition of the model, and estimates of the "magnetic" exponent seem to confirm its mean-field critical behavior. We also found that the flipping times between the metastable phases of the model scale exponentially with the system size, signaling the breaking of ergodicity in the thermodynamic limit. Incidentally, we discovered that a closely related model not considered before also displays a phase transition with the same critical behavior as the original model. Our results support the usefulness of off-critical histogram techniques in the investigation of nonequilibrium phase transitions. We also briefly discuss in the appendix a good and simple pseudo-random number generator used in our simulations.
Resumo:
A chaotic encryption algorithm is proposed based on the "Life-like" cellular automata (CA), which acts as a pseudo-random generator (PRNG). The paper main focus is to use chaos theory to cryptography. Thus, CA was explored to look for this "chaos" property. This way, the manuscript is more concerning on tests like: Lyapunov exponent, Entropy and Hamming distance to measure the chaos in CA, as well as statistic analysis like DIEHARD and ENT suites. Our results achieved higher randomness quality than others ciphers in literature. These results reinforce the supposition of a strong relationship between chaos and the randomness quality. Thus, the "chaos" property of CA is a good reason to be employed in cryptography, furthermore, for its simplicity, low cost of implementation and respectable encryption power. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The purpose of this study was to investigate the role of the fronto–striatal system for implicit task sequence learning. We tested performance of patients with compromised functioning of the fronto–striatal loops, that is, patients with Parkinson's disease and patients with lesions in the ventromedial or dorsolateral prefrontal cortex. We also tested amnesic patients with lesions either to the basal forebrain/orbitofrontal cortex or to thalamic/medio-temporal regions. We used a task sequence learning paradigm involving the presentation of a sequence of categorical binary-choice decision tasks. After several blocks of training, the sequence, hidden in the order of tasks, was replaced by a pseudo-random sequence. Learning (i.e., sensitivity to the ordering) was assessed by measuring whether this change disrupted performance. Although all the patients were able to perform the decision tasks quite easily, those with lesions to the fronto–striatal loops (i.e., patients with Parkinson's disease, with lesions in the ventromedial or dorsolateral prefrontal cortex and those amnesic patients with lesions to the basal forebrain/orbitofrontal cortex) did not show any evidence of implicit task sequence learning. In contrast, those amnesic patients with lesions to thalamic/medio-temporal regions showed intact sequence learning. Together, these results indicate that the integrity of the fronto–striatal system is a prerequisite for implicit task sequence learning.
Resumo:
El objetivo del PFC es el diseño e implementación de una aplicación que funcione como osciloscopio, analizador de espectro y generador de funciones virtual, todo dentro de la misma aplicacion. Mediante una tarjeta de adquisición de datos tomaremos muestras de señales del mundo real (sistema analógico) para generar datos que puedan ser manipulados por un ordenador (sistema digital). Con esta misma tarjeta también se podrán generar señales básicas, tales como señales senoidales, cuadradas.... y además se ha añadido la funcionalidad de generar señales moduladas en frecuencia, señales tipo Chirp (usadas comúnmente tanto en aplicaciones sonar y radar, como en transmisión óptica) o PRN (ruido pseudo-aleatorio que consta de una secuencia determinista de pulsos que se repite cada periodo, usada comúnmente en receptores GPS), como también señales ampliamente conocidas como el ruido blanco Gaussiano o el ruido blanco uniforme. La aplicación mostrará con detalle las señales adquiridas y analizará de diversas maneras esas señales. Posee la función de enventanado de los tipos de ventana mas comunes, respuesta en frecuencia, transformada de Fourier, etc. La configuración es elegida por el usuario en un entorno amigable y de visualización atractiva. The objective of the PFC is the design and implementation of an application that works as oscilloscope, spectrum analyzer and virtual signal generator, all within the same application. Through a data acquisition card, the user can take samples of real-world signals (analog system) to generate data that can be manipulated by a computer (digital system). This same card can also generate basic signals, such as sine waves, square waves, sawtooth waves.... and further has added other functionalities as frequency modulated signals generation, Chirp signals type generation (commonly used in both sonar and radar applications, such as optical transmission) or PRN (pseudo-random noise sequence comprising a deterministic pulse that repeats every period, commonly used in GPS receivers). It also can generate widely known as Gaussian white noise signals or white noise uniform signals. The application will show in detail the acquired signals and will analyze these signals in different ways selected by the user. Windowing function has the most common window types, frequency response, Fourier transform are examples of what kind of analyzing that can be processed. The configuration is chosen by the user throught friendly and attractive displays and panels.
Resumo:
Esta investigación es un ejemplo de simbiosis entre criptoanálisis y desciframiento de lenguas. Es la búsqueda del sentido de una inscripción, un conjunto de casi doscientas letras latinas, en una talla de la Virgen María que estaba en la isla de Tenerife, en la localidad hoy de Candelaria, en las islas Canarias. La imagen desapareció en un temporal en el año 1826. No obstante, es posible lograr una gran certeza sobre qué letras tenía, acudiendo a las fuentes documentales textuales y artísticas. El conocimiento del significado, si lo hubiera, de la inscripción mariana, creemos que no puede lograrse sin la adecuada comprensión del contexto. Esto significa indagar en la historia de la misma talla, que se remonta hasta el siglo XIV o XV, en el estudio de la población autóctona canaria, así como de los pueblos que allí llegaron en sus diferentes momentos históricos. Además, es necesario conocer el redescubrimiento del archipiélago canario y sus procesos de conquista y evangelización. Todos estos datos irán ofreciendo un panorama nuevo y sorprendente para comprender no sólo las letras sino la misma imagen escultórica en madera. A partir de este momento la indagación se moverá en ver si las letras corresponden a alguna lengua posible, lo que nos ha llevado a analizar un amplísimo conjunto de textos lo más cercanos a la época bajo estudio, pertenecientes a alrededor de un centenar de lenguas. Tras el examen lingüístico se ha procedido a un estudio de las posibles formas criptográficas que se hubieran utilizado para generar el texto de la inscripción. Se ofrece un detallado y minucioso elenco de técnicas posibles que pudieran haberse adoptado y se criptoanaliza con exhaustividad las letras de la talla mariana. Al mismo tiempo se ofrece un nuevo marco criptológico de métodos y sistemas más ordenado y completo que el que hasta ahora venía considerándose, en especial desde el surgimiento de la criptografía de clave asimétrica. Continuamos la investigación sopesando la posible generación pseudo-aleatoria del texto, un texto que pudiera no tener sentido alguno. En este momento, y habiendo completado todas las posibilidades e hipótesis, habiéndose negado todas, volvemos a reconsiderar el cuerpo de conjeturas y supuestos. Desde ahí analizamos en profundidad el ámbito etnográfico y lingüístico bereber como hipótesis más plausible y probable. Tras la profundización en esta lengua y la corrección de los errores que nos llevaron a no detectarla en nuestro análisis precedente, llegamos a la conclusión de encontrarnos ante una lengua arcaica bereber, un conjunto de letras pertenecientes a una lengua y familia hoy no desaparecida, si bien muy modelada y difuminada por otras lenguas, en especial el árabe. Esto nos llevará a rescatar aspectos léxicos, morfológicos, sintácticos y fonéticos de este habla arcaica. Con todos estos datos realizamos un amplio estudio semántico de la talla tanto desde la perspectiva aborigen autóctona como cristiana. Finalmente, desde las voces lexicales y sus raíces de las lenguas bereberes e insulares amazigh, ofrecemos el significado de las letras inscritas en la talla mariana de Candelaria. ABSTRACT This research is an example of symbiosis between cryptanalysis and deciphering of languages. It is the search for meaning in an inscription, a group of about two hundred latin letters on a carving of the Virgin Mary that was on the island of Tenerife, in the town of Candelaria today, in the Canary islands. The image disappeared in a storm in 1826. However, it is possible to achieve a great certainty about what letters had, going to the textual and artistic documentary sources. The knowledge of the meaning, if any, of the marian inscription, can not be achieved without an adequate knowledge of the context. This means researching into the history of the same carving, which dates back to the fourteenth and fifteen century; the study of the canarian indigenous people and of the people who came there at different historical moments. Furthermore, it is necessary to know the rediscovery of the Canary islands and their processes of conquest and evangelization. All these data will offer a new and surprising outlook to understanding not only the letters but the same wood sculpture. From this moment the inquiry will move to see if the letters correspond to any possible language, which has led us to analyze a very large set of texts as close to the time under study, in a hundred languages. After the language examination, has been carried out a study of possible cryptographic forms used to generate the text of the inscription. A detailed and thorough list of possible techniques that could be adopted is offered. Then exhaustively we cryptanalyze the letters of the marian carving. At the same time a new crypto framework of methods and systems more orderly and complete, especially since the emergence of asymmetric key cryptography, is provided. We continue researching the possible pseudo-random generation of the text, a text that would not make any sense. At this time, and having completed all the possibilities and hypotheses, all having refused, we return to rethink our assumptions. From there we analyze in depth the ethnographic and linguistic berber sphere as the most likely hypothesis. Following the deepening of this language and correcting the mistakes that led us not to detect it in our analysis above, we conclude that this is an archaic berber language, a set of letters belonging to a language and family not extinct today but very modeled and influenced by other languages, primarily arabic. This will lead us to rescue lexical, morphological, syntactic and phonetic aspects of this archaic speech. With all this data we make a wide semantic study of the carving from the indigenous and christian perspective. Finally, from the lexical voices and roots of the berber languages amazigh and island-amazigh, we give the meaning of the letters inscribed in the marian carving of Candelaria.
Resumo:
In the Monte Carlo simulation of both lattice field theories and of models of statistical mechanics, identities verified by exact mean values, such as Schwinger-Dyson equations, Guerra relations, Callen identities, etc., provide well-known and sensitive tests of thermalization bias as well as checks of pseudo-random-number generators. We point out that they can be further exploited as control variates to reduce statistical errors. The strategy is general, very simple, and almost costless in CPU time. The method is demonstrated in the twodimensional Ising model at criticality, where the CPU gain factor lies between 2 and 4.
Resumo:
We describe a modification to a previously published pseudorandom number generator improving security while maintaining high performance. The proposed generator is based on the powers of a word-packed block upper triangular matrix and it is designed to be fast and easy to implement in software since it mainly involves bitwise operations between machine registers and, in our tests, it presents excellent security and statistical characteristics. The modifications include a new, key-derived s-box based nonlinear output filter and improved seeding and extraction mechanisms. This output filter can also be applied to other generators.
Resumo:
An experimental testing system for the study of the dynamic behavior of fluid-loaded rectangular micromachined silicon plates is designed and presented in this paper. In this experimental system, the base-excitation technique combined with pseudo-random signal and cross-correlation analysis is applied to test fluid-loaded microstructures. Theoretical model is also derived to reveal the mechanism of such an experimental system in the application of testing fluid-loaded microstructures. The dynamic experiments cover a series of testings of various microplates with different boundary conditions and dimensions, both in air and immersed in water. This paper is the first that demonstrates the ability and performances of base excitation in the application of dynamic testing of microstructures that involves a natural fluid environment. Traditional modal analysis approaches are used to evaluate natural frequencies, modal damping and mode shapes from the experimental data. The obtained experimental results are discussed and compared with theoretical predictions. This research experimentally determines the dynamic characteristics of the fluid-loaded silicon microplates, which can contribute to the design of plate-based microsystems. The experimental system and testing approaches presented in this paper can be widely applied to the investigation of the dynamics of microstructures and nanostructures.
Resumo:
2000 Mathematics Subject Classification: 94A29, 94B70
Resumo:
We present experimental results for wavelength-division multiplexed (WDM) transmission performance using unbalanced proportions of 1s and 0s in pseudo-random bit sequence (PRBS) data. This investigation simulates the effect of local, in time, data unbalancing which occurs in some coding systems such as forward error correction when extra bits are added to the WDM data stream. We show that such local unbalancing, which would practically give a time-dependent error-rate, can be employed to improve the legacy long-haul WDM system performance if the system is allowed to operate in the nonlinear power region. We use a recirculating loop to simulate a long-haul fibre system.
Resumo:
This dissertation presents the design of three high-performance successive-approximation-register (SAR) analog-to-digital converters (ADCs) using distinct digital background calibration techniques under the framework of a generalized code-domain linear equalizer. These digital calibration techniques effectively and efficiently remove the static mismatch errors in the analog-to-digital (A/D) conversion. They enable aggressive scaling of the capacitive digital-to-analog converter (DAC), which also serves as sampling capacitor, to the kT/C limit. As a result, outstanding conversion linearity, high signal-to-noise ratio (SNR), high conversion speed, robustness, superb energy efficiency, and minimal chip-area are accomplished simultaneously. The first design is a 12-bit 22.5/45-MS/s SAR ADC in 0.13-μm CMOS process. It employs a perturbation-based calibration based on the superposition property of linear systems to digitally correct the capacitor mismatch error in the weighted DAC. With 3.0-mW power dissipation at a 1.2-V power supply and a 22.5-MS/s sample rate, it achieves a 71.1-dB signal-to-noise-plus-distortion ratio (SNDR), and a 94.6-dB spurious free dynamic range (SFDR). At Nyquist frequency, the conversion figure of merit (FoM) is 50.8 fJ/conversion step, the best FoM up to date (2010) for 12-bit ADCs. The SAR ADC core occupies 0.06 mm2, while the estimated area the calibration circuits is 0.03 mm2. The second proposed digital calibration technique is a bit-wise-correlation-based digital calibration. It utilizes the statistical independence of an injected pseudo-random signal and the input signal to correct the DAC mismatch in SAR ADCs. This idea is experimentally verified in a 12-bit 37-MS/s SAR ADC fabricated in 65-nm CMOS implemented by Pingli Huang. This prototype chip achieves a 70.23-dB peak SNDR and an 81.02-dB peak SFDR, while occupying 0.12-mm2 silicon area and dissipating 9.14 mW from a 1.2-V supply with the synthesized digital calibration circuits included. The third work is an 8-bit, 600-MS/s, 10-way time-interleaved SAR ADC array fabricated in 0.13-μm CMOS process. This work employs an adaptive digital equalization approach to calibrate both intra-channel nonlinearities and inter-channel mismatch errors. The prototype chip achieves 47.4-dB SNDR, 63.6-dB SFDR, less than 0.30-LSB differential nonlinearity (DNL), and less than 0.23-LSB integral nonlinearity (INL). The ADC array occupies an active area of 1.35 mm2 and dissipates 30.3 mW, including synthesized digital calibration circuits and an on-chip dual-loop delay-locked loop (DLL) for clock generation and synchronization.
Resumo:
A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.