262 resultados para randomness
Resumo:
The ability to exchange keys between users is vital in any wireless based security system. A key generation technique which exploits the randomness of the wireless channel is a promising alternative to existing key distribution techniques, e.g., public key cryptography. In this paper, a secure key generation scheme based on the subcarriers' channel responses in orthogonal frequency-division multiplexing (OFDM) systems is proposed. We first implement a time-variant multipath channel with its channel impulse response modelled as a wide sense stationary (WSS) uncorrelated scattering random process and demonstrate that each subcarrier's channel response is also a WSS random process. We then define the X% coherence time as the time required to produce an X% correlation coefficient in the autocorrelation function (ACF) of each channel tap, and find that when all the channel taps have the same Doppler power spectrum, all subcarriers' channel responses has the same ACF as the channel taps. The subcarrier's channel response is then sampled every X% coherence time and quantized into key bits. All the key sequences' randomness is tested using National Institute of Standards and Technology (NIST) statistical test suite and the results indicate that the commonly used sampling interval as 50% coherence time cannot guarantee the randomness of the key sequence.
Resumo:
Key generation from the randomness of wireless channels is a promising alternative to public key cryptography for the establishment of cryptographic keys between any two users. This paper reviews the current techniques for wireless key generation. The principles, performance metrics and key generation procedure are comprehensively surveyed. Methods for optimizing the performance of key generation are also discussed. Key generation applications in various environments are then introduced along with the challenges of applying the approach in each scenario. The paper concludes with some suggestions for future studies.
Resumo:
This paper presents a key generation system derived from the channel response of individual subcarrier in orthogonal frequency-division multiplexing (OFDM) systems. Practical aspects of the security were investigated by implementing our key generation scheme on a wireless open-access research platform (WARP), which enables us to obtain channel estimation of individual OFDM subcarriers, a feature not currently available in most commercial wireless interface cards. Channel response of individual OFDM subcarrier is usually a wide sense stationary random process, which allows us to find the optimal probing period and maximize the key generation rate. The implementation requires cross layer design as it involves interaction between physical and MAC layer. We have experimentally verified the feasibility and principles of key generation, and also evaluated the performance of our system in terms of randomness, key generation rate and key disagreement rate, which proves that OFDM subcarrier's channel responses are valid for key generation.
Resumo:
This paper examines to what extent individual measures of well-being are correlated with daily weather patterns in the United Kingdom. Merging daily weather data with data from the British Household Panel Survey (BHPS) allows us to test whether measures of well-being are correlated with temperature, sunshine, rainfall and wind speed. We are able to make a strong case for causality due to ‘randomness’ of weather in addition to using regression methods that eliminate time-invariant individual level heterogeneity. Results suggest that some weather parameters (such as sunshine) are correlated with some measures of well-being (job satisfaction); however, in general the effect of weather on subjective measures of well-being is very small.
Resumo:
This paper proposes a new methodology to reduce the probability of occurring states that cause load curtailment, while minimizing the involved costs to achieve that reduction. The methodology is supported by a hybrid method based on Fuzzy Set and Monte Carlo Simulation to catch both randomness and fuzziness of component outage parameters of transmission power system. The novelty of this research work consists in proposing two fundamentals approaches: 1) a global steady approach which deals with building the model of a faulted transmission power system aiming at minimizing the unavailability corresponding to each faulted component in transmission power system. This, results in the minimal global cost investment for the faulted components in a system states sample of the transmission network; 2) a dynamic iterative approach that checks individually the investment’s effect on the transmission network. A case study using the Reliability Test System (RTS) 1996 IEEE 24 Buses is presented to illustrate in detail the application of the proposed methodology.
Fuzzy Monte Carlo mathematical model for load curtailment minimization in transmission power systems
Resumo:
This paper presents a methodology which is based on statistical failure and repair data of the transmission power system components and uses fuzzyprobabilistic modeling for system component outage parameters. Using statistical records allows developing the fuzzy membership functions of system component outage parameters. The proposed hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. A network contingency analysis to identify any overloading or voltage violation in the network is performed once obtained the system states by Monte Carlo simulation. This is followed by a remedial action algorithm, based on optimal power flow, to reschedule generations and alleviate constraint violations and, at the same time, to avoid any load curtailment, if possible, or, otherwise, to minimize the total load curtailment, for the states identified by the contingency analysis. In order to illustrate the application of the proposed methodology to a practical case, the paper will include a case study for the Reliability Test System (RTS) 1996 IEEE 24 BUS.
Resumo:
This paper present a methodology to choose the distribution networks reconfiguration that presents the lower power losses. The proposed methodology is based on statistical failure and repair data of the distribution power system components and uses fuzzy-probabilistic modeling for system component outage parameters. The proposed hybrid method using fuzzy sets and Monte Carlo simulation based on the fuzzyprobabilistic models allows catching both randomness and fuzziness of component outage parameters. A logic programming algorithm is applied, once obtained the system states by Monte Carlo Simulation, to get all possible reconfigurations for each system state. To evaluate the line flows and bus voltages and to identify if there is any overloading, and/or voltage violation an AC load flow has been applied to select the feasible reconfiguration with lower power losses. To illustrate the application of the proposed methodology, the paper includes a case study that considers a 115 buses distribution network.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Civil na Área de Especialização em Estruturas
Resumo:
It has been shown that in reality at least two general scenarios of data structuring are possible: (a) a self-similar (SS) scenario when the measured data form an SS structure and (b) a quasi-periodic (QP) scenario when the repeated (strongly correlated) data form random sequences that are almost periodic with respect to each other. In the second case it becomes possible to describe their behavior and express a part of their randomness quantitatively in terms of the deterministic amplitude–frequency response belonging to the generalized Prony spectrum. This possibility allows us to re-examine the conventional concept of measurements and opens a new way for the description of a wide set of different data. In particular, it concerns different complex systems when the ‘best-fit’ model pretending to be the description of the data measured is absent but the barest necessity of description of these data in terms of the reduced number of quantitative parameters exists. The possibilities of the proposed approach and detection algorithm of the QP processes were demonstrated on actual data: spectroscopic data recorded for pure water and acoustic data for a test hole. The suggested methodology allows revising the accepted classification of different incommensurable and self-affine spatial structures and finding accurate interpretation of the generalized Prony spectroscopy that includes the Fourier spectroscopy as a partial case.
Resumo:
O intenso intercâmbio entre os países, resultante do processo de globalização, veio acrescer importância ao mercado de capitais. Os países em desenvolvimento procuram abrir as suas economias para receber investimentos externos. Quanto maior for o grau de desenvolvimento de uma economia mais ativo será o seu mercado de capitais. No entanto, tem-se verificado uma tendência de substituição de enfoque económico, que antes era mais dirigido ao planeamento empresarial para metas mais ligadas ao meio ambiente. O mercado de capitais é um sistema de distribuição de valores mobiliários cujo objectivo é proporcionar liquidez a títulos emitidos pelas empresas, com a finalidade de viabilizar o processo de capitalização desses papéis. O mercado de capitais é composto pelas bolsas de valores, sociedades corretoras e outras instituições financeiras que têm autorização da Comissão de Valores dos Mercados Mobiliários (CMVM). O mercado bolsista insere-se no mercado de capitais. Nesses mercados, é importante conseguir conjuntamente a maximização dos recursos (retornos) e minimização dos custos (riscos). O principal objectivo das bolsas de valores é promover um ambiente de negociação dos títulos e dos valores mobiliários das empresas. Muitos investidores têm a sua própria maneira de investir, consoante o perfil que cada um tem. Além do perfil dos investidores, é também pertinente analisar a questão do risco. Vaughan (1997) observa que, nos dias atuais, a questão da administração do risco está presente na vida de todos. Este trabalho tem o propósito de demonstrar a necessidade da utilização de ferramentas para a seleção de ativos e para a mensuração do risco e do retorno de aplicações de recursos financeiros nesses activos de mercados de capitais, por qualquer tipo de investidor, mais especificamente na compra de ações e montagem de uma carteira de investimento. Para isso usou-se o método de Elton e Gruber, analisou-se as rentabilidades, os riscos e os índices de desempenho de Treynor e Sharpe. Testes estatísticos para os retornos das ações foram executados visando analisar a aleatoriedade dos dados. Este trabalho conclui que pode haver vantagens na utilização do método de Elton e Gruber para os investidores propensos a utilzar ações de empresas socialmente responsáveis.
Resumo:
RESUMO - Portugal continental, como outros países europeus, foi afectado por uma onda de calor de grande intensidade no Verão de 2003, com efeitos na mortalidade da população. O excesso de óbitos associados à onda de calor foi estimado pela comparação do número de óbitos observados entre 30 de Julho e 15 de Agosto de 2003 e o número de óbitos esperados se a população tivesse estado exposta às taxas de mortalidade médias do biénio 2000-2001 no respectivo período homólogo. Os óbitos esperados foram calculados com ajustamento para a idade. O número de óbitos observados (O) foi superior ao número esperado (E) em todos os dias do período estudado e o seu excesso global foi estimado em 1953 óbitos (excesso relativo de 43%), dos quais 1317 (61%) ocorreram no sexo feminino e 1742 no grupo de 75 e + anos (89%). A nível distrital, Portalegre teve o maior aumento relativo do número de óbitos (+89%) e Aveiro o menor (+18%). Numa área geográfica contínua do interior do território (Guarda, Castelo Branco, Portalegre e Évora) houve aumentos relativos superiores a 80%. Em termos absolutos, o maior excesso de óbitos ocorreu no distrito de Lisboa (mais cerca de 396) e no do Porto (mais cerca de 183). As causas de morte «golpe de calor» e «desidratação e outros distúrbios metabólicos» tiveram os aumentos relativos mais elevados (razões O/E de, respectivamente, 70 e 8,65). Os maiores aumentos absolutos do número de óbitos ocorreram no grupo das «doenças do aparelho circulatório» (mais 758), nas «doenças do aparelho respiratório» (mais 255) e no conjunto de «todas as neoplasias malignas» (mais 131). No período da onda de calor e no período de comparação, a percentagem dos óbitos que ocorreu nos hospitais (52% e 56%), no domicílio (32 e 33%) e em «outros locais» foi semelhante. A discussão sobre os factores que condicionaram a obtenção dos valores apresentados, relativos ao excesso de óbitos por sexo, grupo etário, distrito, causa e local da morte, permite concluir que os mesmos se afiguram adequados para medir a ordem de grandeza e caracterizar o efeito da onda de calor na mortalidade. O erro aleatório, medido pelos intervalos de confiança, e alguns possíveis erros sistemáticos associados ao período de comparação escolhido não deverão afectar de modo relevante as estimativas.
Resumo:
Feature selection plays an important role in knowledge discovery and data mining nowadays. In traditional rough set theory, feature selection using reduct - the minimal discerning set of attributes - is an important area. Nevertheless, the original definition of a reduct is restrictive, so in one of the previous research it was proposed to take into account not only the horizontal reduction of information by feature selection, but also a vertical reduction considering suitable subsets of the original set of objects. Following the work mentioned above, a new approach to generate bireducts using a multi--objective genetic algorithm was proposed. Although the genetic algorithms were used to calculate reduct in some previous works, we did not find any work where genetic algorithms were adopted to calculate bireducts. Compared to the works done before in this area, the proposed method has less randomness in generating bireducts. The genetic algorithm system estimated a quality of each bireduct by values of two objective functions as evolution progresses, so consequently a set of bireducts with optimized values of these objectives was obtained. Different fitness evaluation methods and genetic operators, such as crossover and mutation, were applied and the prediction accuracies were compared. Five datasets were used to test the proposed method and two datasets were used to perform a comparison study. Statistical analysis using the one-way ANOVA test was performed to determine the significant difference between the results. The experiment showed that the proposed method was able to reduce the number of bireducts necessary in order to receive a good prediction accuracy. Also, the influence of different genetic operators and fitness evaluation strategies on the prediction accuracy was analyzed. It was shown that the prediction accuracies of the proposed method are comparable with the best results in machine learning literature, and some of them outperformed it.
Resumo:
This paper presents a new theory of random consumer demand. The primitive is a collection of probability distributions, rather than a binary preference. Various assumptions constrain these distributions, including analogues of common assumptions about preferences such as transitivity, monotonicity and convexity. Two results establish a complete representation of theoretically consistent random demand. The purpose of this theory of random consumer demand is application to empirical consumer demand problems. To this end, the theory has several desirable properties. It is intrinsically stochastic, so the econometrician can apply it directly without adding extrinsic randomness in the form of residuals. Random demand is parsimoniously represented by a single function on the consumption set. Finally, we have a practical method for statistical inference based on the theory, described in McCausland (2004), a companion paper.
Resumo:
Cette thèse porte sur les questions d'évaluation et de couverture des options dans un modèle exponentiel-Lévy avec changements de régime. Un tel modèle est construit sur un processus additif markovien un peu comme le modèle de Black- Scholes est basé sur un mouvement Brownien. Du fait de l'existence de plusieurs sources d'aléa, nous sommes en présence d'un marché incomplet et ce fait rend inopérant les développements théoriques initiés par Black et Scholes et Merton dans le cadre d'un marché complet. Nous montrons dans cette thèse que l'utilisation de certains résultats de la théorie des processus additifs markoviens permet d'apporter des solutions aux problèmes d'évaluation et de couverture des options. Notamment, nous arrivons à caracté- riser la mesure martingale qui minimise l'entropie relative à la mesure de probabilit é historique ; aussi nous dérivons explicitement sous certaines conditions, le portefeuille optimal qui permet à un agent de minimiser localement le risque quadratique associé. Par ailleurs, dans une perspective plus pratique nous caract érisons le prix d'une option Européenne comme l'unique solution de viscosité d'un système d'équations intégro-di érentielles non-linéaires. Il s'agit là d'un premier pas pour la construction des schémas numériques pour approcher ledit prix.
Resumo:
Secret sharing schemes allow a secret to be shared among a group of participants so that only qualified subsets of participants can recover the secret. A visual cryptography scheme (VCS) is a special kind of secret sharing scheme in which the secret to share consists of an image and the shares consist of xeroxed transparencies which are stacked to recover the shared image. In this thesis we have given the theoretical background of Secret Sharing Schemes and the historical development of the subject. We have included a few examples to improve the readability of the thesis. We have tried to maintain the rigor of the treatment of the subject. The limitations and disadvantages of the various forms secret sharing schemes are brought out. Several new schemes for both dealing and combining are included in the thesis. We have introduced a new number system, called, POB number system. Representation using POB number system has been presented. Algorithms for finding the POB number and POB value are given.We have also proved that the representation using POB number system is unique and is more efficient. Being a new system, there is much scope for further development in this area.