923 resultados para Key Block Theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

No país existem inúmeras estruturas e obras civis que estão em operação a dezenas de anos e necessitam de monitoramento periódico devido a sua importância. Por este motivo, a dissertação apresenta um caso de um túnel antigo com problema de queda de bloco e visa instigar novas pesquisas e aumentar o conhecimento sobre o tema. Foram realizadas inspeções em campo em alguns túneis não revestidos da Estrada de Ferro Vitória-Minas (EFVM), bem como os ensaios em laboratório e in situ realizados nas amostras e no maciço rochoso para caracterizar o problema. Para o estudo foi escolhido o túnel Monte Seco Linha 1 e Linha 2 nos quais foram realizadas sondagens rotativas inclinadas e orientadas próximas ao eixo para investigação dos planos de descontinuidade. Os conceitos da Teoria dos Blocos-Chave foram aplicados às famílias de descontinuidades encontradas nos Túneis Monte Seco L1 e L2 para identificar os possíveis blocos instáveis formados pelas escavações. Para obtenção dos parâmetros geotécnicos de resistência e deformabilidade foram realizados ensaios de compressão uniaxial instrumentados com strain gages. A resistência a tração foi obtida através de Ensaio de Compressão Diametral (ECD). No ensaio de campo foi utilizado o Martelo de Schmidt para avaliação da rocha in situ. Através da análise dos dados foi possível distinguir setores cuja ocorrência de queda de blocos são maiores e a classe do maciço rochoso de acordo com a proposta de Bieniawski.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Block theory is an effective method on stability analysis of fractured rigid rock mass. There are a lot of discontinuous planes developed in rock mass of Jinping II hydropower station conveyor tunnel, so the stability of conveyor tunnel is related with whether there are unstable blocks on excavation planes. This paper deals with the stability of conveyor tunnel with stereo-analytical method for block theory on the basis of detailed investigation of rock mass data, and makes judgements on the movable blocks sliding types which are induced by all rock discontinuous planes and every excavation plane of conveyor tunnel. A conclusion is obtained that the sliding type of blocks is mainly single sliding, and a relatively few sliding types of double-sided sliding and vertical block falling; Also, the obvious statistical distribution result on movable blocks in conveyor tunnel indicates that there are a bit more instability blocks in left wall, left and right arches than right wall. In this paper, the stochastic probability model is drawn into block theory to study the sliding probability of key block on the basis of detailed investigation of its rock mass data and the development of the discontinuous planes in rock mass of Jinping II hydropower station conveyor tunnel. And some following conclusions are obtained. The relationship between trace length and the probability of instability of key block is inverse ratio. The probability of 1-3m primary joints are relatively higher. Key block containing joints J2 is relatively stable and the reinforcement of the arch would be crucial in the conveyor tunnel. They are all useful to offer effective reinforcement design and have important engineering values.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A zone based systems design framework is described and utilised in the implementation of a message authentication code (MAC) algorithm based on symmetric key block ciphers. The resulting block cipher based MAC algorithm may be used to provide assurance of the authenticity and, hence, the integrity of binary data. Using software simulation to benchmark against the de facto cipher block chaining MAC (CBC-MAC) variant used in the TinySec security protocol for wireless sensor networks and the NIST cipher block chaining MAC standard, CMAC; we show that our zone based systems design framework can lead to block cipher based MAC constructs that point to improvements in message processing efficiency, processing throughput and processing latency.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

n the recent years protection of information in digital form is becoming more important. Image and video encryption has applications in various fields including Internet communications, multimedia systems, medical imaging, Tele-medicine and military communications. During storage as well as in transmission, the multimedia information is being exposed to unauthorized entities unless otherwise adequate security measures are built around the information system. There are many kinds of security threats during the transmission of vital classified information through insecure communication channels. Various encryption schemes are available today to deal with information security issues. Data encryption is widely used to protect sensitive data against the security threat in the form of “attack on confidentiality”. Secure transmission of information through insecure communication channels also requires encryption at the sending side and decryption at the receiving side. Encryption of large text message and image takes time before they can be transmitted, causing considerable delay in successive transmission of information in real-time. In order to minimize the latency, efficient encryption algorithms are needed. An encryption procedure with adequate security and high throughput is sought in multimedia encryption applications. Traditional symmetric key block ciphers like Data Encryption Standard (DES), Advanced Encryption Standard (AES) and Escrowed Encryption Standard (EES) are not efficient when the data size is large. With the availability of fast computing tools and communication networks at relatively lower costs today, these encryption standards appear to be not as fast as one would like. High throughput encryption and decryption are becoming increasingly important in the area of high-speed networking. Fast encryption algorithms are needed in these days for high-speed secure communication of multimedia data. It has been shown that public key algorithms are not a substitute for symmetric-key algorithms. Public key algorithms are slow, whereas symmetric key algorithms generally run much faster. Also, public key systems are vulnerable to chosen plaintext attack. In this research work, a fast symmetric key encryption scheme, entitled “Matrix Array Symmetric Key (MASK) encryption” based on matrix and array manipulations has been conceived and developed. Fast conversion has been achieved with the use of matrix table look-up substitution, array based transposition and circular shift operations that are performed in the algorithm. MASK encryption is a new concept in symmetric key cryptography. It employs matrix and array manipulation technique using secret information and data values. It is a block cipher operated on plain text message (or image) blocks of 128 bits using a secret key of size 128 bits producing cipher text message (or cipher image) blocks of the same size. This cipher has two advantages over traditional ciphers. First, the encryption and decryption procedures are much simpler, and consequently, much faster. Second, the key avalanche effect produced in the ciphertext output is better than that of AES.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Los sistemas basados en la técnica OFDM (Multiplexación por División de Frecuencias Ortogonales) son una evolución de los tradicionales sistemas FDM (Multiplexación por División de Frecuencia), gracias a la cual se consigue un mejor aprovechamiento del ancho de banda. En la actualidad los sistemas OFDM y sus variantes ocupan un lugar muy importante en las comunicaciones, estando implementados en diversos estándares como pueden ser: DVB-T (estándar de la TDT), ADSL, LTE, WIMAX, DAB (radio digital), etc. Debido a ello, en este proyecto se implementa un sistema OFDM en el que poder realizar diversas simulaciones para entender mejor su funcionamiento. Para ello nos vamos a valer de la herramienta Matlab. Los objetivos fundamentales dentro de la simulación del sistema es poner a prueba el empleo de turbo códigos (comparándolo con los códigos convolucionales tradicionales) y de un ecualizador. Todo ello con la intención de mejorar la calidad de nuestro sistema (recibir menos bits erróneos) en condiciones cada vez más adversas: relaciones señal a ruido bajas y multitrayectos. Para ello se han implementado las funciones necesarias en Matlab, así como una interfaz gráfica para que sea más sencillo de utilizar el programa y más didáctico. En los capítulos segundo y tercero de este proyecto se efectúa un estudio de las bases de los sistemas OFDM. En el segundo nos centramos más en un estudio teórico puro para después pasar en el tercero a centrarnos únicamente en la teoría de los bloques implementados en el sistema OFDM que se desarrolla en este proyecto. En el capítulo cuarto se explican las distintas opciones que se pueden llevar a cabo mediante la interfaz implementada, a la vez que se elabora un manual para el correcto uso de la misma. El quinto capítulo se divide en dos partes, en la primera se muestran las representaciones que puede realizar el programa, y en la segunda únicamente se realizan simulaciones para comprobar que tal responde nuestra sistema a distintas configuraciones de canal, y las a distintas configuraciones que hagamos nosotros de nuestro sistema (utilicemos una codificación u otra, utilicemos el ecualizador o el prefijo cíclico, etc…). Para finalizar, en el último capítulo se exponen las conclusiones obtenidas en este proyecto, así como posibles líneas de trabajo que seguir en próximas versiones del mismo. ABSTRACT. Systems based on OFDM (Orthogonal Frequency Division Multiplexing) technique are an evolution of traditional FDM (Frequency Division Multiplexing). Due to the use of OFDM systems are achieved by more efficient use of bandwidth. Nowadays, OFDM systems and variants of OFDM systems occupy a very important place in the world of communications, being implemented in standards such as DVB-T, ADSL, LTE, WiMAX, DAB (digital radio) and another more. For all these reasons, this project implements a OFDM system for performing various simulations for better understanding of OFDM system operation. The system has been simulated using Matlab. With system simulation we search to get two key objectives: to test the use of turbo codes (compared to traditional convolutional codes) and an equalizer. We do so with the intention of improving the quality of our system (receive fewer rates of bit error) in increasingly adverse conditions: lower signal-to-noise and multipath. For these reasons necessaries Matlab´s functions have been developed, and a GUI (User Graphical Interface) has been integrated so the program can be used in a easier and more didactic way. This project is divided into five chapters. In the second and third chapter of this project are developed the basis of OFDM systems. Being developed in the second one a pure theoretical study, while focusing only on block theory implemented in the OFDM system in the third one. The fourth chapter describes the options that can be carried out by the interface implemented. Furthermore the chapter is developed for the correct use of the interface. The fifth chapter is divided into two parts, the first part shows to us the representations that the program can perform, and the second one just makes simulations to check that our system responds to differents channel configurations (use of convolutional codes or turbo codes, the use of equalizer or cyclic prefix…). Finally, the last chapter presents the conclusions of this project and possible lines of work to follow in future versions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose The purpose of this work was to explore how men and women construct their experiences living with lymphoedema following treatment for any cancer in the context of everyday life. Methods The design and conduct of this qualitative study was guided by Charmaz’ social constructivist grounded theory. To collect data, focus groups and telephone interviews were conducted. Audiotapes were transcribed verbatim and imported into NVivo8 to organise data and codes. Data were analysed using key grounded theory principles of constant comparison, data saturation and initial, focused and theoretical coding. Results Participants were 3 men and 26 women who had developed upper- or lower-limb lymphoedema following cancer treatment. Three conceptual categories were developed during data analysis and were labelled ‘accidental journey’, ‘altered normalcy’ and ‘ebb and flow of control’. ‘Altered normalcy’ reflects the physical and psychosocial consequences of lymphoedema and its relationship to everyday life. ‘Accidental journey’ explains the participants’ experiences with the health care system, including the prevention, treatment and management of their lymphoedema. ‘Ebb and flow of control’ draws upon a range of individual and social elements that influenced the participants’ perceived control over lymphoedema. These conceptual categories were inter-related and contributed to the core category of ‘sense of self’, which describes their perceptions of their identity and roles. Conclusions Results highlight the need for greater clinical and public awareness of lymphoedema as a chronic condition requiring prevention and treatment, and one that has far-reaching effects on physical and psychosocial well-being as well as overall quality of life.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

According to the feature of high stress and large size underground tunnel , a method named DEFLAC is put forward in this paper. DEFLAC is such a method that use disturbing energy as criteria, and based on the simulating software of FLAC. Finally, the method is applied in the underground powerhouse project of Jin-Ping First-level Hydropower Station. The result is well. And some conclusions are got. (1) Based on the geological features of excavation unloading phenomena, they are concluded to three types, what’s more three corresponding mechanical modes are proposed to explain the unloading phenomena. (2) The relation of two standards , which is called GB50287-99 (T) and BQ ,is studied. And the main difference of them ,when they are applied in high stress zone ,is researched. (3) .A method named DEFLAC is put forward , which is combined disturbing energy method and FLAC simulating software. The two dimension and three dimension explicit finite difference expressions are deduced in this paper. (4) Compared the instability area achieved by FLAC and DEFLAC with the measured result, a conclusion is got. That is a more accurate result can be got by DEFLAC. (5) According to the main powerhouse first layer excavation project, the method to search and analyze the instability blocks is studied in this paper. Finally, the results got by FLAC and DEFLAC are compared .A conclusion is got that DEFLAC can judge the stability of blocks induced by intermittent joints, but Block Theory can’t. So more accurate block amount can be got by DEFLAC. It is an effective method to judge stability of blocks

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article is a reflection about the contribution the social communication media can provide to the public debate about the environmental concerns. The Agenda XXI and several other UN documents call to the need to inform and educate the society. On the other hand, the Communication theories always have in common the imperative of the emitter to be understood by the recipient, in such a way that they can become change agents and not only observers. The first step is, therefore, to study the environmental question, adequately focus the theme and convey clarification. It is not what happened, for instance, with some technical concepts from the area, in particular sustainability, that came about in the 70 s to guide public policies at the service of the life preservation and specially, future and today, utilized even as a parameter of preservation of profit and advantage resulting from exploitation of nature.Key-Words Theory of Communication - Environmental Sustainability MCM

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Oggi, i dispositivi portatili sono diventati la forza trainante del mercato consumer e nuove sfide stanno emergendo per aumentarne le prestazioni, pur mantenendo un ragionevole tempo di vita della batteria. Il dominio digitale è la miglior soluzione per realizzare funzioni di elaborazione del segnale, grazie alla scalabilità della tecnologia CMOS, che spinge verso l'integrazione a livello sub-micrometrico. Infatti, la riduzione della tensione di alimentazione introduce limitazioni severe per raggiungere un range dinamico accettabile nel dominio analogico. Minori costi, minore consumo di potenza, maggiore resa e una maggiore riconfigurabilità sono i principali vantaggi dell'elaborazione dei segnali nel dominio digitale. Da più di un decennio, diverse funzioni puramente analogiche sono state spostate nel dominio digitale. Ciò significa che i convertitori analogico-digitali (ADC) stanno diventando i componenti chiave in molti sistemi elettronici. Essi sono, infatti, il ponte tra il mondo digitale e analogico e, di conseguenza, la loro efficienza e la precisione spesso determinano le prestazioni globali del sistema. I convertitori Sigma-Delta sono il blocco chiave come interfaccia in circuiti a segnale-misto ad elevata risoluzione e basso consumo di potenza. I tools di modellazione e simulazione sono strumenti efficaci ed essenziali nel flusso di progettazione. Sebbene le simulazioni a livello transistor danno risultati più precisi ed accurati, questo metodo è estremamente lungo a causa della natura a sovracampionamento di questo tipo di convertitore. Per questo motivo i modelli comportamentali di alto livello del modulatore sono essenziali per il progettista per realizzare simulazioni veloci che consentono di identificare le specifiche necessarie al convertitore per ottenere le prestazioni richieste. Obiettivo di questa tesi è la modellazione del comportamento del modulatore Sigma-Delta, tenendo conto di diverse non idealità come le dinamiche dell'integratore e il suo rumore termico. Risultati di simulazioni a livello transistor e dati sperimentali dimostrano che il modello proposto è preciso ed accurato rispetto alle simulazioni comportamentali.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to explain variations in discretionary information shared between buyers and key suppliers. The paper also aims to examine how the extent of information shared affects buyers’ performance in terms of resource usage, output, and flexibility. ----- ----- Design/methodology/approach: The data for the paper comprise 221 Finnish and Swedish non-service companies obtained through a mail survey. The hypothesized relationships were tested using partial least squares modelling with reflective and formative constructs.----- ----- Findings: The results of the study suggest that (environmental and demand) uncertainty and interdependency can to some degree explain the extent of information shared between a buyer and key supplier. Furthermore, information sharing improves buyers’ performance with respect to resource usage, output, and flexibility.----- ----- Research limitations/implications: A limitation to the paper relates to the data, which only included buyers.Abetter approach would have been to collect data from both, buyers and key suppliers. Practical implications – Companies face a wide range of supply chain solutions that enable and encourage collaboration across organizations. This paper suggests a more selective and balanced approach toward adopting the solutions offered as the benefits are contingent on a number of factors such as uncertainty. Also, the risks of information sharing are far too high for a one size fits all approach.----- ----- Originality/value: The paper illustrates the applicability of transaction cost theory to the contemporary era of e-commerce. With this finding, transaction cost economics can provide a valuable lens with which to view and interpret interorganizational information sharing, a topic that has received much attention in the recent years.