963 resultados para Key recovery attack
Resumo:
Bushfires are regular occurrences in the Australian landscape which can, under adverse weather conditions, give rise to losses of life, property, infrastructure, environmental and cultural values. Where property loss is involved, historical surveys of house losses have focussed on ember, radiant heat and flame contact as key bushfire attack mechanisms. Although often noted, little work has been done to quantify the impact of fire generated or fire enhanced wind and pyro-convective events on house loss and to improve construction practice within Australia. It is well known that strong winds are always associated with bushfire events. It is less well known, although increasingly shown through anecdotal evidence, that bushfires are not a passive companion of wind, but indeed they interact with winds and can together cause significant damages to exposed buildings and ecological structures. Previous studies have revealed the effects of wind, fire and structure interactions with the result of increased pressure coefficient distributions on the windward side of a building downstream of a fire front. This paper presents a further analysis of the result in relations to the relevant standards and fire weather conditions. A review of wind code and bushfire code was conducted. Based on the result of the current study, the authors believe it is necessary to consider wind as an attack mechanism in bushfire events. The results of the study will also have implications on bushfire emergency management, design of emergency shelters, perception of danger, emergency evacuation and on risk assessment.
Resumo:
We consider the problem of building robust fuzzy extractors, which allow two parties holding similar random variables W, W' to agree on a secret key R in the presence of an active adversary. Robust fuzzy extractors were defined by Dodis et al. in Crypto 2006 [6] to be noninteractive, i.e., only one message P, which can be modified by an unbounded adversary, can pass from one party to the other. This allows them to be used by a single party at different points in time (e.g., for key recovery or biometric authentication), but also presents an additional challenge: what if R is used, and thus possibly observed by the adversary, before the adversary has a chance to modify P. Fuzzy extractors secure against such a strong attack are called post-application robust. We construct a fuzzy extractor with post-application robustness that extracts a shared secret key of up to (2m−n)/2 bits (depending on error-tolerance and security parameters), where n is the bit-length and m is the entropy of W . The previously best known result, also of Dodis et al., [6] extracted up to (2m − n)/3 bits (depending on the same parameters).
Resumo:
Traditionally, attacks on cryptographic algorithms looked for mathematical weaknesses in the underlying structure of a cipher. Side-channel attacks, however, look to extract secret key information based on the leakage from the device on which the cipher is implemented, be it smart-card, microprocessor, dedicated hardware or personal computer. Attacks based on the power consumption, electromagnetic emanations and execution time have all been practically demonstrated on a range of devices to reveal partial secret-key information from which the full key can be reconstructed. The focus of this thesis is power analysis, more specifically a class of attacks known as profiling attacks. These attacks assume a potential attacker has access to, or can control, an identical device to that which is under attack, which allows him to profile the power consumption of operations or data flow during encryption. This assumes a stronger adversary than traditional non-profiling attacks such as differential or correlation power analysis, however the ability to model a device allows templates to be used post-profiling to extract key information from many different target devices using the power consumption of very few encryptions. This allows an adversary to overcome protocols intended to prevent secret key recovery by restricting the number of available traces. In this thesis a detailed investigation of template attacks is conducted, along with how the selection of various attack parameters practically affect the efficiency of the secret key recovery, as well as examining the underlying assumption of profiling attacks in that the power consumption of one device can be used to extract secret keys from another. Trace only attacks, where the corresponding plaintext or ciphertext data is unavailable, are then investigated against both symmetric and asymmetric algorithms with the goal of key recovery from a single trace. This allows an adversary to bypass many of the currently proposed countermeasures, particularly in the asymmetric domain. An investigation into machine-learning methods for side-channel analysis as an alternative to template or stochastic methods is also conducted, with support vector machines, logistic regression and neural networks investigated from a side-channel viewpoint. Both binary and multi-class classification attack scenarios are examined in order to explore the relative strengths of each algorithm. Finally these machine-learning based alternatives are empirically compared with template attacks, with their respective merits examined with regards to attack efficiency.
Resumo:
Side-channel analysis of cryptographic systems can allow for the recovery of secret information by an adversary even where the underlying algorithms have been shown to be provably secure. This is achieved by exploiting the unintentional leakages inherent in the underlying implementation of the algorithm in software or hardware. Within this field of research, a class of attacks known as profiling attacks, or more specifically as used here template attacks, have been shown to be extremely efficient at extracting secret keys. Template attacks assume a strong adversarial model, in that an attacker has an identical device with which to profile the power consumption of various operations. This can then be used to efficiently attack the target device. Inherent in this assumption is that the power consumption across the devices under test is somewhat similar. This central tenet of the attack is largely unexplored in the literature with the research community generally performing the profiling stage on the same device as being attacked. This is beneficial for evaluation or penetration testing as it is essentially the best case scenario for an attacker where the model built during the profiling stage matches exactly that of the target device, however it is not necessarily a reflection on how the attack will work in reality. In this work, a large scale evaluation of this assumption is performed, comparing the key recovery performance across 20 identical smart-cards when performing a profiling attack.
Resumo:
Literally, the word compliance suggests conformity in fulfilling official requirements. The thesis presents the results of the analysis and design of a class of protocols called compliant cryptologic protocols (CCP). The thesis presents a notion for compliance in cryptosystems that is conducive as a cryptologic goal. CCP are employed in security systems used by at least two mutually mistrusting sets of entities. The individuals in the sets of entities only trust the design of the security system and any trusted third party the security system may include. Such a security system can be thought of as a broker between the mistrusting sets of entities. In order to provide confidence in operation for the mistrusting sets of entities, CCP must provide compliance verification mechanisms. These mechanisms are employed either by all the entities or a set of authorised entities in the system to verify the compliance of the behaviour of various participating entities with the rules of the system. It is often stated that confidentiality, integrity and authentication are the primary interests of cryptology. It is evident from the literature that authentication mechanisms employ confidentiality and integrity services to achieve their goal. Therefore, the fundamental services that any cryptographic algorithm may provide are confidentiality and integrity only. Since controlling the behaviour of the entities is not a feasible cryptologic goal,the verification of the confidentiality of any data is a futile cryptologic exercise. For example, there exists no cryptologic mechanism that would prevent an entity from willingly or unwillingly exposing its private key corresponding to a certified public key. The confidentiality of the data can only be assumed. Therefore, any verification in cryptologic protocols must take the form of integrity verification mechanisms. Thus, compliance verification must take the form of integrity verification in cryptologic protocols. A definition of compliance that is conducive as a cryptologic goal is presented as a guarantee on the confidentiality and integrity services. The definitions are employed to provide a classification mechanism for various message formats in a cryptologic protocol. The classification assists in the characterisation of protocols, which assists in providing a focus for the goals of the research. The resulting concrete goal of the research is the study of those protocols that employ message formats to provide restricted confidentiality and universal integrity services to selected data. The thesis proposes an informal technique to understand, analyse and synthesise the integrity goals of a protocol system. The thesis contains a study of key recovery,electronic cash, peer-review, electronic auction, and electronic voting protocols. All these protocols contain message format that provide restricted confidentiality and universal integrity services to selected data. The study of key recovery systems aims to achieve robust key recovery relying only on the certification procedure and without the need for tamper-resistant system modules. The result of this study is a new technique for the design of key recovery systems called hybrid key escrow. The thesis identifies a class of compliant cryptologic protocols called secure selection protocols (SSP). The uniqueness of this class of protocols is the similarity in the goals of the member protocols, namely peer-review, electronic auction and electronic voting. The problem statement describing the goals of these protocols contain a tuple,(I, D), where I usually refers to an identity of a participant and D usually refers to the data selected by the participant. SSP are interested in providing confidentiality service to the tuple for hiding the relationship between I and D, and integrity service to the tuple after its formation to prevent the modification of the tuple. The thesis provides a schema to solve the instances of SSP by employing the electronic cash technology. The thesis makes a distinction between electronic cash technology and electronic payment technology. It will treat electronic cash technology to be a certification mechanism that allows the participants to obtain a certificate on their public key, without revealing the certificate or the public key to the certifier. The thesis abstracts the certificate and the public key as the data structure called anonymous token. It proposes design schemes for the peer-review, e-auction and e-voting protocols by employing the schema with the anonymous token abstraction. The thesis concludes by providing a variety of problem statements for future research that would further enrich the literature.
Resumo:
HIGHLIGHTS FOR FY 2003 1. Continued a 3-year threatened Gulf sturgeon population estimate in the Escambia River, Florida and conducted presence-absence surveys in 4 other Florida river systems and 1 bay. 2. Five juvenile Gulf sturgeon collected, near the mouth of the Choctawhatchee River, Florida, were equipped with sonic tags and monitored while over-wintering in Choctawhatchee Bay. 3. Continued to examine Gulf sturgeon marine habitat use. 4. Implemented Gulf Striped Bass Restoration Plan by coordinating the 20th Annual Morone Workshop, leading the technical committee, transporting broodfish, and coordinating the stocking on the Apalachicola-Chattahoochee-Flint (ACF) river system. 5. Over 73,000 Phase II Gulf striped bass were marked with sequential coded wire tags and stocked in the Apalachicola River. Post-stocking evaluations were conducted at 31 sites. 6. Three stream fisheries assessment s were completed to evaluate the fish community at sites slated for habitat restoration by the Partners for Fish and Wildlife Program (PFW). 7. PFW program identified restoration needs and opportunities for 10 areas. 8. Developed an Unpaved Road Evaluation Handbook. 9. Completed restoration of Chipola River Greenway, Seibenhener Streambank Restoration, Blackwater River State Forest, and Anderson Property. 10. Assessments for fluvial geomorphic conditions for design criteria were completed for 3 projects. 11. Geomorphology in Florida streams initiated development of Rosgen regional curves for Northwest Florida for use by the Florida Department of Transportation. 12. Developed a Memorandum of Understanding between partners for enhancing, protecting, and restoring stream, wetland, and upland habitat in northwest Florida 13. Completed aquatic fauna and fish surveys with new emphasis on integration of data from reach level into watershed and landscape scale and keeping database current. 14. Compliance based sampling of impaired waterbodies on Eglin Air Force Base in conjunction with Florida Department of Environmental Protection for Total Maximum Daily Load development support. 15. Surveyed 20 sites for the federally endangered Okaloosa darter, provided habitat descriptions, worked with partners to implement key recovery tasks and set priorities for restoration. 16. Worked with partners to develop a freshwater mussel survey protocol to provide standard operating procedures for establishing the presence/absence of federally listed mussel species within a Federal project area. 17. GIS database was created to identify all known freshwater mussel records from the northeast Gulf ecosystem. 18. Completed recovery plan for seven freshwater mussels and drafted candidate elevation package for seven additional mussels. Developed proposals to implement recovery plan. 19. Worked with Corps of Engineers and State partners to develop improved reservoir operating policies to benefit both riverine and reservoir fisheries for the ACF river system. 20. Multiple outreach projects were completed to detail aquatic resources conservation opportunities. 21. Multiple stream restoration and watershed management projects initiated or completed (see Appendix A).
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Reticulados têm sido aplicados de diferentes maneiras em criptografia. Inicialmente utilizados para a destruição de criptossistemas, eles foram posteriormente aplicados na construção de novos esquemas, incluindo criptossistemas assimétricos, esquemas de assinatura cega e os primeiros métodos para encriptação completamente homomórfica. Contudo, seu desempenho ainda é proibitivamente lenta em muitos casos. Neste trabalho, expandimos técnicas originalmente desenvolvidas para encriptação homomórfica, tornando-as mais genéricas e aplicando-as no esquema GGH-YK-M, um esquema de encriptação de chave pública, e no esquema LMSV, a única construção homomórfica que não sucumbiu a ataques de recuperação de chaves IND-CCA1 até o momento. Em nossos testes, reduzimos o tamanho das chaves do GGH-YK-M em uma ordem de complexidade, especificamente, de O(n2 lg n) para O(n lg n), onde n é um parâmetro público do esquema. A nova técnica também atinge processamento mais rápido em todas as operações envolvidas em um criptossistema assimétrico, isto é, geração de chaves, encriptação e decriptação. A melhora mais significativa é na geração de chaves, que se torna mais de 3 ordens de magnitude mais rápida que resultados anteriores, enquanto a encriptação se torna por volta de 2 ordens de magnitude mais rápida. Para decriptação, nossa implementação é dez vezes mais rápida que a literatura. Também mostramos que é possível aumentar a segurança do esquema LMSV contra os ataques quânticos de recuperação de chaves recentemente publicados pela agência britânica GCHQ. Isso é feito através da adoção de reticulados não-ciclotômicos baseados em anéis polinomiais irredutíveis quase-circulantes. Em nossa implementação, o desempenho da encriptação é virtualmente idêntico, e a decriptação torna-se ligeiramente inferior, um pequeno preço a se pagar pelo aumento de segurança. A geração de chaves, porém, é muito mais lenta, devido à necessidade de se utilizar um método mais genérico e caro. A existência de métodos dedicados altamente eficientes para a geração de chaves nesta variante mais segura do LMSV permanece como um problema em aberto.
Resumo:
Recent advances in electronic and computer technologies lead to wide-spread deployment of wireless sensor networks (WSNs). WSNs have wide range applications, including military sensing and tracking, environment monitoring, smart environments, etc. Many WSNs have mission-critical tasks, such as military applications. Thus, the security issues in WSNs are kept in the foreground among research areas. Compared with other wireless networks, such as ad hoc, and cellular networks, security in WSNs is more complicated due to the constrained capabilities of sensor nodes and the properties of the deployment, such as large scale, hostile environment, etc. Security issues mainly come from attacks. In general, the attacks in WSNs can be classified as external attacks and internal attacks. In an external attack, the attacking node is not an authorized participant of the sensor network. Cryptography and other security methods can prevent some of external attacks. However, node compromise, the major and unique problem that leads to internal attacks, will eliminate all the efforts to prevent attacks. Knowing the probability of node compromise will help systems to detect and defend against it. Although there are some approaches that can be used to detect and defend against node compromise, few of them have the ability to estimate the probability of node compromise. Hence, we develop basic uniform, basic gradient, intelligent uniform and intelligent gradient models for node compromise distribution in order to adapt to different application environments by using probability theory. These models allow systems to estimate the probability of node compromise. Applying these models in system security designs can improve system security and decrease the overheads nearly in every security area. Moreover, based on these models, we design a novel secure routing algorithm to defend against the routing security issue that comes from the nodes that have already been compromised but have not been detected by the node compromise detecting mechanism. The routing paths in our algorithm detour those nodes which have already been detected as compromised nodes or have larger probabilities of being compromised. Simulation results show that our algorithm is effective to protect routing paths from node compromise whether detected or not.
Resumo:
Increasing the size of training data in many computer vision tasks has shown to be very effective. Using large scale image datasets (e.g. ImageNet) with simple learning techniques (e.g. linear classifiers) one can achieve state-of-the-art performance in object recognition compared to sophisticated learning techniques on smaller image sets. Semantic search on visual data has become very popular. There are billions of images on the internet and the number is increasing every day. Dealing with large scale image sets is intense per se. They take a significant amount of memory that makes it impossible to process the images with complex algorithms on single CPU machines. Finding an efficient image representation can be a key to attack this problem. A representation being efficient is not enough for image understanding. It should be comprehensive and rich in carrying semantic information. In this proposal we develop an approach to computing binary codes that provide a rich and efficient image representation. We demonstrate several tasks in which binary features can be very effective. We show how binary features can speed up large scale image classification. We present learning techniques to learn the binary features from supervised image set (With different types of semantic supervision; class labels, textual descriptions). We propose several problems that are very important in finding and using efficient image representation.
Resumo:
Side channel attacks permit the recovery of the secret key held within a cryptographic device. This paper presents a new EM attack in the frequency domain, using a power spectral density analysis that permits the use of variable spectral window widths for each trace of the data set and demonstrates how this attack can therefore overcome both inter-and intra-round random insertion type countermeasures. We also propose a novel re-alignment method exploiting the minimal power markers exhibited by electromagnetic emanations. The technique can be used for the extraction and re-alignment of round data in the time domain.
Resumo:
CONTEXT Enhanced Recovery after Surgery (ERAS) programs are multimodal care pathways that aim to decrease intra-operative blood loss, decrease postoperative complications, and reduce recovery times. OBJECTIVE To overview the use and key elements of ERAS pathways, and define needs for future clinical trials. EVIDENCE ACQUISITION A comprehensive systematic MEDLINE search was performed for English language reports published before May 2015 using the terms "postoperative period," "postoperative care," "enhanced recovery after surgery," "enhanced recovery," "accelerated recovery," "fast track recovery," "recovery program," "recovery pathway", "ERAS," and "urology" or "cystectomy" or "urologic surgery." EVIDENCE SYNTHESIS We identified 18 eligible articles. Patient counseling, physical conditioning, avoiding excessive alcohol and smoking, and good nutrition appeared to protect against postoperative complications. Fasting from solid food for only 6h and perioperative liquid-carbohydrate loading up to 2h prior to surgery appeared to be safe and reduced recovery times. Restricted, balanced, and goal-directed fluid replacement is effective when individualized, depending on patient morbidity and surgical procedure. Decreased intraoperative blood loss may be achieved by several measures. Deep vein thrombosis prophylaxis, antibiotic prophylaxis, and thermoregulation were found to help reduce postsurgical complications, as was a multimodal approach to postoperative nausea, vomiting, and analgesia. Chewing gum, prokinetic agents, oral laxatives, and an early resumption to normal diet appear to aid faster return to normal bowel function. Further studies should compare anesthetic protocols, refine analgesia, and evaluate the importance of robot-assisted surgery and the need/timing for drains and catheters. CONCLUSIONS ERAS regimens are multidisciplinary, multimodal pathways that optimize postoperative recovery. PATIENT SUMMARY This review provides an overview of the use and key elements of Enhanced Recovery after Surgery programs, which are multimodal, multidisciplinary care pathways that aim to optimize postoperative recovery. Additional conclusions include identifying effective procedures within Enhanced Recovery after Surgery programs and defining needs for future clinical trials.
Resumo:
With the advent of cloud computing, many applications have embraced the ensuing paradigm shift towards modern distributed key-value data stores, like HBase, in order to benefit from the elastic scalability on offer. However, many applications still hesitate to make the leap from the traditional relational database model simply because they cannot compromise on the standard transactional guarantees of atomicity, isolation, and durability. To get the best of both worlds, one option is to integrate an independent transaction management component with a distributed key-value store. In this paper, we discuss the implications of this approach for durability. In particular, if the transaction manager provides durability (e.g., through logging), then we can relax durability constraints in the key-value store. However, if a component fails (e.g., a client or a key-value server), then we need a coordinated recovery procedure to ensure that commits are persisted correctly. In our research, we integrate an independent transaction manager with HBase. Our main contribution is a failure recovery middleware for the integrated system, which tracks the progress of each commit as it is flushed down by the client and persisted within HBase, so that we can recover reliably from failures. During recovery, commits that were interrupted by the failure are replayed from the transaction management log. Importantly, the recovery process does not interrupt transaction processing on the available servers. Using a benchmark, we evaluate the impact of component failure, and subsequent recovery, on application performance.