954 resultados para multi-factor authentication


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the multicast stream authentication problem when an opponent can drop, reorder and introduce data packets into the communication channel. In such a model, packet overhead and computing efficiency are two parameters to be taken into account when designing a multicast stream protocol. In this paper, we propose to use two families of erasure codes to deal with this problem, namely, rateless codes and maximum distance separable codes. Our constructions will have the following advantages. First, our packet overhead will be small. Second, the number of signature verifications to be performed at the receiver is O(1). Third, every receiver will be able to recover all the original data packets emitted by the sender despite losses and injection occurred during the transmission of information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A set of resistance-type strain sensors has been fabricated from metal-coated carbon nanofiller (CNF)/epoxy composites. Two nanofillers, i.e., multi-walled carbon nanotubes and vapor growth carbon fibers (VGCFs) with nickel, copper and silver coatings were used. The ultrahigh strain sensitivity was observed in these novel sensors as compared to the sensors made from the CNFs without metal-coating, and conventional strain gauges. In terms of gauge factor, the sensor made of VGCFs with silver coating is estimated to be 155, which is around 80 times higher than that in a metal-foil strain gauge. The possible mechanism responsible for the high sensitivity and its dependence with the networks of the CNFs with and without metal-coating and the geometries of the CNFs were thoroughly investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a new multi-class steganalysis for binary image. The proposed method can identify the type of steganographic technique used by examining on the given binary image. In addition, our proposed method is also capable of differentiating an image with hidden message from the one without hidden message. In order to do that, we will extract some features from the binary image. The feature extraction method used is a combination of the method extended from our previous work and some new methods proposed in this paper. Based on the extracted feature sets, we construct our multi-class steganalysis from the SVM classifier. We also present the empirical works to demonstrate that the proposed method can effectively identify five different types of steganography.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently a new human authentication scheme called PAS (predicate-based authentication service) was proposed, which does not require the assistance of any supplementary device. The main security claim of PAS is to resist passive adversaries who can observe the whole authentication session between the human user and the remote server. In this paper we show that PAS is insecure against both brute force attack and a probabilistic attack. In particular, we show that its security against brute force attack was strongly overestimated. Furthermore, we introduce a probabilistic attack, which can break part of the password even with a very small number of observed authentication sessions. Although the proposed attack cannot completely break the password, it can downgrade the PAS system to a much weaker system similar to common OTP (one-time password) systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 21st century will see monumental change. Either the human race will use its knowledge and skills and change the way it interacts with the environment, or the environment will change the way it interacts with its inhabitants. In the first case, the focus of this book, we would see our sophisticated understanding in areas such as physics, chemistry, engineering, biology, planning, commerce, business and governance accumulated over the last 1,000 years brought to bear on the challenge of dramatically reducing our pressure on the environment. The second case however is the opposite scenario, involving the decline of the planet’s ecosystems until they reach thresholds where recovery is not possible, and following which we have no idea what happens. For instance, if we fail to respond to Sir Nicolas Stern’s call to meet appropriate stabilisation trajectories for greenhouse gas emissions, and we allow the average temperature of our planets surface to increase by 4-6 degrees Celsius, we will see staggering changes to our environment, including rapidly rising sea level, withering crops, diminishing water reserves, drought, cyclones, floods… allowing this to happen will be the failure of our species, and those that survive will have a deadly legacy. In this update to the 1997 International Best Seller, Factor Four, Ernst von Weizsäcker again leads a team to present a compelling case for sector wide advances that can deliver significant resource productivity improvements over the coming century. The purpose of this book is to inspire hope and to then inform meaningful action in the coming decades to respond to the greatest challenge our species has ever faced – that of living in harmony with our planet and its other inhabitants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Secure multi-party computation (MPC) protocols enable a set of n mutually distrusting participants P 1, ..., P n , each with their own private input x i , to compute a function Y = F(x 1, ..., x n ), such that at the end of the protocol, all participants learn the correct value of Y, while secrecy of the private inputs is maintained. Classical results in the unconditionally secure MPC indicate that in the presence of an active adversary, every function can be computed if and only if the number of corrupted participants, t a , is smaller than n/3. Relaxing the requirement of perfect secrecy and utilizing broadcast channels, one can improve this bound to t a  < n/2. All existing MPC protocols assume that uncorrupted participants are truly honest, i.e., they are not even curious in learning other participant secret inputs. Based on this assumption, some MPC protocols are designed in such a way that after elimination of all misbehaving participants, the remaining ones learn all information in the system. This is not consistent with maintaining privacy of the participant inputs. Furthermore, an improvement of the classical results given by Fitzi, Hirt, and Maurer indicates that in addition to t a actively corrupted participants, the adversary may simultaneously corrupt some participants passively. This is in contrast to the assumption that participants who are not corrupted by an active adversary are truly honest. This paper examines the privacy of MPC protocols, and introduces the notion of an omnipresent adversary, which cannot be eliminated from the protocol. The omnipresent adversary can be either a passive, an active or a mixed one. We assume that up to a minority of participants who are not corrupted by an active adversary can be corrupted passively, with the restriction that at any time, the number of corrupted participants does not exceed a predetermined threshold. We will also show that the existence of a t-resilient protocol for a group of n participants, implies the existence of a t’-private protocol for a group of n′ participants. That is, the elimination of misbehaving participants from a t-resilient protocol leads to the decomposition of the protocol. Our adversary model stipulates that a MPC protocol never operates with a set of truly honest participants (which is a more realistic scenario). Therefore, privacy of all participants who properly follow the protocol will be maintained. We present a novel disqualification protocol to avoid a loss of privacy of participants who properly follow the protocol.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Australia’s building stock includes many older commercial buildings with numerous factors that impact energy performance and indoor environment quality. The built environment industry has generally focused heavily on improving physical building design elements for greater energy efficiency (such as retrofits and environmental upgrades), however there are noticeable ‘upper limits’ to performance improvements in these areas. To achieve a stepchange improvement in building performance, the authors propose that additional components need to be addressed in a whole of building approach, including the way building design elements are managed and the level of stakeholder engagement between owners, tenants and building managers. This paper focuses on the opportunities provided by this whole-of-building approach, presenting the findings of a research project undertaken through the Sustainable Built Environment National Research Centre (SBEnrc) in Australia. Researchers worked with a number of industry partners over two years to investigate issues facing stakeholders at base building and tenancy levels, and the barriers to improving building performance. Through a mixed-method, industry-led research approach, five ‘nodes’ were identified in whole-of-building performance evaluation, each with interlinking and overlapping complexities that can influence performance. The nodes cover building management, occupant experience, indoor environment quality, agreements and culture, and design elements. This paper outlines the development and testing of these nodes and their interactions, and the resultant multi-nodal tool, called the ‘Performance Nexus’ tool. The tool is intended to be of most benefit in evaluating opportunities for performance improvement in the vast number of existing low-performing building stock.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The contemporary default materials for multi-storey buildings – namely concrete and steel – are all significant generators of carbon and the use of timber products provides a technically, economically and environmentally viable alternative. In particular, timber’s sustainability can drive increased use and subsequent evolution of the Blue economy as a new economic model. National research to date, however, indicates a resistance to the uptake of timber technologies in Australia. To investigate this further, a preliminary study involving a convenience sample of 15 experts was conducted to identify the main barriers involved in the use of timber frames in multi-storey buildings. A closed-ended questionnaire survey involving 74 experienced construction industry participants was then undertaken to rate the relative importance of the barriers. The survey confirmed the most significant barriers to be a perceived increase in maintenance costs and fire risk, together with a limited awareness of the emerging timber technologies available. It is expected that the results will benefit government and the timber industry, contributing to environmental improvement by developing strategies to increase the use of timber technologies in multi-storey buildings by countering perceived barriers in the Australian context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents an acoustic emission (AE) based fault diagnosis for low speed bearing using multi-class relevance vector machine (RVM). A low speed test rig was developed to simulate the various defects with shaft speeds as low as 10 rpm under several loading conditions. The data was acquired using anAEsensor with the test bearing operating at a constant loading (5 kN) andwith a speed range from20 to 80 rpm. This study is aimed at finding a reliable method/tool for low speed machines fault diagnosis based on AE signal. In the present study, component analysis was performed to extract the bearing feature and to reduce the dimensionality of original data feature. The result shows that multi-class RVM offers a promising approach for fault diagnosis of low speed machines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigated the population genetics, demographic history and pathway of invasion of the Russian wheat aphid (RWA) from its native range in Central Asia, the Middle East and Europe to South Africa and the Americas. We screened microsatellite markers, mitochondrial DNA and endosymbiont genes in 504 RWA clones from nineteen populations worldwide. Following pathway analyses of microsatellite and endosymbiont data, we postulate that Turkey and Syria were the most likely sources of invasion to Kenya and South Africa, respectively. Furthermore, we found that one clone transferred between South Africa and the Americas was most likely responsible for the New World invasion. Finally, endosymbiont DNA was found to be a high resolution population genetic marker, extremely useful for studies of invasion over a relatively short evolutionary history time frame. This study has provided valuable insights into the factors that may have facilitated the recent global invasion by this damaging pest.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ramp signalling is an access control for motorways, in which a traffic signal is placed at on-ramps to regulate the rate of vehicles entering the motorway and thus to preserve the motorway capacity. In general, ramp signalling algorithms fall into two categories: local control and coordinated control by their effective scope. Coordinated ramp signalling strategies make use of measurements from the entire motorway network to operate individual ramp signals for the optimal performances at the network level. This study proposes a multi-hierarchical strategy for coordinated ramp signalling. The strategy is structured in two layers. At the higher layer with a longer update interval, coordination group is assembled and disassembled based on the location of high-risk breakdown flow. At the lower layer with a shorter update interval, individual ramps are hired to serve the coordination and are also released based on the prevailing congestion level on the ramp. This strategy is modelled and applied to the northbound Pacific Motorway micro-simulation platform (AIMSUN). The simulation results show an effective congestion mitigation of the proposed strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The structures of the compounds from the reaction of the drug dapsone [4-(4-aminophenylsulfonyl)aniline] with 3,5-dinitrosalicylic acid, the salt hydrate [4-(4-aminohenylsulfonyl)anilinium 2-carboxy-4,6-dinitrophenolate monohydrate] (1) and the 1:1 adduct with 5-nitroisophthalic acid [4-(4-aminophenylsulfonyl)aniline 5-nitrobenzene-1,3-dicarboxylic acid] (2) have been determined. Crystals of 1 are triclinic, space group P-1, with unit cell dimensions a = 8.2043(3), b = 11.4000(6), c = 11.8261(6)Å, α = 110.891(5), β = 91.927(3), γ = 98.590(4)deg. and Z = 4. Compound 2 is orthorhombic, space group Pbcn, with unit cell dimensions a = 20.2662(6), b = 12.7161(4), c = 15.9423(5)Å and Z = 8. In 1, intermolecular analinium N-H…O and water O-H…O and O-H…N hydrogen-bonding interactions with sulfone, carboxyl, phenolate and nitro O-atom and aniline N-atom acceptors give a two-dimensional layered structure. With 2, the intermolecular interactions involve both aniline N-H…O and carboxylic acid O-H…O and O-H…N hydrogen bonds to sulfone, carboxyl, nitro and aniline acceptors, giving a three-dimensional network structure. In both structures π--π aromatic ring associations are present.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ineffectiveness of current design processes has been well studied and has resulted in widespread calls for the evolution and development of new management processes. Even following the advent of BIM, we continue to move from one stage to another without necessarily having resolved all the issues. CAD design technology, if well handled, could have significantly raised the level of quality and efficiency of current processes, but in practice this was not fully realized. Therefore, technology alone can´t solve all the problems and the advent of BIM could result in a similar bottleneck. For a precise definition of the problem to be solved we should start by understanding what are the main current bottlenecks that have yet to be overcome by either new technologies or management processes, and the impact of human behaviour-related issues which impact the adoption and utilization of new technologies. The fragmented and dispersed nature of the AEC sector, and the huge number of small organizations that comprise it, are a major limiting factor. Several authors have addressed this issue and more recently IDDS has been defined as the highest level of achievement. However, what is written on IDDS shows an extremely ideal situation on a state to be achieved; it shows a holistic utopian proposition with the intent to create the research agenda to move towards that state. Key to IDDS is the framing of a new management model which should address the problems associated with key aspects: technology, processes, policies and people. One of the primary areas to be further studied is the process of collaborative work and understanding, together with the development of proposals to overcome the many cultural barriers that currently exist and impede the advance of new management methods. The purpose of this paper is to define and delimit problems to be solved so that it is possible to implement a new management model for a collaborative design process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In attempting to build intelligent litigation support tools, we have moved beyond first generation, production rule legal expert systems. Our work integrates rule based and case based reasoning with intelligent information retrieval. When using the case based reasoning methodology, or in our case the specialisation of case based retrieval, we need to be aware of how to retrieve relevant experience. Our research, in the legal domain, specifies an approach to the retrieval problem which relies heavily on an extended object oriented/rule based system architecture that is supplemented with causal background information. We use a distributed agent architecture to help support the reasoning process of lawyers. Our approach to integrating rule based reasoning, case based reasoning and case based retrieval is contrasted to the CABARET and PROLEXS architectures which rely on a centralised blackboard architecture. We discuss in detail how our various cooperating agents interact, and provide examples of the system at work. The IKBALS system uses a specialised induction algorithm to induce rules from cases. These rules are then used as indices during the case based retrieval process. Because we aim to build legal support tools which can be modified to suit various domains rather than single purpose legal expert systems, we focus on principles behind developing legal knowledge based systems. The original domain chosen was theAccident Compensation Act 1989 (Victoria, Australia), which relates to the provision of benefits for employees injured at work. For various reasons, which are indicated in the paper, we changed our domain to that ofCredit Act 1984 (Victoria, Australia). This Act regulates the provision of loans by financial institutions. The rule based part of our system which provides advice on the Credit Act has been commercially developed in conjunction with a legal firm. We indicate how this work has lead to the development of a methodology for constructing rule based legal knowledge based systems. We explain the process of integrating this existing commercial rule based system with the case base reasoning and retrieval architecture.

Relevância:

20.00% 20.00%

Publicador: