286 resultados para correctness verification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper makes a formal security analysis of the current Australian e-passport implementation using model checking tools CASPER/CSP/FDR. We highlight security issues in the current implementation and identify new threats when an e-passport system is integrated with an automated processing system like SmartGate. The paper also provides a security analysis of the European Union (EU) proposal for Extended Access Control (EAC) that is intended to provide improved security in protecting biometric information of the e-passport bearer. The current e-passport specification fails to provide a list of adequate security goals that could be used for security evaluation. We fill this gap; we present a collection of security goals for evaluation of e-passport protocols. Our analysis confirms existing security weaknesses that were previously identified and shows that both the Australian e-passport implementation and the EU proposal fail to address many security and privacy aspects that are paramount in implementing a secure border control mechanism. ACM Classification C.2.2 (Communication/Networking and Information Technology – Network Protocols – Model Checking), D.2.4 (Software Engineering – Software/Program Verification – Formal Methods), D.4.6 (Operating Systems – Security and Privacy Protection – Authentication)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, botnet, a network of compromised computers, has been recognized as the biggest threat to the Internet. The bots in a botnet communicate with the botnet owner via a communication channel called Command and Control (C & C) channel. There are three main C & C channels: Internet Relay Chat (IRC), Peer-to-Peer (P2P) and web-based protocols. By exploiting the flexibility of the Web 2.0 technology, the web-based botnet has reached a new level of sophistication. In August 2009, such botnet was found on Twitter, one of the most popular Web 2.0 services. In this paper, we will describe a new type of botnet that uses Web 2.0 service as a C & C channel and a temporary storage for their stolen information. We will then propose a novel approach to thwart this type of attack. Our method applies a unique identifier of the computer, an encryption algorithm with session keys and a CAPTCHA verification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increasing the importance and use of infrastructures such as bridges, demands more effective structural health monitoring (SHM) systems. SHM has well addressed the damage detection issues through several methods such as modal strain energy (MSE). Many of the available MSE methods either have been validated for limited type of structures such as beams or their performance is not satisfactory. Therefore, it requires a further improvement and validation of them for different types of structures. In this study, an MSE method was mathematically improved to precisely quantify the structural damage at an early stage of formation. Initially, the MSE equation was accurately formulated considering the damaged stiffness and then it was used for derivation of a more accurate sensitivity matrix. Verification of the improved method was done through two plane structures: a steel truss bridge and a concrete frame bridge models that demonstrate the framework of a short- and medium-span of bridge samples. Two damage scenarios including single- and multiple-damage were considered to occur in each structure. Then, for each structure, both intact and damaged, modal analysis was performed using STRAND7. Effects of up to 5 per cent noise were also comprised. The simulated mode shapes and natural frequencies derived were then imported to a MATLAB code. The results indicate that the improved method converges fast and performs well in agreement with numerical assumptions with few computational cycles. In presence of some noise level, it performs quite well too. The findings of this study can be numerically extended to 2D infrastructures particularly short- and medium-span bridges to detect the damage and quantify it more accurately. The method is capable of providing a proper SHM that facilitates timely maintenance of bridges to minimise the possible loss of lives and properties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Amiton (O,O-diethyl-S-[2-(diethylamino)ethyl]phosphorothiolate), otherwise known as VG, is listed in schedule 2 of the Chemical Weapons Convention (CWC) and has a structure closely related to VX (O-ethyl-S-(2-diisopropylamino)ethylmethylphosphonothiolate). Fragmentation of protonated VG in the gas phase was performed using electrospray ionisation ion trap mass spectrometry (ESI-ITMS) and revealed several characteristic product ions. Quantum chemical calculations provide the most probable structures for these ions as well as the likely unimolecular mechanisms by which they are formed. The decomposition pathways predicted by computation are consistent with deuterium-labeling studies. The combination of experimental and theoretical data suggests that the fragmentation pathways of VG and analogous organophosphorus nerve agents, such as VX and Russian VX, are predictable and thus ESI tandem mass spectrometry is a powerful tool for the verification of unknown compounds listed in the CWC. Copyright (c) 2006 Commonwealth of Australia. Published by John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper provides a detailed description of the current Australian e-passport implementation and makes a formal verification using model checking tools CASPER/CSP/FDR. We highlight security issues present in the current e-passport implementation and identify new threats when an e-passport system is integrated with an automated processing systems like SmartGate. Because the current e-passport specification does not provide adequate security goals, to perform a rational security analysis we identify and describe a set of security goals for evaluation of e-passport protocols. Our analysis confirms existing security issues that were previously informally identified and presents weaknesses that exists in the current e-passport implementation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present efficient protocols for private set disjointness tests. We start from an intuition of our protocols that applies Sylvester matrices. Unfortunately, this simple construction is insecure as it reveals information about the cardinality of the intersection. More specifically, it discloses its lower bound. By using the Lagrange interpolation we provide a protocol for the honest-but-curious case without revealing any additional information. Finally, we describe a protocol that is secure against malicious adversaries. The protocol applies a verification test to detect misbehaving participants. Both protocols require O(1) rounds of communication. Our protocols are more efficient than the previous protocols in terms of communication and computation overhead. Unlike previous protocols whose security relies on computational assumptions, our protocols provide information theoretic security. To our knowledge, our protocols are first ones that have been designed without a generic secure function evaluation. More importantly, they are the most efficient protocols for private disjointness tests for the malicious adversary case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Process models define allowed process execution scenarios. The models are usually depicted as directed graphs, with gateway nodes regulating the control flow routing logic and with edges specifying the execution order constraints between tasks. While arbitrarily structured control flow patterns in process models complicate model analysis, they also permit creativity and full expressiveness when capturing non-trivial process scenarios. This paper gives a classification of arbitrarily structured process models based on the hierarchical process model decomposition technique. We identify a structural class of models consisting of block structured patterns which, when combined, define complex execution scenarios spanning across the individual patterns. We show that complex behavior can be localized by examining structural relations of loops in hidden unstructured regions of control flow. The correctness of the behavior of process models within these regions can be validated in linear time. These observations allow us to suggest techniques for transforming hidden unstructured regions into block-structured ones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Radiographic examinations of the ankle are important in the clinical management of ankle injuries in hospital emergency departments. National (Australian) Emergency Access Targets (NEAT) stipulate that 90 percent of presentations should leave the emergency department within 4 hours. For a radiological report to have clinical usefulness and relevance to clinical teams treating patients with ankle injuries in emergency departments, the report would need to be prepared and available to the clinical team within the NEAT 4 hour timeframe; before the patient has left the emergency department. However, little is known about the demand profile of ankle injuries requiring radiographic examination or time until radiological reports are available for this clinical group in Australian public hospital emergency settings. Methods This study utilised a prospective cohort of consecutive cases of ankle examinations from patients (n=437) with suspected traumatic ankle injuries presenting to the emergency department of a tertiary hospital facility. Time stamps from the hospital Picture Archiving and Communication System were used to record the timing of three processing milestones for each patient's radiographic examination; the time of image acquisition, time of a provisional radiological report being made available for viewing by referring clinical teams, and time of final verification of radiological report. Results Radiological reports and all three time stamps were available for 431 (98.6%) cases and were included in analysis. The total time between image acquisition and final radiological report verification exceeded 4?hours for 404 (92.5%) cases. The peak demand for radiographic examination of ankles was on weekend days, and in the afternoon and evening. The majority of examinations were provisionally reported and verified during weekday daytime shift hours. Conclusions Provisional or final radiological reports were frequently not available within 4 hours of image acquisition among this sample. Effective and cost-efficient strategies to improve the support provided to referring clinical teams from medical imaging departments may enhance emergency care interventions for people presenting to emergency departments with ankle injuries; particularly those with imaging findings that may be challenging for junior clinical staff to interpret without a definitive radiological report.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Integer ambiguity resolution is an indispensable procedure for all high precision GNSS applications. The correctness of the estimated integer ambiguities is the key to achieving highly reliable positioning, but the solution cannot be validated with classical hypothesis testing methods. The integer aperture estimation theory unifies all existing ambiguity validation tests and provides a new prospective to review existing methods, which enables us to have a better understanding on the ambiguity validation problem. This contribution analyses two simple but efficient ambiguity validation test methods, ratio test and difference test, from three aspects: acceptance region, probability basis and numerical results. The major contribution of this paper can be summarized as: (1) The ratio test acceptance region is overlap of ellipsoids while the difference test acceptance region is overlap of half-spaces. (2) The probability basis of these two popular tests is firstly analyzed. The difference test is an approximation to optimal integer aperture, while the ratio test follows an exponential relationship in probability. (3) The limitations of the two tests are firstly identified. The two tests may under-evaluate the failure risk if the model is not strong enough or the float ambiguities fall in particular region. (4) Extensive numerical results are used to compare the performance of these two tests. The simulation results show the ratio test outperforms the difference test in some models while difference test performs better in other models. Particularly in the medium baseline kinematic model, the difference tests outperforms the ratio test, the superiority is independent on frequency number, observation noise, satellite geometry, while it depends on success rate and failure rate tolerance. Smaller failure rate leads to larger performance discrepancy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Early feeding practices lay the foundation for children’s eating habits and weight gain. Questionnaires are available to assess parental feeding but overlapping and inconsistent items, subscales and terminology limit conceptual clarity and between study comparisons. Our aim was to consolidate a range of existing items into a parsimonious and conceptually robust questionnaire for assessing feeding practices with very young children (<3 years). Methods Data were from 462 mothers and children (age 21–27 months) from the NOURISH trial. Items from five questionnaires and two study-specific items were submitted to a priori item selection, allocation and verification, before theoretically-derived factors were tested using Confirmatory Factor Analysis. Construct validity of the new factors was examined by correlating these with child eating behaviours and weight. Results Following expert review 10 factors were specified. Of these, 9 factors (40 items) showed acceptable model fit and internal reliability (Cronbach’s α: 0.61-0.89). Four factors reflected non-responsive feeding practices: ‘Distrust in Appetite’, ‘Reward for Behaviour’, ‘Reward for Eating’, and ‘Persuasive Feeding’. Five factors reflected structure of the meal environment and limits: ‘Structured Meal Setting’, ‘Structured Meal Timing’, ‘Family Meal Setting’, ‘Overt Restriction’ and ‘Covert Restriction’. Feeding practices generally showed the expected pattern of associations with child eating behaviours but none with weight. Conclusion The Feeding Practices and Structure Questionnaire (FPSQ) provides a new reliable and valid measure of parental feeding practices, specifically maternal responsiveness to children’s hunger/satiety signals facilitated by routine and structure in feeding. Further validation in more diverse samples is required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we tackle the problem of finding an efficient signature verification scheme when the number of signatures is signi.- cantly large and the verifier is relatively weak. In particular, we tackle the problem of message authentication in many-to-one communication networks known as concast communication. The paper presents three signature screening algorithms for a variant of ElGamal-type digital signatures. The cost for these schemes is n applications of hash functions, 2n modular multiplications, and n modular additions plus the verification of one digital signature, where n is the number of signatures. The paper also presents a solution to the open problem of finding a fast screening signature for non-RSA digital signature schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A secure protocol for electronic, sealed-bid, single item auctions is presented. The protocol caters to both first and second price (Vickrey) auctions and provides full price flexibility. Both computational and communication cost are linear with the number of bidders and utilize only standard cryptographic primitives. The protocol strictly divides knowledge of the bidder's identity and their actual bids between, respectively, a registration authority and an auctioneer, who are assumed not to collude but may be separately corrupt. This assures strong bidder-anonymity, though only weak bid privacy. The protocol is structured in two phases, each involving only off-line communication. Registration, requiring the use of the public key infrastructure, is simultaneous with hash-sealed bid-commitment and generates a receipt to the bidder containing a pseudonym. This phase is followed by encrypted bid-submission. Both phases involve the registration authority acting as a communication conduit but the actual message size is quite small. It is argued that this structure guarantees non-repudiation by both the winner and the auctioneer. Second price correctness is enforced either by observing the absence of registration of the claimed second-price bid or, where registered but lower than the actual second price, is subject to cooperation by the second price bidder - presumably motivated through self-interest. The use of the registration authority in other contexts is also considered with a view to developing an architecture for efficient secure multiparty transactions

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We determine the affine equivalence classes of the eight variable degree three homogeneous bent functions using a new algorithm. Our algorithm applies to general bent functions and can systematically determine the automorphism groups. We provide a partial verification of the enumeration of eight variable degree three homogeneous bent functions obtained by Meng et al. We determine the affine equivalence classes of these functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A parallel authentication and public-key encryption is introduced and exemplified on joint encryption and signing which compares favorably with sequential Encrypt-then-Sign (ɛtS) or Sign-then-Encrypt (Stɛ) schemes as far as both efficiency and security are concerned. A security model for signcryption, and thus joint encryption and signing, has been recently defined which considers possible attacks and security goals. Such a scheme is considered secure if the encryption part guarantees indistinguishability and the signature part prevents existential forgeries, for outsider but also insider adversaries. We propose two schemes of parallel signcryption, which are efficient alternative to Commit-then-Sign-and- Encrypt (Ct&G3&S). They are both provably secure in the random oracle model. The first one, called generic parallel encrypt and sign, is secure if the encryption scheme is semantically secure against chosen-ciphertext attacks and the signature scheme prevents existential forgeries against random-message attacks. The second scheme, called optimal parallel encrypt. and sign, applies random oracles similar to the OAEP technique in order to achieve security using encryption and signature components with very weak security requirements — encryption is expected to be one-way under chosen-plaintext attacks while signature needs to be secure against universal forgeries under random-plaintext attack, that is actually the case for both the plain-RSA encryption and signature under the usual RSA assumption. Both proposals are generic in the sense that any suitable encryption and signature schemes (i.e. which simply achieve required security) can be used. Furthermore they allow both parallel encryption and signing, as well as parallel decryption and verification. Properties of parallel encrypt and sign schemes are considered and a new security standard for parallel signcryption is proposed.