917 resultados para Two-wavelength HPLC fingerprinting, Cassia seeds, Chemometrics, Authentication


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction. Endoscopic anterior scoliosis correction has been employed recently as a less invasive and level-sparing approach compared with open surgical techniques. We have previously demonstrated that during the two-year post-operative period, there was a mean loss of rib hump correction by 1.4 degrees. The purpose of this study was to determine whether intra- or inter-vertebral rotational deformity during the post-operative period could account for the loss of rib hump correction. Materials and Methods. Ten consecutive patients diagnosed with adolescent idiopathic scoliosis were treated with an endoscopic anterior scoliosis correction. Low-dose computed tomography scans of the instrumented segment were obtained post-operatively at 6 and 24 months following institutional ethical approval and patient consent. Three-dimensional multi-planar reconstruction software (Osirix Imaging Software, Pixmeo, Switzerland) was used to create axial slices of each vertebral level, corrected in both coronal and sagittal planes. Vertebral rotation was measured using Ho’s method for every available superior and inferior endplate at 6 and 24 months. Positive changes in rotation indicate a reduction and improvement in vertebral rotation. Intra-observer variability analysis was performed on a subgroup of images. Results. Mean change in rotation for vertebral endplates between 6 and 24 months post-operatively was -0.26˚ (range -3.5 to 4.9˚) within the fused segment and +1.26˚ (range -7.2 to 15.1˚) for the un-instrumented vertebrae above and below the fusion. Mean change in clinically measured rib hump for the 10 patients was -1.6˚ (range -3 to 0˚). The small change in rotation within the fused segment accounts for only 16.5% of the change in rib hump measured clinically whereas the change in rotation between the un-instrumented vertebrae above and below the construct accounts for 78.8%. There was no clear association between rib hump recurrence and intra- or inter-vertebral rotation in individual patients. Intra-rater variability was ± 3˚. Conclusions. Intra- and inter-vertebral rotation continues post-operatively both within the instrumented and un-instrumented segments of the immature spine. Rotation between the un-instrumented vertebrae above and below the fusion was +1.26˚, suggesting that the un-instrumented vertebrae improved and de-rotated slightly after surgery. This may play a role in rib hump recurrence, however this remains clinically insignificant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two varieties of grapes, white grape and red grape grown in the Campania region of Italy were selected for the study of drying characteristics. Comparisons were made with treated and untreated grapes under constant drying condition of 50o C in a conventional drying system. This temperature was selected to represent farm drying conditions. Grapes were purchased from a local market from the same supplier to maintain the same size of grapes and same properties. An abrasive physical treatment was used as pretreatment. The drying curves were constructed and drying kinetics was calculated using several commonly available models. It was found that treated samples show better drying characteristics than untreated samples. The objective of this study is to obtain drying kinetics which can be used to optimize the drying operations in grape drying.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bovine intestine samples were heat pump fluidized bed dried at atmospheric pressure and at temperatures below and above the material freezing points equipped with a continuous monitoring system. The investigation of the drying characteristics has been conducted in the temperature range -10~25oC and the airflow in the range 1.5~2.5 m/s. Some experiments were conducted as a single temperature drying experiments and others as two stage drying experiments employing two temperatures. An Arrhenius-type equation was used to interpret the influence of the drying air parameters on the effective diffusivity, calculated with the method of slopes in terms of energy activation, and this was found to be sensitivity of the temperature. The effective diffusion coefficient of moisture transfer was determined by Fickian method using uni-dimensional moisture movement in both moisture, removal by evaporation and combined sublimation and evaporation. Correlations expressing the effective moisture diffusivity and drying temperature are reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Availability has become a primary goal of information security and is as significant as other goals, in particular, confidentiality and integrity. Maintaining availability of essential services on the public Internet is an increasingly difficult task in the presence of sophisticated attackers. Attackers may abuse limited computational resources of a service provider and thus managing computational costs is a key strategy for achieving the goal of availability. In this thesis we focus on cryptographic approaches for managing computational costs, in particular computational effort. We focus on two cryptographic techniques: computational puzzles in cryptographic protocols and secure outsourcing of cryptographic computations. This thesis contributes to the area of cryptographic protocols in the following ways. First we propose the most efficient puzzle scheme based on modular exponentiations which, unlike previous schemes of the same type, involves only a few modular multiplications for solution verification; our scheme is provably secure. We then introduce a new efficient gradual authentication protocol by integrating a puzzle into a specific signature scheme. Our software implementation results for the new authentication protocol show that our approach is more efficient and effective than the traditional RSA signature-based one and improves the DoSresilience of Secure Socket Layer (SSL) protocol, the most widely used security protocol on the Internet. Our next contributions are related to capturing a specific property that enables secure outsourcing of cryptographic tasks in partial-decryption. We formally define the property of (non-trivial) public verifiability for general encryption schemes, key encapsulation mechanisms (KEMs), and hybrid encryption schemes, encompassing public-key, identity-based, and tag-based encryption avors. We show that some generic transformations and concrete constructions enjoy this property and then present a new public-key encryption (PKE) scheme having this property and proof of security under the standard assumptions. Finally, we combine puzzles with PKE schemes for enabling delayed decryption in applications such as e-auctions and e-voting. For this we first introduce the notion of effort-release PKE (ER-PKE), encompassing the well-known timedrelease encryption and encapsulated key escrow techniques. We then present a security model for ER-PKE and a generic construction of ER-PKE complying with our security notion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Denial-of-service (DoS) attacks are a growing concern to networked services like the Internet. In recent years, major Internet e-commerce and government sites have been disabled due to various DoS attacks. A common form of DoS attack is a resource depletion attack, in which an attacker tries to overload the server's resources, such as memory or computational power, rendering the server unable to service honest clients. A promising way to deal with this problem is for a defending server to identify and segregate malicious traffic as earlier as possible. Client puzzles, also known as proofs of work, have been shown to be a promising tool to thwart DoS attacks in network protocols, particularly in authentication protocols. In this thesis, we design efficient client puzzles and propose a stronger security model to analyse client puzzles. We revisit a few key establishment protocols to analyse their DoS resilient properties and strengthen them using existing and novel techniques. Our contributions in the thesis are manifold. We propose an efficient client puzzle that enjoys its security in the standard model under new computational assumptions. Assuming the presence of powerful DoS attackers, we find a weakness in the most recent security model proposed to analyse client puzzles and this study leads us to introduce a better security model for analysing client puzzles. We demonstrate the utility of our new security definitions by including two hash based stronger client puzzles. We also show that using stronger client puzzles any protocol can be converted into a provably secure DoS resilient key exchange protocol. In other contributions, we analyse DoS resilient properties of network protocols such as Just Fast Keying (JFK) and Transport Layer Security (TLS). In the JFK protocol, we identify a new DoS attack by applying Meadows' cost based framework to analyse DoS resilient properties. We also prove that the original security claim of JFK does not hold. Then we combine an existing technique to reduce the server cost and prove that the new variant of JFK achieves perfect forward secrecy (the property not achieved by original JFK protocol) and secure under the original security assumptions of JFK. Finally, we introduce a novel cost shifting technique which reduces the computation cost of the server significantly and employ the technique in the most important network protocol, TLS, to analyse the security of the resultant protocol. We also observe that the cost shifting technique can be incorporated in any Diffine{Hellman based key exchange protocol to reduce the Diffie{Hellman exponential cost of a party by one multiplication and one addition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents a segmentation pipeline that fuses colour and depth information to automatically separate objects of interest in video sequences captured from a quadcopter. Many approaches assume that cameras are static with known position, a condition which cannot be preserved in most outdoor robotic applications. In this study, the authors compute depth information and camera positions from a monocular video sequence using structure from motion and use this information as an additional cue to colour for accurate segmentation. The authors model the problem similarly to standard segmentation routines as a Markov random field and perform the segmentation using graph cuts optimisation. Manual intervention is minimised and is only required to determine pixel seeds in the first frame which are then automatically reprojected into the remaining frames of the sequence. The authors also describe an automated method to adjust the relative weights for colour and depth according to their discriminative properties in each frame. Experimental results are presented for two video sequences captured using a quadcopter. The quality of the segmentation is compared to a ground truth and other state-of-the-art methods with consistently accurate results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In most developing countries, the overall quality of the livelihood of labourers, work place environment and implementation of labour rights do not progress at the same rate as their industrial development. To address this situation, the ILO has initiated the concept of 'decent work' to assist regulators articulate labour-related social policy goals. Against this backdrop, this article assesses the Bangladesh Labour Law 2006 by reference to the four social principles developed by the ILO for ensuring 'decent work'. It explains the impact of the absence of these principles in this Law on the labour administration in the ready-made garment and ship-breaking industries. It finds that an appropriate legislative framework needs to be based on the principles of 'decent work' to establish a solid platform for a sound labour regulation in Bangladesh.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Yates et al (1996) provided a review of the literature on educational approaches to improving psychosocial care of terminally ill patients and their families and suggested that there was an urgent need for innovation in this area. A programme of professional development currently being offered to 181 palliative care nurses in Queensland, Australia, was also described. This paper presents research in progress evaluating this programme which involves use of a quasi-experimental pre-post test design. It also includes process and outcome measures to assess effectiveness in improving the participant's ability to provide psychosocial care to patients and families. Research examining the effectiveness of various educational programmes on care of the dying has offered equivocal results (Yates et al 1996). Degner and Gow (1988a) noted that the inconsistencies found in research into death education result from inadequate study designs, variations in the conceptualisation and measurement of the outcomes of the programmes and flaws in data analysis. Such studies have often lacked a theoretical basis, few have employed well-controlled experimental designs, and the programme outcomes have generally been limited to the participant's 'death anxiety', or other death attitudes which have been variously defined and measured. Whilst Degner and Gow (1988b) have reported that undergraduate nursing students who participated in a care of the dying educational programme demonstrated more 'approach caring' behaviours than a control group, the impact of education programmes on patient care has rarely been examined. Failure to link education to nursing practice and subsequent clinical outcomes has, however, been seen as a major limitation of nursing knowledge in this area (Degner et al 1991). This paper describes an approach to researching the effectiveness of professional development programmes for palliative care nurses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This editorial on health and guardianship law provides an overview of the causation issues that precluded the recovery of two medical negligence claims in the cases of Wallace v Kam [2013] HCA 19 and Waller v James [2013] NSWSC 497.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two recent decisions of the Supreme Court of New South Wales in the context of obstetric management have highlighted firstly, the importance of keeping legible, accurate and detailed medical records; and secondly, the challenges faced by those seeking to establish causation, particularly where epidemiological evidence is relied upon...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a classification problem typically we face two challenging issues, the diverse characteristic of negative documents and sometimes a lot of negative documents that are closed to positive documents. Therefore, it is hard for a single classifier to clearly classify incoming documents into classes. This paper proposes a novel gradual problem solving to create a two-stage classifier. The first stage identifies reliable negatives (negative documents with weak positive characteristics). It concentrates on minimizing the number of false negative documents (recall-oriented). We use Rocchio, an existing recall based classifier, for this stage. The second stage is a precision-oriented “fine tuning”, concentrates on minimizing the number of false positive documents by applying pattern (a statistical phrase) mining techniques. In this stage a pattern-based scoring is followed by threshold setting (thresholding). Experiment shows that our statistical phrase based two-stage classifier is promising.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Transport Layer Security (TLS) protocol is the most widely used security protocol on the Internet. It supports negotiation of a wide variety of cryptographic primitives through different cipher suites, various modes of client authentication, and additional features such as renegotiation. Despite its widespread use, only recently has the full TLS protocol been proven secure, and only the core cryptographic protocol with no additional features. These additional features have been the cause of several practical attacks on TLS. In 2009, Ray and Dispensa demonstrated how TLS renegotiation allows an attacker to splice together its own session with that of a victim, resulting in a man-in-the-middle attack on TLS-reliant applications such as HTTP. TLS was subsequently patched with two defence mechanisms for protection against this attack. We present the first formal treatment of renegotiation in secure channel establishment protocols. We add optional renegotiation to the authenticated and confidential channel establishment model of Jager et al., an adaptation of the Bellare--Rogaway authenticated key exchange model. We describe the attack of Ray and Dispensa on TLS within our model. We show generically that the proposed fixes for TLS offer good protection against renegotiation attacks, and give a simple new countermeasure that provides renegotiation security for TLS even in the face of stronger adversaries.