929 resultados para Oracle bones
Resumo:
Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.
Resumo:
This dissertation analyses how physical objects are translated into digital artworks using techniques which can lead to ‘imperfections’ in the resulting digital artwork that are typically removed to arrive at a ‘perfect’ final representation. The dissertation discusses the adaptation of existing techniques into an artistic workflow that acknowledges and incorporates the imperfections of translation into the final pieces. It presents an exploration of the relationship between physical and digital artefacts and the processes used to move between the two. The work explores the 'craft' of digital sculpting and the technology used in producing what the artist terms ‘a naturally imperfect form’, incorporating knowledge of traditional sculpture, an understanding of anatomy and an interest in the study of bones (Osteology). The outcomes of the research are presented as a series of digital sculptural works, exhibited as a collection of curiosities in multiple mediums, including interactive game spaces, augmented reality (AR), rapid prototype prints (RP) and video displays.
Resumo:
Availability has become a primary goal of information security and is as significant as other goals, in particular, confidentiality and integrity. Maintaining availability of essential services on the public Internet is an increasingly difficult task in the presence of sophisticated attackers. Attackers may abuse limited computational resources of a service provider and thus managing computational costs is a key strategy for achieving the goal of availability. In this thesis we focus on cryptographic approaches for managing computational costs, in particular computational effort. We focus on two cryptographic techniques: computational puzzles in cryptographic protocols and secure outsourcing of cryptographic computations. This thesis contributes to the area of cryptographic protocols in the following ways. First we propose the most efficient puzzle scheme based on modular exponentiations which, unlike previous schemes of the same type, involves only a few modular multiplications for solution verification; our scheme is provably secure. We then introduce a new efficient gradual authentication protocol by integrating a puzzle into a specific signature scheme. Our software implementation results for the new authentication protocol show that our approach is more efficient and effective than the traditional RSA signature-based one and improves the DoSresilience of Secure Socket Layer (SSL) protocol, the most widely used security protocol on the Internet. Our next contributions are related to capturing a specific property that enables secure outsourcing of cryptographic tasks in partial-decryption. We formally define the property of (non-trivial) public verifiability for general encryption schemes, key encapsulation mechanisms (KEMs), and hybrid encryption schemes, encompassing public-key, identity-based, and tag-based encryption avors. We show that some generic transformations and concrete constructions enjoy this property and then present a new public-key encryption (PKE) scheme having this property and proof of security under the standard assumptions. Finally, we combine puzzles with PKE schemes for enabling delayed decryption in applications such as e-auctions and e-voting. For this we first introduce the notion of effort-release PKE (ER-PKE), encompassing the well-known timedrelease encryption and encapsulated key escrow techniques. We then present a security model for ER-PKE and a generic construction of ER-PKE complying with our security notion.
Resumo:
Denial-of-service (DoS) attacks are a growing concern to networked services like the Internet. In recent years, major Internet e-commerce and government sites have been disabled due to various DoS attacks. A common form of DoS attack is a resource depletion attack, in which an attacker tries to overload the server's resources, such as memory or computational power, rendering the server unable to service honest clients. A promising way to deal with this problem is for a defending server to identify and segregate malicious traffic as earlier as possible. Client puzzles, also known as proofs of work, have been shown to be a promising tool to thwart DoS attacks in network protocols, particularly in authentication protocols. In this thesis, we design efficient client puzzles and propose a stronger security model to analyse client puzzles. We revisit a few key establishment protocols to analyse their DoS resilient properties and strengthen them using existing and novel techniques. Our contributions in the thesis are manifold. We propose an efficient client puzzle that enjoys its security in the standard model under new computational assumptions. Assuming the presence of powerful DoS attackers, we find a weakness in the most recent security model proposed to analyse client puzzles and this study leads us to introduce a better security model for analysing client puzzles. We demonstrate the utility of our new security definitions by including two hash based stronger client puzzles. We also show that using stronger client puzzles any protocol can be converted into a provably secure DoS resilient key exchange protocol. In other contributions, we analyse DoS resilient properties of network protocols such as Just Fast Keying (JFK) and Transport Layer Security (TLS). In the JFK protocol, we identify a new DoS attack by applying Meadows' cost based framework to analyse DoS resilient properties. We also prove that the original security claim of JFK does not hold. Then we combine an existing technique to reduce the server cost and prove that the new variant of JFK achieves perfect forward secrecy (the property not achieved by original JFK protocol) and secure under the original security assumptions of JFK. Finally, we introduce a novel cost shifting technique which reduces the computation cost of the server significantly and employ the technique in the most important network protocol, TLS, to analyse the security of the resultant protocol. We also observe that the cost shifting technique can be incorporated in any Diffine{Hellman based key exchange protocol to reduce the Diffie{Hellman exponential cost of a party by one multiplication and one addition.
Resumo:
This paper presents a comprehensive formal security framework for key derivation functions (KDF). The major security goal for a KDF is to produce cryptographic keys from a private seed value where the derived cryptographic keys are indistinguishable from random binary strings. We form a framework of five security models for KDFs. This consists of four security models that we propose: Known Public Inputs Attack (KPM, KPS), Adaptive Chosen Context Information Attack (CCM) and Adaptive Chosen Public Inputs Attack(CPM); and another security model, previously defined by Krawczyk [6], which we refer to as Adaptive Chosen Context Information Attack(CCS). These security models are simulated using an indistinguisibility game. In addition we prove the relationships between these five security models and analyse KDFs using the framework (in the random oracle model).
Resumo:
This thesis presents a study using mechanical testing techniques combined with advanced computational methods to examine the mechanics of bone. It contributes novel observations and analysis of how bones fail at the microscopic level, which will be valuable in furthering our understanding and the treatment of bone damage in health and disease, including osteoporosis.
Resumo:
Injured bone initiates the healing process by forming a blood clot at the damaged site. However, in severe damage, synthetic bone implants are used to provide structural integrity and restore the healing process. The implant unavoidably comes into direct contact with whole blood, leading to a blood clot formation on its surface. Despite this, most research in bone tissue engineering virtually ignores the important role of a blood clot in supporting healing. Surface chemistry of a biomaterial is a crucial property in mediating blood-biomaterials interactions, and hence the formation of the resultant blood clot. Surfaces presenting mixtures of functional groups carboxyl (–COOH) and methyl (–CH3) have been shown to enhance platelet response and coagulation activation, leading to the formation of fibrin fibres. In addition, it has been shown that varying the compositions of these functional groups and the length of alkyl groups further modulate the immune complement response. In this study, we hypothesised that a biomaterial surface with mixture of –COOH/–CH3(methyl), –CH2CH3 (ethyl) or –(CH2)3CH3 (butyl) groups at different ratios would modulate blood coagulation and complement activation, and eventually tailor the structural and functional properties of the blood clot formed on the surface, which subsequently impacts new bone formation. Firstly, we synthesised a series of materials composed of acrylic acid (AA), and methyl (MMA), ethyl (EMA) or butyl methacrylates (BMA) at different ratios and coated on the inner surfaces of incubation vials. Our surface analysis showed that the amount of –COOH groups on the surface coatings was lower than the ratios of AA prepared in the materials even though the surface content of –COOH groups increased with increasing in AA ratios. It was indicated that the surface hydrophobicity increased with increasing alkyl chain length: –CH 3 > –CH2CH3 > –(CH2)3CH3, and decreased with increasing –COOH groups. No significant differences in surface hydrophobicity was found on surfaces with –CH3 and –CH2CH3 groups in the presence of –COOH groups. The material coating was as smooth as uncoated glass and without any major flaws. The average roughness of material-coated surface (3.99 ± 0.54 nm) was slightly higher than that of uncoated glass surface (2.22 ± 0.29 nm). However, no significant differences in surface average roughness was found among surfaces with the same functionalities at different –COOH ratios nor among surfaces with different alkyl groups but the same –COOH ratios. These suggested that the surface functional groups and their compositions had a combined effect on modulating surface hydrophobicity but not surface roughness. The second part of our study was to investigate the effect of surface functional groups and their compositions on blood cascade activation and structural properties of the formed clots. It was found that surfaces with –COOH/–(CH2)3CH3 induced a faster coagulation activation than those with –COOH/–CH3 and –CH2CH3, regardless of the –COOH ratios. An increase in –COOH ratios on –COOH/–CH3 and –CH2CH3 surfaces decreased the rate of activation. Moreover, all material-coated surfaces markedly reduced the complement activation compared to uncoated glass surfaces, and the pattern of complement activation was entirely similar to that of surface-induced coagulation, suggesting there is an interaction between two cascades. The clots formed on material-coated surfaces had thicker fibrin with a tighter network at the exterior when compared to uncoated glass surfaces. Compared to the clot exteriors, thicker fibrins with a loose network were found in clot interiors. Coated surfaces resulted in more rigid clots with a significantly slower fibrinolysis after 1 h of lysis when compared to uncoated glass surfaces. Significant differences in fibrinolysis after 1 h of lysis among clots on material-coated surfaces correlated well with the differences in fibrin thickness and density at clot exterior. In addition, more growth factors were released during clot formation than during clot lysis. From an intact clot, there was a correlation between the amount of PDGF-AB release and fibrin density. Highest amount of PDGF-AB was released from clots formed on surfaces with 40% –COOH/60% –CH 3 (i.e. 65MMA). During clot lysis, the release of PDGF-AB also correlated with the fibrinolytic rate while the release of TGF-â1 was influenced by the fibrin thickness. This suggested that different clot structures led to different release profiles of growth factors in clot intact and degrading stages. We further validated whether the clots formed on material-coatings provide the microenvironment for improved bone healing by using a rabbit femoral defect model. In this pilot study, the implantation of clots formed on 65MMA coatings significantly increased new bone formation with enhanced chondrogenesis, osteoblasts activity and vascularisation, but decreased inflammatory macrophage number at the defects after 4 weeks when compared to commercial bone grafts ChronOSTM â-TCP granules. Empty defects were observed when blood clot formation was inhibited. In summary, our study demonstrated that surface functional groups and their relative ratios on material coatings synergistically modulate activation of blood cascades, resultant fibrin architecture, rigidity, susceptibility to fibrinolysis as well as growth factor release of the formed clots, which ultimately alter the healing microenvironment of injured bones.
Resumo:
Runt related transcription factor 2 (RUNX2) is a key regulator of osteoblast differentiation. Several variations within RUNX2 have been found to be associated with significant changes in BMD, which is a major risk factor for fracture. In this study we report that an 18bp deletion within the polyalanine tract (17A>11A) of RUNX2 is significantly associated with fracture. Carriers of the 11A allele were found to be nearly twice as likely to have sustained fracture. Within the fracture category, there was a significant tendency of 11A carriers to present with fractures of bones of intramembranous origin compared to bones of endochondral origin (p=0.005). In a population of random subjects, the 11A allele was associated with decreased levels of serum collagen cross links (CTx, p=0.01), suggesting decreased bone turnover. The transactivation function of the 11A allele was quantitatively decreased. Interestingly, we found no effect of the 11A allele on BMD at multiple skeletal sites, although these were not the sites where a relationship with fracture was most evident. These findings suggest that the 11A allele is a biologically relevant polymorphism that influences serum CTx and confers enhanced fracture risk in a site-selective manner related to intramembranous bone ossification.
Resumo:
This project’s aim was to create new experimental models in small animals for the investigation of infections related to bone fracture fixation implants. Animal models are essential in orthopaedic trauma research and this study evaluated new implants and surgical techniques designed to improve standardisation in these experiments, and ultimately to minimise the number of animals needed in future work. This study developed and assessed procedures using plates and inter-locked nails to stabilise fractures in rabbit thigh bones. Fracture healing was examined with mechanical testing and histology. The results of this work contribute to improvements in future small animal infection experiments.
Resumo:
To understand the survival status of cancer patients and influencing factors, an analysis was undertaken using data of 6450 cancer patients living in Linqu County, Shandong, diagnosed between 1993 and 1999. Survival rates were calculated using life table method with SAS 9.0 software. Overall 1-5 year survival rates for all patients were 53.16%, 28.65%, 21.57%, 18.36% and 17.87%, respectively. Cancers with a 5-year survival rate over 25% included ovarium, breast, uterus, stomach and colorectal cancers. Cancers with a 5-year survival lower than 10% were cancers on liver, cervical, lung and bones.Survival rates differed significantly across gender, age of onset, economic status, year of diagnosis and evidence of diagnosis. Patients' economic status, age of diagnosis and year of diagnosis seem to have strong effects on survival. [目的] 了解临朐县恶性肿瘤患者生存现状,探讨影响生存率的因素. [方法] 对临朐县1993~1999年发病的6450例肿瘤患者的生存资料进行分析,利用SAS9.0软件寿命表法计算生存率. [结果] 临朐县1993~1999年的恶性肿瘤患者1~5年生存率分别为53.16%、28.65%、21.57%、18.36%和17.87%,5年生存率超过25%的恶性肿瘤有卵巢癌、乳腺癌、宫体癌、胃癌、结直肠癌,5年生存率低于10%的有肝癌、宫颈癌、肺癌、骨恶性肿瘤.不同性别、发病年龄、经济状况、诊断时间和诊断依据的恶性肿瘤生存率有显著性差异. [结论] 患者经济条件、诊断年龄和诊断时间影响恶性肿瘤生存率.
Resumo:
The notion of plaintext awareness ( PA ) has many applications in public key cryptography: it offers unique, stand-alone security guarantees for public key encryption schemes, has been used as a sufficient condition for proving indistinguishability against adaptive chosen-ciphertext attacks ( IND-CCA ), and can be used to construct privacy-preserving protocols such as deniable authentication. Unlike many other security notions, plaintext awareness is very fragile when it comes to differences between the random oracle and standard models; for example, many implications involving PA in the random oracle model are not valid in the standard model and vice versa. Similarly, strategies for proving PA of schemes in one model cannot be adapted to the other model. Existing research addresses PA in detail only in the public key setting. This paper gives the first formal exploration of plaintext awareness in the identity-based setting and, as initial work, proceeds in the random oracle model. The focus is laid mainly on identity-based key encapsulation mechanisms (IB-KEMs), for which the paper presents the first definitions of plaintext awareness, highlights the role of PA in proof strategies of IND-CCA security, and explores relationships between PA and other security properties. On the practical side, our work offers the first, highly efficient, general approach for building IB-KEMs that are simultaneously plaintext-aware and IND-CCA -secure. Our construction is inspired by the Fujisaki-Okamoto (FO) transform, but demands weaker and more natural properties of its building blocks. This result comes from a new look at the notion of γ -uniformity that was inherent in the original FO transform. We show that for IB-KEMs (and PK-KEMs), this assumption can be replaced with a weaker computational notion, which is in fact implied by one-wayness. Finally, we give the first concrete IB-KEM scheme that is PA and IND-CCA -secure by applying our construction to a popular IB-KEM and optimizing it for better performance.
Resumo:
Fossils and sediments preserved in caves are an excellent source of information for investigating impacts of past environmental changes on biodiversity. Until recently studies have relied on morphology-based palaeontological approaches, but recent advances in molecular analytical methods offer excellent potential for extracting a greater array of biological information from these sites. This study presents a thorough assessment of DNA preservation from late Pleistocene–Holocene vertebrate fossils and sediments from Kelly Hill Cave Kangaroo Island, South Australia. Using a combination of extraction techniques and sequencing technologies, ancient DNA was characterised from over 70 bones and 20 sediment samples from 15 stratigraphic layers ranging in age from >20 ka to ∼6.8 ka. A combination of primers targeting marsupial and placental mammals, reptiles and two universal plant primers were used to reveal genetic biodiversity for comparison with the mainland and with the morphological fossil record for Kelly Hill Cave. We demonstrate that Kelly Hill Cave has excellent long-term DNA preservation, back to at least 20 ka. This contrasts with the majority of Australian cave sites thus far explored for ancient DNA preservation, and highlights the great promise Kangaroo Island caves hold for yielding the hitherto-elusive DNA of extinct Australian Pleistocene species.
Resumo:
Basing signature schemes on strong lattice problems has been a long standing open issue. Today, two families of lattice-based signature schemes are known: the ones based on the hash-and-sign construction of Gentry et al.; and Lyubashevsky’s schemes, which are based on the Fiat-Shamir framework. In this paper we show for the first time how to adapt the schemes of Lyubashevsky to the ring signature setting. In particular we transform the scheme of ASIACRYPT 2009 into a ring signature scheme that provides strong properties of security under the random oracle model. Anonymity is ensured in the sense that signatures of different users are within negligible statistical distance even under full key exposure. In fact, the scheme satisfies a notion which is stronger than the classical full key exposure setting as even if the keypair of the signing user is adversarially chosen, the statistical distance between signatures of different users remains negligible. Considering unforgeability, the best lattice-based ring signature schemes provide either unforgeability against arbitrary chosen subring attacks or insider corruption in log-sized rings. In this paper we present two variants of our scheme. In the basic one, unforgeability is ensured in those two settings. Increasing signature and key sizes by a factor k (typically 80 − 100), we provide a variant in which unforgeability is ensured against insider corruption attacks for arbitrary rings. The technique used is pretty general and can be adapted to other existing schemes.
Resumo:
Proxy re-encryption (PRE) is a highly useful cryptographic primitive whereby Alice and Bob can endow a proxy with the capacity to change ciphertext recipients from Alice to Bob, without the proxy itself being able to decrypt, thereby providing delegation of decryption authority. Key-private PRE (KP-PRE) specifies an additional level of confidentiality, requiring pseudo-random proxy keys that leak no information on the identity of the delegators and delegatees. In this paper, we propose a CPA-secure PK-PRE scheme in the standard model (which we then transform into a CCA-secure scheme in the random oracle model). Both schemes enjoy highly desirable properties such as uni-directionality and multi-hop delegation. Unlike (the few) prior constructions of PRE and KP-PRE that typically rely on bilinear maps under ad hoc assumptions, security of our construction is based on the hardness of the standard Learning-With-Errors (LWE) problem, itself reducible from worst-case lattice hard problems that are conjectured immune to quantum cryptanalysis, or “post-quantum”. Of independent interest, we further examine the practical hardness of the LWE assumption, using Kannan’s exhaustive search algorithm coupling with pruning techniques. This leads to state-of-the-art parameters not only for our scheme, but also for a number of other primitives based on LWE published the literature.
Resumo:
We revisit the venerable question of access credentials management, which concerns the techniques that we, humans with limited memory, must employ to safeguard our various access keys and tokens in a connected world. Although many existing solutions can be employed to protect a long secret using a short password, those solutions typically require certain assumptions on the distribution of the secret and/or the password, and are helpful against only a subset of the possible attackers. After briefly reviewing a variety of approaches, we propose a user-centric comprehensive model to capture the possible threats posed by online and offline attackers, from the outside and the inside, against the security of both the plaintext and the password. We then propose a few very simple protocols, adapted from the Ford-Kaliski server-assisted password generator and the Boldyreva unique blind signature in particular, that provide the best protection against all kinds of threats, for all distributions of secrets. We also quantify the concrete security of our approach in terms of online and offline password guesses made by outsiders and insiders, in the random-oracle model. The main contribution of this paper lies not in the technical novelty of the proposed solution, but in the identification of the problem and its model. Our results have an immediate and practical application for the real world: they show how to implement single-sign-on stateless roaming authentication for the internet, in a ad-hoc user-driven fashion that requires no change to protocols or infrastructure.