991 resultados para THRESHOLD FUNCTIONS
Resumo:
This paper presents a comprehensive formal security framework for key derivation functions (KDF). The major security goal for a KDF is to produce cryptographic keys from a private seed value where the derived cryptographic keys are indistinguishable from random binary strings. We form a framework of five security models for KDFs. This consists of four security models that we propose: Known Public Inputs Attack (KPM, KPS), Adaptive Chosen Context Information Attack (CCM) and Adaptive Chosen Public Inputs Attack(CPM); and another security model, previously defined by Krawczyk [6], which we refer to as Adaptive Chosen Context Information Attack(CCS). These security models are simulated using an indistinguisibility game. In addition we prove the relationships between these five security models and analyse KDFs using the framework (in the random oracle model).
Resumo:
Migraine is a painful and debilitating, neurovascular disease. Current migraine head pain treatments work with differing efficacies in migraineurs. The opioid system plays an important role in diverse biological functions including analgesia, drug response and pain reduction. The A118G single nucleotide polymorphism (SNP) in exon 1 of the μ-opioid receptor gene (OPRM1) has been associated with elevated pain responses and decreased pain threshold in a variety of populations. The aim of the current preliminary study was to test whether genotypes of the OPRM1 A118G SNP are associated with head pain severity in a clinical cohort of female migraineurs. This was a preliminary study to determine whether genotypes of the OPRM1 A118G SNP are associated with head pain severity in a clinical cohort of female migraineurs. A total of 153 chronic migraine with aura sufferers were assessed for migraine head pain using the Migraine Disability Assessment Score instrument and classified into high and low pain severity groups. DNA was extracted and genotypes obtained for the A118G SNP. Logistic regression analysis adjusting for age effects showed the A118G SNP of the OPRM1 gene to be significantly associated with migraine pain severity in the test population (P = 0.0037). In particular, G118 allele carriers were more likely to be high pain sufferers compared to homozygous carriers of the A118 allele (OR = 3.125, 95 % CI = 1.41, 6.93, P = 0.0037). These findings suggest that A118G genotypes of the OPRM1 gene may influence migraine-associated head pain in females. Further investigations are required to fully understand the effect of this gene variant on migraine head pain including studies in males and in different migraine subtypes, as well as in response to head pain medication.
Resumo:
The importance of applying unsaturated soil mechanics to geotechnical engineering design has been well understood. However, the consumption of time and the necessity for a specific laboratory testing apparatus when measuring unsaturated soil properties have limited the application of unsaturated soil mechanics theories in practice. Although methods for predicting unsaturated soil properties have been developed, the verification of these methods for a wide range of soil types is required in order to increase the confidence of practicing engineers in using these methods. In this study, a new permeameter was developed to measure the hydraulic conductivity of unsaturated soils using the steady-state method and directly measured suction (negative pore-water pressure) values. The apparatus is instrumented with two tensiometers for the direct measurement of suction during the tests. The apparatus can be used to obtain the hydraulic conductivity function of sandy soil over a low suction range (0-10 kPa). Firstly, the repeatability of the unsaturated hydraulic conductivity measurement, using the new permeameter, was verified by conducting tests on two identical sandy soil specimens and obtaining similar results. The hydraulic conductivity functions of the two sandy soils were then measured during the drying and wetting processes of the soils. A significant hysteresis was observed when the hydraulic conductivity was plotted against the suction. However, the hysteresis effects were not apparent when the conductivity was plotted against the volumetric water content. Furthermore, the measured unsaturated hydraulic conductivity functions were compared with predictions using three different predictive methods that are widely incorporated into numerical software. The results suggest that these predictive methods are capable of capturing the measured behavior with reasonable agreement.
Resumo:
The Chemistry Discipline Network has recently completed two distinct mapping exercises. The first is a snapshot of chemistry taught at 12 institutions around Australia in 2011. There were many similarities but also important differences in the content taught and assessed at different institutions. There were also significant differences in delivery, particularly laboratory contact hours, as well as forms and weightings of assessment. The second mapping exercise mapped the chemistry degrees at three institutions to the Threshold Learning Outcomes for chemistry. Importantly, some of the TLOs were addressed by multiple units at all institutions, while others were not met, or were met at an introductory level only. The exercise also exposed some challenges in using the TLOs as currently written.
Resumo:
Expert searchers engage with information as information brokers, researchers, reference librarians, information architects, faculty who teach advanced search, and in a variety of other information-intensive professions. Their experiences are characterized by a profound understanding of information concepts and skills and they have an agile ability to apply this knowledge to interacting with and having an impact on the information environment. This study explored the learning experiences of searchers to understand the acquisition of search expertise. The research question was: What can be learned about becoming an expert searcher from the learning experiences of proficient novice searchers and highly experienced searchers? The key objectives were: (1) to explore the existence of threshold concepts in search expertise; (2) to improve our understanding of how search expertise is acquired and how novice searchers, intent on becoming experts, can learn to search in more expertlike ways. The participant sample drew from two population groups: (1) highly experienced searchers with a minimum of 20 years of relevant professional experience, including LIS faculty who teach advanced search, information brokers, and search engine developers (11 subjects); and (2) MLIS students who had completed coursework in information retrieval and online searching and demonstrated exceptional ability (9 subjects). Using these two groups allowed a nuanced understanding of the experience of learning to search in expertlike ways, with data from those who search at a very high level as well as those who may be actively developing expertise. The study used semi-structured interviews, search tasks with think-aloud narratives, and talk-after protocols. Searches were screen-captured with simultaneous audio-recording of the think-aloud narrative. Data were coded and analyzed using NVivo9 and manually. Grounded theory allowed categories and themes to emerge from the data. Categories represented conceptual knowledge and attributes of expert searchers. In accord with grounded theory method, once theoretical saturation was achieved, during the final stage of analysis the data were viewed through lenses of existing theoretical frameworks. For this study, threshold concept theory (Meyer & Land, 2003) was used to explore which concepts might be threshold concepts. Threshold concepts have been used to explore transformative learning portals in subjects ranging from economics to mathematics. A threshold concept has five defining characteristics: transformative (causing a shift in perception), irreversible (unlikely to be forgotten), integrative (unifying separate concepts), troublesome (initially counter-intuitive), and may be bounded. Themes that emerged provided evidence of four concepts which had the characteristics of threshold concepts. These were: information environment: the total information environment is perceived and understood; information structures: content, index structures, and retrieval algorithms are understood; information vocabularies: fluency in search behaviors related to language, including natural language, controlled vocabulary, and finesse using proximity, truncation, and other language-based tools. The fourth threshold concept was concept fusion, the integration of the other three threshold concepts and further defined by three properties: visioning (anticipating next moves), being light on one's 'search feet' (dancing property), and profound ontological shift (identity as searcher). In addition to the threshold concepts, findings were reported that were not concept-based, including praxes and traits of expert searchers. A model of search expertise is proposed with the four threshold concepts at its core that also integrates the traits and praxes elicited from the study, attributes which are likewise long recognized in LIS research as present in professional searchers. The research provides a deeper understanding of the transformative learning experiences involved in the acquisition of search expertise. It adds to our understanding of search expertise in the context of today's information environment and has implications for teaching advanced search, for research more broadly within library and information science, and for methodologies used to explore threshold concepts.
Resumo:
This paper explores the theoretical framework of threshold concepts and its potential for LIS education. Threshold concepts are key ideas, often troublesome and counter-intuitive, that are critical to profound understanding of a domain. Once understood, they allow mastery of significant aspects of the domain, opening up new, previously inaccessible ways of thinking. The paper is developed in three parts. First, threshold concept theory is introduced and studies of its use in higher education are described, including emergent work related to LIS. Second, results of a recent study on learning experiences integral to learning to search are presented along with their implications for search expertise and search education, forming a case illustration of what threshold concept theory may contribute to this and other areas of LIS education. Third, the potential of threshold concept theory for LIS education is discussed. The paper concludes that threshold concept theory has much to offer LIS education, particularly for researching critical concepts and competencies, and considerations for a research agenda are put forth.
Resumo:
Gemcitabine is indicated in combination with cisplatin as first-line therapy for solid tumours including non-small cell lung cancer (NSCLC), bladder cancer and mesothelioma. Gemcitabine is an analogue of pyrimidine cytosine and functions as an anti-metabolite. Structurally, however, gemcitabine has similarities to 5-aza-2-deoxycytidine (decitabine/Dacogen®), a DNA methyltransferase inhibitor (DNMTi). NSCLC, mesothelioma and prostate cancer cell lines were treated with decitabine and gemcitabine. Reactivation of epigenetically silenced genes was examined by RT-PCR/qPCR. DNA methyltransferase activity in nuclear extracts and recombinant proteins was measured using a DNA methyltransferase assay, and alterations in DNA methylation status were examined using methylation-specific PCR (MS-PCR) and pyrosequencing. We observe a reactivation of several epigenetically silenced genes including GSTP1, IGFBP3 and RASSF1A. Gemcitabine functionally inhibited DNA methyltransferase activity in both nuclear extracts and recombinant proteins. Gemcitabine dramatically destabilised DNMT1 protein. However, DNA CpG methylation was for the most part unaffected by gemcitabine. In conclusion, gemcitabine both inhibits and destabilises DNA methyltransferases and reactivates epigenetically silenced genes having activity equivalent to decitabine at concentrations significantly lower than those achieved in the treatment of patients with solid tumours. This property may contribute to the anticancer activity of gemcitabine.
Resumo:
This chapter describes an innovative method of curriculum design that is based on combining phenomenographic research, and the associated variation theory of learning, with the notion of disciplinary threshold concepts to focus specialised design attention on the most significant and difficult parts of the curriculum. The method involves three primary stages: (i) identification of disciplinary concepts worthy of intensive curriculum design attention, using the criteria for threshold concepts; (ii) action research into variation in students’ understandings/misunderstandings of those concepts, using phenomenography as the research approach; (iii) design of learning activities to address the poorer understandings identified in the second stage, using variation theory as a guiding framework. The curriculum design method is inherently theory and evidence based. It was developed and trialed during a two-year project funded by the Australian Learning and Teaching Council, using physics and law disciplines as case studies. Disciplinary teachers’ perceptions of the impact of the method on their teaching and understanding of student learning were profound. Attempts to measure the impact on student learning were less conclusive; teachers often unintentionally deviated from the design when putting it into practice for the first time. Suggestions for improved implementation of the method are discussed.
Resumo:
Dengue virus (DENV) transmission in Australia is driven by weather factors and imported dengue fever (DF) cases. However, uncertainty remains regarding the threshold effects of high-order interactions among weather factors and imported DF cases and the impact of these factors on autochthonous DF. A time-series regression tree model was used to assess the threshold effects of natural temporal variations of weekly weather factors and weekly imported DF cases in relation to incidence of weekly autochthonous DF from 1 January 2000 to 31 December 2009 in Townsville and Cairns, Australia. In Cairns, mean weekly autochthonous DF incidence increased 16.3-fold when the 3-week lagged moving average maximum temperature was <32 °C, the 4-week lagged moving average minimum temperature was ≥24 °C and the sum of imported DF cases in the previous 2 weeks was >0. When the 3-week lagged moving average maximum temperature was ≥32 °C and the other two conditions mentioned above remained the same, mean weekly autochthonous DF incidence only increased 4.6-fold. In Townsville, the mean weekly incidence of autochthonous DF increased 10-fold when 3-week lagged moving average rainfall was ≥27 mm, but it only increased 1.8-fold when rainfall was <27 mm during January to June. Thus, we found different responses of autochthonous DF incidence to weather factors and imported DF cases in Townsville and Cairns. Imported DF cases may also trigger and enhance local outbreaks under favorable climate conditions.
Resumo:
It is widely acknowledged that effective asset management requires an interdisciplinary approach, in which synergies should exist between traditional disciplines such as: accounting, engineering, finance, humanities, logistics, and information systems technologies. Asset management is also an important, yet complex business practice. Business process modelling is proposed as an approach to manage the complexity of asset management through the modelling of asset management processes. A sound foundation for the systematic application and analysis of business process modelling in asset management is, however, yet to be developed. Fundamentally, a business process consists of activities (termed functions), events/states, and control flow logic. As both events/states and control flow logic are somewhat dependent on the functions themselves, it is a logical step to first identify the functions within a process. This research addresses the current gap in knowledge by developing a method to identify functions common to various industry types (termed core functions). This lays the foundation to extract such functions, so as to identify both commonalities and variation points in asset management processes. This method describes the use of a manual text mining and a taxonomy approach. An example is presented.
Resumo:
Objectives: To investigate the relationship between two assessments to quantify delayed onset muscle soreness [DOMS]: visual analog scale [VAS] and pressure pain threshold [PPT]. Methods: Thirty-one healthy young men [25.8 ± 5.5 years] performed 10 sets of six maximal eccentric contractions of the elbow flexors with their non-dominant arm. Before and one to four days after the exercise, muscle pain perceived upon palpation of the biceps brachii at three sites [5, 9 and 13 cm above the elbow crease] was assessed by VAS with a 100 mm line [0 = no pain, 100 = extremely painful], and PPT of the same sites was determined by an algometer. Changes in VAS and PPT over time were compared amongst three sites by a two-way repeated measures analysis of variance, and the relationship between VAS and PPT was analyzed using a Pearson product-moment correlation. Results: The VAS increased one to four days after exercise and peaked two days post-exercise, while the PPT decreased most one day post-exercise and remained below baseline for four days following exercise [p < 0.05]. No significant difference among the three sites was found for VAS [p = 0.62] or PPT [p = 0.45]. The magnitude of change in VAS did not significantly correlate with that of PPT [r = −0.20, p = 0.28]. Conclusion: These results suggest that the level of muscle pain is not region-specific, at least among the three sites investigated in the study, and VAS and PPT provide different information about DOMS, indicating that VAS and PPT represent different aspects of pain.
Resumo:
Many cell types form clumps or aggregates when cultured in vitro through a variety of mechanisms including rapid cell proliferation, chemotaxis, or direct cell-to-cell contact. In this paper we develop an agent-based model to explore the formation of aggregates in cultures where cells are initially distributed uniformly, at random, on a two-dimensional substrate. Our model includes unbiased random cell motion, together with two mechanisms which can produce cell aggregates: (i) rapid cell proliferation, and (ii) a biased cell motility mechanism where cells can sense other cells within a finite range, and will tend to move towards areas with higher numbers of cells. We then introduce a pair-correlation function which allows us to quantify aspects of the spatial patterns produced by our agent-based model. In particular, these pair-correlation functions are able to detect differences between domains populated uniformly at random (i.e. at the exclusion complete spatial randomness (ECSR) state) and those where the proliferation and biased motion rules have been employed - even when such differences are not obvious to the naked eye. The pair-correlation function can also detect the emergence of a characteristic inter-aggregate distance which occurs when the biased motion mechanism is dominant, and is not observed when cell proliferation is the main mechanism of aggregate formation. This suggests that applying the pair-correlation function to experimental images of cell aggregates may provide information about the mechanism associated with observed aggregates. As a proof of concept, we perform such analysis for images of cancer cell aggregates, which are known to be associated with rapid proliferation. The results of our analysis are consistent with the predictions of the proliferation-based simulations, which supports the potential usefulness of pair correlation functions for providing insight into the mechanisms of aggregate formation.
Resumo:
In 2012 the New Zealand government spent $3.4 billion, or nearly $800 per person, on responses to crime via the justice system. Research shows that much of this spending does little to reduce the changes of re-offending. Relatively little money is spent on victims, the rehabilitation of offenders or to support the families of offenders. This book is based on papers presented at the Costs of Crime forum held by the Institute of Policy Studies in February 2011. It presents lessons from what is happening in Australia, Britain and the United States and focuses on how best to manage crime, respond to victims, and reduce offending in a cost-effective manner in a New Zealand context. It is clear that strategies are needed that are based on better research and a more informed approach to policy development. Such strategies must assist victims constructively while also reducing offending. Using public resources to lock as many people in our prisons as possible cannot be justified by the evidence and is fiscally unsustainable; nor does such an approach make society safer. To reduce the costs of crime we need to reinvest resources in effective strategies to build positive futures for those at risk and the communities needed to sustain them.
An improved chemically inducible gene switch that functions in the monocotyledonous plant sugar cane
Resumo:
Chemically inducible gene switches can provide precise control over gene expression, enabling more specific analyses of gene function and expanding the plant biotechnology toolkit beyond traditional constitutive expression systems. The alc gene expression system is one of the most promising chemically inducible gene switches in plants because of its potential in both fundamental research and commercial biotechnology applications. However, there are no published reports demonstrating that this versatile gene switch is functional in transgenic monocotyledonous plants, which include some of the most important agricultural crops. We found that the original alc gene switch was ineffective in the monocotyledonous plant sugar cane, and describe a modified alc system that is functional in this globally significant crop. A promoter consisting of tandem copies of the ethanol receptor inverted repeat binding site, in combination with a minimal promoter sequence, was sufficient to give enhanced sensitivity and significantly higher levels of ethanol inducible gene expression. A longer CaMV 35S minimal promoter than was used in the original alc gene switch also substantially improved ethanol inducibility. Treating the roots with ethanol effectively induced the modified alc system in sugar cane leaves and stem, while an aerial spray was relatively ineffective. The extension of this chemically inducible gene expression system to sugar cane opens the door to new opportunities for basic research and crop biotechnology.
Resumo:
To this day, realizations in the standard-model of (lossy) trapdoor functions from discrete-log-type assumptions require large public key sizes, e.g., about Θ(λ 2) group elements for a reduction from the decisional Diffie-Hellman assumption (where λ is a security parameter). We propose two realizations of lossy trapdoor functions that achieve public key size of only Θ(λ) group elements in bilinear groups, with a reduction from the decisional Bilinear Diffie-Hellman assumption. Our first construction achieves this result at the expense of a long common reference string of Θ(λ 2) elements, albeit reusable in multiple LTDF instantiations. Our second scheme also achieves public keys of size Θ(λ), entirely in the standard model and in particular without any reference string, at the cost of a slightly more involved construction. The main technical novelty, developed for the second scheme, is a compact encoding technique for generating compressed representations of certain sequences of group elements for the public parameters.