254 resultados para Complexity of Distribution
Resumo:
This paper forms one part of a broadly-based study into the use of humour within tertiary teaching. One theme to emerge from semi-structured, in-depth interviews with university academics concerns the setting of boundaries to the appropriate use of humour within lectures and tutorials. Following the ‘benign violations’ theory of humour—wherein, to be funny, a situation/statement must be some kind of a social violation, that violation must be regarded as relatively benign, and the two ideas must be held simultaneously—this paper suggests that the willingness of academics to use particular types of humour in their teaching revolves around the complexities of determining the margins of the benign. These margins are shaped in part by pedagogic limitations, but also by professional delimitations. In terms of limitations, the boundaries of humour are set by the academic environment of the university, by the characteristics of different cohorts of students, and by what those students are prepare to laugh at. In terms of delimitations—where humour choice is moderated, not by the possibility of immediate laughter, but rather by the consequences of that choice—academic seniority and security play a large role in determining what kinds of humour will be used, and where boundaries are to be set. The central conclusion here is that formal maxims of humour use—‘Never tease students’, ‘Don’t joke about potentially sensitive issues’—fail to account for the complexity of teaching relationships, for the differences between student cohorts, and for the talents and standing of particular teachers.
Resumo:
Injection velocity has been recognized as a key variable in thermoplastic injection molding. Its closed-loop control is, however, difficult due to the complexity of the process dynamic characteristics. The basic requirements of the control system include tracking of a pre-determined injection velocity curve defined in a profile, load rejection and robustness. It is difficult for a conventional control scheme to meet all these requirements. Injection velocity dynamics are first analyzed in this paper. Then a novel double-controller scheme is adopted for the injection velocity control. This scheme allows an independent design of set-point tracking and load rejection and has good system robustness. The implementation of the double-controller scheme for injection velocity control is discussed. Special techniques such as profile transformation and shifting are also introduced to improve the velocity responses. The proposed velocity control has been experimentally demonstrated to be effective for a wide range of processing conditions.
Resumo:
Intensity Modulated Radiotherapy (IMRT) is a well established technique for delivering highly conformal radiation dose distributions. The complexity of the delivery techniques and high dose gradients around the target volume make verification of the patient treatment crucial to the success of the treatment. Conventional treatment protocols involve imaging the patient prior to treatment, comparing the patient set-up to the planned set-up and then making any necessary shifts in the patient position to ensure target volume coverage. This paper presents a method for calibrating electronic portal imaging device (EPID) images acquired during IMRT delivery so that they can be used for verifying the patient set-up.
Resumo:
Objective To examine the clinical utility of the Cornell Scale for Depression in Dementia (CSDD) in nursing homes. Setting 14 nursing homes in Sydney and Brisbane, Australia. Participants 92 residents with a mean age of 85 years. Measurements Consenting residents were assessed by care staff for depression using the CSDD as part of their routine assessment. Specialist clinicians conducted assessment of depression using the Semi-structured Clinical Diagnostic Interview for DSM-IV-TR Axis I Disorders for residents without dementia or the Provisional Diagnostic Criteria for Depression in Alzheimer Disease for residents with dementia to establish expert clinical diagnoses of depression. The diagnostic performance of the staff completed CSDD was analyzed against expert diagnosis using receiver operating characteristic (ROC) curves. Results The CSDD showed low diagnostic accuracy, with areas under the ROC curve being 0.69, 0.68 and 0.70 for the total sample, residents with dementia and residents without dementia, respectively. At the standard CSDD cutoff score, the sensitivity and specificity were 71% and 59% for the total sample, 69% and 57% for residents with dementia, and 75% and 61% for residents without dementia. The Youden index (for optimizing cut-points) suggested different depression cutoff scores for residents with and without dementia. Conclusion When administered by nursing home staff the clinical utility of the CSDD is highly questionable in identifying depression. The complexity of the scale, the time required for collecting relevant information, and staff skills and knowledge of assessing depression in older people must be considered when using the CSDD in nursing homes.
Resumo:
Design deals with improving the lives of people. As such interactions with products, interfaces, and systems should facilitate not only usable and practical concerns but also mediate emotionally meaningful experiences. This paper presents an integrated and comprehensive model of experience, labeled 'Unified User Experience Model', covering the most prominent perspectives from across the design field. It is intended to support designers from different disciplines to consider the complexity of user experience. The vision of the model is to support both the analysis of existing products, interfaces, and systems, as well as the development of new designs that take into account this complexity. In essence, we hope the model can enable designers to develop more marketable, appropriate, and enhanced products to improve experiences and ultimately the lives of people.
Resumo:
The current state of the prefabricated housing market in Australia is systematically profiled, guided by a theoretical systems model. Particular focus is given to two original data collections. The first identifies manufacturers and builders using prefabrication innovations, and the second compares the context for prefabricated housing in Australia with that of key international jurisdictions. The results indicate a small but growing market for prefabricated housing in Australia, often building upon expertise developed through non-residential building applications. The international comparison highlighted the complexity of the interactions between macro policy decisions and historical influences and the uptake of prefabricated housing. The data suggest factors such as the small scale of the Australian market, and a lack of investment in research, development and training have not encouraged prefabrication. A lack of clear regulatory policy surrounding prefabricated housing is common both in Australia and internationally, with local effects in regards to home warranties and housing finance highlighted. Future research should target the continuing lack of consideration of prefabrication from within the housing construction industry, and build upon the research reported in this paper to further quantify the potential end user market and the continuing development of the industry.
Resumo:
Monogenetic volcanoes have long been regarded as simple in nature, involving single magma batches and uncomplicated evolutions; however, recent detailed research into individual centres is challenging that assumption. Mt Rouse (Kolor) is the volumetrically largest volcano in the monogenetic Newer Volcanics Province of southeast Australia. This study presents new major, trace and Sr–Nd–Pb isotope data for samples selected on the basis of a detailed stratigraphic framework analysis of the volcanic products from Mt Rouse. The volcano is the product of three magma batches geochemically similar to Ocean–Island basalts, featuring increasing LREE enrichment with each magma batch (batches A, B and C) but no evidence of crustal contamination; the Sr–Nd–Pb isotopes define two groupings. Modelling suggests that the magmas were sourced from a zone of partial melting crossing the lithosphere–asthenosphere boundary, with batch A forming a large volume partial melt in the deep lithosphere (1.7 GPa/55.5 km); and batches B and C from similar areas within the shallow asthenosphere (1.88 GPa/61 km and 1.94 GPa/63 km, respectively). The formation and extraction of these magmas may have been due to high deformation rates in the mantle caused by edge-driven convection and asthenospheric upwelling. The lithosphere– asthenosphere boundary is important with respect to NVP volcanism. An eruption chronology involves sequential eruption of magma batches A, C and B, followed by simultaneous eruption of batches A and B. Mt Rouse is a complex polymagmatic monogenetic volcano that illustrates the complexity of monogenetic volcanism and demonstrates the importance of combining detailed stratigraphic analysis alongside systematic geochemical sampling.
Resumo:
In this paper, the security of two recent RFID mutual authentication protocols are investigated. The first protocol is a scheme proposed by Huang et al. [7] and the second one by Huang, Lin and Li [6]. We show that these two protocols have several weaknesses. In Huang et al.’s scheme, an adversary can determine the 32-bit secret password with a probability of 2−2 , and in Huang-Lin-Li scheme, a passive adversary can recognize a target tag with a success probability of 1−2−4 and an active adversary can determine all 32 bits of Access password with success probability of 2−4 . The computational complexity of these attacks is negligible.
Resumo:
In this paper, we analyze the SHAvite-3-512 hash function, as proposed and tweaked for round 2 of the SHA-3 competition. We present cryptanalytic results on 10 out of 14 rounds of the hash function SHAvite-3-512, and on the full 14 round compression function of SHAvite-3-512. We show a second preimage attack on the hash function reduced to 10 rounds with a complexity of 2497 compression function evaluations and 216 memory. For the full 14-round compression function, we give a chosen counter, chosen salt preimage attack with 2384 compression function evaluations and 2128 memory (or complexity 2448 without memory), and a collision attack with 2192 compression function evaluations and 2128 memory.
Resumo:
In this paper we present concrete collision and preimage attacks on a large class of compression function constructions making two calls to the underlying ideal primitives. The complexity of the collision attack is above the theoretical lower bound for constructions of this type, but below the birthday complexity; the complexity of the preimage attack, however, is equal to the theoretical lower bound. We also present undesirable properties of some of Stam’s compression functions proposed at CRYPTO ’08. We show that when one of the n-bit to n-bit components of the proposed 2n-bit to n-bit compression function is replaced by a fixed-key cipher in the Davies-Meyer mode, the complexity of finding a preimage would be 2 n/3. We also show that the complexity of finding a collision in a variant of the 3n-bits to 2n-bits scheme with its output truncated to 3n/2 bits is 2 n/2. The complexity of our preimage attack on this hash function is about 2 n . Finally, we present a collision attack on a variant of the proposed m + s-bit to s-bit scheme, truncated to s − 1 bits, with a complexity of O(1). However, none of our results compromise Stam’s security claims.
Resumo:
Many RFID protocols use cryptographic hash functions for their security. The resource constrained nature of RFID systems forces the use of light weight cryptographic algorithms. Tav-128 is one such 128-bit light weight hash function proposed by Peris-Lopez et al. for a low-cost RFID tag authentication protocol. Apart from some statistical tests for randomness by the designers themselves, Tav-128 has not undergone any other thorough security analysis. Based on these tests, the designers claimed that Tav-128 does not posses any trivial weaknesses. In this article, we carry out the first third party security analysis of Tav-128 and show that this hash function is neither collision resistant nor second preimage resistant. Firstly, we show a practical collision attack on Tav-128 having a complexity of 237 calls to the compression function and produce message pairs of arbitrary length which produce the same hash value under this hash function. We then show a second preimage attack on Tav-128 which succeeds with a complexity of 262 calls to the compression function. Finally, we study the constituent functions of Tav-128 and show that the concatenation of nonlinear functions A and B produces a 64-bit permutation from 32-bit messages. This could be a useful light weight primitive for future RFID protocols.
Resumo:
This study evaluated the complexity of calcium ion exchange with sodium exchanged weak acid cation resin (DOW MAC-3). Exchange equilibria recorded for a range of different solution normalities revealed profiles which were represented by conventional “L” or “H” type isotherms at low values of equilibrium concentration (Ce) of calcium ions, plus a superimposed region of increasing calcium uptake was observed at high Ce values. The loading of calcium ions was determined to be ca. 53.5 to 58.7 g/kg of resin when modelling only the sorption curve created at low Ce values,which exhibited a well-defined plateau. The calculated calcium ion loading capacity for DOWMAC-3 resin appeared to correlate with the manufacturer's recommendation. The phenomenon of super equivalent ion exchange (SEIX) was observed when the “driving force” for the exchange process was increased in excess of 2.25 mmol calcium ions per gram of resin in the starting solution. This latter event was explained in terms of displacement of sodium ions from sodium hydroxide solution which remained in the resin bead following the initial conversion of the as supplied “H+” exchanged resin sites to the “Na+” version required for softening studies. Evidence for hydrolysis of a small fraction of the sites on the sodium exchanged resin surface was noted. The importance of carefully choosing experimental parameters was discussed especially in relation to application of the Langmuir–Vageler expression. This latter model which compared the ratio of the initial calcium ion concentration in solution to resin mass, versus final equilibrium loading of the calcium ions on the resin; was discovered to be an excellent means of identifying the progress of the calcium–sodium ion exchange process. Moreover, the Langmuir–Vageler model facilitated standardization of various calcium–sodium ion exchange experiments which allowed systematic experimental design.
Resumo:
This thesis critically explored the concept of collaboration through an analysis of the experiences of midwives, child health nurses and women in the process of transition from hospital to community care and related policy documents. The research concluded that the concept serves an important social function in obscuring the complexity of social relations in healthcare. Rather than adopt an unquestioning attitude to what is represented as collaboration this thesis argues for a more critical examination of what is occurring, what is potentially hidden and how specific interests are served through its use.
Resumo:
"Book Description: The phenomenon which dialogism addresses is human interaction. It enables us to conceptualise human interaction as intersubjective, symbolic, cultural, transformative and conflictual, in short, as complex. The complexity of human interaction is evident in all domains of human life, for example, in therapy, education, health intervention, communication, and coordination at all levels. A dialogical approach starts by acknowledging that the social world is perspectival, that people and groups inhabit different social realities. This book stands apart from the proliferation of recent books on dialogism, because rather than applying dialogism to this or that domain, the present volume focuses on dialogicality itself to interrogate the concepts and methods which are taken for granted in the burgeoning literature. (Imprint: Nova Press)"--Publisher website
Resumo:
We show the first deterministic construction of an unconditionally secure multiparty computation (MPC) protocol in the passive adversarial model over black-box non-Abelian groups which is both optimal (secure against an adversary who possesses any t