379 resultados para Mutual Gains
Resumo:
AIM: To draw on empirical evidence to illustrate the core role of nurse practitioners in Australia and New Zealand. BACKGROUND: Enacted legislation provides for mutual recognition of qualifications, including nursing, between New Zealand and Australia. As the nurse practitioner role is relatively new in both countries, there is no consistency in role expectation and hence mutual recognition has not yet been applied to nurse practitioners. A study jointly commissioned by both countries' Regulatory Boards developed information on the core role of the nurse practitioner, to develop shared competency and educational standards. Reporting on this study's process and outcomes provides insights that are relevant both locally and internationally. METHOD: This interpretive study used multiple data sources, including published and grey literature, policy documents, nurse practitioner program curricula and interviews with 15 nurse practitioners from the two countries. Data were analysed according to the appropriate standard for each data type and included both deductive and inductive methods. The data were aggregated thematically according to patterns within and across the interview and material data. FINDINGS: The core role of the nurse practitioner was identified as having three components: dynamic practice, professional efficacy and clinical leadership. Nurse practitioner practice is dynamic and involves the application of high level clinical knowledge and skills in a wide range of contexts. The nurse practitioner demonstrates professional efficacy, enhanced by an extended range of autonomy that includes legislated privileges. The nurse practitioner is a clinical leader with a readiness and an obligation to advocate for their client base and their profession at the systems level of health care. CONCLUSION: A clearly articulated and research informed description of the core role of the nurse practitioner provides the basis for development of educational and practice competency standards. These research findings provide new perspectives to inform the international debate about this extended level of nursing practice. RELEVANCE TO CLINICAL PRACTICE: The findings from this research have the potential to achieve a standardised approach and internationally consistent nomenclature for the nurse practitioner role.
Resumo:
This paper considers the question of designing a fully image-based visual servo control for a class of dynamic systems. The work is motivated by the ongoing development of image-based visual servo control of small aerial robotic vehicles. The kinematics and dynamics of a rigid-body dynamical system (such as a vehicle airframe) maneuvering over a flat target plane with observable features are expressed in terms of an unnormalized spherical centroid and an optic flow measurement. The image-plane dynamics with respect to force input are dependent on the height of the camera above the target plane. This dependence is compensated by introducing virtual height dynamics and adaptive estimation in the proposed control. A fully nonlinear adaptive control design is provided that ensures asymptotic stability of the closed-loop system for all feasible initial conditions. The choice of control gains is based on an analysis of the asymptotic dynamics of the system. Results from a realistic simulation are presented that demonstrate the performance of the closed-loop system. To the author's knowledge, this paper documents the first time that an image-based visual servo control has been proposed for a dynamic system using vision measurement for both position and velocity.
Resumo:
Minimizing complexity of group key exchange (GKE) protocols is an important milestone towards their practical deployment. An interesting approach to achieve this goal is to simplify the design of GKE protocols by using generic building blocks. In this paper we investigate the possibility of founding GKE protocols based on a primitive called multi key encapsulation mechanism (mKEM) and describe advantages and limitations of this approach. In particular, we show how to design a one-round GKE protocol which satisfies the classical requirement of authenticated key exchange (AKE) security, yet without forward secrecy. As a result, we obtain the first one-round GKE protocol secure in the standard model. We also conduct our analysis using recent formal models that take into account both outsider and insider attacks as well as the notion of key compromise impersonation resilience (KCIR). In contrast to previous models we show how to model both outsider and insider KCIR within the definition of mutual authentication. Our analysis additionally implies that the insider security compiler by Katz and Shin from ACM CCS 2005 can be used to achieve more than what is shown in the original work, namely both outsider and insider KCIR.
Resumo:
This paper investigates whether Socially Responsible Investment (SRI) is more or less sensitive to market downturns than conventional investment, and examines the legal implications for fund managers and trustees. Using a market model methodology, we find that over the past 15 years, the beta risk of SRI, both in Australia and internationally, increased more than that of conventional investment during economic downturns. This implies that companies acting as fund trustees, managed investment schemes and traditional institutional fund managers risk breaching their fiduciary or statutory duties if they go long - or remain long - in SRI funds during market downturns, unless perhaps relevant legislation is reformed. If reform is viewed as desirable, possible reforms could include explicitly overriding the common law to allow all traditional funds to invest in SRI; granting immunity to directors of trustee companies from potential personal liability under sections 197 or 588G et seq of the Corporations Act; allowing companies acting as trustees, managed investment schemes and traditional institutional fund managers and trustees to invest in SRI without triggering a substantial capital gains tax liability through trust resettlement; tax concessions for SRI (eg. introducing a 150% tax deduction or investment allowance for SRI); and allowing SRI sub-funds to obtain “deductible gift recipient” status or the equivalent from relevant taxation authorities. The research is important and original insofar as the assessment of risk in SRIs during market downturns is an area which has hitherto not been subjected to rigorous empirical investigation, despite its serious legal implications.
Resumo:
Transport regulators consider that, with respect to pavement damage, heavy vehicles (HVs) are the riskiest vehicles on the road network. That HV suspension design contributes to road and bridge damage has been recognised for some decades. This thesis deals with some aspects of HV suspension characteristics, particularly (but not exclusively) air suspensions. This is in the areas of developing low-cost in-service heavy vehicle (HV) suspension testing, the effects of larger-than-industry-standard longitudinal air lines and the characteristics of on-board mass (OBM) systems for HVs. All these areas, whilst seemingly disparate, seek to inform the management of HVs, reduce of their impact on the network asset and/or provide a measurement mechanism for worn HV suspensions. A number of project management groups at the State and National level in Australia have been, and will be, presented with the results of the project that resulted in this thesis. This should serve to inform their activities applicable to this research. A number of HVs were tested for various characteristics. These tests were used to form a number of conclusions about HV suspension behaviours. Wheel forces from road test data were analysed. A “novel roughness” measure was developed and applied to the road test data to determine dynamic load sharing, amongst other research outcomes. Further, it was proposed that this approach could inform future development of pavement models incorporating roughness and peak wheel forces. Left/right variations in wheel forces and wheel force variations for different speeds were also presented. This led on to some conclusions regarding suspension and wheel force frequencies, their transmission to the pavement and repetitive wheel loads in the spatial domain. An improved method of determining dynamic load sharing was developed and presented. It used the correlation coefficient between two elements of a HV to determine dynamic load sharing. This was validated against a mature dynamic loadsharing metric, the dynamic load sharing coefficient (de Pont, 1997). This was the first time that the technique of measuring correlation between elements on a HV has been used for a test case vs. a control case for two different sized air lines. That dynamic load sharing was improved at the air springs was shown for the test case of the large longitudinal air lines. The statistically significant improvement in dynamic load sharing at the air springs from larger longitudinal air lines varied from approximately 30 percent to 80 percent. Dynamic load sharing at the wheels was improved only for low air line flow events for the test case of larger longitudinal air lines. Statistically significant improvements to some suspension metrics across the range of test speeds and “novel roughness” values were evident from the use of larger longitudinal air lines, but these were not uniform. Of note were improvements to suspension metrics involving peak dynamic forces ranging from below the error margin to approximately 24 percent. Abstract models of HV suspensions were developed from the results of some of the tests. Those models were used to propose further development of, and future directions of research into, further gains in HV dynamic load sharing. This was from alterations to currently available damping characteristics combined with implementation of large longitudinal air lines. In-service testing of HV suspensions was found to be possible within a documented range from below the error margin to an error of approximately 16 percent. These results were in comparison with either the manufacturer’s certified data or test results replicating the Australian standard for “road-friendly” HV suspensions, Vehicle Standards Bulletin 11. OBM accuracy testing and development of tamper evidence from OBM data were detailed for over 2000 individual data points across twelve test and control OBM systems from eight suppliers installed on eleven HVs. The results indicated that 95 percent of contemporary OBM systems available in Australia are accurate to +/- 500 kg. The total variation in OBM linearity, after three outliers in the data were removed, was 0.5 percent. A tamper indicator and other OBM metrics that could be used by jurisdictions to determine tamper events were developed and documented. That OBM systems could be used as one vector for in-service testing of HV suspensions was one of a number of synergies between the seemingly disparate streams of this project.
Resumo:
Four studies report on outcomes for long-term unemployed individuals who attend occupational skills/personal development training courses in Australia. Levels of distress, depression, guilt, anger, helplessness, positive and negative affect, life satisfaction and self esteem were used as measures of well-being. Employment value, employment expectations and employment commitment were used as measures of work attitude. Social support, financial strain, and use of community resources were used as measures of life situation. Other variables investigated were causal attribution, unemployment blame, levels of coping, self efficacy, the personality variable of neuroticism, the psycho-social climate of the training course, and changes to occupational status. Training courses were (a) government funded occupational skills-based programs which included some components of personal development training, and (b) a specially developed course which focused exclusively on improving well-being, and which utilised the cognitive-behavioural therapy (CBT) approach. Data for all studies were collected longitudinally by having subjects complete questionnaires pre-course, post-course, and (for 3 of the 4 studies) at 3 months follow-up, in order to investigate long-term effects. One of the studies utilised the case-study methodology and was designed to be illustrative and assist in interpreting the quantitative data from the other 3 evaluations. The outcomes for participants were contrasted with control subjects who met the same sel~tion criteria for training. Results confirmed earlier findings that the experiences of unemployment were negative. Immediate effects of the courses were to improve well-being. Improvements were greater for those who attended courses with higher levels of personal development input, and the best results were obtained from the specially developed CBT program. Participants who had lower levels of well-being at the beginning of the courses did better as a result of training than those who were already functioning at higher levels. Course participants gained only marginal advantages over control subjects in relation to improving their occupational status. Many of the short term well-being gains made as a result of attending the courses were still evident at 3 months follow-up. Best results were achieved for the specially designed CBT program. Results were discussed in the context of prevailing theories of Ynemployment (Fryer, 1986,1988; Jahoda, 1981, 1982; Warr, 1987a, 1987b).
Resumo:
The effective daylighting of multistorey commercial building interiors poses an interesting problem for designers in Australia’s tropical and subtropical context. Given that a building exterior receives adequate sun and skylight as dictated by location-specific factors such as weather, siting and external obstructions; then the availability of daylight throughout its interior is dependant on certain building characteristics: the distance from a window façade (room depth), ceiling or window head height, window size and the visible transmittance of daylighting apertures. The daylighting of general stock, multistorey commercial buildings is made difficult by their design limitations with respect to some of these characteristics. The admission of daylight to these interiors is usually exclusively by vertical windows. Using conventional glazing, such windows can only admit sun and skylight to a depth of approximately 2 times the window height. This penetration depth is typically much less than the depth of the office interiors, so that core areas of these buildings receive little or no daylight. This issue is particularly relevant where deep, open plan office layouts prevail. The resulting interior daylight pattern is a relatively narrow perimeter zone bathed in (sometimes too intense) light, contrasted with a poorly daylit core zone. The broad luminance range this may present to a building occupant’s visual field can be a source of discomfort glare. Furthermore, the need in most tropical and subtropical regions to restrict solar heat gains to building interiors for much of the year has resulted in the widespread use of heavily tinted or reflective glazing on commercial building façades. This strategy reduces the amount of solar radiation admitted to the interior, thereby decreasing daylight levels proportionately throughout. However this technique does little to improve the way light is distributed throughout the office space. Where clear skies dominate weather conditions, at different times of day or year direct sunlight may pass unobstructed through vertical windows causing disability or discomfort glare for building occupants and as such, its admission to an interior must be appropriately controlled. Any daylighting system to be applied to multistorey commercial buildings must consider these design obstacles, and attempt to improve the distribution of daylight throughout these deep, sidelit office spaces without causing glare conditions. The research described in this thesis delineates first the design optimisation and then the actual prototyping and manufacture process of a daylighting device to be applied to such multistorey buildings in tropical and subtropical environments.
Resumo:
This study is conducted within the IS-Impact Research Track at Queensland University of Technology (QUT). The goal of the IS-Impact Track is, "to develop the most widely employed model for benchmarking information systems in organizations for the joint benefit of both research and practice" (Gable et al, 2006). IS-Impact is defined as "a measure at a point in time, of the stream of net benefits from the IS [Information System], to date and anticipated, as perceived by all key-user-groups" (Gable Sedera and Chan, 2008). Track efforts have yielded the bicameral IS-Impact measurement model; the "impact" half includes Organizational-Impact and Individual-Impact dimensions; the "quality" half includes System-Quality and Information-Quality dimensions. The IS-Impact model, by design, is intended to be robust, simple and generalisable, to yield results that are comparable across time, stakeholders, different systems and system contexts. The model and measurement approach employs perceptual measures and an instrument that is relevant to key stakeholder groups, thereby enabling the combination or comparison of stakeholder perspectives. Such a validated and widely accepted IS-Impact measurement model has both academic and practical value. It facilitates systematic operationalisation of a main dependent variable in research (IS-Impact), which can also serve as an important independent variable. For IS management practice it provides a means to benchmark and track the performance of information systems in use. From examination of the literature, the study proposes that IS-Impact is an Analytic Theory. Gregor (2006) defines Analytic Theory simply as theory that ‘says what is’, base theory that is foundational to all other types of theory. The overarching research question thus is "Does IS-Impact positively manifest the attributes of Analytic Theory?" In order to address this question, we must first answer the question "What are the attributes of Analytic Theory?" The study identifies the main attributes of analytic theory as: (1) Completeness, (2) Mutual Exclusivity, (3) Parsimony, (4) Appropriate Hierarchy, (5) Utility, and (6) Intuitiveness. The value of empirical research in Information Systems is often assessed along the two main dimensions - rigor and relevance. Those Analytic Theory attributes associated with the ‘rigor’ of the IS-Impact model; namely, completeness, mutual exclusivity, parsimony and appropriate hierarchy, have been addressed in prior research (e.g. Gable et al, 2008). Though common tests of rigor are widely accepted and relatively uniformly applied (particularly in relation to positivist, quantitative research), attention to relevance has seldom been given the same systematic attention. This study assumes a mainly practice perspective, and emphasises the methodical evaluation of the Analytic Theory ‘relevance’ attributes represented by the Utility and Intuitiveness of the IS-Impact model. Thus, related research questions are: "Is the IS-Impact model intuitive to practitioners?" and "Is the IS-Impact model useful to practitioners?" March and Smith (1995), identify four outputs of Design Science: constructs, models, methods and instantiations (Design Science research may involve one or more of these). IS-Impact can be viewed as a design science model, composed of Design Science constructs (the four IS-Impact dimensions and the two model halves), and instantiations in the form of management information (IS-Impact data organised and presented for management decision making). In addition to methodically evaluating the Utility and Intuitiveness of the IS-Impact model and its constituent constructs, the study aims to also evaluate the derived management information. Thus, further research questions are: "Is the IS-Impact derived management information intuitive to practitioners?" and "Is the IS-Impact derived management information useful to practitioners? The study employs a longitudinal design entailing three surveys over 4 years (the 1st involving secondary data) of the Oracle-Financials application at QUT, interspersed with focus groups involving senior financial managers. The study too entails a survey of Financials at four other Australian Universities. The three focus groups respectively emphasise: (1) the IS-Impact model, (2) the 2nd survey at QUT (descriptive), and (3) comparison across surveys within QUT, and between QUT and the group of Universities. Aligned with the track goal of producing IS-Impact scores that are highly comparable, the study also addresses the more specific utility-related questions, "Is IS-Impact derived management information a useful comparator across time?" and "Is IS-Impact derived management information a useful comparator across universities?" The main contribution of the study is evidence of the utility and intuitiveness of IS-Impact to practice, thereby further substantiating the practical value of the IS-Impact approach; and also thereby motivating continuing and further research on the validity of IS-Impact, and research employing the ISImpact constructs in descriptive, predictive and explanatory studies. The study also has value methodologically as an example of relatively rigorous attention to relevance. A further key contribution is the clarification and instantiation of the full set of analytic theory attributes.
Resumo:
Stream ciphers are encryption algorithms used for ensuring the privacy of digital telecommunications. They have been widely used for encrypting military communications, satellite communications, pay TV encryption and for voice encryption of both fixed lined and wireless networks. The current multi year European project eSTREAM, which aims to select stream ciphers suitable for widespread adoptation, reflects the importance of this area of research. Stream ciphers consist of a keystream generator and an output function. Keystream generators produce a sequence that appears to be random, which is combined with the plaintext message using the output function. Most commonly, the output function is binary addition modulo two. Cryptanalysis of these ciphers focuses largely on analysis of the keystream generators and of relationships between the generator and the keystream it produces. Linear feedback shift registers are widely used components in building keystream generators, as the sequences they produce are well understood. Many types of attack have been proposed for breaking various LFSR based stream ciphers. A recent attack type is known as an algebraic attack. Algebraic attacks transform the problem of recovering the key into a problem of solving multivariate system of equations, which eventually recover the internal state bits or the key bits. This type of attack has been shown to be effective on a number of regularly clocked LFSR based stream ciphers. In this thesis, algebraic attacks are extended to a number of well known stream ciphers where at least one LFSR in the system is irregularly clocked. Applying algebriac attacks to these ciphers has only been discussed previously in the open literature for LILI-128. In this thesis, algebraic attacks are first applied to keystream generators using stop-and go clocking. Four ciphers belonging to this group are investigated: the Beth-Piper stop-and-go generator, the alternating step generator, the Gollmann cascade generator and the eSTREAM candidate: the Pomaranch cipher. It is shown that algebraic attacks are very effective on the first three of these ciphers. Although no effective algebraic attack was found for Pomaranch, the algebraic analysis lead to some interesting findings including weaknesses that may be exploited in future attacks. Algebraic attacks are then applied to keystream generators using (p; q) clocking. Two well known examples of such ciphers, the step1/step2 generator and the self decimated generator are investigated. Algebraic attacks are shown to be very powerful attack in recovering the internal state of these generators. A more complex clocking mechanism than either stop-and-go or the (p; q) clocking keystream generators is known as mutual clock control. In mutual clock control generators, the LFSRs control the clocking of each other. Four well known stream ciphers belonging to this group are investigated with respect to algebraic attacks: the Bilateral-stop-and-go generator, A5/1 stream cipher, Alpha 1 stream cipher, and the more recent eSTREAM proposal, the MICKEY stream ciphers. Some theoretical results with regards to the complexity of algebraic attacks on these ciphers are presented. The algebraic analysis of these ciphers showed that generally, it is hard to generate the system of equations required for an algebraic attack on these ciphers. As the algebraic attack could not be applied directly on these ciphers, a different approach was used, namely guessing some bits of the internal state, in order to reduce the degree of the equations. Finally, an algebraic attack on Alpha 1 that requires only 128 bits of keystream to recover the 128 internal state bits is presented. An essential process associated with stream cipher proposals is key initialization. Many recently proposed stream ciphers use an algorithm to initialize the large internal state with a smaller key and possibly publicly known initialization vectors. The effect of key initialization on the performance of algebraic attacks is also investigated in this thesis. The relationships between the two have not been investigated before in the open literature. The investigation is conducted on Trivium and Grain-128, two eSTREAM ciphers. It is shown that the key initialization process has an effect on the success of algebraic attacks, unlike other conventional attacks. In particular, the key initialization process allows an attacker to firstly generate a small number of equations of low degree and then perform an algebraic attack using multiple keystreams. The effect of the number of iterations performed during key initialization is investigated. It is shown that both the number of iterations and the maximum number of initialization vectors to be used with one key should be carefully chosen. Some experimental results on Trivium and Grain-128 are then presented. Finally, the security with respect to algebraic attacks of the well known LILI family of stream ciphers, including the unbroken LILI-II, is investigated. These are irregularly clock- controlled nonlinear filtered generators. While the structure is defined for the LILI family, a particular paramater choice defines a specific instance. Two well known such instances are LILI-128 and LILI-II. The security of these and other instances is investigated to identify which instances are vulnerable to algebraic attacks. The feasibility of recovering the key bits using algebraic attacks is then investigated for both LILI- 128 and LILI-II. Algebraic attacks which recover the internal state with less effort than exhaustive key search are possible for LILI-128 but not for LILI-II. Given the internal state at some point in time, the feasibility of recovering the key bits is also investigated, showing that the parameters used in the key initialization process, if poorly chosen, can lead to a key recovery using algebraic attacks.
Resumo:
Expert knowledge is valuable in many modelling endeavours, particularly where data is not extensive or sufficiently robust. In Bayesian statistics, expert opinion may be formulated as informative priors, to provide an honest reflection of the current state of knowledge, before updating this with new information. Technology is increasingly being exploited to help support the process of eliciting such information. This paper reviews the benefits that have been gained from utilizing technology in this way. These benefits can be structured within a six-step elicitation design framework proposed recently (Low Choy et al., 2009). We assume that the purpose of elicitation is to formulate a Bayesian statistical prior, either to provide a standalone expert-defined model, or for updating new data within a Bayesian analysis. We also assume that the model has been pre-specified before selecting the software. In this case, technology has the most to offer to: targeting what experts know (E2), eliciting and encoding expert opinions (E4), whilst enhancing accuracy (E5), and providing an effective and efficient protocol (E6). Benefits include: -providing an environment with familiar nuances (to make the expert comfortable) where experts can explore their knowledge from various perspectives (E2); -automating tedious or repetitive tasks, thereby minimizing calculation errors, as well as encouraging interaction between elicitors and experts (E5); -cognitive gains by educating users, enabling instant feedback (E2, E4-E5), and providing alternative methods of communicating assessments and feedback information, since experts think and learn differently; and -ensuring a repeatable and transparent protocol is used (E6).
Resumo:
AfL practices observed in case studies in a North Queensland school were analysed from a sociocultural theoretical perspective. AfL practices of feedback, dialogue and peer assessment were viewed as an opportunity for students to learn the social expectations about being an autonomous learner, or central participant within the classroom community of practice. This process of becoming more expert and belonging within the community of practice involved students negotiating identities of participation that included knowing both academic skills and social expectations within the classroom. This paper argues that when AfL practices are viewed as ways of enhancing participation, there is potential for learners to negotiate identities as autonomous learners. AfL practices within the daily classroom interactions and pedagogy that enabled students to develop a shared repertoire, joint enterprise and mutual engagement in the classroom communities of practice are described. The challenges for teachers in shifting their gaze to patterns of participation are also briefly discussed.
Resumo:
Digital collections are growing exponentially in size as the information age takes a firm grip on all aspects of society. As a result Information Retrieval (IR) has become an increasingly important area of research. It promises to provide new and more effective ways for users to find information relevant to their search intentions. Document clustering is one of the many tools in the IR toolbox and is far from being perfected. It groups documents that share common features. This grouping allows a user to quickly identify relevant information. If these groups are misleading then valuable information can accidentally be ignored. There- fore, the study and analysis of the quality of document clustering is important. With more and more digital information available, the performance of these algorithms is also of interest. An algorithm with a time complexity of O(n2) can quickly become impractical when clustering a corpus containing millions of documents. Therefore, the investigation of algorithms and data structures to perform clustering in an efficient manner is vital to its success as an IR tool. Document classification is another tool frequently used in the IR field. It predicts categories of new documents based on an existing database of (doc- ument, category) pairs. Support Vector Machines (SVM) have been found to be effective when classifying text documents. As the algorithms for classifica- tion are both efficient and of high quality, the largest gains can be made from improvements to representation. Document representations are vital for both clustering and classification. Representations exploit the content and structure of documents. Dimensionality reduction can improve the effectiveness of existing representations in terms of quality and run-time performance. Research into these areas is another way to improve the efficiency and quality of clustering and classification results. Evaluating document clustering is a difficult task. Intrinsic measures of quality such as distortion only indicate how well an algorithm minimised a sim- ilarity function in a particular vector space. Intrinsic comparisons are inherently limited by the given representation and are not comparable between different representations. Extrinsic measures of quality compare a clustering solution to a “ground truth” solution. This allows comparison between different approaches. As the “ground truth” is created by humans it can suffer from the fact that not every human interprets a topic in the same manner. Whether a document belongs to a particular topic or not can be subjective.