405 resultados para revisions
Resumo:
I used to think of the connection between a particular and a universal that it instantiates as a contingent one. Now I think that this is not quite right. This revision, as I now see it, is not a very large one. I still think that the states of affairs (Russell’s facts in his great Lectures on Logical Atomism) that unite particulars and universals are contingent beings. But the connection within states of affairs is, in a certain way, necessary.
Resumo:
This article reviews an exhibition of art works at the Institute for Modern Art that dealt with the idea of history. The review suggests that history is an unstable product of our collective imaginations and is constantly open to revisions and individual perspectives. Each of the artists deal with these issues by reinterpreting past events. The artists were Pierre Huyghe, Thomas Demand, Mike Kelley, paul McCarthy, Jeremy Deller, Danius Kesminas, Gerard Byrne, Emma Kay, and Omer Fast.
Resumo:
Clearly the world is a different place to what it was 40 years ago, and much of that difference can be characterised as disturbances to the local on the basis of globalism, particularly due to changes in communication and information technology. Like it did to modernism before it, this societal change calls for, or more aptly calls to, designers to reformulate their practices to reflect this significant new paradigm. This is a rationale that has driven much avant-garde activity in the 20th century, and in this case, 'landscape urbanism' in the 21st. In the case of this discussion, it is important to recognise the avant-garde cycle at work in the development of the discipline, not only to contextualise its production, but so that its greatest values can be welcomed: despite the propaganda and arrogance, important revisions occurred to the canons after all the -isms. That said, I do find myself asking: do we need another -ism?
Resumo:
Background Patella resurfacing in total knee arthroplasty is a contentious issue. The literature suggests that resurfacing of the patella is based on surgeon preference, and little is known about the role and timing of resurfacing and how this affects outcomes. Methods We analyzed 134,799 total knee arthroplasties using data from the Australian Orthopaedic Association National Joint Replacement Registry. Hazards ratios (HRs) were used to compare rates of early revision between patella resurfacing at the primary procedure (the resurfacing group, R) and primary arthroplasty without resurfacing (no-resurfacing group, NR). We also analyzed the outcomes of NR that were revised for isolated patella addition. Results At 5 years, the R group showed a lower revision rate than the NR group: cumulative per cent revision (CPR) 3.1% and 4.0%, respectively (HR = 0.75, p < 0.001). Revisions for patellofemoral pain were more common in the NR group (17%) than in the R group (1%), and “patella only” revisions were more common in the NR group (29%) than in the R group (6%). Non-resurfaced knees revised for isolated patella addition had a higher revision rate than patella resurfacing at the primary procedure, with a 4-year CPR of 15% and 2.8%, respectively (HR = 4.1, p < 0.001). Interpretation Rates of early revision of primary total knees were higher when the patella was not resurfaced, and suggest that surgeons may be inclined to resurface later if there is patellofemoral pain. However, 15% of non-resurfaced knees revised for patella addition are re-revised by 4 years. Our results suggest an early beneficial outcome for patella resurfacing at primary arthroplasty based on revision rates up to 5 years.
Resumo:
Frontline employee behaviours are recognised as vital for achieving a competitive advantage for service organisations. The services marketing literature has comprehensively examined ways to improve frontline employee behaviours in service delivery and recovery. However, limited attention has been paid to frontline employee behaviours that favour customers in ways that go against organisational norms or rules. This study examines these behaviours by introducing a behavioural concept of Customer-Oriented Deviance (COD). COD is defined as, “frontline employees exhibiting extra-role behaviours that they perceive to defy existing expectations or prescribed rules of higher authority through service adaptation, communication and use of resources to benefit customers during interpersonal service encounters.” This thesis develops a COD measure and examines the key determinants of these behaviours from a frontline employee perspective. Existing research on similar behaviours that has originated in the positive deviance and pro-social behaviour domains has limitations and is considered inadequate to examine COD in the services context. The absence of a well-developed body of knowledge on non-conforming service behaviours has implications for both theory and practice. The provision of ‘special favours’ increases customer satisfaction but the over-servicing of customers is also counterproductive for the service delivery and costly for the organisation. Despite these implications of non-conforming service behaviours, there is little understanding about the nature of these behaviours and its key drivers. This research builds on inadequacies in prior research on positive deviance, pro-social and pro-customer literature to develop the theoretical foundation of COD. The concept of positive deviance which has predominantly been used to study organisational behaviours is applied within a services marketing setting. Further, it addresses previous limitations in pro-social and pro-customer behavioural literature that has examined limited forms of behaviours with no clear understanding on the nature of these behaviours. Building upon these literature streams, this research adopts a holistic approach towards the conceptualisation of COD. It addresses previous shortcomings in the literature by providing a well bounded definition, developing a psychometrically sound measure of COD and a conceptually well-founded model of COD. The concept of COD was examined across three separate studies and based on the theoretical foundations of role theory and social identity theory. Study 1 was exploratory and based on in-depth interviews using the Critical Incident Technique (CIT). The aim of Study 1 was to understand the nature of COD and qualitatively identify its key drivers. Thematic analysis was conducted to analyse the data and the two potential dimensions of COD behaviours of Deviant Service Adaptation (DSA) and Deviant Service Communication (DSC) were revealed in the analysis. In addition, themes representing the potential influences of COD were broadly classified as individual factors, situational factors, and organisational factors. Study 2 was a scale development procedure that involved the generation and purification of items for the measure based on two student samples working in customer service roles (Pilot sample, N=278; Initial validation sample, N=231). The results for the reliability and Exploratory Factor Analyses (EFA) on the pilot sample suggested the scale had poor psychometric properties. As a result, major revisions were made in terms of item wordings and new items were developed based on the literature to reflect a new dimension, Deviant Use of Resources (DUR). The revised items were tested on the initial validation sample with the EFA analysis suggesting a four-factor structure of COD. The aim of Study 3 was to further purify the COD measure and test for nomological validity based on its theoretical relationships with key antecedents and similar constructs (key correlates). The theoretical model of COD consisting of nine hypotheses was tested on a retail and hospitality sample of frontline employees (Retail N=311; Hospitality N=305) of a market research panel using an online survey. The data was analysed using Structural Equation Modelling (SEM). The results provided support for a re-specified second-order three-factor model of COD which consists of 11 items. Overall, the COD measure was found to be reliable and valid, demonstrating convergent validity, discriminant validity and marginal partial invariance for the factor loadings. The results showed support for nomological validity, although the antecedents had differing impact on COD across samples. Specifically, empathy and perspective-taking, role conflict, and job autonomy significantly influenced COD in the retail sample, whereas empathy and perspective-taking, risk-taking propensity and role conflict were significant predictors in the hospitality sample. In addition, customer orientation-selling orientation, the altruistic dimension of organisational citizenship behaviours, workplace deviance, and social desirability responding were found to correlate with COD. This research makes several contributions to theory. First, the findings of this thesis extend the literature on positive deviance, pro-social and pro-customer behaviours. Second, the research provides an empirically tested model which describes the antecedents of COD. Third, this research contributes by providing a reliable and valid measure of COD. Finally, the research investigates the differential effects of the key antecedents in different service sectors on COD. The research findings also contribute to services marketing practice. Based on the research findings, service practitioners can better understand the phenomenon of COD and utilise the measurement tool to calibrate COD levels within their organisations. Knowledge on the key determinants of COD will help improve recruitment and training programs and drive internal initiatives within the firm.
Resumo:
Background: The objective of routine outpatient assessment of well functioning patients after primary total hip arthroplasty (THA) is to detect asymptomatic failure of prostheses to guide recommendations for early intervention. We have observed that the revision of THAs in asymptomatic patients is highly uncommon. We therefore question the need for routine follow-up of patients after THA. Methods: A prospective analysis of an orthopaedic database identified 158 patients who received 177 revision THAs over a 4 year period. A retrospective chart review was conducted. Patient demographics, primary and revision surgery parameters and follow-up information was recorded and cross referenced with AOA NJRR data. Results: 110 THAs in 104 patients (average age 70.4 (SD 9.8 years). There were 70 (63.6%) total, 13 (11.8%) femoral and 27 (24.5%) acetabular revisions. The indications for revision were aseptic loosening (70%), dislocation (8.2%), peri-prosthetic fracture (7.3%), osteolysis (6.4%) and infection (4.5%). Only 4 (3.6%) were asymptomatic revisions. A mean of 5.3 (SD 5.2 and 1.9 (SD 5.3 follow-up appointments were required before revision in patients with and without symptoms, respectively. The average time from the primary to revision surgery was 11.8 (SD 7.23) years. Conclusions: We conclude that patients with prostheses with excellent long term clinical results as validated by Joint Registries, routine follow-up of asymptomatic THA should be questioned and requires further investigation. Based on the work of this study, the current practice of routine follow-up of asymptomatic THA may be excessively costly and unnecessary and a less resource-intensive review method may be more appropriate.
Resumo:
A group key exchange (GKE) protocol allows a set of parties to agree upon a common secret session key over a public network. In this thesis, we focus on designing efficient GKE protocols using public key techniques and appropriately revising security models for GKE protocols. For the purpose of modelling and analysing the security of GKE protocols we apply the widely accepted computational complexity approach. The contributions of the thesis to the area of GKE protocols are manifold. We propose the first GKE protocol that requires only one round of communication and is proven secure in the standard model. Our protocol is generically constructed from a key encapsulation mechanism (KEM). We also suggest an efficient KEM from the literature, which satisfies the underlying security notion, to instantiate the generic protocol. We then concentrate on enhancing the security of one-round GKE protocols. A new model of security for forward secure GKE protocols is introduced and a generic one-round GKE protocol with forward security is then presented. The security of this protocol is also proven in the standard model. We also propose an efficient forward secure encryption scheme that can be used to instantiate the generic GKE protocol. Our next contributions are to the security models of GKE protocols. We observe that the analysis of GKE protocols has not been as extensive as that of two-party key exchange protocols. Particularly, the security attribute of key compromise impersonation (KCI) resilience has so far been ignored for GKE protocols. We model the security of GKE protocols addressing KCI attacks by both outsider and insider adversaries. We then show that a few existing protocols are not secure against KCI attacks. A new proof of security for an existing GKE protocol is given under the revised model assuming random oracles. Subsequently, we treat the security of GKE protocols in the universal composability (UC) framework. We present a new UC ideal functionality for GKE protocols capturing the security attribute of contributiveness. An existing protocol with minor revisions is then shown to realize our functionality in the random oracle model. Finally, we explore the possibility of constructing GKE protocols in the attribute-based setting. We introduce the concept of attribute-based group key exchange (AB-GKE). A security model for AB-GKE and a one-round AB-GKE protocol satisfying our security notion are presented. The protocol is generically constructed from a new cryptographic primitive called encapsulation policy attribute-based KEM (EP-AB-KEM), which we introduce in this thesis. We also present a new EP-AB-KEM with a proof of security assuming generic groups and random oracles. The EP-AB-KEM can be used to instantiate our generic AB-GKE protocol.
Resumo:
In the summer of 2008, the Jönköping International Business School invited a selection of prominent (and not yet past-zenith) scholars of our field to a workshop at which they were asked to present their visions about where the future of entrepreneurship research is headed. An important inspiration for this initiative was a similar gathering in Jönköping 10 years earlier, which led to a special issue of Entrepreneurship Theory and Practice that has become one of the most cited in the history of the journal (see Davidsson, Low, &Wright, 2001). Similarly, the current special issue is based on the ideas that were first presented at the 2008 workshop, although they have since been thoroughly discussed and developed through extensive commentary and revisions.
Resumo:
In a similar fashion to many western countries, the political context of Japan has been transformed since the 1975 UN World Conference on Women, which eventually led to the establishment of the Basic Law for a Gender-equal Society in Japan in 1999. The Basic Law sets out a series of general guidelines across every field of society, including education. This trajectory policy research study targets gender issues in Japanese higher education and follows the development of the Basic Law and, in particular, how it has been interpreted by bureaucrats and implemented within the field of higher education. This feminist policy research study examines Japanese power relationships within the field of gender and identifies gender discourses embedded within Japanese gender equity policy documents. The study documents the experiences of, and strategies used by, Japanese feminists in relation to gender equity policies in education. Drawing on critical feminist theory and feminist critical discourse theory, the study explores the relationship between gender discourses and social practices and analyses how unequal gender relations have been sustained through the implementation of Japanese gender equity policy. Feminist critical policy analysis and feminist critical discourse analysis have been used to examine data collected through interviews with key players, including policy makers and policy administrators from the national government and higher education institutions offering teacher education courses. The study also scrutinises the minutes of government meetings, and other relevant policy documents. The study highlights the struggles between policy makers in the government and bureaucracy, and feminist educators working for change. Following an anti-feminist backlash, feminist discourses in the original policy documents were weakened or marginalised in revisions, ultimately weakening the impact of the Basic Law in the higher education institutions. The following four key findings are presented within the research: 1) tracking of the original feminist teachers’ movement that existed just prior to the development of the Basic Law in 1999; 2) the formation of the Basic Law, and how the policy resulted in a weakening of the main tenets of women’s policy from a feminist perspective; 3) the problematic manner in which the Basic Law was interpreted at the bureaucratic level; and 4) the limited impact of the Basic Law on higher education and the strategies and struggles of feminist scholars in reaction to this law.
Resumo:
Metallic materials exposed to oxygen-enriched atmospheres – as commonly used in the medical, aerospace, aviation and numerous chemical processing industries – represent a significant fire hazard which must be addressed during design, maintenance and operation. Hence, accurate knowledge of metallic materials flammability is required. Reduced gravity (i.e. space-based) operations present additional unique concerns, where the absence of gravity must also be taken into account. The flammability of metallic materials has historically been quantified using three standardised test methods developed by NASA, ASTM and ISO. These tests typically involve the forceful (promoted) ignition of a test sample (typically a 3.2 mm diameter cylindrical rod) in pressurised oxygen. A test sample is defined as flammable when it undergoes burning that is independent of the ignition process utilised. In the standardised tests, this is indicated by the propagation of burning further than a defined amount, or „burn criterion.. The burn criterion in use at the onset of this project was arbitrarily selected, and did not accurately reflect the length a sample must burn in order to be burning independent of the ignition event and, in some cases, required complete consumption of the test sample for a metallic material to be considered flammable. It has been demonstrated that a) a metallic material.s propensity to support burning is altered by any increase in test sample temperature greater than ~250-300 oC and b) promoted ignition causes an increase in temperature of the test sample in the region closest to the igniter, a region referred to as the Heat Affected Zone (HAZ). If a test sample continues to burn past the HAZ (where the HAZ is defined as the region of the test sample above the igniter that undergoes an increase in temperature of greater than or equal to 250 oC by the end of the ignition event), it is burning independent of the igniter, and should be considered flammable. The extent of the HAZ, therefore, can be used to justify the selection of the burn criterion. A two dimensional mathematical model was developed in order to predict the extent of the HAZ created in a standard test sample by a typical igniter. The model was validated against previous theoretical and experimental work performed in collaboration with NASA, and then used to predict the extent of the HAZ for different metallic materials in several configurations. The extent of HAZ predicted varied significantly, ranging from ~2-27 mm depending on the test sample thermal properties and test conditions (i.e. pressure). The magnitude of the HAZ was found to increase with increasing thermal diffusivity, and decreasing pressure (due to slower ignition times). Based upon the findings of this work, a new burn criterion requiring 30 mm of the test sample to be consumed (from the top of the ignition promoter) was recommended and validated. This new burn criterion was subsequently included in the latest revision of the ASTM G124 and NASA 6001B international test standards that are used to evaluate metallic material flammability in oxygen. These revisions also have the added benefit of enabling the conduct of reduced gravity metallic material flammability testing in strict accordance with the ASTM G124 standard, allowing measurement and comparison of the relative flammability (i.e. Lowest Burn Pressure (LBP), Highest No-Burn Pressure (HNBP) and average Regression Rate of the Melting Interface(RRMI)) of metallic materials in normal and reduced gravity, as well as determination of the applicability of normal gravity test results to reduced gravity use environments. This is important, as currently most space-based applications will typically use normal gravity information in order to qualify systems and/or components for reduced gravity use. This is shown here to be non-conservative for metallic materials which are more flammable in reduced gravity. The flammability of two metallic materials, Inconel® 718 and 316 stainless steel (both commonly used to manufacture components for oxygen service in both terrestrial and space-based systems) was evaluated in normal and reduced gravity using the new ASTM G124-10 test standard. This allowed direct comparison of the flammability of the two metallic materials in normal gravity and reduced gravity respectively. The results of this work clearly show, for the first time, that metallic materials are more flammable in reduced gravity than in normal gravity when testing is conducted as described in the ASTM G124-10 test standard. This was shown to be the case in terms of both higher regression rates (i.e. faster consumption of the test sample – fuel), and burning at lower pressures in reduced gravity. Specifically, it was found that the LBP for 3.2 mm diameter Inconel® 718 and 316 stainless steel test samples decreased by 50% from 3.45 MPa (500 psia) in normal gravity to 1.72 MPa (250 psia) in reduced gravity for the Inconel® 718, and 25% from 3.45 MPa (500 psia) in normal gravity to 2.76 MPa (400 psia) in reduced gravity for the 316 stainless steel. The average RRMI increased by factors of 2.2 (27.2 mm/s in 2.24 MPa (325 psia) oxygen in reduced gravity compared to 12.8 mm/s in 4.48 MPa (650 psia) oxygen in normal gravity) for the Inconel® 718 and 1.6 (15.0 mm/s in 2.76 MPa (400 psia) oxygen in reduced gravity compared to 9.5 mm/s in 5.17 MPa (750 psia) oxygen in normal gravity) for the 316 stainless steel. Reasons for the increased flammability of metallic materials in reduced gravity compared to normal gravity are discussed, based upon the observations made during reduced gravity testing and previous work. Finally, the implications (for fire safety and engineering applications) of these results are presented and discussed, in particular, examining methods for mitigating the risk of a fire in reduced gravity.
Resumo:
Purpose – The purpose of this paper is to jointly assess the impact of regulatory reform for corporate fundraising in Australia (CLERP Act 1999) and the relaxation of ASX admission rules in 1999, on the accuracy of management earnings forecasts in initial public offer (IPO) prospectuses. The relaxation of ASX listing rules permitted a new category of new economy firms (commitments test entities (CTEs))to list without a prior history of profitability, while the CLERP Act (introduced in 2000) was accompanied by tighter disclosure obligations and stronger enforcement action by the corporate regulator (ASIC). Design/methodology/approach – All IPO earnings forecasts in prospectuses lodged between 1998 and 2003 are examined to assess the pre- and post-CLERP Act impact. Based on active ASIC enforcement action in the post-reform period, IPO firms are hypothesised to provide more accurate forecasts, particularly CTE firms, which are less likely to have a reasonable basis for forecasting. Research models are developed to empirically test the impact of the reforms on CTE and non-CTE IPO firms. Findings – The new regulatory environment has had a positive impact on management forecasting behaviour. In the post-CLERP Act period, the accuracy of prospectus forecasts and their revisions significantly improved and, as expected, the results are primarily driven by CTE firms. However, the majority of prospectus forecasts continue to be materially inaccurate. Originality/value – The results highlight the need to control for both the changing nature of listed firms and the level of enforcement action when examining responses to regulatory changes to corporate fundraising activities.
Resumo:
Over the last decade there has been an expansion in the number of Juris Doctor (JD) courses in the Australian legal education marketplace. Across the board it is graduate-entry, but it is currently offered in undergraduate, postgraduate and ‘hybrid’ forms. In this article we will discuss recent research conducted as part of an Australian Learning and Teaching Council grant. This project included an exploration of whether JD courses in Australia were applying different and higher level academic standards to those operating in Bachelor of Laws degrees. Our research findings reveal justification for concerns about the academic standards of some JD courses, particularly where masters level students were being taught alongside their undergraduate counterparts. They also provide some insights into perceptions in the marketplace of JD graduates. Finally, we will discuss the future viability of such courses in light of recent revisions to the Australian Qualifications Framework.
Resumo:
Young adult literature is a socialising genre that encourages young readers to take up very particular ways of relating to historical or cultural materials. Recent years have seen a boom in Sherlockian YA fiction inviting reader identification either with the Baker Street Irregulars or an adolescent Holmes. In works by Anthony Read, Andrew Lane, Tracy Mack & Michael Citrin, and Tony Lee, the Sherlock canon provides a vocabulary for neo-Victorian young adult fiction to simultaneously invoke and defer a range of competing visions of working childhood as both at-risk and autonomous; of education as both oppression and emancipation; and of literary-cultural history as both populist and elitist. Such tensions can be traced in Conan Doyle’s own constructions of working children, and in the circulation of the Sherlock stories as popular or literary fictions. Drawing both on the Sherlock canon and its revisions, this paper reads current YA fiction’s deployment of Conan Doyle’s fictional universe as a tool for negotiating contemporary anxieties of adolescence.
Resumo:
Background: Previous research identified that primary brain tumour patients have significant psychological morbidity and unmet needs, particularly the need for more information and support. However, the utility of strategies to improve information provision in this setting is unknown. This study involved the development and piloting of a brain tumour specific question prompt list (QPL). A QPL is a list of questions patients may find useful to ask their health professionals, and is designed to facilitate communication and information exchange. Methods: Thematic analysis of QPLs developed for other chronic diseases and brain tumour specific patient resources informed a draft QPL. Subsequent refinement of the QPL involved an iterative process of interviews and review with 12 recently diagnosed patients and six caregivers. Final revisions were made following readability analyses and review by health professionals. Piloting of the QPL is underway using a non-randomised control group trial with patients undergoing treatment for a primary brain tumour in Brisbane, Queensland. Following baseline interviews, consenting participants are provided with the QPL or standard information materials. Follow-up interviews four to 6 weeks later allow assessment of the acceptability of the QPL, how it is used by patients, impact on information needs, and feasibility of recruitment, implementation and outcome assessment. Results: The final QPL was determined to be readable at the sixth grade level. It contains seven sections: diagnosis, prognosis, symptoms and changes, the health professional team, support, treatment and management, and post-treatment concerns. At this time, fourteen participants have been recruited for the pilot, and data collection completed for eleven. Data collection and preliminary analysis are expected to be completed by and presented at the conference. Conclusions: If acceptable to participants, the QPL may encourage patients, doctors and nurses to communicate more effectively, reducing unmet information needs and ultimately improving psychological wellbeing.
Resumo:
Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.