405 resultados para Patrick Rothfus
Resumo:
Background: The objective of routine outpatient assessment of well functioning patients after primary total hip arthroplasty (THA) is to detect asymptomatic failure of prostheses to guide recommendations for early intervention. We have observed that the revision of THAs in asymptomatic patients is highly uncommon. We therefore question the need for routine follow-up of patients after THA. Methods: A prospective analysis of an orthopaedic database identified 158 patients who received 177 revision THAs over a 4 year period. A retrospective chart review was conducted. Patient demographics, primary and revision surgery parameters and follow-up information was recorded and cross referenced with AOA NJRR data. Results: 110 THAs in 104 patients (average age 70.4 (SD 9.8 years). There were 70 (63.6%) total, 13 (11.8%) femoral and 27 (24.5%) acetabular revisions. The indications for revision were aseptic loosening (70%), dislocation (8.2%), peri-prosthetic fracture (7.3%), osteolysis (6.4%) and infection (4.5%). Only 4 (3.6%) were asymptomatic revisions. A mean of 5.3 (SD 5.2 and 1.9 (SD 5.3 follow-up appointments were required before revision in patients with and without symptoms, respectively. The average time from the primary to revision surgery was 11.8 (SD 7.23) years. Conclusions: We conclude that patients with prostheses with excellent long term clinical results as validated by Joint Registries, routine follow-up of asymptomatic THA should be questioned and requires further investigation. Based on the work of this study, the current practice of routine follow-up of asymptomatic THA may be excessively costly and unnecessary and a less resource-intensive review method may be more appropriate.
Resumo:
This report explains the objectives, datasets and evaluation criteria of both the clustering and classification tasks set in the INEX 2009 XML Mining track. The report also describes the approaches and results obtained by the different participants.
Resumo:
When the colonisers first came to Australia there was an urgent desire to map, name and settle. This desire, in part, stemmed from a fear of the unknown. Once these tasks were completed it was thought that a sense of identity and belonging would automatically come. In Anglo-Australian geography the map of Australia was always perceived in relationship to the larger map of Europe and Britain. The quicker Australia could be mapped the quicker its connection with the ‘civilised’ world could be established. Official maps could be taken up in official history books and a detailed monumental history could begin. Australians would feel secure in where they were placed in the world. However, this was not the case and anxieties about identity and belonging remained. One of the biggest hurdles was the fear of the open spaces and not knowing how to move across the land. Attempts to transpose colonisers’ use of space onto the Australian landscape did not work and led to confusion. Using authors who are often perceived as writers of national fictions (Henry Lawson, Barbara Baynton, Patrick White, David Malouf and Peter Carey) I will reveal how writing about space becomes a way to create a sense of belonging. It is through spatial knowledge and its application that we begin to gain a sense of closeness and identity. I will also look at how one of the greatest fears for the colonisers was the Aboriginal spatial command of the country. Aborigines already had a strongly developed awareness of spatial belonging and their stories reveal this authority (seen in the work of Lorna Little, Mick McLean) Colonisers attempted to discredit this knowledge but the stories and the land continue to recognise its legitimacy. From its beginning Australian spaces have been spaces of hybridity and the more the colonisers attempted to force predetermined structures onto these spaces the more hybrid they became.
Resumo:
A series of short performance experiments demonstrating the creative potential of motion capture technology as a tool within performance for exploring audience behaviour and interaction. Examples highlight the possibilities for future work and give basic demonstrations of what is technically possible for the stage. Please note that people attending this performance may be videoed and motion captured for research and/or publication purposes. This work has been funded by the EPSRC c4dm platform grant.
Resumo:
The concept of radar was developed for the estimation of the distance (range) and velocity of a target from a receiver. The distance measurement is obtained by measuring the time taken for the transmitted signal to propagate to the target and return to the receiver. The target's velocity is determined by measuring the Doppler induced frequency shift of the returned signal caused by the rate of change of the time- delay from the target. As researchers further developed conventional radar systems it become apparent that additional information was contained in the backscattered signal and that this information could in fact be used to describe the shape of the target itself. It is due to the fact that a target can be considered to be a collection of individual point scatterers, each of which has its own velocity and time- delay. DelayDoppler parameter estimation of each of these point scatterers thus corresponds to a mapping of the target's range and cross range, thus producing an image of the target. Much research has been done in this area since the early radar imaging work of the 1960s. At present there are two main categories into which radar imaging falls. The first of these is related to the case where the backscattered signal is considered to be deterministic. The second is related to the case where the backscattered signal is of a stochastic nature. In both cases the information which describes the target's scattering function is extracted by the use of the ambiguity function, a function which correlates the backscattered signal in time and frequency with the transmitted signal. In practical situations, it is often necessary to have the transmitter and the receiver of the radar system sited at different locations. The problem in these situations is 'that a reference signal must then be present in order to calculate the ambiguity function. This causes an additional problem in that detailed phase information about the transmitted signal is then required at the receiver. It is this latter problem which has led to the investigation of radar imaging using time- frequency distributions. As will be shown in this thesis, the phase information about the transmitted signal can be extracted from the backscattered signal using time- frequency distributions. The principle aim of this thesis was in the development, and subsequent discussion into the theory of radar imaging, using time- frequency distributions. Consideration is first given to the case where the target is diffuse, ie. where the backscattered signal has temporal stationarity and a spatially white power spectral density. The complementary situation is also investigated, ie. where the target is no longer diffuse, but some degree of correlation exists between the time- frequency points. Computer simulations are presented to demonstrate the concepts and theories developed in the thesis. For the proposed radar system to be practically realisable, both the time- frequency distributions and the associated algorithms developed must be able to be implemented in a timely manner. For this reason an optical architecture is proposed. This architecture is specifically designed to obtain the required time and frequency resolution when using laser radar imaging. The complex light amplitude distributions produced by this architecture have been computer simulated using an optical compiler.
Resumo:
Hydrogel polymers are used for the manufacture of soft (or disposable) contact lenses worldwide today, but have a tendency to dehydrate on the eye. In vitro methods that can probe the potential for a given hydrogel polymer to dehydrate in vivo are much sought after. Nuclear magnetic resonance (NMR) has been shown to be effective in characterising water mobility and binding in similar systems (Barbieri, Quaglia et al., 1998, Larsen, Huff et al., 1990, Peschier, Bouwstra et al., 1993), predominantly through measurement of the spin-lattice relaxation time (T1), the spinspin relaxation time (T2) and the water diffusion coefficient (D). The aim of this work was to use NMR to quantify the molecular behaviour of water in a series of commercially available contact lens hydrogels, and relate these measurements to the binding and mobility of the water, and ultimately the potential for the hydrogel to dehydrate. As a preliminary study, in vitro evaporation rates were measured for a set of commercial contact lens hydrogels. Following this, comprehensive measurement of the temperature and water content dependencies of T1, T2 and D was performed for a series of commercial hydrogels that spanned the spectrum of equilibrium water content (EWC) and common compositions of contact lenses that are manufactured today. To quantify material differences, the data were then modelled based on theory that had been used for similar systems in the literature (Walker, Balmer et al., 1989, Hills, Takacs et al., 1989). The differences were related to differences in water binding and mobility. The evaporative results suggested that the EWC of the material was important in determining a material's potential to dehydrate in this way. Similarly, the NMR water self-diffusion coefficient was also found to be largely (if not wholly) determined by the WC. A specific binding model confirmed that the we was the dominant factor in determining the diffusive behaviour, but also suggested that subtle differences existed between the materials used, based on their equilibrium we (EWC). However, an alternative modified free volume model suggested that only the current water content of the material was important in determining the diffusive behaviour, and not the equilibrium water content. It was shown that T2 relaxation was dominated by chemical exchange between water and exchangeable polymer protons for materials that contained exchangeable polymer protons. The data was analysed using a proton exchange model, and the results were again reasonably correlated with EWC. Specifically, it was found that the average water mobility increased with increasing EWe approaching that of free water. The T1 relaxation was also shown to be reasonably well described by the same model. The main conclusion that can be drawn from this work is that the hydrogel EWe is an important parameter, which largely determines the behaviour of water in the gel. Higher EWe results in a hydrogel with water that behaves more like bulk water on average, or is less strongly 'bound' on average, compared with a lower EWe material. Based on the set of materials used, significant differences due to composition (for materials of the same or similar water content) could not be found. Similar studies could be used in the future to highlight hydrogels that deviate significantly from this 'average' behaviour, and may therefore have the least/greatest potential to dehydrate on the eye.
Resumo:
The human-technology nexus is a strong focus of Information Systems (IS) research; however, very few studies have explored this phenomenon in anaesthesia. Anaesthesia has a long history of adoption of technological artifacts, ranging from early apparatus to present-day information systems such as electronic monitoring and pulse oximetry. This prevalence of technology in modern anaesthesia and the rich human-technology relationship provides a fertile empirical setting for IS research. This study employed a grounded theory approach that began with a broad initial guiding question and, through simultaneous data collection and analysis, uncovered a core category of technology appropriation. This emergent basic social process captures a central activity of anaesthestists and is supported by three major concepts: knowledge-directed medicine, complementary artifacts and culture of anaesthesia. The outcomes of this study are: (1) a substantive theory that integrates the aforementioned concepts and pertains to the research setting of anaesthesia and (2) a formal theory, which further develops the core category of appropriation from anaesthesia-specific to a broader, more general perspective. These outcomes fulfill the objective of a grounded theory study, being the formation of theory that describes and explains observed patterns in the empirical field. In generalizing the notion of appropriation, the formal theory is developed using the theories of Karl Marx. This Marxian model of technology appropriation is a three-tiered theoretical lens that examines appropriation behaviours at a highly abstract level, connecting the stages of natural, species and social being to the transition of a technology-as-artifact to a technology-in-use via the processes of perception, orientation and realization. The contributions of this research are two-fold: (1) the substantive model contributes to practice by providing a model that describes and explains the human-technology nexus in anaesthesia, and thereby offers potential predictive capabilities for designers and administrators to optimize future appropriations of new anaesthetic technological artifacts; and (2) the formal model contributes to research by drawing attention to the philosophical foundations of appropriation in the work of Marx, and subsequently expanding the current understanding of contemporary IS theories of adoption and appropriation.
Resumo:
Through a grant received from the Australian Library and Information Association (ALIA), members of Health Libraries Australia (HLA) are collaborating with a researcher/educator to conduct a twelve month research project with the goal of developing an educational framework for the Australian health librarianship workforce of the future. The collaboration comprises the principal researcher and a representative group of practitioners from different sectors of the health industry who are affiliated with ALIA in various committees, advisory groups and roles. The research has two main aims: to determine the future skills requirements for the health librarian workforce in Australia; and to develop a structured, modular education framework for specialist post-graduate qualifications together with a structure for ongoing continuing professional development. The paper highlights some of the major trends in the health sector and some of the main environmental influences that may act as drivers for change for health librarianship as a profession, and particularly for educating the future workforce. The research methodology is outlined and the main results are described; the findings are discussed with regard to their implications for the development of a structured, competency-based education framework.