899 resultados para Return of results


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Driving is a vigilance task, requiring sustained attention to maintain performance and avoid crashes. Hypovigilance (i.e., marked reduction in vigilance) while driving manifests as poor driving performance and is commonly attributed to fatigue (Dinges, 1995). However, poor driving performance has been found to be more frequent when driving in monotonous road environments, suggesting that monotony plays a role in generating hypovigilance (Thiffault & Bergeron, 2003b). Research to date has tended to conceptualise monotony as a uni-dimensional task characteristic, typically used over a prolonged period of time to facilitate other factors under investigation, most notably fatigue. However, more often than not, more than one exogenous factor relating to the task or operating environment is manipulated to vary or generate monotony (Mascord & Heath, 1992). Here we aimed to explore whether monotony is a multi-dimensional construct that is determined by characteristics of both the task proper and the task environment. The general assumption that monotony is a task characteristic used solely to elicit hypovigilance or poor performance related to fatigue appears to have led to there being little rigorous investigation into the exact nature of the relationship. While the two concepts are undoubtedly linked, the independent effect of monotony on hypovigilance remains largely ignored. Notwithstanding, there is evidence that monotony effects can emerge very early in vigilance tasks and are not necessarily accompanied by fatigue (see Meuter, Rakotonirainy, Johns, & Wagner, 2005). This phenomenon raises a largely untested, empirical question explored in two studies: Can hypovigilance emerge as a consequence of task and/or environmental monotony, independent of time on task and fatigue? In Study 1, using a short computerised vigilance task requiring responses to be withheld to infrequent targets, we explored the differential impacts of stimuli and task demand manipulations on the development of a monotonous context and the associated effects on vigilance performance (as indexed by respone errors and response times), independent of fatigue and time on task. The role of individual differences (sensation seeking, extroversion and cognitive failures) in moderating monotony effects was also considered. The results indicate that monotony affects sustained attention, with hypovigilance and associated performance worse in monotonous than in non-monotonous contexts. Critically, performance decrements emerged early in the task (within 4.3 minutes) and remained consistent over the course of the experiment (21.5 minutes), suggesting that monotony effects can operate independent of time on task and fatigue. A combination of low task demands and low stimulus variability form a monotonous context characterised by hypovigilance and poor task performance. Variations to task demand and stimulus variability were also found to independently affect performance, suggesting that monotony is a multi-dimensional construct relating to both task monotony (associated with the task itself) and environmental monotony (related to characteristics of the stimulus). Consequently, it can be concluded that monotony is multi-dimensional and is characterised by low variability in stimuli and/or task demands. The proposition that individual differences emerge under conditions of varying monotony with high sensation seekers and/or extroverts performing worse in monotonous contexts was only partially supported. Using a driving simulator, the findings of Study 1 were extended to a driving context to identify the behavioural and psychophysiological indices of monotony-related hypovigilance associated with variations to road design and road side scenery (Study 2). Supporting the proposition that monotony is a multi-dimensional construct, road design variability emerged as a key moderating characteristic of environmental monotony, resulting in poor driving performance indexed by decrements in steering wheel measures (mean lateral position). Sensation seeking also emerged as a moderating factor, where participants high in sensation seeking tendencies displayed worse driving behaviour in monotonous conditions. Importantly, impaired driving performance was observed within 8 minutes of commencing the driving task characterised by environmental monotony (low variability in road design) and was not accompanied by a decline in psychophysiological arousal. In addition, no subjective declines in alertness were reported. With fatigue effects associated with prolonged driving (van der Hulst, Meijman, & Rothengatter, 2001) and indexed by drowsiness, this pattern of results indicates that monotony can affect driver vigilance, independent of time on task and fatigue. Perceptual load theory (Lavie, 1995, 2005) and mindlessness theory (Robertson, Manly, Andrade, Baddley, & Yiend, 1997) provide useful theoretical frameworks for explaining and predicting monotony effects by positing that the low load (of task and/or stimuli) associated with a monotonous task results in spare attentional capacity which spills over involuntarily, resulting in the processing of task-irrelevant stimuli or task unrelated thoughts. That is, individuals – even when not fatigued - become easily distracted when performing a highly monotonous task, resulting in hypovigilance and impaired performance. The implications for road safety, including the likely effectiveness of fatigue countermeasures to mitigate monotony-related driver hypovigilance are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper gives an overview of the INEX 2009 Ad Hoc Track. The main goals of the Ad Hoc Track were three-fold. The first goal was to investigate the impact of the collection scale and markup, by using a new collection that is again based on a the Wikipedia but is over 4 times larger, with longer articles and additional semantic annotations. For this reason the Ad Hoc track tasks stayed unchanged, and the Thorough Task of INEX 2002–2006 returns. The second goal was to study the impact of more verbose queries on retrieval effectiveness, by using the available markup as structural constraints—now using both the Wikipedia’s layout-based markup, as well as the enriched semantic markup—and by the use of phrases. The third goal was to compare different result granularities by allowing systems to retrieve XML elements, ranges of XML elements, or arbitrary passages of text. This investigates the value of the internal document structure (as provided by the XML mark-up) for retrieving relevant information. The INEX 2009 Ad Hoc Track featured four tasks: For the Thorough Task a ranked-list of results (elements or passages) by estimated relevance was needed. For the Focused Task a ranked-list of non-overlapping results (elements or passages) was needed. For the Relevant in Context Task non-overlapping results (elements or passages) were returned grouped by the article from which they came. For the Best in Context Task a single starting point (element start tag or passage start) for each article was needed. We discuss the setup of the track, and the results for the four tasks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Windows are one of the most significant elements in the design of buildings. Whether there are small punched openings in the facade or a completely glazed curtain wall, windows are usually a dominant feature of the building's exterior appearance. From the energy use perspective, windows may also be regarded as thermal holes for a building. Therefore, window design and selection must take both aesthetics and serviceability into consideration. In this paper, using building computer simulation techniques, the effects of glass types on the thermal and energy performance of a sample air-conditioned office building in Australia are studied. It is found that a glass type with lower shading coefficient will have a lower building cooling load and total energy use. Through the comparison of results between current and future weather scenarios, it is identified that the pattern found from the current weather scenario would also exist in the future weather scenario, although the scale of change would become smaller. The possible implication of glazing selection in face of global warming is also examined. It is found that compared with its influence on building thermal performance, its influence on the building energy use is relatively small or insignificant.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Discovering proper search intents is a vi- tal process to return desired results. It is constantly a hot research topic regarding information retrieval in recent years. Existing methods are mainly limited by utilizing context-based mining, query expansion, and user profiling techniques, which are still suffering from the issue of ambiguity in search queries. In this pa- per, we introduce a novel ontology-based approach in terms of a world knowledge base in order to construct personalized ontologies for identifying adequate con- cept levels for matching user search intents. An iter- ative mining algorithm is designed for evaluating po- tential intents level by level until meeting the best re- sult. The propose-to-attempt approach is evaluated in a large volume RCV1 data set, and experimental results indicate a distinct improvement on top precision after compared with baseline models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There is an intimate interconnectivity between policy guidelines defining reform and the delineation of what research methods would be subsequently applied to determine reform success. Research is guided as much by the metaphors describing it as by the ensuing empirical definition of actions of results obtained from it. In a call for different reform policy metaphors Lumby and English (2010) note, “The primary responsibility for the parlous state of education... lies with the policy makers that have racked our schools with reductive and dehumanizing processes, following the metaphors of market efficiency, and leadership models based on accounting and the characteristics of machine bureaucracy” (p. 127)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The growth of solid tumours beyond a critical size is dependent upon angiogenesis, the formation of new blood vessels from an existing vasculature. Tumours may remain dormant at microscopic sizes for some years before switching to a mode in which growth of a supportive vasculature is initiated. The new blood vessels supply nutrients, oxygen, and access to routes by which tumour cells may travel to other sites within the host (metastasize). In recent decades an abundance of biological research has focused on tumour-induced angiogenesis in the hope that treatments targeted at the vasculature may result in a stabilisation or regression of the disease: a tantalizing prospect. The complex and fascinating process of angiogenesis has also attracted the interest of researchers in the field of mathematical biology, a discipline that is, for mathematics, relatively new. The challenge in mathematical biology is to produce a model that captures the essential elements and critical dependencies of a biological system. Such a model may ultimately be used as a predictive tool. In this thesis we examine a number of aspects of tumour-induced angiogenesis, focusing on growth of the neovasculature external to the tumour. Firstly we present a one-dimensional continuum model of tumour-induced angiogenesis in which elements of the immune system or other tumour-cytotoxins are delivered via the newly formed vessels. This model, based on observations from experiments by Judah Folkman et al., is able to show regression of the tumour for some parameter regimes. The modelling highlights a number of interesting aspects of the process that may be characterised further in the laboratory. The next model we present examines the initiation positions of blood vessel sprouts on an existing vessel, in a two-dimensional domain. This model hypothesises that a simple feedback inhibition mechanism may be used to describe the spacing of these sprouts with the inhibitor being produced by breakdown of the existing vessel's basement membrane. Finally, we have developed a stochastic model of blood vessel growth and anastomosis in three dimensions. The model has been implemented in C++, includes an openGL interface, and uses a novel algorithm for calculating proximity of the line segments representing a growing vessel. This choice of programming language and graphics interface allows for near-simultaneous calculation and visualisation of blood vessel networks using a contemporary personal computer. In addition the visualised results may be transformed interactively, and drop-down menus facilitate changes in the parameter values. Visualisation of results is of vital importance in the communication of mathematical information to a wide audience, and we aim to incorporate this philosophy in the thesis. As biological research further uncovers the intriguing processes involved in tumourinduced angiogenesis, we conclude with a comment from mathematical biologist Jim Murray, Mathematical biology is : : : the most exciting modern application of mathematics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Global warming can have a significant impact on building energy performance and indoor thermal environment, as well as the health and productivity of people living and working inside them. Through the building simulation technique, this paper investigates the adaptation potential of different selections of building physical properties to increased outdoor temperature in Australia. It is found that overall, an office building with lower insulation level, smaller window to wall ratio and/or a glass type with lower shading coefficient, and lower internal load density will have the effect of lowering building cooling load and total energy use, and therefore have a better potential to adapt to the warming external climate. Compared with clear glass, it is shown that the use of reflective glass for the sample building with WWR being 0.5 reduces the building cooling load by more than 12%. A lower internal load can also have a significant impact on the reduction of building cooling load, as well as the building energy use. Through the comparison of results between current and future weather scenarios, it is found that the patterns found in the current weather scenario also exist in the future weather scenarios, but to a smaller extent.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Construction is undoubtedly the most dangerous industry in Hong Kong, being responsible for 76 percent of all fatal accidents in industry in the region – around twenty times more than any other industry. In this paper, it is argued that while this rate can be largely reduced by improved production practices in isolation from the project’s physical design, there is some scope for the design team to contribute to site safety. A new safety assessment method, the Virtual Safety Assessment System (VSAS), is described which offers assistance. This involves individual construction workers being presented with 3D virtual risky scenarios of their project and a range of possible actions for selection. The method provides an analysis of results, including an assessment of the correctness or otherwise of the user’s selections, contributing to an iterative process of retraining and testing until a satisfactory level of knowledge and skill is achieved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Prevention and safety promotion programmes. Traditionally, in-depth investigations of crash risks are conducted using exposure controlled study or case-control methodology. However, these studies need either observational data for control cases or exogenous exposure data like vehicle-kilometres travel, entry flow or product of conflicting flow for a particular traffic location, or a traffic site. These data are not readily available and often require extensive data collection effort on a system-wide basis. Aim: The objective of this research is to propose an alternative methodology to investigate crash risks of a road user group in different circumstances using readily available traffic police crash data. Methods: This study employs a combination of a log-linear model and the quasi-induced exposure technique to estimate crash risks of a road user group. While the log-linear model reveals the significant interactions and thus the prevalence of crashes of a road user group under various sets of traffic, environmental and roadway factors, the quasi-induced exposure technique estimates relative exposure of that road user in the same set of explanatory variables. Therefore, the combination of these two techniques provides relative measures of crash risks under various influences of roadway, environmental and traffic conditions. The proposed methodology has been illustrated using Brisbane motorcycle crash data of five years. Results: Interpretations of results on different combination of interactive factors show that the poor conspicuity of motorcycles is a predominant cause of motorcycle crashes. Inability of other drivers to correctly judge the speed and distance of an oncoming motorcyclist is also evident in right-of-way violation motorcycle crashes at intersections. Discussion and Conclusions: The combination of a log-linear model and the induced exposure technique is a promising methodology and can be applied to better estimate crash risks of other road users. This study also highlights the importance of considering interaction effects to better understand hazardous situations. A further study on the comparison between the proposed methodology and case-control method would be useful.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nutrition interventions in the form of both self-management education and individualised diet therapy are considered essential for the long-term management of type 2 diabetes mellitus (T2DM). The measurement of diet is essential to inform, support and evaluate nutrition interventions in the management of T2DM. Barriers inherent within health care settings and systems limit ongoing access to personnel and resources, while traditional prospective methods of assessing diet are burdensome for the individual and often result in changes in typical intake to facilitate recording. This thesis investigated the inclusion of information and communication technologies (ICT) to overcome limitations to current approaches in the nutritional management of T2DM, in particular the development, trial and evaluation of the Nutricam dietary assessment method (NuDAM) consisting of a mobile phone photo/voice application to assess nutrient intake in a free-living environment with older adults with T2DM. Study 1: Effectiveness of an automated telephone system in promoting change in dietary intake among adults with T2DM The effectiveness of an automated telephone system, Telephone-Linked Care (TLC) Diabetes, designed to deliver self-management education was evaluated in terms of promoting dietary change in adults with T2DM and sub-optimal glycaemic control. In this secondary data analysis independent of the larger randomised controlled trial, complete data was available for 95 adults (59 male; mean age(±SD)=56.8±8.1 years; mean(±SD)BMI=34.2±7.0kg/m2). The treatment effect showed a reduction in total fat of 1.4% and saturated fat of 0.9% energy intake, body weight of 0.7 kg and waist circumference of 2.0 cm. In addition, a significant increase in the nutrition self-efficacy score of 1.3 (p<0.05) was observed in the TLC group compared to the control group. The modest trends observed in this study indicate that the TLC Diabetes system does support the adoption of positive nutrition behaviours as a result of diabetes self-management education, however caution must be applied in the interpretation of results due to the inherent limitations of the dietary assessment method used. The decision to use a close-list FFQ with known bias may have influenced the accuracy of reporting dietary intake in this instance. This study provided an example of the methodological challenges experienced with measuring changes in absolute diet using a FFQ, and reaffirmed the need for novel prospective assessment methods capable of capturing natural variance in usual intakes. Study 2: The development and trial of NuDAM recording protocol The feasibility of the Nutricam mobile phone photo/voice dietary record was evaluated in 10 adults with T2DM (6 Male; age=64.7±3.8 years; BMI=33.9±7.0 kg/m2). Intake was recorded over a 3-day period using both Nutricam and a written estimated food record (EFR). Compared to the EFR, the Nutricam device was found to be acceptable among subjects, however, energy intake was under-recorded using Nutricam (-0.6±0.8 MJ/day; p<0.05). Beverages and snacks were the items most frequently not recorded using Nutricam; however forgotten meals contributed to the greatest difference in energy intake between records. In addition, the quality of dietary data recorded using Nutricam was unacceptable for just under one-third of entries. It was concluded that an additional mechanism was necessary to complement dietary information collected via Nutricam. Modifications to the method were made to allow for clarification of Nutricam entries and probing forgotten foods during a brief phone call to the subject the following morning. The revised recording protocol was evaluated in Study 4. Study 3: The development and trial of the NuDAM analysis protocol Part A explored the effect of the type of portion size estimation aid (PSEA) on the error associated with quantifying four portions of 15 single foods items contained in photographs. Seventeen dietetic students (1 male; age=24.7±9.1 years; BMI=21.1±1.9 kg/m2) estimated all food portions on two occasions: without aids and with aids (food models or reference food photographs). Overall, the use of a PSEA significantly reduced mean (±SD) group error between estimates compared to no aid (-2.5±11.5% vs. 19.0±28.8%; p<0.05). The type of PSEA (i.e. food models vs. reference food photograph) did not have a notable effect on the group estimation error (-6.7±14.9% vs. 1.4±5.9%, respectively; p=0.321). This exploratory study provided evidence that the use of aids in general, rather than the type, was more effective in reducing estimation error. Findings guided the development of the Dietary Estimation and Assessment Tool (DEAT) for use in the analysis of the Nutricam dietary record. Part B evaluated the effect of the DEAT on the error associated with the quantification of two 3-day Nutricam dietary records in a sample of 29 dietetic students (2 males; age=23.3±5.1 years; BMI=20.6±1.9 kg/m2). Subjects were randomised into two groups: Group A and Group B. For Record 1, the use of the DEAT (Group A) resulted in a smaller error compared to estimations made without the tool (Group B) (17.7±15.8%/day vs. 34.0±22.6%/day, p=0.331; respectively). In comparison, all subjects used the DEAT to estimate Record 2, with resultant error similar between Group A and B (21.2±19.2%/day vs. 25.8±13.6%/day; p=0.377 respectively). In general, the moderate estimation error associated with quantifying food items did not translate into clinically significant differences in the nutrient profile of the Nutricam dietary records, only amorphous foods were notably over-estimated in energy content without the use of the DEAT (57kJ/day vs. 274kJ/day; p<0.001). A large proportion (89.6%) of the group found the DEAT helpful when quantifying food items contained in the Nutricam dietary records. The use of the DEAT reduced quantification error, minimising any potential effect on the estimation of energy and macronutrient intake. Study 4: Evaluation of the NuDAM The accuracy and inter-rater reliability of the NuDAM to assess energy and macronutrient intake was evaluated in a sample of 10 adults (6 males; age=61.2±6.9 years; BMI=31.0±4.5 kg/m2). Intake recorded using both the NuDAM and a weighed food record (WFR) was coded by three dietitians and compared with an objective measure of total energy expenditure (TEE) obtained using the doubly labelled water technique. At the group level, energy intake (EI) was under-reported to a similar extent using both methods, with the ratio of EI:TEE was 0.76±0.20 for the NuDAM and 0.76±0.17 for the WFR. At the individual level, four subjects reported implausible levels of energy intake using the WFR method, compared to three using the NuDAM. Overall, moderate to high correlation coefficients (r=0.57-0.85) were found across energy and macronutrients except fat (r=0.24) between the two dietary measures. High agreement was observed between dietitians for estimates of energy and macronutrient derived for both the NuDAM (ICC=0.77-0.99; p<0.001) and WFR (ICC=0.82-0.99; p<0.001). All subjects preferred using the NuDAM over the WFR to record intake and were willing to use the novel method again over longer recording periods. This research program explored two novel approaches which utilised distinct technologies to aid in the nutritional management of adults with T2DM. In particular, this thesis makes a significant contribution to the evidence base surrounding the use of PhRs through the development, trial and evaluation of a novel mobile phone photo/voice dietary record. The NuDAM is an extremely promising advancement in the nutritional management of individuals with diabetes and other chronic conditions. Future applications lie in integrating the NuDAM with other technologies to facilitate practice across the remaining stages of the nutrition care process.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The discovery of protein variation is an important strategy in disease diagnosis within the biological sciences. The current benchmark for elucidating information from multiple biological variables is the so called “omics” disciplines of the biological sciences. Such variability is uncovered by implementation of multivariable data mining techniques which come under two primary categories, machine learning strategies and statistical based approaches. Typically proteomic studies can produce hundreds or thousands of variables, p, per observation, n, depending on the analytical platform or method employed to generate the data. Many classification methods are limited by an n≪p constraint, and as such, require pre-treatment to reduce the dimensionality prior to classification. Recently machine learning techniques have gained popularity in the field for their ability to successfully classify unknown samples. One limitation of such methods is the lack of a functional model allowing meaningful interpretation of results in terms of the features used for classification. This is a problem that might be solved using a statistical model-based approach where not only is the importance of the individual protein explicit, they are combined into a readily interpretable classification rule without relying on a black box approach. Here we incorporate statistical dimension reduction techniques Partial Least Squares (PLS) and Principal Components Analysis (PCA) followed by both statistical and machine learning classification methods, and compared them to a popular machine learning technique, Support Vector Machines (SVM). Both PLS and SVM demonstrate strong utility for proteomic classification problems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Studies of orthographic skills transfer between languages focus mostly on working memory (WM) ability in alphabetic first language (L1) speakers when learning another, often alphabetically congruent, language. We report two studies that, instead, explored the transferability of L1 orthographic processing skills in WM in logographic-L1 and alphabetic-L1 speakers. English-French bilingual and English monolingual (alphabetic-L1) speakers, and Chinese-English (logographic-L1) speakers, learned a set of artificial logographs and associated meanings (Study 1). The logographs were used in WM tasks with and without concurrent articulatory or visuo-spatial suppression. The logographic-L1 bilinguals were markedly less affected by articulatory suppression than alphabetic-L1 monolinguals (who did not differ from their bilingual peers). Bilinguals overall were less affected by spatial interference, reflecting superior phonological processing skills or, conceivably, greater executive control. A comparison of span sizes for meaningful and meaningless logographs (Study 2) replicated these findings. However, the logographic-L1 bilinguals’ spans in L1 were measurably greater than those of their alphabetic-L1 (bilingual and monolingual) peers; a finding unaccounted for by faster articulation rates or differences in general intelligence. The overall pattern of results suggests an advantage (possibly perceptual) for logographic-L1 speakers, over and above the bilingual advantage also seen elsewhere in third language (L3) acquisition.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Demineralized freeze-dried bone allografts (DFDBAs) have been proposed as a useful adjunct in periodontal therapy to induce periodontal regeneration through the induction of new bone formation. The presence of bone morphogenetic proteins (BMPs) within the demineralized matrix has been proposed as a possible mechanism through which DFDBA may exert its biologic effect. However, in recent years, the predictability of results using DFDBA has been variable and has led to its use being questioned. One reason for the variability in tissue response may be attributed to differences in the processing of DFDBA, which may lead to loss of activity of any bioactive substances within the DFDBA matrix. Therefore, the purpose of this investigation was to determine whether there are detectable levels of bone morphogenetic proteins in commercial DFDBA preparations. METHODS: A single preparation of DFDBA was obtained from three commercial sources. Each preparation was studied in triplicate. Proteins within the DFDBA samples were first extracted with 4M guanidinium HCI for seven days at 40 degrees celsius and the residue was further extracted with 4M guanidinium HCL/EDTA for seven days at 40 degrees celsius. Two anti-human BMP-2 and -4 antibodies were used for the detection of the presence of BMP's in the extracts. RESULTS: Neither BMP-2 nor BMP-4 was detected in any of the extracts. When recombinant human BMP-2 and -4 were added throughout the extraction process of DFDBA extraction, not only were intact proteins detected but smaller molecular weight fragments were also noted in the extract. CONCLUSIONS: These results indicate that all of the DFDBA samples tested had no detectable amounts of BMP-2 and -4. In addition, an unknown substance present in the DFDBA may be responsible for degradation of whatever BMPs might be present.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper explores how mandated literacy assessment is reorganising teachers’ work in the context of Australia’s National Assessment Program – Literacy and Numeracy (NAPLAN), which was implemented in 2008. Students in Years 3, 5, 7 and 9 are tested annually, with school results publicly available. The wider policy context and the emergence of different forms of interconnected educational work associated with the testing phenomenon are described. Taking an Institutional Ethnography approach, the local effects of the federal policy regime are examined through a case study of one school. What mandated literacy assessment does to educators’ work in a culturally diverse low socioeconomic school community is discussed. Key themes include strategic exclusions of students from the testing process, appropriations and adaptations of literacy theory, work intensification, and ethical mediation of results. Questions concerning equity are raised about the differential effects of policy in different school contexts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Total hip arthroplasty (THA) is a commonly performed procedure and numbers are increasing with ageing populations. One of the most serious complications in THA are surgical site infections (SSIs), caused by pathogens entering the wound during the procedure. SSIs are associated with a substantial burden for health services, increased mortality and reduced functional outcomes in patients. Numerous approaches to preventing these infections exist but there is no gold standard in practice and the cost-effectiveness of alternate strategies is largely unknown. Objectives The aim of this project was to evaluate the cost-effectiveness of strategies claiming to reduce deep surgical site infections following total hip arthroplasty in Australia. The objectives were: 1. Identification of competing strategies or combinations of strategies that are clinically relevant to the control of SSI related to hip arthroplasty 2. Evidence synthesis and pooling of results to assess the volume and quality of evidence claiming to reduce the risk of SSI following total hip arthroplasty 3. Construction of an economic decision model incorporating cost and health outcomes for each of the identified strategies 4. Quantification of the effect of uncertainty in the model 5. Assessment of the value of perfect information among model parameters to inform future data collection Methods The literature relating to SSI in THA was reviewed, in particular to establish definitions of these concepts, understand mechanisms of aetiology and microbiology, risk factors, diagnosis and consequences as well as to give an overview of existing infection prevention measures. Published economic evaluations on this topic were also reviewed and limitations for Australian decision-makers identified. A Markov state-transition model was developed for the Australian context and subsequently validated by clinicians. The model was designed to capture key events related to deep SSI occurring within the first 12 months following primary THA. Relevant infection prevention measures were selected by reviewing clinical guideline recommendations combined with expert elicitation. Strategies selected for evaluation were the routine use of pre-operative antibiotic prophylaxis (AP) versus no use of antibiotic prophylaxis (No AP) or in combination with antibiotic-impregnated cement (AP & ABC) or laminar air operating rooms (AP & LOR). The best available evidence for clinical effect size and utility parameters was harvested from the medical literature using reproducible methods. Queensland hospital data were extracted to inform patients’ transitions between model health states and related costs captured in assigned treatment codes. Costs related to infection prevention were derived from reliable hospital records and expert opinion. Uncertainty of model input parameters was explored in probabilistic sensitivity analyses and scenario analyses and the value of perfect information was estimated. Results The cost-effectiveness analysis was performed from a health services perspective using a hypothetical cohort of 30,000 THA patients aged 65 years. The baseline rate of deep SSI was 0.96% within one year of a primary THA. The routine use of antibiotic prophylaxis (AP) was highly cost-effective and resulted in cost savings of over $1.6m whilst generating an extra 163 QALYs (without consideration of uncertainty). Deterministic and probabilistic analysis (considering uncertainty) identified antibiotic prophylaxis combined with antibiotic-impregnated cement (AP & ABC) to be the most cost-effective strategy. Using AP & ABC generated the highest net monetary benefit (NMB) and an incremental $3.1m NMB compared to only using antibiotic prophylaxis. There was a very low error probability that this strategy might not have the largest NMB (<5%). Not using antibiotic prophylaxis (No AP) or using both antibiotic prophylaxis combined with laminar air operating rooms (AP & LOR) resulted in worse health outcomes and higher costs. Sensitivity analyses showed that the model was sensitive to the initial cohort starting age and the additional costs of ABC but the best strategy did not change, even for extreme values. The cost-effectiveness improved for a higher proportion of cemented primary THAs and higher baseline rates of deep SSI. The value of perfect information indicated that no additional research is required to support the model conclusions. Conclusions Preventing deep SSI with antibiotic prophylaxis and antibiotic-impregnated cement has shown to improve health outcomes among hospitalised patients, save lives and enhance resource allocation. By implementing a more beneficial infection control strategy, scarce health care resources can be used more efficiently to the benefit of all members of society. The results of this project provide Australian policy makers with key information about how to efficiently manage risks of infection in THA.