153 resultados para Barrett, Ellen M.
Resumo:
In this study, sustained, selective, divided, and switching attention, and reloading of working memory were investigated in schizophrenia by using a newly developed Visual Attention Battery (VAB). Twenty-four outpatients with schizophrenia and 24 control participants were studied using the VAB. Performance on VAB components was correlated with performance of standard tests. Patients with schizophrenia were significantly impaired on VAB tasks that required switching of attention and reloading of working memory but had normal performance on tasks involving sustained attention or attention to multiple stimulus features. Switching attention and reloading of working memory were highly correlated with Trails (B - A) score for patients. The decline in performance on the switching-attention task in patients with schizophrenia met criteria for a differential deficit in switching attention. Future research should examine the neurophysiological basis of the switching deficit and its sensitivity and specificity to schizophrenia.
Groundwater flow model of the Logan river alluvial aquifer system Josephville, South East Queensland
Resumo:
The study focuses on an alluvial plain situated within a large meander of the Logan River at Josephville near Beaudesert which supports a factory that processes gelatine. The plant draws water from on site bores, as well as the Logan River, for its production processes and produces approximately 1.5 ML per day (Douglas Partners, 2004) of waste water containing high levels of dissolved ions. At present a series of treatment ponds are used to aerate the waste water reducing the level of organic matter; the water is then used to irrigate grazing land around the site. Within the study the hydrogeology is investigated, a conceptual groundwater model is produced and a numerical groundwater flow model is developed from this. On the site are several bores that access groundwater, plus a network of monitoring bores. Assessment of drilling logs shows the area is formed from a mixture of poorly sorted Quaternary alluvial sediments with a laterally continuous aquifer comprised of coarse sands and fine gravels that is in contact with the river. This aquifer occurs at a depth of between 11 and 15 metres and is overlain by a heterogeneous mixture of silts, sands and clays. The study investigates the degree of interaction between the river and the groundwater within the fluvially derived sediments for reasons of both environmental monitoring and sustainability of the potential local groundwater resource. A conceptual hydrogeological model of the site proposes two hydrostratigraphic units, a basal aquifer of coarse-grained materials overlain by a thick semi-confining unit of finer materials. From this, a two-layer groundwater flow model and hydraulic conductivity distribution was developed based on bore monitoring and rainfall data using MODFLOW (McDonald and Harbaugh, 1988) and PEST (Doherty, 2004) based on GMS 6.5 software (EMSI, 2008). A second model was also considered with the alluvium represented as a single hydrogeological unit. Both models were calibrated to steady state conditions and sensitivity analyses of the parameters has demonstrated that both models are very stable for changes in the range of ± 10% for all parameters and still reasonably stable for changes up to ± 20% with RMS errors in the model always less that 10%. The preferred two-layer model was found to give the more realistic representation of the site, where water level variations and the numerical modeling showed that the basal layer of coarse sands and fine gravels is hydraulically connected to the river and the upper layer comprising a poorly sorted mixture of silt-rich clays and sands of very low permeability limits infiltration from the surface to the lower layer. The paucity of historical data has limited the numerical modelling to a steady state one based on groundwater levels during a drought period and forecasts for varying hydrological conditions (e.g. short term as well as prolonged dry and wet conditions) cannot reasonably be made from such a model. If future modelling is to be undertaken it is necessary to establish a regular program of groundwater monitoring and maintain a long term database of water levels to enable a transient model to be developed at a later stage. This will require a valid monitoring network to be designed with additional bores required for adequate coverage of the hydrogeological conditions at the Josephville site. Further investigations would also be enhanced by undertaking pump testing to investigate hydrogeological properties in the aquifer.
Resumo:
The evolution of the laptop computer as a musical instrument in the 1990s provided a tool for empowering the solo musician and divergent approaches to the application of this technology in performance remain consistently debated. The increasing ubiquity of digital media combined with the power of current generation notebook technology has provided the perfect platform to realise integrated audio-visual toolsets that respond to musical controllers and provide mixed-media results. Despite emerging practitioners increasingly availing themselves to the musical affordances of this technology, theoretical discussion in the field ignores the various approaches a solo musician might take in developing integrated media works for performance. In an increasingly crowded niche there is a clear compulsion to consider expanded modes of performance, yet lacking any formal framework these integrations can easily alienate an audience, distract from performance and lead to criticisms of novelty for novelty's sake.
Resumo:
We consider the problem of designing a surveillance system to detect a broad range of invasive species across a heterogeneous sampling frame. We present a model to detect a range of invertebrate invasives whilst addressing the challenges of multiple data sources, stratifying for differential risk, managing labour costs and providing sufficient power of detection.We determine the number of detection devices required and their allocation across the landscape within limiting resource constraints. The resulting plan will lead to reduced financial and ecological costs and an optimal surveillance system.
Resumo:
The researcher’s professional role as an Education Officer was the impetus for this study. Designing and implementing professional development activities is a significant component of the researcher’s position description and as a result of reflection and feedback from participants and colleagues, the creation of a more effective model of professional development became the focus for this study. Few studies have examined all three links between the purposes of professional development that is, increasing teacher knowledge, improving teacher practice, and improving student outcomes. This study is significant in that it investigates the nature of the growth of teachers who participated in a model of professional development which was based upon the principles of Lesson Study. The research provides qualitative and empirical data to establish some links between teacher knowledge, teacher practice, and student learning outcomes. Teacher knowledge in this study refers to mathematics content knowledge as well as pedagogical-content knowledge. The outcomes for students include achievement outcomes, attitudinal outcomes, and behavioural outcomes. As the study was conducted at one school-site, existence proof research was the focus of the methodology and data collection. Developing over the 2007 school year, with five teacher-participants and approximately 160 students from Year Levels 6 to 9, the Lesson Study-principled model of professional development provided the teacher-participants with on-site, on-going, and reflective learning based on their classroom environment. The focus area for the professional development was strategising the engagement with and solution of worded mathematics problems. A design experiment was used to develop the professional development as an intervention of prevailing teacher practice for which data were collected prior to and after the period of intervention. A model of teacher change was developed as an underpinning framework for the development of the study, and was useful in making decisions about data collection and analyses. Data sources consisted of questionnaires, pre-tests and post-tests, interviews, and researcher observations and field notes. The data clearly showed that: content knowledge and pedagogical-content knowledge were increased among the teacher-participants; teacher practice changed in a positive manner; and that a majority of students demonstrated improved learning outcomes. The positive changes to teacher practice are described in this study as the demonstrated use of mixed pedagogical practices rather than a polarisation to either traditional pedagogical practices or contemporary pedagogical practices. The improvement in student learning outcomes was most significant as improved achievement outcomes as indicated by the comparison of pre-test and post-test scores. The effectiveness of the Lesson Study-principled model of professional development used in this study was evaluated using Guskey’s (2005) Five Levels of Professional Development Evaluation.
Resumo:
Highway construction often requires a significant capital input; therefore it often causes serious financial implications for developers, owners and operators. The recent industry-wide focus on sustainability has added a new dimension to the evaluation of highway projects, particularly on the economical scale of ‘going green’. Comprehensive analysis of the whole-of-life highway development that responds to sustainability challenges is one of the primary concerns for stakeholders. Principles of engineering economics and life cycle costing have been used to determine the incremental capacity investments for highway projects. However, the consideration of costs and issues associated with sustainability is still very limited in current studies on highway projects. Previous studies have identified that highway project investments are primarily concerned with direct market costs that can be quantified through life cycle costing analysis (LCCA). But they tend to ignore costs that are difficult to calculate, as those related to environmental and social elements. On a more positive note, these studies proved that the inclusion of such costs is an essential part of the overall development investment and a primary concern for decision making by the stakeholders. This paper discusses a research attempt to identify and categorise sustainability cost elements for highway projects. Through questionnaire survey, a set of sustainability cost elements on highway projects has been proposed. These cost elements are incorporated into the extension of some of the existing Life Cycle Costing Analysis (LCCA) models in order to produce a holistic financial picture of the highway project. It is expected that a new LCCA model will be established to serve as a suitable tool for decision making for highway project stakeholders.
Resumo:
The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.
Resumo:
Serotonergic hypofunction is associated with a depressive mood state, an increased drive to eat and preference for sweet (SW) foods. High-trait anxiety individuals are characterised by a functional shortage of serotonin during stress, which in turn increases their susceptibility to experience a negative mood and an increased drive for SW foods. The present study examined whether an acute dietary manipulation, intended to increase circulating serotonin levels, alleviated the detrimental effects of a stress-inducing task on subjective appetite and mood sensations, and preference for SW foods in high-trait anxiety individuals. Thirteen high- (eleven females and two males; anxiety scores 45·5 (sd 5·9); BMI 22·9 (sd 3·0)kg/m2) and twelve low- (ten females and two males; anxiety scores 30·4 (sd 4·8); BMI 23·4 (sd 2·5) kg/m2) trait anxiety individuals participated in a placebo-controlled, two-way crossover design. Participants were provided with 40 g α-lactalbumin (LAC; l-tryptophan (Trp):large neutral amino acids (LNAA) ratio of 7·6) and 40 g casein (placebo) (Trp:LNAA ratio of 4·0) in the form of a snack and lunch on two test days. On both the test days, participants completed a stress-inducing task 2 h after the lunch. Mood and appetite were assessed using visual analogue scales. Changes in food hedonics for different taste and nutrient combinations were assessed using a computer task. The results demonstrated that the LAC manipulation did not exert any immediate effects on mood or appetite. However, LAC did have an effect on food hedonics in individuals with high-trait anxiety after acute stress. These individuals expressed a lower liking (P = 0·012) and SW food preference (P = 0·014) after the stressful task when supplemented with LAC.
Resumo:
Staphylococcus aureus is a common pathogen that causes a variety of infections including soft tissue infections, impetigo, septicemia toxic shock and scalded skin syndrome. Traditionally, Methicillin-Resistant Staphylococcus aureus (MRSA) was considered a Hospital-Acquired (HA) infection. It is now recognised that the frequency of infections with MRSA is increasing in the community, and that these infections are not originating from hospital environments. A 2007 report by the Centers for Disease Control and Prevention (CDC) stated that Staphylococcus aureus is the most important cause of serious and fatal infections in the USA. Community-Acquired MRSA (CA-MRSA) are genetically diverse and distinct, meaning they are able to be identified and tracked by way of genotyping. Genotyping of MRSA using Single nucleotide polymorphisms (SNPs) is a rapid and robust method for monitoring MRSA, specifically ST93 (Queensland Clone) dissemination in the community. It has been shown that a large proportion of CA-MRSA infections in Queensland and New South Wales are caused by ST93. The rationale for this project was that SNP analysis of MLST genes is a rapid and cost-effective method for genotyping and monitoring MRSA dissemination in the community. In this study, 16 different sequence types (ST) were identified with 41% of isolates identified as ST93 making it the predominate clone. Males and Females were infected equally with an average patient age of 45yrs. Phenotypically, all of the ST93 had an identical antimicrobial resistance pattern. They were resistant to the β-lactams – Penicillin, Flu(di)cloxacillin and Cephalothin but sensitive to all other antibiotics tested. Virulence factors play an important role in allowing S. aureus to cause disease by way of colonising, replication and damage to the host. One virulence factor of particular interest is the toxin Panton-Valentine leukocidin (PVL), which is composed of two separate proteins encoded by two adjacent genes. PVL positive CA-MRSA are shown to cause recurrent, chronic or severe skin and soft tissue infections. As a result, it is important that PVL positive CA-MRSA is genotyped and tracked. Especially now that CA-MRSA infections are more prevalent than HA-MRSA infections and are now deemed endemic in Australia. 98% of all isolates in this study tested positive for the PVL toxin gene. This study showed that PVL is present in many different community based ST, not just ST93, which were all PVL positive. With this toxin becoming entrenched in CA-MRSA, genotyping would provide more accurate data and a way of tracking the dissemination. PVL gene can be sub-typed using an allele-specific Real-Time PCR (RT-PCR) followed by High resolution meltanalysis. This allows the identification of PVL subtypes within the CA-MRSA population and allow the tracking of these clones in the community.
Resumo:
Queensland University of Technology has a long standing in providing tertiary education and training in ionising radiation. The radiological laboratory plays an important part in this education and training. As radiological applications are diversified in the fields of health and environment, the laboratory provides support for a number of scenarios in the use of experimental situations in radiation detection and radiation protection. This paper discusses the role that a radiological laboratory technician plays in the functionality of a radiological laboratory.
Resumo:
Background Despite the recognition of obesity in young people as a key health issue, there is limited evidence to inform health professionals regarding the most appropriate treatment options. The Eat Smart study aims to contribute to the knowledge base of effective dietary strategies for the clinical management of the obese adolescent and examine the cardiometablic effects of a reduced carbohydrate diet versus a low fat diet. Methods and design Eat Smart is a randomised controlled trial and aims to recruit 100 adolescents over a 2½ year period. Families will be invited to participate following referral by their health professional who has recommended weight management. Participants will be overweight as defined by a body mass index (BMI) greater than the 90th percentile, using CDC 2000 growth charts. An accredited 6-week psychological life skills program ‘FRIENDS for Life’, which is designed to provide behaviour change and coping skills will be undertaken prior to volunteers being randomised to group. The intervention arms include a structured reduced carbohydrate or a structured low fat dietary program based on an individualised energy prescription. The intervention will involve a series of dietetic appointments over 24 weeks. The control group will commence the dietary program of their choice after a 12 week period. Outcome measures will be assessed at baseline, week 12 and week 24. The primary outcome measure will be change in BMI z-score. A range of secondary outcome measures including body composition, lipid fractions, inflammatory markers, social and psychological measures will be measured. Discussion The chronic and difficult nature of treating the obese adolescent is increasingly recognised by clinicians and has highlighted the need for research aimed at providing effective intervention strategies, particularly for use in the tertiary setting. A structured reduced carbohydrate approach may provide a dietary pattern that some families will find more sustainable and effective than the conventional low fat dietary approach currently advocated. This study aims to investigate the acceptability and effectiveness of a structured reduced dietary carbohydrate intervention and will compare the outcomes of this approach with a structured low fat eating plan. Trial Registration: The protocol for this study is registered with the International Clinical Trials Registry (ISRCTN49438757).