944 resultados para lab
Resumo:
Many of the undergraduate and postgraduate programs of the former Faculty of Built Environment and Engineering PLUS Faculty of Sciences and Technology are changing as a result of merging these two large organisations, with some disciplines relocating to faculties of Creative Industries and Health respectively. The new STEM precinct under construction has begun rising from the proverbial hole-in-the-ground. Existing Surveying and Spatial Sciences programs, assets and staff are being repositioned with the newly formed School of Earth, Environment and Biological Sciences.2011. Golden graduates morning tea organised by QUT Alumni. Technology upgrades to the Mapping Sciences lab benefits 3-D learning experiences. Second and third-year students are undertaking Work Integrated Learning (WIL) over the summer vacation period. Final year students recently presented capstone project presentations at mini-conference in the Gibson Rooms overlooking a vibrant Southbank and sparkling Brisbane River. Discussion on end of year graduation ceremony held at QPAC.
Resumo:
This project involved the complete refurbishment and extension of a 1980’s two-storey domestic brick building, previously used as a Boarding House (Class 3), into Middle School facilities (Class 9b) on a heritage listed site at Nudgee College secondary school, Brisbane. The building now accommodates 12 technologically advanced classrooms, computer lab and learning support rooms, tuckshop, art room, mini library/reading/stage area, dedicated work areas for science and large projects with access to water on both floors, staff facilities and an undercover play area suitable for assemblies and presentations. The project was based on a Reggio Emilia approach, in which the organisation of the physical environment is referred to as the child’s third teacher, creating opportunities for complex, varied, sustained and changing relationships between people and ideas. Classrooms open to a communal centre piazza and are integrated with the rest of the school and the school with the surrounding community. In order to achieve this linkage of the building with the overall masterplan of the site, a key strategy of the internal planning was to orientate teaching areas around a well defined active circulation space that breaks out of the building form to legibly define the new access points to the building and connect up to the pathway network of the campus. The width of the building allowed for classrooms and a generous corridor that has become ‘breakout’ teaching areas for art, IT, and small group activities. Large sliding glass walls allow teachers to maintain supervision of students across all areas and allow maximum light penetration through small domestic window openings into the deep and low-height spaces. The building was also designed with an effort to uphold cultural characteristics from the Edmund Rice Education Charter (2004). Coherent planning is accompanied by a quality fit-out, creating a vibrant and memorable environment in which to deliver the upper primary curriculum. Consistent with the Reggio Emilia approach, materials, expressive of the school’s colours, are used in a contemporary, adventurous manner to create panels of colour useful for massing and defining the ‘breakout’ teaching areas and paths of travel, and storage elements are detailed and arranged to draw attention to their aesthetic features. Modifications were difficult due to the random placement of load bearing walls, minimum ceiling heights, the general standard of finishes and new fire and energy requirements, however the reuse of this building was assessed to be up to 30% cheaper than an equivalent new building, The fit out integrates information technology and services at a level not usually found in primary school facilities. This has been achieved within the existing building fabric through thoughtful detailing and co-ordination with allied disciplines.
Resumo:
A time-resolved inverse spatially offset Raman spectrometer was constructed for depth profiling of Raman-active substances under both the lab and the field environments. The system operating principles and performance are discussed along with its advantages relative to traditional continuous wave spatially offset Raman spectrometer. The developed spectrometer uses a combination of space- and time-resolved detection in order to obtain high-quality Raman spectra from substances hidden behind coloured opaque surface layers, such as plastic and garments, with a single measurement. The time-gated spatially offset Raman spectrometer was successfully used to detect concealed explosives and drug precursors under incandescent and fluorescent background light as well as under daylight. The average screening time was 50 s per measurement. The excitation energy requirements were relatively low (20 mW) which makes the probe safe for screening hazardous substances. The unit has been designed with nanosecond laser excitation and gated detection, making it of lower cost and complexity than previous picosecond-based systems, to provide a functional platform for in-line or in-field sensing of chemical substances.
Resumo:
A central topic in economics is the existence of social preferences. Behavioural economics in general has approached the issue from several angles. Controlled experimental settings, surveys, and field experiments are able to show that in a number of economic environments, people usually care about immaterial things such as fairness or equity of allocations. Findings from experimental economics specifically have lead to large increase in theories addressing social preferences. Most (pro)social phenomena are well understood in the experimental settings but very difficult to observe 'in the wild'. One criticism in this regard is that many findings are bound by the artificial environment of the computer lab or survey method used. A further criticism is that the traditional methods also fail to directly attribute the observed behaviour to the mental constructs that are expected to stand behind them. This thesis will first examine the usefulness of sports data to test social preference models in a field environment, thus overcoming limitations of the lab with regards to applicability to other - non-artificial - environments. The second major contribution of this research establishes a new neuroscientific tool - the measurement of the heart rate variability - to observe participants' emotional reactions in a traditional experimental setup.
Resumo:
The research field was curatorship of the Machinima genre - a film-making practice that uses real time 3D computer graphics engines to create cinematic productions. The context was the presentation of gallery non-specific work for large-scale exhibition, as an investigation in thinking beyond traditional strategies of white cube. Strongly influenced by the Christiane Paul (Ed) seminal text, 'New Media in the White Cube and Beyond, Curatorial Models for Digital Art', the context was the repositioning of a genre traditionally focussed on delivery through small-screen, indoor, personal spaces, to large exhibition hall spaces. Beyond the core questions of collecting, documenting, expanding and rethinking the place of Machinima within the history of contemporary digital arts, the curatorial premise asked how to best invert the relationship between context of media production within the gaming domain, using novel presentational strategies that might best promote the 'take-home' impulse. The exhibition was used not as the ultimate destination for work but rather as a place to experience, sort and choose from a high volume of possible works for subsequent investigation by audiences within their own game-ready, domestic environments. In pursuit of this core aim, the exhibition intentionally promoted 'sensory overload'. The exhibition also included a gaming lab experience where audiences could begin to learn the DIY concepts of the medium, and be stimulated to revisit, consider and re-make their own relationship to this genre. The research was predominantly practice-led and collaborative (in close concert with the Machinima community), and ethnographic in that it sought to work with, understand and promote the medium in a contemporary art context. This benchmark exhibition, building on the 15-year history of the medium, was warmly received by the global Machinima community as evidenced by the significant debate, feedback and general interest recorded. The exhibition has recently begun an ongoing Australian touring schedule. To date, the exhibition has received critical attention nationally and internationally in Das Superpaper, the Courier Mail, Machinimart, 4ZZZ-FM, the Sydney Morning Herald, Games and Business, Australian Gamer, Kotaku Australia, and the Age.
Resumo:
Honing and Ladinig (2008) make the assertion that while the internal validity of web-based studies may be reduced, this is offset by an increase in external validity possible when experimenters can sample a wider range of participants and experimental settings. In this paper, the issue of internal validity is more closely examined, and it is agued that there is no necessary reason why internal validity of a web-based study should be worse than that of a lab-based one. Errors of measurement or inconsistencies of manipulation will typically balance across conditions of the experiment, and thus need not necessarily threaten the validity of a study’s findings.
Resumo:
This architectural and urban design project was conducted as part of the Brisbane Airport Corporations master-planning Atelier, run in conjunction with City Lab. This creation and innovation event brought together approximately 80 designers, associated professionals, and both local and state government representatives to research concepts for future development and planning of the Brisbane airport site. The Team Delta research project explored the development of a new precinct cluster around the existing international terminal building; with a view of reinforcing the sense of place and arrival. The development zone explores the options of developing a subtropical character through landscape elements such as open plazas, tourist attractions, links to existing adjacent waterways, and localised rapid transport options. The proposal tests the possibilities of developing a cultural hub in conjunction with transport infrastructure and the airport terminal(s).
Resumo:
In this study on the basis of lab data and available resources in Bangladesh, feasibility study has been carried out for pyrolysis process converting solid tire wastes into pyrolysis oils, solid char and gases. The process considered for detailed analysis was fixed-bed fire-tube heating pyrolysis reactor system. The comparative techno-economic assessment was carried out in US$ for three different sizes plants: medium commercial scale (144 tons/day), small commercial scale (36 tons/day), pilot scale (3.6 tons/day). The assessment showed that medium commercial scale plant was economically feasible, with the lowest unit production cost than small commercial and pilot scale plants for the production of crude pyrolysis oil that could be used as boiler fuel oil and for the production of upgraded liquid-products.
Resumo:
Australia requires decisive action on climate change and issues of sustainability. The Urban Informatics Research Lab has been funded by the Queensland State Government to conduct a three year study (2009 – 2011) exploring ways to support Queensland residents in making more sustainable consumer and lifestyle choices. We conduct user-centred design research that inform the development of real-time, mobile, locational, networked information interfaces, feedback mechanisms and persuasive and motivational approaches that in turn assist in-situ decision making and environmental awareness in everyday settings. The study aims to deliver usable and useful prototypes offering individual and collective visualisations of ecological impact and opportunities for engagement and collaboration in order to foster a participatory and sustainable culture of life in Australia. Raising people’s awareness with environmental data and educational information does not necessarily trigger sufficient motivation to change their habits towards a more environmentally friendly and sustainable lifestyle. Our research seeks to develop a better understanding how to go beyond just informing and into motivating and encouraging action and change. Drawing on participatory culture, ubiquitous computing, and real-time information, the study delivers research that leads to viable new design approaches and information interfaces which will strengthen Australia’s position to meet the targets of the Clean Energy Future strategy, and contribute to the sustainability of a low-carbon future in Australia. As part of this program of research, the Urban Informatics Research Lab has been invited to partner with GV Community Energy Pty Ltd on a project funded by the Victorian Government Sustainability Fund. This feasibility report specifically looks at the challenges and opportunities of energy monitoring in households in Victoria that include a PV solar installation. The report is structured into two parts: In Part 1, we first review a range of energy monitoring solutions, both stand-alone and internet-enabled. This section primarily focusses on the technical capacilities. However, in order to understand this information and make an informed decision, it is crucial to understand the basic principles and limitations of energy monitoring as well as the opportunities and challenges of a networked approach towards energy monitoring which are discussed in Section 2.
Resumo:
When wheels pass over insulated rail joints (IRJs) a vertical impact force is generated. The ability to measure the impact force is valuable as the force signature helps understand the behaviour of the IRJs, in particular their potential for failure. The impact forces are thought to be one of the main factors that cause damage to the IRJ and track components. Study of the deterioration mechanism helps finding new methods to improve the service life of IRJs in track. In this research, the strain-gage-based wheel load detector, for the first time, is employed to measure the wheel–rail contact-impact force at an IRJ in a heavy haul rail line. In this technique, the strain gages are installed within the IRJ assembly without disturbing the structural integrity of IRJ and arranged in a full wheatstone bridge to form a wheel load detector. The instrumented IRJ is first tested and calibrated in the lab and then installed in the field. For comparison purposes, a reference rail section is also instrumented with the same strain gage pattern as the IRJ. In this paper the measurement technique, the process of instrumentation, and tests as well as some typical data obtained from the field and the inferences are presented.
Resumo:
Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.
Resumo:
The making of the modern world has long been fuelled by utopian images that are blind to ecologi- cal reality. Botanical gardens are but one example – who typically portray themselves as miniature, isolated 'edens on earth', whereas they are now in many cases self-evidently also the vital ‘lungs’ of crowded cities, as well as critical habitats for threat- ened biodiversity. In 2010 the 'Remnant Emergency Art lab' set out to question utopian thinking through a creative provocation called the 'Botanical Gardens ‘X-Tension’ - an imagined city-wide, distributed, network of 'ecological gardens' suited to both bat and human needs, in order to ask, what now needs to be better understood, connected and therefore ultimately conserved.
Resumo:
Introduction QC and EQA are integral to good pathology laboratory practice. Medical Laboratory Science students undertake a project exploring internal QC and EQA procedures used in chemical pathology laboratories. Each student represents an individual lab and the class group represents the peer group of labs performing the same assay using the same method. Methods Using a manual BCG assay for serum albumin, normal and abnormal controls are run with a patient sample over 7 weeks. The QC results are assessed each week using calculated z-scores and both 2S & 3S control rules to determine whether a run is ‘in control’. At the end of the 7 weeks a completed LJ chart is assessed using the Westgard Multirules. Students investigate causes of error and the implications for both lab practice and patient care if runs are not ‘in control’. Twice in the 7 weeks two EQA samples (with target values unknown) are assayed alongside the weekly QC and patient samples. Results from each student are collated and form the basis of an EQA program. ALP are provided and students complete a Youden Plot, which is used to analyse the performance of each ‘lab’ and the method to identify bias. Students explore the concept of possible clinical implications of a biased method and address the actions that should be taken if a lab is not in consensus with the peer group. Conclusion This project is a model of ‘real world’ practice in which student demonstrate an understanding of the importance of QC procedures in a pathology laboratory, apply and interpret statistics and QC rules and charts, apply critical thinking and analytical skills to quality performance data to make recommendations for further practice and improve their technical competence and confidence.
Resumo:
Background: Extra corporeal membrane oxygenation (ECMO) is a complex rescue therapy used to provide cardiac and/or respiratory support for critically ill patients who have failed maximal conventional medical management. ECMO is based on a modified cardiopulmonary bypass (CPB) circuit, and can provide cardiopulmonary support for up-to several months. It can be used in a veno venous configuration for isolated respiratory failure, (VV-ECMO), or in a veno arterial configuration (VA-ECMO) where support is necessary for cardiac +/- respiratory failure. The ECMO circuit consists of five main components: large bore cannulae (access cannulae) for drainage of the venous system, and return cannulae to either the venous (in VV-ECMO) or arterial (in VA ECMO) system. An oxygenator, with a vast surface area of hollow filaments, allows addition of oxygen and removal of carbon dioxide; a centrifugal blood pump allows propulsion of blood through the circuit at upto 10 L/minute; a control module and a thermoregulatory unit, which allows for exact temperature control of the extra corporeal blood. Methods: The first successful use of ECMO for ARDS in adults occurred in 1972, and its use has become more commonplace over the last 30 years, supported by the improvement in design and biocompatibility of the equipment, which has reduced the morbidity associated with this modality. Whilst the use of ECMO in neonatal population has been supported by numerous studies, the evidence upon which ECMO was integrated into adult practice was substantially less robust. Results: Recent data, including the CESAR study (Conventional Ventilatory Support versus Extra corporeal membrane oxygenation for Severe Respiratory failure) has added a degree of evidence to the role of ECMO in such a patient population. The CESAR study analysed 180 patients, and confirmed that ECMO was associated with an improved rate of survival. More recently, ECMO has been utilized in numerous situations within the critical care area, including support in high-risk percutaneous interventions in cardiac catheter lab; the operating room, emergency department, as well in specialized inter-hospital retrieval services. The increased understanding of the risk:benefit profile of ECMO, along with a reduction in morbidity associated with its use will doubtless lead to a substantial rise in the utilisation of this modality. As with all extra-corporeal circuits, ECMO opposes the basic premises of the mammalian inflammation and coagulation cascade where blood comes into foreign circulation, both these cascades are activated. Anti-coagulation is readily dealt with through use of agents such as heparin, but the inflammatory excess, whilst less macroscopically obvious, continues un-abated. Platelet consumption and neutrophil activation occur rapidly, and the clinician is faced with balancing the need of anticoagulation for the circuit, against haemostasis in an acutely bleeding patient. Alterations in pharmacokinetics may result in inadequate levels of disease modifying therapeutics, such as antibiotics, hence paradoxically delaying recovery from conditions such as pneumonia. Key elements of nutrition and the innate immune system maysimilarly be affected. Summary: This presentation will discuss the basic features of ECMO to the nonspecialist, and review the clinical conundrum faced by the team treating these most complex cases.
Resumo:
The Community Service-learning Lab (the Lab) was initiated as a university-wide service-learning experience at an Australian university. The Lab engages students, academics, and key community organisations in interdisciplinary action research projects to support student learning and to explore complex and ongoing problems nominated by the community partners. The current study uses feedback from the first offering of the Lab and focuses on exploring student experiences of the service learning project using an action research framework. Student reflections on this experience have revealed some positive outcomes of the Lab such as an appreciation for positive and strengths-based change. These outcomes are corroborated by collected reflections from community partners and academics. The students also identified challenges balancing the requirements for assessment and their goals to serve the community partner’s needs. This feedback has provided vital information for the academic team, highlighting the difficulties in balancing the agenda of the academic framework and the desire to give students authentic experiences.