543 resultados para Intercolonial Exhibition (1866-1867 : Melbourne, Vic.)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

As ambient computing blends into the fabric of the modern urban environment developing a positive interplay between people, places, and technology to create enlivened, interactive cities becomes a necessary priority in how we imagine, understand, design, and develop cities. Designing technology for art, culture and gastronomic experiences, that are rich in community, can provide the means for collaborative action to (re)create cities that are lively, engaging, and promote a sense of well being as well as belonging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstraction in its resistance to evident meaning has the capacity to interrupt or at least provide tools with which to question an overly compliant reception of the information to which we are subject. It does so by highlighting a latency or potentiality inherent in materiality that points to the possibility of a critical resistance to this ceaseless flow of sound/image/data. This resistance has been remarked on in differing ways by a number of commentators such as Lyotard, in his exploration of the avant-garde and the sublime for example. This joint paper will initially map the collaborative project by Daniel Mafe and Andrew Brown, Affecting Interference which conjoins painting with digital sound and animations into a single, large scale, immersive exhibition/installation. The work acts as an interstitial point between contrasting approaches to abstraction: the visual and aural, the digital and analogue. The paper will then explore the ramifications of this through the examination of abstraction as ‘noise’, that is as that raw inassimilable materiality, within which lays the creative possibility to forge and embrace the as-yet-unthought and almost-forgotten. It does so by establishing a space for a more poetic and slower paced critical engagement for the viewing and receiving information or data. This slowing of perception through the suspension of easy recognition runs counter to our current ‘high performance’ culture, and it’s requisite demand for speedy assimilation of content, representing instead the poetic encounter with a potentiality or latency inherent in the nameless particularity of that which is.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This presentation discusses topics and issues that connect closely with the Conference Themes and themes in the ARACY Report Card. For example, developing models of public space that are safe, welcoming and relevant to children and young people will impact on their overall wellbeing and may help to prevent many of the tensions occurring in Australia and elsewhere around the world. This area is the subject of ongoing international debate, research and policy formation, relevant to concerns in the ARACY Report Card about children and young people’s health and safety, participation, behaviours and risks and peer and family relationships.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling. [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. The 2014 challenge is part of a series of data collection opportunities focussed on the game jam itself and the meaning making that the participants engage in about the event. We continued the data collection commenced in 2012: "Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling." [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Natural product provenance is important in the food, beverage and pharmaceutical industries, for consumer confidence and with health implications. Raman spectroscopy has powerful molecular fingerprint abilities. Surface Enhanced Raman Spectroscopy’s (SERS) sharp peaks allow distinction between minimally different molecules, so it should be suitable for this purpose. Methods Naturally caffeinated beverages with Guarana extract, coffee and Red Bull energy drink as a synthetic caffeinated beverage for comparison (20 µL ea.) were reacted 1:1 with Gold nanoparticles functionalised with anti-caffeine antibody (ab15221) (10 minutes), air dried and analysed in a micro-Raman instrument. The spectral data was processed using Principle Component Analysis (PCA). Results The PCA showed Guarana sourced caffeine varied significantly from synthetic caffeine (Red Bull) on component 1 (containing 76.4% of the variance in the data). See figure 1. The coffee containing beverages, and in particular Robert Timms (instant coffee) were very similar on component 1, but the barista espresso showed minor variance on component 1. Both coffee sourced caffeine samples varied with red Bull on component 2, (20% of variance). ************************************************************ Figure 1 PCA comparing a naturally caffeinated beverage containing Guarana with coffee. ************************************************************ Discussion PCA is an unsupervised multivariate statistical method that determines patterns within data. Figure 1 shows Caffeine in Guarana is notably different to synthetic caffeine. Other researchers have revealed that caffeine in Guarana plants is complexed with tannins. Naturally sourced/ lightly processed caffeine (Monster Energy, Espresso) are more inherently different than synthetic (Red Bull) /highly processed (Robert Timms) caffeine, in figure 1, which is consistent with this finding and demonstrates this technique’s applicability. Guarana provenance is important because it is still largely hand produced and its demand is escalating with recognition of its benefits. This could be a powerful technique for Guarana provenance, and may extend to other industries where provenance / authentication are required, e.g. the wine or natural pharmaceuticals industries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim A recent Monte Carlo based study has shown that it is possible to design a diode that measures small field output factors equivalent to that in water. This is accomplished by placing an appropriate sized air gap above the silicon chip (1) with experimental results subsequently confirming that a particular Monte Carlo design was accurate (2). The aim of this work was to test if a new correction-less diode could be designed using an entirely experimental methodology. Method: All measurements were performed on a Varian iX at a depth of 5 cm, SSD of 95 cm and field sizes of 5, 6, 8, 10, 20 and 30 mm. Firstly, the experimental transfer of kq,clin,kq,msr from a commonly used diode detector (IBA, stereotactic field diode (SFD)) to another diode detector (Sun Nuclear, unshielded diode, (EDGEe)) was tested. These results were compared to Monte Carlo calculated values of the EDGEe. Secondly, the air gap above the EDGEe silicon chip was optimised empirically. Nine different air gap “tops” were placed above the EDGEe (air depth = 0.3, 0.6, 0.9 mm; air width = 3.06, 4.59, 6.13 mm). The sensitivity of the EDGEe was plotted as a function of air gap thickness for the field sizes measured. Results: The transfer of kq,clin,kq,msr from the SFD to the EDGEe was correct to within the simulation and measurement uncertainties. The EDGEe detector can be made “correction-less” for field sizes of 5 and 6 mm, but was ∼2% from being “correction-less” at field sizes of 8 and 10 mm. Conclusion Different materials will perturb small fields in different ways. A detector is only “correction-less” if all these perturbations happen to cancel out. Designing a “correction-less” diode is a complicated process, thus it is reasonable to expect that Monte Carlo simulations should play an important role.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim A new method of penumbral analysis is implemented which allows an unambiguous determination of field size and penumbra size and quality for small fields and other non-standard fields. Both source occlusion and lateral electronic disequilibrium will affect the size and shape of cross-axis profile penumbrae; each is examined in detail. Method A new method of penumbral analysis is implemented where the square of the derivative of the cross-axis profile is plotted. The resultant graph displays two peaks in the place of the two penumbrae. This allows a strong visualisation of the quality of a field penumbra, as well as a mathematically consistent method of determining field size (distance between the two peak’s maxima), and penumbra (full-widthtenth-maximum of peak). Cross-axis profiles were simulated in a water phantom at a depth of 5 cm using Monte Carlo modelling, for field sizes between 5 and 30 mm. The field size and penumbra size of each field was calculated using the method above, as well as traditional definitions set out in IEC976. The effect of source occlusion and lateral electronic disequilibrium on the penumbrae was isolated by repeating the simulations removing electron transport and using an electron spot size of 0 mm, respectively. Results All field sizes calculated using the traditional and proposed methods agreed within 0.2 mm. The penumbra size measured using the proposed method was systematically 1.8 mm larger than the traditional method at all field sizes. The size of the source had a larger effect on the size of the penumbra than did lateral electronic disequilibrium, particularly at very small field sizes. Conclusion Traditional methods of calculating field size and penumbra are proved to be mathematically adequate for small fields. However, the field size definition proposed in this study would be more robust amongst other nonstandard fields, such as flattening filter free. Source occlusion plays a bigger role than lateral electronic disequilibrium in small field penumbra size.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim The assessment of treatment plans is an important component in the education of radiation therapists. The establishment of a grade for a plan is currently based on subjective assessment of a range of criteria. The automation of assessment could provide a number of advantages including faster feedback, reduced chance of human error, and simpler aggregation of past results. Method A collection of treatments planned by a cohort of 27 second year radiation therapy students were selected for quantitative evaluation. Treatment sites included the bladder, cervix, larynx, parotid and prostate, although only the larynx plans had been assessed in detail. The plans were designed with the Pinnacle system and exported using the DICOM framework. Assessment criteria included beam arrangement optimisation, volume contouring, target dose coverage and homogeneity, and organ-at-risk sparing. The in-house Treatment and Dose Assessor (TADA) software1 was evaluated for suitability in assisting with the quantitative assessment of these plans. Dose volume data were exported in per-student and per-structure data tables, along with beam complexity metrics, dose volume histograms, and reports on naming conventions. Results The treatment plans were exported and processed using TADA, with the processing of all 27 plans for each treatment site taking less than two minutes. Naming conventions were successfully checked against a reference protocol. Significant variations between student plans were found. Correlation with assessment feedback was established for the larynx plans. Conclusion The data generated could be used to inform the selection of future assessment criteria, monitor student development, and provide useful feedback to the students. The provision of objective, quantitative evaluations of plan quality would be a valuable addition to not only radiotherapy education programmes but also for staff development and potentially credentialing methods. New functionality within TADA developed for this work could be applied clinically to, for example, evaluate protocol compliance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many infrastructure agencies adopt sustainability objectives at a corporate level and incorporate sustainability targets and indicators as part of corporate reporting processes. These objectives are expected to translate to all stages of the project delivery process, including project selection. For infrastructure capital works projects and programs, a robust project management approach involves the development of a business case to guide investment decision making. A key tool in the assessment of project options and selection of a delivery strategy is Cost Benefit Analysis (CBA). Infrastructure providers are required to undertake cost benefit analysis to support project selection through regulatory approval and budgetary processes. This tool has emerged through the prism of economic analysis rather than sustainability. A literature review reveals the limitations of CBA alone to effectively evaluate economic, environmental and social externalities or impacts that apply over a long time frame, and that are ultimately irreversible. Multi-Criteria Analysis (MCA) has been introduced as a means to incorporate a wider array of factors into decision making such as sustainability. This, however, presents new challenges with issues around how to transparently represent wider community values in the selection of a preferred solution. Are these tools effective in assessing the wider sustainability costs and benefits taking into account that these are public works with long life spans and significant impacts across institutional boundaries? The research indicates a need to develop clear guidelines for investment decision making in order to better align with corporate sustainability objectives. Findings from the literature review indicate that a more sustainable approach to investment decision-making framework should include: the incorporation of sustainability goals from corporate planning documents; problem definition and option generation using best practice investment management guidelines; improved guidelines for Business Case development using a combination of both Cost Benefit Analysis and Multi-Criteria Analysis; and an integrated public participation process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: In 2013 QUT introduced the Medical Imaging Training Immersive Environment (MITIE) as a virtual reality (VR) platform that allowed students to practice general radiography. The system software has been expanded to now include C-Arm. The aim of this project was to investigate the use of this technology in the pedagogy of undergraduate medical imaging students who have limited to no experience in the use of the C-Arm clinically. Method: The Medical Imaging Training Immersive Environment (MITIE) application provides students with realistic and fully interactive 3D models of C-Arm equipment. As with VR initiatives in other health disciplines (1–2) the software mimics clinical practice as much as possible and uses 3D technology to enhance 3D spatial awareness and realism. The application allows students to set up and expose a virtual patient in a 3D environment as well as creating the resultant “image” for comparison with a gold standard. Automated feedback highlights ways for the student to improve their patient positioning, equipment setup or exposure factors. The students' equipment knowledge was tested using an on line assessment quiz and surveys provided information on the students' pre-clinical confidence scale, with post-clinical data comparisons. Ethical approval for the project was provided by the university ethics panel. Results: This study is currently under way and this paper will present analysis of initial student feedback relating to the perceived value of the application for confidence in a high risk environment (i.e. operating theatre) and related clinical skills development. Further in-depth evaluation is ongoing with full results to be presented. Conclusion: MITIE C-Arm has a development role to play in the pre-clinical skills training for Medical Radiation Science students. It will augment their theoretical understanding prior to their clinical experience. References 1. Bridge P, Appleyard R, Ward J, Phillips R, Beavis A. The development and evaluation of a virtual radiotherapy treatment machine using an immersive visualisation environment. Computers and Education 2007; 49(2): 481–494. 2. Gunn T, Berry C, Bridge P et al. 3D Virtual Radiography: Development and Initial Feedback. Paper presented at the 10th Annual Scientific Meeting of Medical Imaging and Radiation Therapy, March 2013 Hobart, Tasmania.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background From the conservative estimates of registrants with the National Diabetes Supply Scheme, we will be soon passing 1.1 Million Australians affected by all types of diabetes. The diabetes complications of foot ulceration and amputation are costly to all. These costs can be reduced with appropriate prevention strategies, starting with identifying people at risk through primary care diabetic foot screening. However, levels of diabetic foot screening in Australia are difficult to quantify. Methods This presentation reports on foot screening rates as recorded in the academic literature, national health surveys and national database reports. The focus is on type 1 and type 2 diabetes in adults, and not gestational diabetes or children. Literature searches included diabetic foot screening that occurred in the primary care setting for populations over 2000 people from 2002 to 2014. Searches were performed using Medline and CINAHL as well as internet searches of OECD health databases. The primary outcome measure was foot -screening rates as a percentage of adult diabetic population. Results The lack of a national diabetes database and register hampers efforts to analyse diabetic foot screening levels. The most recent and accurate level for Australian population review was in the AUSDIAB (Australian Diabetes and lifestyle survey) from 2004. This survey reported screening in primary care to be as low as 50%. Countries such as the United Kingdom and United States of America report much higher rates of foot screening (67-88%) using national databases and web based initiatives that involve patients and clinicians. Conclusions Australian rates of diabetic foot screening in primary care centres is ambiguous. Uptake of national registers, incentives and web based systems improve levels of diabetic foot assessment which are the first steps to a healthier diabetic population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an ever-changing and globalised world there is a need for higher education to adapt and evolve its models of learning and teaching. The old industrial model has lost traction, and new patterns of creative engagement are required. These new models potentially increase relevancy and better equip students for the future. Although creativity is recognised as an attribute that can contribute much to the development of these pedagogies, and creativity is valued by universities as a graduate capability, some educators understandably struggle to translate this vision into practice. This paper reports on selected survey findings from a mixed methods research project which aimed to shed light on how creativity can be designed for in higher education learning and teaching settings. A social constructivist epistemology underpinned the research and data was gathered using survey and case study methods. Descriptive statistical methods and informed grounded theory were employed for the analysis reported here. The findings confirm that creativity is valued for its contribution to the development of students’ academic work, employment opportunities and life in general; however, tensions arise between individual educator’s creative pedagogical goals and the provision of institutional support for implementation of those objectives. Designing for creativity becomes, paradoxically, a matter of navigating and limiting complexity and uncertainty, while simultaneously designing for those same states or qualities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since 2006, we have been conducting urban informatics research that we define as “the study, design, and practice of urban experiences across different urban contexts that are created by new opportunities of real-time, ubiquitous technology and the augmentation that mediates the physical and digital layers of people networks and urban infrastructures” [1]. Various new research initiatives under the label “urban informatics” have been started since then by universities (e.g., NYU’s Center for Urban Science and Progress) and industry (e.g., Arup, McKinsey) worldwide. Yet, many of these new initiatives are limited to what Townsend calls, “data-driven approaches to urban improvement” [2]. One of the key challenges is that any quantity of aggregated data does not easily translate directly into quality insights to better understand cities. In this talk, I will raise questions about the purpose of urban informatics research beyond data, and show examples of media architecture, participatory city making, and citizen activism. I argue for (1) broadening the disciplinary foundations that urban science approaches draw on; (2) maintaining a hybrid perspective that considers both the bird’s eye view as well as the citizen’s view, and; (3) employing design research to not be limited to just understanding, but to bring about actionable knowledge that will drive change for good.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. The 2015 challenge is part of a series of data collection opportunities focussed on the game jam itself and the meaning making that the participants engage in about the event. We are continuing the data collection commenced in 2012: "Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling." [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia