296 resultados para Scientists reflexivity
Resumo:
This article examines the moment of exchange between artist, audience and culture in Live Art. Drawing on historical and contemporary examples, including examples from the Exist in 08 Live Art Event in Brisbane, Australia, in October 2008, it argues that Live Art - be it body art, activist art, site-specific performance, or other sorts of performative intervention in the public sphere - is characterised by a common set of claims about activating audiences, asking them to reflect on cultural norms challenged in the work. Live Art presents risky actions, in a context that blurs the boundaries between art and reality, to position audients as ‘witnesses’ who are personally implicated in, and responsible for, the actions unfolding before them. This article problematises assumptions about the way the uncertainties embedded in the Live Art encounter contribute to its deconstructive agenda. It uses the ethical theory of Emmanuel Levinas, Hans-Thies Lehmann and Dwight Conquergood to examine the mechanics of reductive, culturally-recuperative readings that can limit the efficacy of the Live Art encounter. It argues that, though ‘witnessing’ in Live Art depends on a relation to the real - real people, taking real risks, in real places - if it fails to foreground theatrical frame it is difficult for audients to develop the dual consciousness of the content, and their complicity in that content, that is the starting point for reflexivity, and response-ability, in the ethical encounter.
Resumo:
It is more that 20 years since the “Social Control of the Drink Driver” edited by Laurence, Snortum and Zimring (1988) were published. It was, and remains a major examination of the issue involving 17 scientists from all relevant disciplines and policy centres and represents the current practice and experience at the time. While much of, but by no means all, the content is centred on the North American experience the scholarship and range of research data explored through the investigative lens of lawyers, pharmacologists, psychologists, sociologists, criminologists and economists covers all the major issues being examined in Europe, and Australia at the time. More importantly, it presents the policy aspirations and goals of nine countries and includes a comparison of deterrence and the legal context in six countries; emerging technologies for control and the potential contributions of education and rehabilitation. The experience of promoting evidence based policies and practices are generally experienced in all countries as both laborious and painfully slow. However, this ICADTS meeting in Norway provides an opportunity to challenge these feelings by re-examining the current situation compared with that documented over 20yrs ago. This presentation will undertake a reality check on just what we have achieved within that time and try to attribute success and failure towards recommendations for our future endeavours.
Resumo:
Ocean processes are dynamic and complex events that occur on multiple different spatial and temporal scales. To obtain a synoptic view of such events, ocean scientists focus on the collection of long-term time series data sets. Generally, these time series measurements are continually provided in real or near-real time by fixed sensors, e.g., buoys and moorings. In recent years, an increase in the utilization of mobile sensor platforms, e.g., Autonomous Underwater Vehicles, has been seen to enable dynamic acquisition of time series data sets. However, these mobile assets are not utilized to their full capabilities, generally only performing repeated transects or user-defined patrolling loops. Here, we provide an extension to repeated patrolling of a designated area. Our algorithms provide the ability to adapt a standard mission to increase information gain in areas of greater scientific interest. By implementing a velocity control optimization along the predefined path, we are able to increase or decrease spatiotemporal sampling resolution to satisfy the sampling requirements necessary to properly resolve an oceanic phenomenon. We present a path planning algorithm that defines a sampling path, which is optimized for repeatability. This is followed by the derivation of a velocity controller that defines how the vehicle traverses the given path. The application of these tools is motivated by an ongoing research effort to understand the oceanic region off the coast of Los Angeles, California. The computed paths are implemented with the computed velocities onto autonomous vehicles for data collection during sea trials. Results from this data collection are presented and compared for analysis of the proposed technique.
Resumo:
In recent years, ocean scientists have started to employ many new forms of technology as integral pieces in oceanographic data collection for the study and prediction of complex and dynamic ocean phenomena. One area of technological advancement in ocean sampling if the use of Autonomous Underwater Vehicles (AUVs) as mobile sensor plat- forms. Currently, most AUV deployments execute a lawnmower- type pattern or repeated transects for surveys and sampling missions. An advantage of these missions is that the regularity of the trajectory design generally makes it easier to extract the exact path of the vehicle via post-processing. However, if the deployment region for the pattern is poorly selected, the AUV can entirely miss collecting data during an event of specific interest. Here, we consider an innovative technology toolchain to assist in determining the deployment location and executed paths for AUVs to maximize scientific information gain about dynamically evolving ocean phenomena. In particular, we provide an assessment of computed paths based on ocean model predictions designed to put AUVs in the right place at the right time to gather data related to the understanding of algal and phytoplankton blooms.
Resumo:
Autonomous Underwater Vehicles (AUVs) are revolutionizing oceanography through their versatility, autonomy and endurance. However, they are still an underutilized technology. For coastal operations, the ability to track a certain feature is of interest to ocean scientists. Adaptive and predictive path planning requires frequent communication with significant data transfer. Currently, most AUVs rely on satellite phones as their primary communication. This communication protocol is expensive and slow. To reduce communication costs and provide adequate data transfer rates, we present a hardware modification along with a software system that provides an alternative robust disruption- tolerant communications framework enabling cost-effective glider operation in coastal regions. The framework is specifically designed to address multi-sensor deployments. We provide a system overview and present testing and coverage data for the network. Additionally, we include an application of ocean-model driven trajectory design, which can benefit from the use of this network and communication system. Simulation and implementation results are presented for single and multiple vehicle deployments. The presented combination of infrastructure, software development and deployment experience brings us closer to the goal of providing a reliable and cost-effective data transfer framework to enable real-time, optimal trajectory design, based on ocean model predictions, to gather in situ measurements of interesting and evolving ocean features and phenomena.
Resumo:
The screenplay, “Perfect Blood” (Frank and Stein), is the first two-hour episode of a two-part television miniseries Frank and Stein. This creative work is a science fiction story that speculates on the future of Western nations in a world where petroleum is scarce. A major theme that has been explored in the miniseries is the tension between the advantages and dangers of scientific progress without regard to human consequences. “Perfect Blood” (Frank and Stein) was written as part of my personal creative journey, which has been the transformation from research scientist to creative writer. In the exegetical component of this thesis, I propose that a key challenge for any scientist writing science fiction is the shift from conducting empirical research in a laboratory-based situation to engaging in creative practice research. During my personal creative journey, I found that a predominant difficulty in conducting research within a creative practice-led paradigm was unleashing my creativity and personal viewpoint, practices that are frowned upon in scientific research. The aim of the exegesis is to demonstrate that the transformative process from science to art is not neat and well-structured. My personal creative journey was fraught with many ‘wrong’ turns. However, after reflecting on the experience, I realise that every varied piece of research that I undertook allowed me to progress to the next stage, the next draft of Frank and Stein. And via the disorder of the creative process, a screenplay finally emerged that was both structured and creative, which are equally essential elements in screenwriting.
Resumo:
Scientists need to transfer semantically similar queries across multiple heterogeneous linked datasets. These queries may require data from different locations and the results are not simple to combine due to differences between datasets. A query model was developed to make it simple to distribute queries across different datasets using RDF as the result format. The query model, based on the concept of publicly recognised namespaces for parts of each scientific dataset, was implemented with a configuration that includes a large number of current biological and chemical datasets. The configuration is flexible, providing the ability to transparently use both private and public datasets in any query. A prototype implementation of the model was used to resolve queries for the Bio2RDF website, including both Bio2RDF datasets and other datasets that do not follow the Bio2RDF URI conventions.
Resumo:
If Australian scientists are to fully and actively participate in international scientific collaborations utilising online technologies, policies and laws must support the data access and reuse objectives of these projects. To date Australia lacks a comprehensive policy and regulatory framework for environmental information and data generally. Instead there exists a series of unconnected Acts that adopt historically-based, sector-specific approaches to the collection, use and reuse of environmental information. This paper sets out the findings of an analysis of a representative sample of Australian statutes relating to environmental management and protection to determine the extent to which they meet best practice criteria for access to and reuse of environmental information established in international initiatives. It identifies issues that need to be addressed in the legislation governing environmental information to ensure that Australian scientists are able to fully engage in international research collaborations.
Resumo:
Cell based therapies as they apply to tissue engineering and regenerative medicine, require cells capable of self renewal and differentiation, and a prerequisite is to be able to prepare an effective dose of ex vivo expanded cells for autologous transplants. The in vivo identification of a source of physiologically relevant cell types suitable for cell therapies therefore figures as an integral part of tissue engineering. Stem cells serve as a reserve for biological repair, having the potential to differentiate into a number of specialised cell types within the body; they therefore represent the most useful candidates for cell based therapies. The primary goal of stem cell research is to produce cells that are both patient specific, as well as having properties suitable for the specific conditions for which they are intended to remedy. From a purely scientific perspective, stem cells allow scientists to gain a deeper understanding of developmental biology and regenerative therapies. Stem cells have acquired a number of uses for applications in regenerative medicine, immunotherapy, gene therapy, but it is in the area of tissue engineering that they generate most excitement, primarily as a result of their capacity for self-renewal and pluripotency. A unique feature of stem cells is their ability to maintain an uncommitted quiescent state in vivo and then, once triggered by conditions such as disease, injury or natural wear or tear, serve as a reservoir and natural support system to replenish lost cells. Although these cells retain the plasticity to differentiate into various tissues, being able to control this differentiation process is still one of the biggest challenges facing stem cell research. In an effort to harness the potential of these cells a number of studies have been conducted using both embryonic/foetal and adult stem cells. The use of embryonic stem cells (ESC) have been hampered by strong ethical and political concerns, this despite their perceived versatility due to their pluripotency. Ethical issues aside, other concerns raised with ESCs relates to the possibility of tumorigenesis, immune rejection and complications with immunosuppressive therapies, all of which adds layers of complications to the application ESC in research and which has led to the search for alternative sources for stem cells. The adult tissues in higher organisms harbours cells, termed adult stem cells, and these cells are reminiscent of unprogrammed stem cells. A number of sources of adult stem cells have been described. Bone marrow is by far the most accessible source of two potent populations of adult stem cells, namely haematopoietic stem cells (HSCs) and bone marrow mesenchymal stem cells (BMSCs). Autologously harvested adult stem cells can, in contrast to embryonic stem cells, readily be used in autografts, since immune rejection is not an issue; and their use in scientific research has not attracted the ethical concerns which have been the case with embryonic stem cells. The major limitation to their use, however, is the fact that adult stem cells are exceedingly rare in most tissues. This fact makes identifying and isolating these cells problematic; bone marrow being perhaps the only notable exception. Unlike the case of HSCs, there are as yet no rigorous criteria for characterizing MSCs. Changing acuity about the pluripotency of MSCs in recent studies has expanded their potential application; however, the underlying molecular pathways which impart the features distinctive to MSCs remain elusive. Furthermore, the sparse in vivo distribution of these cells imposes a clear limitation to their study in vitro. Also, when MSCs are cultured in vitro, there is a loss of the in vivo microenvironment, resulting in a progressive decline in proliferation potential and multipotentiality. This is further exacerbated with increased passage numbers in culture, characterized by the onset of senescence related changes. As a consequence, it is necessary to establish protocols for generating large numbers of MSCs but without affecting their differentiation potential. MSCs are capable of differentiating into mesenchymal tissue lineages, including bone, cartilage, fat, tendon, muscle, and marrow stroma. Recent findings indicate that adult bone marrow may also contain cells that can differentiate into the mature, nonhematopoietic cells of a number of tissues, including cells of the liver, kidney, lung, skin, gastrointestinal tract, and myocytes of heart and skeletal muscle. MSCs can readily be expanded in vitro and can be genetically modified by viral vectors and be induced to differentiate into specific cell lineages by changing the microenvironment–properties which makes these cells ideal vehicles for cellular gene therapy. MSCs can also exert profound immunosuppressive effects via modulation of both cellular and innate immune pathways, and this property allows them to overcome the issue of immune rejection. Despite the many attractive features associated with MSCs, there are still many hurdles to overcome before these cells are readily available for use in clinical applications. The main concern relates to in vivo characterization and identification of MSCs. The lack of a universal biomarker, sparse in vivo distribution, and a steady age related decline in their numbers, makes it an obvious need to decipher the reprogramming pathways and critical molecular players which govern the characteristics unique to MSCs. This book presents a comprehensive insight into the biology of adult stem cells and their utility in current regeneration therapies. The adult stem cell populations reviewed in this book include bone marrow derived MSCs, adipose derived stem cells (ASCs), umbilical cord blood stem cells, and placental stem cells. The features such as MSC circulation and trafficking, neuroprotective properties, and the nurturing roles and differentiation potential of multiple lineages have been discussed in details. In terms of therapeutic applications, the strengths of MSCs have been presented and their roles in disease treatments such as osteoarthritis, Huntington’s disease, periodontal regeneration, and pancreatic islet transplantation have been discussed. An analysis comparing osteoblast differentiation of umbilical cord blood stem cells and MSCs has been reviewed, as has a comparison of human placental stem cells and ASCs, in terms of isolation, identification and therapeutic applications of ASC in bone, cartilage regeneration, as well as myocardial regeneration. It is my sincere hope that this book will update the reader as to the research progress of MSC biology and potential use of these cells in clinical applications. It will be the best reward to all contributors of this book, if their efforts herein may in some way help the readers in any part of their study, research, and career development.
Resumo:
The purpose of this work is to validate and automate the use of DYNJAWS; a new component module (CM) in the BEAMnrc Monte Carlo (MC) user code. The DYNJAWS CM simulates dynamic wedges and can be used in three modes; dynamic, step-and-shoot and static. The step-and-shoot and dynamic modes require an additional input file defining the positions of the jaw that constitutes the dynamic wedge, at regular intervals during its motion. A method for automating the generation of the input file is presented which will allow for the more efficient use of the DYNJAWS CM. Wedged profiles have been measured and simulated for 6 and 10 MV photons at three field sizes (5 cm x 5 cm , 10 cm x10 cm and 20 cm x 20 cm), four wedge angles (15, 30, 45 and 60 degrees), at dmax and at 10 cm depth. Results of this study show agreement between the measured and the MC profiles to within 3% of absolute dose or 3 mm distance to agreement for all wedge angles at both energies and depths. The gamma analysis suggests that dynamic mode is more accurate than the step-and-shoot mode. The DYNJAWS CM is an important addition to the BEAMnrc code and will enable the MC verification of patient treatments involving dynamic wedges.
Resumo:
We all live in a yellow submarine… When I go to work in the morning, in the office building that hosts our BPM research group, on the way up to our level I come by this big breakout room that hosts a number of computer scientists, working away at the next generation software algorithms and iPad applications (I assume). I have never actually been in that room, but every now and then the door is left ajar for a while and I can spot couches, lots (I mean, lots!) of monitors, the odd scientist, a number of Lara Croft posters, and the usual room equipment you’d probably expect from computer scientists (and, no, it’s not like that evil Dennis guy from the Jurassic Park movie, buried in chips, coke, and flickering code screens… It’s also not like the command room from the Nebuchadnezzar, Neo’s hovercraft in the Matrix movies, although I still strongly believe these green lines of code make a good screensaver).
Resumo:
In a recent journal article, Luke Jaaniste and I identified an emergent model of exegesis. From a content analysis of submitted exegeses within a local archive, we identified an approach that is quite different from the traditional thesis, but is also distinct from previously identified forms of exegesis, which Milech and Schilo have described as a ‘context model’ (which assumes the voice of academic objectivity and provides an historical or theoretical context for the creative practice) and a ‘commentary’ model’ (which takes the form of a first person reflection on the challenges, insights and achievements of the practice). The model we identified combines these dichotomous forms and assumes a dual orientation–looking outwards to the established field of research, exemplars and theories, and inwards to the methodologies, processes and outcomes of the practice. We went on to argue that this ‘connective’ exegesis offers clear benefits to the researcher in connecting the practice to an established field while allowing the researcher to demonstrate how the methods have led to outcomes that advance the field in some way. And, while it helps the candidate to articulate objective claims for research innovation, it enables them to retain a voiced, personal relationship with their practice. However, it also poses considerable complexities and challenges in the writing. It requires a reconciliation of multi-perspectival subject positions: the disinterested perspective and academic objectivity of an observer/ethnographer/analyst/theorist at times and the invested perspective of the practitioner/ producer at others. The author must also contend with a range of writing styles, speech genres and voices: from the formal, polemical voice of the theorist to the personal, questioning and sometimes emotive voice of reflexivity. Moreover, the connective exegesis requires the researcher to synthesize various perspectives, subject positions, writing styles, and voices into a unified and coherent text. In this paper I consider strategies for writing a hybrid, connective exegesis. I first ground the discussion on polyvocality and alternate textual structures through reference to recent discussions in philosophy and critical theory, and point to examples of emergent approaches to texts and practices in related fields. I then return to the collection of archived exegeses to investigate the strategies that postgraduate candidates have adopted to resolve the problems that arise from a polyvocal, connective exegesis.
Resumo:
In this feasibility study an organic plastic scintillator is calibrated against ionisation chamber measurements and then embedded in a polymer gel dosimeter to obtain a quasi-4D experimental measurement of a radiation field. This hybrid dosimeter was irradiated with a linear accelerator, with temporal measurements of the dose rate being acquired by the scintillator and spatial measurements acquired with the gel dosimeter. The detectors employed in this work are radiologically equivalent; and we show that neither detector perturbs the intensity of the radiation field of the other. By employing these detectors in concert, spatial and temporal variations in the radiation intensity can now be detected and gel dosimeters can be calibrated for absolute dose from a single irradiation.
Resumo:
Technology-oriented young firms play an important role for innovation and commercialisation of new ideas. These firms are often founded by engineers, scientists or academics who posses great scientific/technological knowledge, but limited know-how in other aspects of managing a business including knowledge management. Successful managing and integrating their specialised knowledge is of particular importance when it comes to developing a new product or process. This article therefore focuses on the particularities of the knowledge management process in technopreneurial firms. Using a qualitative investigation from a sample of Australian SMEs, a number of key observations are derived which show the challenges of managing knowledge and how important knowledge management is as a management tool for R&D and innovation process in technology-oriented SMEs. Findings suggest that knowledge management and integration processes in these firms are very much project focused and mainly based on ad hoc and informal processes and not embedded within the overall organisational routines.
Resumo:
Bioinformatics is dominated by online databases and sophisticated web-accessible tools. As such, it is ideally placed to benefit from the rapid, purpose specific combination of services achievable via web mashups. The recent introduction of a number of sophisticated frameworks has greatly simplified the mashup creation process, making them accessible to scientists with limited programming expertise. In this paper we investigate the feasibility of mashups as a new approach to bioinformatic experimentation, focusing on an exploratory niche between interactive web usage and robust workflows, and attempting to identify the range of computations for which mashups may be employed. While we treat each of the major frameworks, we illustrate the ideas with a series of examples developed under the Popfly framework