28 resultados para Los Alamos Scientific Laboratory. Theoretical Division.
Resumo:
This paper examines the way psychologists and others in teh helping professions can deal with stressors in their lives and still work effectively. Three questions will be asked. First "What are the essential ingredients of an environment that supports psychologists going through personal stressors? Second, "What are the personal characteristics and strategies that give resilience to a professional during this period?" and third,"How does the stressor or grieving process influence a psychologist's therapy?" The whole will be fitted into a visual framework and the interaction of the three main variables (client, therapist and stressor) will be explored.
Resumo:
Information literacy has been a significant issue in the library community for many years. It is now being recognised as an important issue by the higher education community. This theoretical framework draws together important elements of the information literacy agenda specifically for tertiary educators and administrators. The frame-work examines three areas of primary concern: the possible outcomes of information literacy education (through outlining the characteristics of information literate people); the nature of information literacy education; and the potential role of stake-holders (including information services, faculty, staff developers and learning counsellors) in helping staff and students to be information literate. This theoretical framework forms part of the Griffith University Information Literacy Blueprint. The Blueprint was designed between June and August of 1994. The project, a quality initiative of the Division of Information Services, was led by Janice Rickards, University Librarian.
Resumo:
Purpose This work introduces the concept of very small field size. Output factor (OPF) measurements at these field sizes require extremely careful experimental methodology including the measurement of dosimetric field size at the same time as each OPF measurement. Two quantifiable scientific definitions of the threshold of very small field size are presented. Methods A practical definition was established by quantifying the effect that a 1 mm error in field size or detector position had on OPFs, and setting acceptable uncertainties on OPF at 1%. Alternatively, for a theoretical definition of very small field size, the OPFs were separated into additional factors to investigate the specific effects of lateral electronic disequilibrium, photon scatter in the phantom and source occlusion. The dominant effect was established and formed the basis of a theoretical definition of very small fields. Each factor was obtained using Monte Carlo simulations of a Varian iX linear accelerator for various square field sizes of side length from 4 mm to 100 mm, using a nominal photon energy of 6 MV. Results According to the practical definition established in this project, field sizes < 15 mm were considered to be very small for 6 MV beams for maximal field size uncertainties of 1 mm. If the acceptable uncertainty in the OPF was increased from 1.0 % to 2.0 %, or field size uncertainties are 0.5 mm, field sizes < 12 mm were considered to be very small. Lateral electronic disequilibrium in the phantom was the dominant cause of change in OPF at very small field sizes. Thus the theoretical definition of very small field size coincided to the field size at which lateral electronic disequilibrium clearly caused a greater change in OPF than any other effects. This was found to occur at field sizes < 12 mm. Source occlusion also caused a large change in OPF for field sizes < 8 mm. Based on the results of this study, field sizes < 12 mm were considered to be theoretically very small for 6 MV beams. Conclusions Extremely careful experimental methodology including the measurement of dosimetric field size at the same time as output factor measurement for each field size setting and also very precise detector alignment is required at field sizes at least < 12 mm and more conservatively < 15 mm for 6 MV beams. These recommendations should be applied in addition to all the usual considerations for small field dosimetry, including careful detector selection.
Resumo:
Measuring Earth material behaviour on time scales of millions of years transcends our current capability in the laboratory. We review an alternative path considering multiscale and multiphysics approaches with quantitative structure-property relationships. This approach allows a sound basis to incorporate physical principles such as chemistry, thermodynamics, diffusion and geometry-energy relations into simulations and data assimilation on the vast range of length and time scales encountered in the Earth. We identify key length scales for Earth systems processes and find a substantial scale separation between chemical, hydrous and thermal diffusion. We propose that this allows a simplified two-scale analysis where the outputs from the micro-scale model can be used as inputs for meso-scale simulations, which then in turn becomes the micro-model for the next scale up. We present two fundamental theoretical approaches to link the scales through asymptotic homogenisation from a macroscopic thermodynamic view and percolation renormalisation from a microscopic, statistical mechanics view.
Resumo:
Geoscientists are confronted with the challenge of assessing nonlinear phenomena that result from multiphysics coupling across multiple scales from the quantum level to the scale of the earth and from femtoseconds to the 4.5 Ga of history of our planet. We neglect in this review electromagnetic modelling of the processes in the Earth’s core, and focus on four types of couplings that underpin fundamental instabilities in the Earth. These are thermal (T), hydraulic (H), mechanical (M) and chemical (C) processes which are driven and controlled by the transfer of heat to the Earth’s surface. Instabilities appear as faults, folds, compaction bands, shear/fault zones, plate boundaries and convective patterns. Convective patterns emerge from buoyancy overcoming viscous drag at a critical Rayleigh number. All other processes emerge from non-conservative thermodynamic forces with a critical critical dissipative source term, which can be characterised by the modified Gruntfest number Gr. These dissipative processes reach a quasi-steady state when, at maximum dissipation, THMC diffusion (Fourier, Darcy, Biot, Fick) balance the source term. The emerging steady state dissipative patterns are defined by the respective diffusion length scales. These length scales provide a fundamental thermodynamic yardstick for measuring instabilities in the Earth. The implementation of a fully coupled THMC multiscale theoretical framework into an applied workflow is still in its early stages. This is largely owing to the four fundamentally different lengths of the THMC diffusion yardsticks spanning micro-metre to tens of kilometres compounded by the additional necessity to consider microstructure information in the formulation of enriched continua for THMC feedback simulations (i.e., micro-structure enriched continuum formulation). Another challenge is to consider the important factor time which implies that the geomaterial often is very far away from initial yield and flowing on a time scale that cannot be accessed in the laboratory. This leads to the requirement of adopting a thermodynamic framework in conjunction with flow theories of plasticity. This framework allows, unlike consistency plasticity, the description of both solid mechanical and fluid dynamic instabilities. In the applications we show the similarity of THMC feedback patterns across scales such as brittle and ductile folds and faults. A particular interesting case is discussed in detail, where out of the fluid dynamic solution, ductile compaction bands appear which are akin and can be confused with their brittle siblings. The main difference is that they require the factor time and also a much lower driving forces to emerge. These low stress solutions cannot be obtained on short laboratory time scales and they are therefore much more likely to appear in nature than in the laboratory. We finish with a multiscale description of a seminal structure in the Swiss Alps, the Glarus thrust, which puzzled geologists for more than 100 years. Along the Glarus thrust, a km-scale package of rocks (nappe) has been pushed 40 km over its footwall as a solid rock body. The thrust itself is a m-wide ductile shear zone, while in turn the centre of the thrust shows a mm-cm wide central slip zone experiencing periodic extreme deformation akin to a stick-slip event. The m-wide creeping zone is consistent with the THM feedback length scale of solid mechanics, while the ultralocalised central slip zones is most likely a fluid dynamic instability.
Resumo:
The basic reproduction number of a pathogen, R 0, determines whether a pathogen will spread (R0>1R 0>1), when introduced into a fully susceptible population or fade out (R0<1R 0<1), because infected hosts do not, on average, replace themselves. In this paper we develop a simple mechanistic model for the basic reproduction number for a group of tick-borne pathogens that wholly, or almost wholly, depend on horizontal transmission to and from vertebrate hosts. This group includes the causative agent of Lyme disease, Borrelia burgdorferi, and the causative agent of human babesiosis, Babesia microti, for which transmission between co-feeding ticks and vertical transmission from adult female ticks are both negligible. The model has only 19 parameters, all of which have a clear biological interpretation and can be estimated from laboratory or field data. The model takes into account the transmission efficiency from the vertebrate host as a function of the days since infection, in part because of the potential for this dynamic to interact with tick phenology, which is also included in the model. This sets the model apart from previous, similar models for R0 for tick-borne pathogens. We then define parameter ranges for the 19 parameters using estimates from the literature, as well as laboratory and field data, and perform a global sensitivity analysis of the model. This enables us to rank the importance of the parameters in terms of their contribution to the observed variation in R0. We conclude that the transmission efficiency from the vertebrate host to Ixodes scapularis ticks, the survival rate of Ixodes scapularis from fed larva to feeding nymph, and the fraction of nymphs finding a competent host, are the most influential factors for R0. This contrasts with other vector borne pathogens where it is usually the abundance of the vector or host, or the vector-to-host ratio, that determine conditions for emergence. These results are a step towards a better understanding of the geographical expansion of currently emerging horizontally transmitted tick-borne pathogens such as Babesia microti, as well as providing a firmer scientific basis for targeted use of acaricide or the application of wildlife vaccines that are currently in development.
Resumo:
Three anion isomers of formula C7H have been synthesised in the mass spectrometer by unequivocal routes. The structures of the isomers are \[HCCC(C-2)(2)](-), C6CH- and C2CHC4-. One of these, \[HCCC(C-2)(2)](-), is formed in sufficient yield to allow it to be charge stripped to the corresponding neutral radical.
Resumo:
In this paper, a novel 2×2 multiple-input multiple-output orthogonal frequency division multiplexing (MIMO-OFDM) testbed based on an Analog Devices AD9361 highly integrated radio frequency (RF) agile transceiver was specifically implemented for the purpose of estimating and analyzing MIMO-OFDM channel capacity in vehicle-to-infrastructure (V2I) environments using the 920 MHz industrial, scientific, and medical (ISM) band. We implemented two-dimensional discrete cosine transform-based filtering to reduce the channel estimation errors and show its effectiveness on our measurement results. We have also analyzed the effects of channel estimation error on the MIMO channel capacity by simulation. Three different scenarios of subcarrier spacing were investigated which correspond to IEEE 802.11p, Long-Term Evolution (LTE), and Digital Video Broadcasting Terrestrial (DVB-T)(2k) standards. An extensive MIMO-OFDM V2I channel measurement campaign was performed in a suburban environment. Analysis of the measured MIMO channel capacity results as a function of the transmitter-to-receiver (TX-RX) separation distance up to 250 m shows that the variance of the MIMO channel capacity is larger for the near-range line-of-sight (LOS) scenarios than for the long-range non-LOS cases, using a fixed receiver signal-to-noise ratio (SNR) criterion. We observed that the largest capacity values were achieved at LOS propagation despite the common assumption of a degenerated MIMO channel in LOS. We consider that this is due to the large angular spacing between MIMO subchannels which occurs when the receiver vehicle rooftop antennas pass by the fixed transmitter antennas at close range, causing MIMO subchannels to be orthogonal. In addition, analysis on the effects of different subcarrier spacings on MIMO-OFDM channel capacity showed negligible differences in mean channel capacity for the subcarrier spacing range investigated. Measured channels described in this paper are available on request.
Resumo:
The microbial mediated production of nitrous oxide (N2O) and its reduction to dinitrogen (N2) via denitrification represents a loss of nitrogen (N) from fertilised agro-ecosystems to the atmosphere. Although denitrification has received great interest by biogeochemists in the last decades, the magnitude of N2lossesand related N2:N2O ratios from soils still are largely unknown due to methodical constraints. We present a novel 15N tracer approach, based on a previous developed tracer method to study denitrification in pure bacterial cultures which was modified for the use on soil incubations in a completely automated laboratory set up. The method uses a background air in the incubation vessels that is replaced with a helium-oxygen gas mixture with a 50-fold reduced N2 background (2 % v/v). This method allows for a direct and sensitive quantification of the N2 and N2O emissions from the soil with isotope-ratio mass spectrometry after 15N labelling of denitrification N substrates and minimises the sensitivity to the intrusion of atmospheric N2 at the same time. The incubation set up was used to determine the influence of different soil moisture levels on N2 and N2O emissions from a sub-tropical pasture soil in Queensland/Australia. The soil was labelled with an equivalent of 50 μg-N per gram dry soil by broadcast application of KNO3solution (4 at.% 15N) and incubated for 3 days at 80% and 100% water filled pore space (WFPS), respectively. The headspace of the incubation vessel was sampled automatically over 12hrs each day and 3 samples (0, 6, and 12 hrs after incubation start) of headspace gas analysed for N2 and N2O with an isotope-ratio mass spectrometer (DELTA V Plus, Thermo Fisher Scientific, Bremen, Germany(. In addition, the soil was analysed for 15N NO3- and NH4+ using the 15N diffusion method, which enabled us to obtain a complete N balance. The method proved to be highly sensitive for N2 and N2O emissions detecting N2O emissions ranging from 20 to 627 μN kg-1soil-1hr-1and N2 emissions ranging from 4.2 to 43 μN kg-1soil-1hr-1for the different treatments. The main end-product of denitrification was N2O for both water contents with N2 accounting for 9% and 13% of the total denitrification losses at 80% and 100%WFPS, respectively. Between 95-100% of the added 15N fertiliser could be recovered. Gross nitrification over the 3 days amounted to 8.6 μN g-1 soil-1 and 4.7 μN g-1 soil-1, denitrification to 4.1 μN g-1 soil-1 and 11.8 μN g-1 soil-1at 80% and 100%WFPS, respectively. The results confirm that the tested method allows for a direct and highly sensitive detection of N2 and N2O fluxes from soils and hence offers a sensitive tool to study denitrification and N turnover in terrestrial agro-ecosystems.
Resumo:
Deterrence-based initiatives form a cornerstone of many road safety countermeasures. This approach is informed by Classical Deterrence Theory, which proposes that individuals will be deterred from committing offences if they fear the perceived consequences of the act, especially the perceived certainty, severity and swiftness of sanctions. While deterrence-based countermeasures have proven effective in reducing a range of illegal driving behaviours known to cause crashes such as speeding and drink driving, the exact level of exposure, and how the process works, remains unknown. As a result the current study involved a systematic review of the literature to identify theoretical advancements within deterrence theory that has informed evidence-based practice. Studies that reported on perceptual deterrence between 1950 and June 2015 were searched in electronic databases including PsychINFO and ScienceDirect, both within road safety and non-road safety fields. This review indicated that scientific efforts to understand deterrence processes for road safety were most intense during the 1970s and 1980s. This era produced competing theories that postulated both legal and non-legal factors can influence offending behaviours. Since this time, little theoretical progression has been made in the road safety arena, apart from Stafford and Warr's (1993) reconceptualisation of deterrence that illuminated the important issue of punishment avoidance. In contrast, the broader field of criminology has continued to advance theoretical knowledge by investigating a range of individual difference-based factors proposed to influence deterrent processes, including: moral inhibition, social bonding, self-control, tendencies to discount the future, etc. However, this scientific knowledge has not been directed towards identifying how to best utilise deterrence mechanisms to improve road safety. This paper will highlight the implications of this lack of progression and provide direction for future research.
Resumo:
Perceiving students, science students especially, as mere consumers of facts and information belies the importance of a need to engage them with the principles underlying those facts and is counter-intuitive to the facilitation of knowledge and understanding. Traditional didactic lecture approaches need a re-think if student classroom engagement and active learning are to be valued over fact memorisation and fact recall. In our undergraduate biomedical science programs across Years 1, 2 and 3 in the Faculty of Health at QUT, we have developed an authentic learning model with an embedded suite of pedagogical strategies that foster classroom engagement and allow for active learning in the sub-discipline area of medical bacteriology. The suite of pedagogical tools we have developed have been designed to enable their translation, with appropriate fine-tuning, to most biomedical and allied health discipline teaching and learning contexts. Indeed, aspects of the pedagogy have been successfully translated to the nursing microbiology study stream at QUT. The aims underpinning the pedagogy are for our students to: (1) Connect scientific theory with scientific practice in a more direct and authentic way, (2) Construct factual knowledge and facilitate a deeper understanding, and (3) Develop and refine their higher order flexible thinking and problem solving skills, both semi-independently and independently. The mindset and role of the teaching staff is critical to this approach since for the strategy to be successful tertiary teachers need to abandon traditional instructional modalities based on one-way information delivery. Face-to-face classroom interactions between students and lecturer enable realisation of pedagogical aims (1), (2) and (3). The strategy we have adopted encourages teachers to view themselves more as expert guides in what is very much a student-focused process of scientific exploration and learning. Specific pedagogical strategies embedded in the authentic learning model we have developed include: (i) interactive lecture-tutorial hybrids or lectorials featuring teacher role-plays as well as class-level question-and-answer sessions, (ii) inclusion of “dry” laboratory activities during lectorials to prepare students for the wet laboratory to follow, (iii) real-world problem-solving exercises conducted during both lectorials and wet laboratory sessions, and (iv) designing class activities and formative assessments that probe a student’s higher order flexible thinking skills. Flexible thinking in this context encompasses analytical, critical, deductive, scientific and professional thinking modes. The strategic approach outlined above is designed to provide multiple opportunities for students to apply principles flexibly according to a given situation or context, to adapt methods of inquiry strategically, to go beyond mechanical application of formulaic approaches, and to as much as possible self-appraise their own thinking and problem solving. The pedagogical tools have been developed within both workplace (real world) and theoretical frameworks. The philosophical core of the pedagogy is a coherent pathway of teaching and learning which we, and many of our students, believe is more conducive to student engagement and active learning in the classroom. Qualitative and quantitative data derived from online and hardcopy evaluations, solicited and unsolicited student and graduate feedback, anecdotal evidence as well as peer review indicate that: (i) our students are engaging with the pedagogy, (ii) a constructivist, authentic-learning approach promotes active learning, and (iii) students are better prepared for workplace transition.
Resumo:
It has been said that we are living in a golden age of innovation. New products, systems and services aimed to enable a better future, have emerged from novel interconnections between design and design research with science, technology and the arts. These intersections are now, more than ever, catalysts that enrich daily activities for health and safety, education, personal computing, entertainment and sustainability, to name a few. Interactive functions made possible by new materials, technology, and emerging manufacturing solutions demonstrate an ongoing interplay between cross-disciplinary knowledge and research. Such interactive interplay bring up questions concerning: (i) how art and design provide a focus for developing design solutions and research in technology; (ii) how theories emerging from the interactions of cross-disciplinary knowledge inform both the practice and research of design and (iii) how research and design work together in a mutually beneficial way. The IASDR2015 INTERPLAY EXHIBITION provides some examples of these interconnections of design research with science, technology and the arts. This is done through the presentation of objects, artefacts and demonstrations that are contextualised into everyday activities across various areas including health, education, safety, furniture, fashion and wearable design. The exhibits provide a setting to explore the various ways in which design research interacts across discipline knowledge and approaches to stimulate innovation. In education, Designing South African Children’s Health Education as Generative Play (A Bennett, F Cassim, M van der Merwe, K van Zijil, and M Ribbens) presents a set of toolkits that resulted from design research entailing generative play. The toolkits are systems that engender pleasure and responsibility, and are aimed at cultivating South African’s youth awareness of nutrition, hygiene, disease awareness and prevention, and social health. In safety, AVAnav: Avalanche Rescue Helmet (Jason Germany) delivers an interactive system as a tool to contribute to reduce the time to locate buried avalanche victims. Helmet-mounted this system responds to the contextual needs of rescuers and has since led to further design research on the interface design of rescuing devices. In apparel design and manufacturing, Shrinking Violets: Fashion design for disassembly (Alice Payne) proposes a design for disassembly through the use of beautiful reversible mono-material garments that interactively responds to the challenges of garment construction in the fashion industry, capturing the metaphor for the interplay between technology and craft in the fashion manufacturing industry. Harvest: A biotextile future (Dean Brough and Alice Payne), explores the interplay of biotechnology, materiality and textile design in the creation of sustainable, biodegradable vegan textile through the process of a symbiotic culture of bacteria and yeast (SCOBY). SCOBY is a pellicle curd that can be harvested, machine washed, dried and cut into a variety of designs and texture combinations. The exploration of smart materials, wearable design and micro-electronics led to creative and aesthetically coherent stimulus-reactive jewellery; Symbiotic Microcosms: Crafting Digital Interaction (K Vones). This creation aims to bridge the gap between craft practitioner and scientific discovery, proposing a move towards the notion of a post-human body, where wearable design is seen as potential ground for new human-computer interactions, affording the development of visually engaging multifunctional enhancements. In furniture design, Smart Assistive chair for older adults (Chao Zhao) demonstrates how cross-disciplinary knowledge interacting with design strategies provide solution that employed new technological developments in older aged care, and the participation of multiple stakeholders: designers, health care system and community based health systems. In health, Molecular diagnosis system for newborns deafness genetic screening (Chao Zhao) presents an ambitious and complex project that includes a medical device aimed at resolving a number of challenges: technical feasibility for city and rural contexts, compatibility with standard laboratory and hospital systems, access to health system, and support the work of different hospital specialists. The interplay between cross-disciplines is evident in this work, demonstrating how design research moves forward through technology developments. These works exemplify the intersection between domains as a means to innovation. Novel design problems are identified as design intersects with the various areas. Research informs this process, and in different ways. We see the background investigation into the contextualising domain (e.g. on-snow studies, garment recycling, South African health concerns, the post human body) to identify gaps in the area and design criteria; the technologies and materials reviews (e.g. AR, biotextiles) to offer plausible technical means to solve these, as well as design criteria. Theoretical reviews can also inform the design (e.g. play, flow). These work together to equip the design practitioner with a robust set of ‘tools’ for design innovation – tools that are based in research. The process identifies innovative opportunity and criteria for design and this, in turn, provides a means for evaluating the success of the design outcomes. Such an approach has the potential to come full circle between research and design – where the design can function as an exemplar, evidencing how the research-articulated problems can be solved. Core to this, however, is the evaluation of the design outcome itself and identifying knowledge outcomes. In some cases, this is fairly straightforward that is, easily measurable. For example the efficacy of Jason Germany’s helmet can be determined by measuring the reduced response time in the rescuer. Similarly the improved ability to recycle Payne’s panel garments can be clearly determined by comparing it to those recycling processes (and her identified criteria of separating textile elements!); while the sustainability and durability of the Brough & Payne’s biotextile can be assessed by documenting the growth and decay processes, or comparative strength studies. There are however situations where knowledge outcomes and insights are not so easily determined. Many of the works here are open-ended in their nature, as they emphasise the holistic experience of one or more designs, in context: “the end result of the art activity that provides the health benefit or outcome but rather, the value lies in the delivery and experience of the activity” (Bennet et al.) Similarly, reconfiguring layers of laser cut silk in Payne’s Shrinking Violets constitutes a customisable, creative process of clothing oneself since it “could be layered to create multiple visual effects”. Symbiotic Microcosms also has room for facilitating experience, as the work is described to facilitate “serendipitous discovery”. These examples show the diverse emphasis of enquiry as on the experience versus the product. Open-ended experiences are ambiguous, multifaceted and differ from person to person and moment to moment (Eco 1962). Determining the success is not always clear or immediately discernible; it may also not be the most useful question to ask. Rather, research that seeks to understand the nature of the experience afforded by the artefact is most useful in these situations. It can inform the design practitioner by helping them with subsequent re-design as well as potentially being generalizable to other designers and design contexts. Bennett et. al exemplify how this may be approached from a theoretical perspective. This work is concerned with facilitating engaging experiences to educate and, ultimately impact on that community. The research is concerned with the nature of that experience as well, and in order to do so the authors have employed theoretical lenses – here these are of flow, pleasure, play. An alternative or complementary approach to using theory, is using qualitative studies such as interviews with users to ask them about what they experienced? Here the user insights become evidence for generalising across, potentially revealing insight into relevant concerns – such as the range of possible ‘playful’ or experiences that may be afforded, or the situation that preceded a ‘serendipitous discovery’. As shown, IASDR2015 INTERPLAY EXHIBITION provides a platform for exploration, discussion and interrogation around the interplay of design research across diverse domains. We look forward with excitement as IASDR continues to bring research and design together, and as our communities of practitioners continue to push the envelope of what is design and how this can be expanded and better understood with research to foster new work and ultimately, stimulate innovation.