669 resultados para Los Alamos Scientific Laboratory
Resumo:
In this response to Tom G. K. Bryce and Stephen P. Day’s (Cult Stud Sci Educ. doi:10.1007/s11422-013-9500-0, 2013) original article, I share with them their interest in the teaching of climate change in school science, but I widen it to include other contemporary complex socio-scientific issues that also need to be discussed. I use an alternative view of the relationship between science, technology and society, supported by evidence from both science and society, to suggest science-informed citizens as a more realistic outcome image of school science than the authors’ one of mini-scientists. The intellectual independence of students Bryce and Day assume, and intend for school science, is countered with an active intellectual dependence. It is only in relation to emerging and uncertain scientific contexts that students should be taught about scepticism, but they also need to learn when, and why to trust science as an antidote to the expressions of doubting it. Some suggestions for pedagogies that could lead to these new learnings are made. The very recent fifth report of the IPCC answers many of their concerns about climate change.
Resumo:
Scientific visualisations such as computer-based animations and simulations are increasingly a feature of high school science instruction. Visualisations are adopted enthusiastically by teachers and embraced by students, and there is good evidence that they are popular and well received. There is limited evidence, however, of how effective they are in enabling students to learn key scientific concepts. This paper reports the results of a quantitative study conducted in Australian chemistry classrooms. The visualisations chosen were from free online sources, intended to model the ways in which classroom teachers use visualisations, but were found to have serious flaws for conceptual learning. There were also challenges in the degree of interactivity available to students using the visualisations. Within these limitations, no significant difference was found for teaching with and without these visualisations. Further study using better designed visualisations and with explicit attention to the pedagogy surrounding the visualisations will be required to gather high quality evidence of the effectiveness of visualisations for conceptual development.
Resumo:
La creación del término resiliencia en salud es un paso importante hacia la construcción de comunidades más resilientes para afrontar mejor los desastres futuros. Hasta la fecha, sin embargo, parece que hay poca literatura sobre cómo el concepto de resiliencia en salud debe ser definido. Este artículo tiene como objetivo construir un enfoque de gestión de desastres de salud integral guiado por el concepto de resiliencia. Se realizaron busquedas en bases de datos electrónicas de salud para recuperar publicaciones críticas que pueden haber contribuido a los fines y objetivos de la investigación. Un total de 61 publicaciones se incluyeron en el análisis final de este documento, que se centraron en aquéllas que proporcionan una descripción completa de las teorías y definiciones de resiliencia ante los desastres y las que proponen una definición y un marco conceptual para la capacidad de resiliencia en salud. La resiliencia es una capacidad inherente de adaptación para hacer frente a la incertidumbre del futuro. Esto implica el uso de múltiples estrategias, un enfoque de riesgos máximos y tratar de lograr un resultado positivo a través de la vinculación y cooperación entre los distintos elementos de la comunidad. Resiliencia en salud puede definirse como la capacidad de las organizaciones de salud para resistir, absorber, y responder al impacto de los desastres, mientras mantiene las funciones esenciales y se recupera a su estado original o se adapta a un nuevo estado. Puede evaluarse por criterios como la robustez, la redundancia, el ingenio y la rapidez e incluye las dimensiones clave de la vulnerabilidad y la seguridad, los recursos y la preparación para casos de desastre, la continuidad de los servicios esenciales de salud, la recuperación y la adaptación. Este nuevo concepto define las capacidades en gestión de desastres de las organizaciones sanitarias, las tareas de gestión, actividades y resultados de desastres juntos en una visión de conjunto integral, y utiliza un enfoque integrado y con un objetivo alcanzable. Se necesita urgentemente investigación futura de su medición
Resumo:
Visual information is central to several of the scientific disciplines. This paper studies how scientists working in a multidisciplinary field produce scientific evidence through building and manipulating scientific visualizations. Using ethnographic methods, we studied visualization practices of eight scientists working in the domain of tissue engineering research. Tissue engineering is an upcoming field of research that deals with replacing or regenerating human cells, tissues, or organs to restore or establish normal function. We spent 3 months in the field, where we recorded laboratory sessions of these scientists and used semi-structured interviews to get an insight into their visualization practices. From our results, we elicit two themes characterizing their visualization practices: multiplicity and physicality. In this article, we provide several examples of scientists’ visualization practices to describe these two themes and show that multimodality of such practices plays an important role in scientific visualization.
Resumo:
Measuring Earth material behaviour on time scales of millions of years transcends our current capability in the laboratory. We review an alternative path considering multiscale and multiphysics approaches with quantitative structure-property relationships. This approach allows a sound basis to incorporate physical principles such as chemistry, thermodynamics, diffusion and geometry-energy relations into simulations and data assimilation on the vast range of length and time scales encountered in the Earth. We identify key length scales for Earth systems processes and find a substantial scale separation between chemical, hydrous and thermal diffusion. We propose that this allows a simplified two-scale analysis where the outputs from the micro-scale model can be used as inputs for meso-scale simulations, which then in turn becomes the micro-model for the next scale up. We present two fundamental theoretical approaches to link the scales through asymptotic homogenisation from a macroscopic thermodynamic view and percolation renormalisation from a microscopic, statistical mechanics view.
Resumo:
The University of Queensland (UQ) has extensive laboratory facilities associated with each course in the undergraduate electrical engineering program. The laboratories include machines and drives, power systems simulation, power electronics and intelligent equipment diagnostics. A number of postgraduate coursework programs are available at UQ and the courses associated with these programs also use laboratories. The machine laboratory is currently being renovated with i-lab style web based experimental facilities, which could be remotely accessed. Senior level courses use independent projects using laboratory facilities and this is found to be very useful to improve students' learning skill. Laboratory experiments are always an integral part of a course. Most of the experiments are conducted in a group of 2-3 students and thesis projects in BE and major projects in ME are always individual works. Assessment is done in-class for the performance and also for the report and analysis.
Resumo:
Geoscientists are confronted with the challenge of assessing nonlinear phenomena that result from multiphysics coupling across multiple scales from the quantum level to the scale of the earth and from femtoseconds to the 4.5 Ga of history of our planet. We neglect in this review electromagnetic modelling of the processes in the Earth’s core, and focus on four types of couplings that underpin fundamental instabilities in the Earth. These are thermal (T), hydraulic (H), mechanical (M) and chemical (C) processes which are driven and controlled by the transfer of heat to the Earth’s surface. Instabilities appear as faults, folds, compaction bands, shear/fault zones, plate boundaries and convective patterns. Convective patterns emerge from buoyancy overcoming viscous drag at a critical Rayleigh number. All other processes emerge from non-conservative thermodynamic forces with a critical critical dissipative source term, which can be characterised by the modified Gruntfest number Gr. These dissipative processes reach a quasi-steady state when, at maximum dissipation, THMC diffusion (Fourier, Darcy, Biot, Fick) balance the source term. The emerging steady state dissipative patterns are defined by the respective diffusion length scales. These length scales provide a fundamental thermodynamic yardstick for measuring instabilities in the Earth. The implementation of a fully coupled THMC multiscale theoretical framework into an applied workflow is still in its early stages. This is largely owing to the four fundamentally different lengths of the THMC diffusion yardsticks spanning micro-metre to tens of kilometres compounded by the additional necessity to consider microstructure information in the formulation of enriched continua for THMC feedback simulations (i.e., micro-structure enriched continuum formulation). Another challenge is to consider the important factor time which implies that the geomaterial often is very far away from initial yield and flowing on a time scale that cannot be accessed in the laboratory. This leads to the requirement of adopting a thermodynamic framework in conjunction with flow theories of plasticity. This framework allows, unlike consistency plasticity, the description of both solid mechanical and fluid dynamic instabilities. In the applications we show the similarity of THMC feedback patterns across scales such as brittle and ductile folds and faults. A particular interesting case is discussed in detail, where out of the fluid dynamic solution, ductile compaction bands appear which are akin and can be confused with their brittle siblings. The main difference is that they require the factor time and also a much lower driving forces to emerge. These low stress solutions cannot be obtained on short laboratory time scales and they are therefore much more likely to appear in nature than in the laboratory. We finish with a multiscale description of a seminal structure in the Swiss Alps, the Glarus thrust, which puzzled geologists for more than 100 years. Along the Glarus thrust, a km-scale package of rocks (nappe) has been pushed 40 km over its footwall as a solid rock body. The thrust itself is a m-wide ductile shear zone, while in turn the centre of the thrust shows a mm-cm wide central slip zone experiencing periodic extreme deformation akin to a stick-slip event. The m-wide creeping zone is consistent with the THM feedback length scale of solid mechanics, while the ultralocalised central slip zones is most likely a fluid dynamic instability.
Resumo:
IODP Expedition 339 drilled five sites in the Gulf of Cadiz and two off the west Iberian margin (November 2011 to January 2012), and recovered 5.5 km of sediment cores with an average recovery of 86.4%. The Gulf of Cadiz was targeted for drilling as a key location for the investigation of Mediterranean outflow water (MOW) through the Gibraltar Gateway and its influence on global circulation and climate. It is also a prime area for understanding the effects of tectonic activity on evolution of the Gibraltar Gateway and on margin sedimentation. We penetrated into the Miocene at two different sites and established a strong signal of MOW in the sedimentary record of the Gulf of Cadiz, following the opening of the Gibraltar Gateway. Preliminary results show the initiation of contourite deposition at 4.2–4.5 Ma, although subsequent research will establish whether this dates the onset of MOW. The Pliocene succession, penetrated at four sites, shows low bottom current activity linked with a weak MOW. Significant widespread unconformities, present in all sites but with hiatuses of variable duration, are interpreted as a signal of intensified MOW, coupled with flow confinement. The Quaternary succession shows a much more pronounced phase of contourite drift development, with two periods of MOW intensification separated by a widespread unconformity. Following this, the final phase of drift evolution established the contourite depositional system (CDS) architecture we see today. There is a significant climate control on this evolution of MOW and bottom-current activity. However, from the closure of the Atlantic–Mediterranean gateways in Spain and Morocco just over 6 Ma and the opening of the Gibraltar Gateway at 5.3 Ma, there has been an even stronger tectonic control on margin development, downslope sediment transport and contourite drift evolution. The Gulf of Cadiz is the world's premier contourite laboratory and thus presents an ideal testing ground for the contourite paradigm. Further study of these contourites will allow us to resolve outstanding issues related to depositional processes, drift budgets, and recognition of fossil contourites in the ancient record on shore. The expedition also verified an enormous quantity and extensive distribution of contourite sands that are clean and well sorted. These represent a relatively untapped and important exploration target for potential oil and gas reservoirs.
Resumo:
The basic reproduction number of a pathogen, R 0, determines whether a pathogen will spread (R0>1R 0>1), when introduced into a fully susceptible population or fade out (R0<1R 0<1), because infected hosts do not, on average, replace themselves. In this paper we develop a simple mechanistic model for the basic reproduction number for a group of tick-borne pathogens that wholly, or almost wholly, depend on horizontal transmission to and from vertebrate hosts. This group includes the causative agent of Lyme disease, Borrelia burgdorferi, and the causative agent of human babesiosis, Babesia microti, for which transmission between co-feeding ticks and vertical transmission from adult female ticks are both negligible. The model has only 19 parameters, all of which have a clear biological interpretation and can be estimated from laboratory or field data. The model takes into account the transmission efficiency from the vertebrate host as a function of the days since infection, in part because of the potential for this dynamic to interact with tick phenology, which is also included in the model. This sets the model apart from previous, similar models for R0 for tick-borne pathogens. We then define parameter ranges for the 19 parameters using estimates from the literature, as well as laboratory and field data, and perform a global sensitivity analysis of the model. This enables us to rank the importance of the parameters in terms of their contribution to the observed variation in R0. We conclude that the transmission efficiency from the vertebrate host to Ixodes scapularis ticks, the survival rate of Ixodes scapularis from fed larva to feeding nymph, and the fraction of nymphs finding a competent host, are the most influential factors for R0. This contrasts with other vector borne pathogens where it is usually the abundance of the vector or host, or the vector-to-host ratio, that determine conditions for emergence. These results are a step towards a better understanding of the geographical expansion of currently emerging horizontally transmitted tick-borne pathogens such as Babesia microti, as well as providing a firmer scientific basis for targeted use of acaricide or the application of wildlife vaccines that are currently in development.
Resumo:
Production of recycled concrete aggregates (RCA) from construction and demolition (C&D) waste has become popular all over the world since the availability of land spaces are limited to dispose. Therefore it is important to seek alternative applications for RCA. The use of RCA in base and sub-base layers in granular pavement is a viable solution. In mechanistic pavement design, rutting (permanent deformation) is considered as the major failure mechanisms of the pavement. The rutting is the accumulation of permanent deformation of pavement layers caused by the repetitive vehicle load. In Queensland, Australia, it is accepted to have the maximum of 20% of reclaimed asphalt pavement (RAP) in RCA and therefore, it is important to investigate the effect of RAP on the permanent deformation properties of RCA. In this study, a series of repeated load triaxial (RLT) tests were conducted on RCA blended with different percentage of RAP to investigate the permanent deformation and resilient modulus properties of RCA. The vertical deformation and resilient modulus values were used to determine the response of RCA for the cyclic loading under standard pressure and loading conditions.
Resumo:
An alternative learning approach for destructive testing of structural specimens in civil engineering is explored by using a remote laboratory experimentation method. The remote laboratory approach focuses on overcoming the constraints in the hands-on experimentation without compromising the understanding of the students on the concepts and mechanics of reinforced concrete structures. The goal of this study is to evaluate whether or not the remote laboratory experimentation approach can become a standard in civil engineering teaching. The teaching activity using remote-laboratory experimentation is presented here and the outcomes of this activity are outlined. The experience and feedback gathered from this study are used to improve the remote-laboratory experimentation approach in future years to other aspects of civil engineering where destructive testing is essential.
Resumo:
In the six decades since the discovery of the double helix structure of DNA by Watson and Crick in 1953, developments in genetic science have transformed our understanding of human health and disease. These developments, along with those in other areas such as computer science, biotechnology, and nanotechnology, have opened exciting new possibilities for the future. In addition, the increasing trend for technologies to converge and build upon each other potentially increases the pace of change, constantly expanding the boundaries of the scientific frontier. At the same time, however, scientific advances are often accompanied by public unease over the potential for unforeseen, negative outcomes. For governments, these issues present significant challenges for effective regulation. This Article analyzes the challenges associated with crafting laws for rapidly changing science and technology. It considers whether we need to regulate, how best to regulate for converging technologies, and how best to ensure the continued relevance of laws in the face of change.
Resumo:
Service processes such as financial advice, booking a business trip or conducting a consulting project have emerged as units of analysis of high interest for the business process and service management communities in practice and academia. While the transactional nature of production processes is relatively well understood and deployed, the less predictable and highly interactive nature of service processes still lacks in many areas appropriate methodological grounding. This paper proposes a framework of a process laboratory as a new IT artefact in order to facilitate the holistic analysis and simulation of such service processes. Using financial services as an example, it will be shown how such a process laboratory can be used to reduce the complexity of service process analysis and facilitate operational service process control.
Resumo:
The paper provides a systematic approach to designing the laboratory phase of a multiphase experiment, taking into account previous phases. General principles are outlined for experiments in which orthogonal designs can be employed. Multiphase experiments occur widely, although their multiphase nature is often not recognized. The need to randomize the material produced from the first phase in the laboratory phase is emphasized. Factor-allocation diagrams are used to depict the randomizations in a design and the use of skeleton analysis-of-variance (ANOVA) tables to evaluate their properties discussed. The methods are illustrated using a scenario and a case study. A basis for categorizing designs is suggested. This article has supplementary material online.
Resumo:
The branded pause advertisement is a recently developed online television-advertising format that displays a full-screen still-image banner ad whenever a viewer pauses a streaming-video program. This study used a controlled lab experiment to compare the effectiveness of branded pause advertisements with normal online television advertisements. The results demonstrate that branded pause advertisements are effective but only when combined with a long-exposure advertisement for the same brand. Despite their short exposure time, pause advertisements function as effective reminders, building awareness through repeat exposure. The findings of the current study were similar regardless of whether pause advertisements were activated as a result of viewers’ pausing at a time of their own choosing or whether viewers were interrupted.