18 resultados para Geodetic astronomy
em Queensland University of Technology - ePrints Archive
Resumo:
This article describes the development and launching of a stargazing activity on two cruise ships, Pacific Dawn and Pacific Sun, which sail from Australian ports. The session included a presentation entitled “Voyage to the Stars” that gave passengers an overview of the life cycle of stars from star-birth nebulae to white dwarfs and black holes. In the presentation it was noted that ancient mariners used the celestial sphere to navigate. The presentation was followed by on-deck observing sessions in which objects shown in the presentation were viewed with the naked eye, binoculars and a small telescope. The activity seemed to be well received and resulted in numerous questions to the presenter of the activity. Many people said that the activity had kindled or rekindled their interest in astronomy.
Resumo:
Malaysia’s Vision 2020 for enhancing its education system includes the development of scientific literacy commencing at the primary school level. This Vision focuses on using English as the Medium of Instruction (EMI) for teaching primary science, as Malaysia has English as a Foreign Language (EFL) in its curriculum. What changes need to occur in preservice teacher education programs for learning about primary science using EMI? This paper investigates the education of Malaysian preservice teachers for learning how to teach one strand in science education (i.e., space, primary astronomy) in an English-language context. Ninety-six second-year preservice teachers from two Malaysian institutes were involved in a 16-week “Earth and Space” course, half the course involved education about primary astronomy. Seventy-five of these preservice teachers provided written responses about the course and their development as potential teachers of primary astronomy using EMI. Preservice teacher assessments and multimedia presentations provided further evidence on learning how to teach primary astronomy. Many of these preservice teachers claimed that learning to teach primary astronomy needs to focus on teaching strategies, content knowledge with easy-to-understand concepts, computer simulations (e.g., Earth Centered Universe, Stellarium, Celestia), other ICT media, and field experiences that use naked-eye observations and telescopes to investigate celestial bodies. Although generally proficient in using ICT, they claimed there were EFL barriers for learning some new terminology. Nevertheless, powerpoints, animations, videos, and simulations were identified as effective ICT tools for providing clear visual representations of abstract concepts and ways to enhance the learning process.
Resumo:
This article places the 6 June 2012 transit of Venus in the context of James Cook’s voyage from England to the South Pacific to observe the 1769 transit of Venus. A description is given on how to use a computer program called Stellarium to ‘observe’ the 1769 transit of Venus exactly as Cook saw it from the island of Tahiti in the South Pacific.
Resumo:
This report studies an algebraic equation whose solution gives the image system of a source of light as seen by an observer inside a reflecting spherical surface. The equation is looked at numerically using GeoGebra. Under the hypothesis that our galaxy is enveloped by a reflecting interface this becomes a possible model for many mysterious extra galactic observations.
Resumo:
The unusual behaviour of fine lunar regolith like stickiness and low heat conductivity is dominated by the structural arrangement of its finest fraction in the outer-most topsoil layer. Here, we show the previously unknown phenomenon of building a globular 3-D superstructure within the dust fraction of the regolith. New technology, Transmission X-ray Microscopy (TXM) with tomographic reconstruction, reveals a highly porous network of cellular void system in the lunar finest dust fraction aggregates. Such porous chained aggregates are composed of sub-micron in size particles that build cellular void networks. Voids are a few micrometers in diameter. Discovery of such a superstructure within the finest fraction of the lunar topsoil allow building a model of heat transfer which is discussed.
Resumo:
In this article some basic laboratory bench experiments are described that are useful for teaching high school students some of the basic principles of stellar astrophysics. For example, in one experiment, students slam a plastic water-filled bottle down onto a bench, ejecting water towards the ceiling illustrating the physics associated with a type II supernova explosion. In another experiment, students roll marbles up and down a double ramp in an attempt to get a marble to enter a tube half way up the slope, which illustrates quantum tunnelling in stellar cores. The experiments are reasonably low cost to either purchase or manufacture.
Resumo:
This thesis aimed to investigate the way in which distance runners modulate their speed in an effort to understand the key processes and determinants of speed selection when encountering hills in natural outdoor environments. One factor which has limited the expansion of knowledge in this area has been a reliance on the motorized treadmill which constrains runners to constant speeds and gradients and only linear paths. Conversely, limits in the portability or storage capacity of available technology have restricted field research to brief durations and level courses. Therefore another aim of this thesis was to evaluate the capacity of lightweight, portable technology to measure running speed in outdoor undulating terrain. The first study of this thesis assessed the validity of a non-differential GPS to measure speed, displacement and position during human locomotion. Three healthy participants walked and ran over straight and curved courses for 59 and 34 trials respectively. A non-differential GPS receiver provided speed data by Doppler Shift and change in GPS position over time, which were compared with actual speeds determined by chronometry. Displacement data from the GPS were compared with a surveyed 100m section, while static positions were collected for 1 hour and compared with the known geodetic point. GPS speed values on the straight course were found to be closely correlated with actual speeds (Doppler shift: r = 0.9994, p < 0.001, Δ GPS position/time: r = 0.9984, p < 0.001). Actual speed errors were lowest using the Doppler shift method (90.8% of values within ± 0.1 m.sec -1). Speed was slightly underestimated on a curved path, though still highly correlated with actual speed (Doppler shift: r = 0.9985, p < 0.001, Δ GPS distance/time: r = 0.9973, p < 0.001). Distance measured by GPS was 100.46 ± 0.49m, while 86.5% of static points were within 1.5m of the actual geodetic point (mean error: 1.08 ± 0.34m, range 0.69-2.10m). Non-differential GPS demonstrated a highly accurate estimation of speed across a wide range of human locomotion velocities using only the raw signal data with a minimal decrease in accuracy around bends. This high level of resolution was matched by accurate displacement and position data. Coupled with reduced size, cost and ease of use, the use of a non-differential receiver offers a valid alternative to differential GPS in the study of overground locomotion. The second study of this dissertation examined speed regulation during overground running on a hilly course. Following an initial laboratory session to calculate physiological thresholds (VO2 max and ventilatory thresholds), eight experienced long distance runners completed a self- paced time trial over three laps of an outdoor course involving uphill, downhill and level sections. A portable gas analyser, GPS receiver and activity monitor were used to collect physiological, speed and stride frequency data. Participants ran 23% slower on uphills and 13.8% faster on downhills compared with level sections. Speeds on level sections were significantly different for 78.4 ± 7.0 seconds following an uphill and 23.6 ± 2.2 seconds following a downhill. Speed changes were primarily regulated by stride length which was 20.5% shorter uphill and 16.2% longer downhill, while stride frequency was relatively stable. Oxygen consumption averaged 100.4% of runner’s individual ventilatory thresholds on uphills, 78.9% on downhills and 89.3% on level sections. Group level speed was highly predicted using a modified gradient factor (r2 = 0.89). Individuals adopted distinct pacing strategies, both across laps and as a function of gradient. Speed was best predicted using a weighted factor to account for prior and current gradients. Oxygen consumption (VO2) limited runner’s speeds only on uphill sections, and was maintained in line with individual ventilatory thresholds. Running speed showed larger individual variation on downhill sections, while speed on the level was systematically influenced by the preceding gradient. Runners who varied their pace more as a function of gradient showed a more consistent level of oxygen consumption. These results suggest that optimising time on the level sections after hills offers the greatest potential to minimise overall time when running over undulating terrain. The third study of this thesis investigated the effect of implementing an individualised pacing strategy on running performance over an undulating course. Six trained distance runners completed three trials involving four laps (9968m) of an outdoor course involving uphill, downhill and level sections. The initial trial was self-paced in the absence of any temporal feedback. For the second and third field trials, runners were paced for the first three laps (7476m) according to two different regimes (Intervention or Control) by matching desired goal times for subsections within each gradient. The fourth lap (2492m) was completed without pacing. Goals for the Intervention trial were based on findings from study two using a modified gradient factor and elapsed distance to predict the time for each section. To maintain the same overall time across all paced conditions, times were proportionately adjusted according to split times from the self-paced trial. The alternative pacing strategy (Control) used the original split times from this initial trial. Five of the six runners increased their range of uphill to downhill speeds on the Intervention trial by more than 30%, but this was unsuccessful in achieving a more consistent level of oxygen consumption with only one runner showing a change of more than 10%. Group level adherence to the Intervention strategy was lowest on downhill sections. Three runners successfully adhered to the Intervention pacing strategy which was gauged by a low Root Mean Square error across subsections and gradients. Of these three, the two who had the largest change in uphill-downhill speeds ran their fastest overall time. This suggests that for some runners the strategy of varying speeds systematically to account for gradients and transitions may benefit race performances on courses involving hills. In summary, a non – differential receiver was found to offer highly accurate measures of speed, distance and position across the range of human locomotion speeds. Self-selected speed was found to be best predicted using a weighted factor to account for prior and current gradients. Oxygen consumption limited runner’s speeds only on uphills, speed on the level was systematically influenced by preceding gradients, while there was a much larger individual variation on downhill sections. Individuals were found to adopt distinct but unrelated pacing strategies as a function of durations and gradients, while runners who varied pace more as a function of gradient showed a more consistent level of oxygen consumption. Finally, the implementation of an individualised pacing strategy to account for gradients and transitions greatly increased runners’ range of uphill-downhill speeds and was able to improve performance in some runners. The efficiency of various gradient-speed trade- offs and the factors limiting faster downhill speeds will however require further investigation to further improve the effectiveness of the suggested strategy.
Resumo:
The Space Day has been running at QUT for about a decade. The Space Day started out as a single lecture on the stars delivered to a group of high school students from Brisbane State High School (BSHS), just across the river from QUT and therefore convenient for the school to visit. I was contacted by Victor James of St. Laurence’s College (SLC), Brisbane asking if he could bring a group of boys to QUT for a lecture similar to that delivered to BSHS. However, for SLC a hands-on laboratory session was added to the lecture and thus the Space Day was born. For the Space Day we have concentrated on year 7 – 10 students. Subsequently, many other schools from Brisbane and further afield in Queensland have attended a Space Day.
Resumo:
Staff and students of the Surveying and Spatial Sciences discipline at QUT have worked collaboratively with the Institute of Sustainable Resources in the creation and development of spatial information layers and infrastructure to support multi-disciplinary research efforts at the Samford Ecological Research Facility (SERF). The SERF property is unique in that it provides staff and students with a semi-rural controlled research base for multiple users. This paper aims to describe the development of a number of spatial information layers and network of survey monuments that assist and support research infrastructure at SERF. A brief historical background about the facility is presented along with descriptions of the surveying and mapping activities undertaken. These broad ranging activities include introducing monument infrastructure and a geodetic control network; surveying activities for aerial photography ground-control targets including precise levelling with barcode instruments; development of an ortho-rectified image spatial information layer; Real-Time-Kinematic Global Positioning Systems (RTK-GPS) surveying for constructing 100metre confluence points/monuments to support science-based disciplines to undertake environmental research transects and long-term ecological sampling; and real-world learning initiative to assist with water engineering projects and student experiential learning. The spatial information layers and physical infrastructure have been adopted by two specific yet diverse user groups with an interest in the long-term research focus of SERF.
Resumo:
Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.
Resumo:
The underlying objective of this study was to develop a novel approach to evaluate the potential for commercialisation of a new technology. More specifically, this study examined the 'ex-ante'. evaluation of the technology transfer process. For this purpose, a technology originating from the high technology sector was used. The technology relates to the application of software for the detection of weak signals from space, which is an established method of signal processing in the field of radio astronomy. This technology has the potential to be used in commercial and industrial areas other than astronomy, such as detecting water leakages in pipes. Its applicability to detecting water leakage was chosen owing to several problems with detection in the industry as well as the impact it can have on saving water in the environment. This study, therefore, will demonstrate the importance of interdisciplinary technology transfer. The study employed both technical and business evaluation methods including laboratory experiments and the Delphi technique to address the research questions. There are several findings from this study. Firstly, scientific experiments were conducted and these resulted in a proof of concept stage of the chosen technology. Secondly, validation as well as refinement of criteria from literature that can be used for „ex-ante. evaluation of technology transfer has been undertaken. Additionally, after testing the chosen technology.s overall transfer potential using the modified set of criteria, it was found that the technology is still in its early stages and will require further development for it to be commercialised. Furthermore, a final evaluation framework was developed encompassing all the criteria found to be important. This framework can help in assessing the overall readiness of the technology for transfer as well as in recommending a viable mechanism for commercialisation. On the whole, the commercial potential of the chosen technology was tested through expert opinion, thereby focusing on the impact of a new technology and the feasibility of alternate applications and potential future applications.
Resumo:
This paper describes a simple activity for plotting and characterising the light curve from an exoplanet transit event by way of differential photometry analysis. Using free digital imaging software, participants analyse a series of telescope images with the goal of calculating various exoplanet parameters, including its size, orbital radius and habitability. The activity has been designed for a high-school or undergraduate university level and introduces fundamental concepts in astrophysics and an understanding of the basis for exoplanetary science, the transit method and digital photometry.
Resumo:
A recent theoretical investigation by Terzieva & Herbst of linear carbon chains, C-n where n greater than or equal to 6, in the interstellar medium has shown that these species can undergo efficient radiative association to form the corresponding anions. An experimental study by Barckholtz, Snow & Bierbaum of these anions has demonstrated that they do not react efficiently with molecular hydrogen, leading to the possibility of detectable abundances of cumulene-type anions in dense interstellar and circumstellar environments. Here we present a series of electronic structure calculations which examine possible anionic candidates for detection in these media, namely the anion analogues of the previously identified interstellar cumulenes CnH and Cn-1CH2 and heterocumulenes CnO (where n = 2-10). The extraordinary electron affinities calculated for these molecules suggest that efficient radiative electron attachment could occur, and the large dipole moments of these simple (generally) linear molecules point to the possibility of detection by radio astronomy.
Resumo:
Since 1995 the eruption of the andesitic Soufrière Hills Volcano (SHV), Montserrat, has been studied in substantial detail. As an important contribution to this effort, the Seismic Experiment with Airgunsource-Caribbean Andesitic Lava Island Precision Seismo-geodetic Observatory (SEA-CALIPSO) experiment was devised to image the arc crust underlying Montserrat, and, if possible, the magma system at SHV using tomography and reflection seismology. Field operations were carried out in October–December 2007, with deployment of 238 seismometers on land supplementing seven volcano observatory stations, and with an array of 10 ocean-bottom seismometers deployed offshore. The RRS James Cook on NERC cruise JC19 towed a tuned airgun array plus a digital 48-channel streamer on encircling and radial tracks for 77 h about Montserrat during December 2007, firing 4414 airgun shots and yielding about 47 Gb of data. The main objecctives of the experiment were achieved. Preliminary analyses of these data published in 2010 generated images of heterogeneous high-velocity bodies representing the cores of volcanoes and subjacent intrusions, and shallow areas of low velocity on the flanks of the island that reflect volcaniclastic deposits and hydrothermal alteration. The resolution of this preliminary work did not extend beyond 5 km depth. An improved three-dimensional (3D) seismic velocity model was then obtained by inversion of 181 665 first-arrival travel times from a more-complete sampling of the dataset, yielding clear images to 7.5 km depth of a low-velocity volume that was interpreted as the magma chamber which feeds the current eruption, with an estimated volume 13 km3. Coupled thermal and seismic modelling revealed properties of the partly crystallized magma. Seismic reflection analyses aimed at imaging structures under southern Montserrat had limited success, and suggest subhorizontal layering interpreted as sills at a depth of between 6 and 19 km. Seismic reflection profiles collected offshore reveal deep fans of volcaniclastic debris and fault offsets, leading to new tectonic interpretations. This chapter presents the project goals and planning concepts, describes in detail the campaigns at sea and on land, summarizes the major results, and identifies the key lessons learned.