983 resultados para Intelligent Environments
Resumo:
Schools bring people together. Yet for many children there are major discontinuities between their lives in and out of school and such differences impact on literacy teaching and learning in both predictable and unpredictable ways. However if schools were reconceptualised as meeting places, where different people are thrown together (Massey, 2005) curriculum and pedagogy could be designed to take into account students’ and teachers’ different experiences and histories and to make those differences a resource for literacy learning. This paper draws on a long-term project with administrators and teachers working in a school situated in a site of urban regeneration and significant demographic shifts. It draws particularly on the ways in which one teacher re-positioned her grade 4/5 students as researchers, designers and journalists exploring student and staff memories of a school. It argues that place, and people’s relationships with places, can be a rich resource for literacy learning when teachers make it the object of study.
Resumo:
Sophisticated models of human social behaviour are fast becoming highly desirable in an increasingly complex and interrelated world. Here, we propose that rather than taking established theories from the physical sciences and naively mapping them into the social world, the advanced concepts and theories of social psychology should be taken as a starting point, and used to develop a new modelling methodology. In order to illustrate how such an approach might be carried out, we attempt to model the low elaboration attitude changes of a society of agents in an evolving social context. We propose a geometric model of an agent in context, where individual agent attitudes are seen to self-organise to form ideologies, which then serve to guide further agent-based attitude changes. A computational implementation of the model is shown to exhibit a number of interesting phenomena, including a tendency for a measure of the entropy in the system to decrease, and a potential for externally guiding a population of agents towards a new desired ideology.
Resumo:
Long term exposure to vehicle emissions has been associated with harmful health effects. Children are amongst the most susceptible group and schools represent an environment where they can experience significant exposure to vehicle emissions. However, there are limited studies on children’s exposure to vehicle emissions in schools. The aim of this study was to quantify the concentration of organic aerosol and in particular, vehicle emissions that children are exposed to during school hours. Therefore an Aerodyne compact time-of-flight aerosol mass spectrometer (TOF-AMS) was deployed at five urban schools in Brisbane, Australia. The TOF-AMS enabled the chemical composition of the non- refractory (NR-PM1) to be analysed with a high temporal resolution to assess the concentration of vehicle emissions and other organic aerosols during school hours. At each school the organic fraction comprised the majority of NR-PM1 with secondary organic aerosols as the main constitute. At two of the schools, a significant source of the organic aerosol (OA) was slightly aged vehicle emissions from nearby highways. More aged and oxidised OA was observed at the other three schools, which also recorded strong biomass burning influences. Primary emissions were found to dominate the OA at only one school which had an O:C ratio of 0.17, due to fuel powered gardening equipment used near the TOF-AMS. The diurnal cycle of OA concentration varied between schools and was found to be at a minimum during school hours. The major organic component that school children were exposed to during school hours was secondary OA. Peak exposure of school children to HOA occurred during school drop off and pick up times. Unless a school is located near major roads, children are exposed predominately to regional secondary OA as opposed to local emissions during schools hours in urban environments.
Resumo:
It has not yet been established whether the spatial variation of particle number concentration (PNC) within a microscale environment can have an effect on exposure estimation results. In general, the degree of spatial variation within microscale environments remains unclear, since previous studies have only focused on spatial variation within macroscale environments. The aims of this study were to determine the spatial variation of PNC within microscale school environments, in order to assess the importance of the number of monitoring sites on exposure estimation. Furthermore, this paper aims to identify which parameters have the largest influence on spatial variation, as well as the relationship between those parameters and spatial variation. Air quality measurements were conducted for two consecutive weeks at each of the 25 schools across Brisbane, Australia. PNC was measured at three sites within the grounds of each school, along with the measurement of meteorological and several other air quality parameters. Traffic density was recorded for the busiest road adjacent to the school. Spatial variation at each school was quantified using coefficient of variation (CV). The portion of CV associated with instrument uncertainty was found to be 0.3 and therefore, CV was corrected so that only non-instrument uncertainty was analysed in the data. The median corrected CV (CVc) ranged from 0 to 0.35 across the schools, with 12 schools found to exhibit spatial variation. The study determined the number of required monitoring sites at schools with spatial variability and tested the deviation in exposure estimation arising from using only a single site. Nine schools required two measurement sites and three schools required three sites. Overall, the deviation in exposure estimation from using only one monitoring site was as much as one order of magnitude. The study also tested the association of spatial variation with wind speed/direction and traffic density, using partial correlation coefficients to identify sources of variation and non-parametric function estimation to quantify the level of variability. Traffic density and road to school wind direction were found to have a positive effect on CVc, and therefore, also on spatial variation. Wind speed was found to have a decreasing effect on spatial variation when it exceeded a threshold of 1.5 (m/s), while it had no effect below this threshold. Traffic density had a positive effect on spatial variation and its effect increased until it reached a density of 70 vehicles per five minutes, at which point its effect plateaued and did not increase further as a result of increasing traffic density.
Resumo:
The contextuality of changing attitudes makes them extremely difficult to model. This paper scales up Quantum Decision Theory (QDT) to a social setting, using it to model the manner in which social contexts can interact with the process of low elaboration attitude change. The elements of this extended theory are presented, along with a proof of concept computational implementation in a low dimensional subspace. This model suggests that a society's understanding of social issues will settle down into a static or frozen configuration unless that society consists of a range of individuals with varying personality types and norms.
Resumo:
Laboratories and technical hands on learning have always been a part of Engineering and Science based university courses. They provide the interface where theory meets practice and students may develop professional skills through interacting with real objects in an environment that models appropriate standards and systems. Laboratories in many countries are facing challenges to their sustainable operation and effectiveness. In some countries such as Australia, significantly reduced funding and staff reduction is eroding a once strong base of technical infrastructure. Other countries such as Thailand are seeking to develop their laboratory infrastructure and are in need of staff skill development, management and staff structure in technical areas. In this paper the authors will address the need for technical development with reference to work undertaken in Thailand and Australia. The authors identify the roads which their respective university sectors are on and point out problems and opportunities. It is hoped that the cross roads where we meet will result in better directions for both.
Resumo:
Vision-based SLAM is mostly a solved problem providing clear, sharp images can be obtained. However, in outdoor environments a number of factors such as rough terrain, high speeds and hardware limitations can result in these conditions not being met. High speed transit on rough terrain can lead to image blur and under/over exposure, problems that cannot easily be dealt with using low cost hardware. Furthermore, recently there has been a growth in interest in lifelong autonomy for robots, which brings with it the challenge in outdoor environments of dealing with a moving sun and lack of constant artificial lighting. In this paper, we present a lightweight approach to visual localization and visual odometry that addresses the challenges posed by perceptual change and low cost cameras. The approach combines low resolution imagery with the SLAM algorithm, RatSLAM. We test the system using a cheap consumer camera mounted on a small vehicle in a mixed urban and vegetated environment, at times ranging from dawn to dusk and in conditions ranging from sunny weather to rain. We first show that the system is able to provide reliable mapping and recall over the course of the day and incrementally incorporate new visual scenes from different times into an existing map. We then restrict the system to only learning visual scenes at one time of day, and show that the system is still able to localize and map at other times of day. The results demonstrate the viability of the approach in situations where image quality is poor and environmental or hardware factors preclude the use of visual features.
Resumo:
Biorobotics has the potential to provide an integrated understanding from neural systems to behavior that is neither ethical nor technically feasible with living systems. Robots that can interact with animals in their natural environment open new possibilities for empirical studies in neuroscience. However, designing a robot that can interact with a rodent requires considerations that span a range of disciplines. For the rat's safety, the body form and movements of the robot need to take into consideration the safety of the animal, an appropriate size for the rodent arenas, and behaviors for interaction. For the robot's safety, its form must be robust in the face of typically inquisitive and potentially aggressive behaviors by the rodent, which can include chewing on exposed parts, including electronics, and deliberate or accidental fouling. We designed a rat-sized robot, the iRat (intelligent rat animat technology) for studies in neuroscience. The iRat is about the same size as a rat and has the ability to navigate autonomously around small environments. In this study we report the first interactions between the iRat and real rodents in a free exploration task. Studies with five rats show that the rats and iRat interact safely for both parties.
Resumo:
Evidence concerning the impact of child care on child development suggests that higher-quality environments, particularly those that are more responsive, predict more favourable social and behavioural outcomes. However, the extent of this effect is not as great as might be expected. Impacts on child outcomes are, at best, modest. One recent explanation emerging from a new theoretical perspective of development, differential susceptibility theory, is that a minority of children are more reactive to both positive and negative environments, while the majority are relatively unaffected. These 'quirky' children have temperamental traits that are more extreme, and are often described in research studies as having 'difficult temperaments'. This paper reviews the literature on such children and argues for the need for further research to identify components of childcare environments that optimise the potential of these more sensitive, quirky individuals.
Resumo:
Ions play an important role in affecting climate and particle formation in the atmosphere. Small ions rapidly attach to particles in the air and, therefore, studies have shown that they are suppressed in polluted environments. Urban environments, in particular, are dominated by motor vehicle emissions and, since motor vehicles are a source of both particles and small ions, the relationship between these two parameters is not well known. In order to gain a better understanding of this relationship, an intensive campaign was undertaken where particles and small ions of both signs were monitored over two week periods at each of three sites A, B and C that were affected to varying degrees by vehicle emissions. Site A was close to a major road and reported the highest particle number and lowest small ion concentrations. Precursors from motor vehicle emissions gave rise to clear particle formation events on five days and, on each day this was accompanied by a suppression of small ions. Observations at Site B, which was located within the urban airshed, though not adjacent to motor traffic, showed particle enhancement but no formation events. Site C was a clean site, away from urban sources. This site reported the lowest particle number and highest small ion concentration. The positive small ion concentration was 10% to 40% higher than the corresponding negative value at all sites. These results confirm previous findings that there is a clear inverse relationship between small ions and particles in urban environments dominated by motor vehicle emissions.
Resumo:
In this paper, we present an unsupervised graph cut based object segmentation method using 3D information provided by Structure from Motion (SFM), called Grab- CutSFM. Rather than focusing on the segmentation problem using a trained model or human intervention, our approach aims to achieve meaningful segmentation autonomously with direct application to vision based robotics. Generally, object (foreground) and background have certain discriminative geometric information in 3D space. By exploring the 3D information from multiple views, our proposed method can segment potential objects correctly and automatically compared to conventional unsupervised segmentation using only 2D visual cues. Experiments with real video data collected from indoor and outdoor environments verify the proposed approach.
Resumo:
This paper presents practical vision-based collision avoidance for objects approximating a single point feature. Using a spherical camera model, a visual predictive control scheme guides the aircraft around the object along a conical spiral trajectory. Visibility, state and control constraints are considered explicitly in the controller design by combining image and vehicle dynamics in the process model, and solving the nonlinear optimization problem over the resulting state space. Importantly, range is not required. Instead, the principles of conical spiral motion are used to design an objective function that simultaneously guides the aircraft along the avoidance trajectory, whilst providing an indication of the appropriate point to stop the spiral behaviour. Our approach is aimed at providing a potential solution to the See and Avoid problem for unmanned aircraft and is demonstrated through a series.
Resumo:
Teaching introductory programming has challenged educators through the years. Although Intelligent Tutoring Systems that teach programming have been developed to try to reduce the problem, none have been developed to teach web programming. This paper describes the design and evaluation of the PHP Intelligent Tutoring System (PHP ITS) which addresses this problem. The evaluation process showed that students who used the PHP ITS showed a significant improvement in test scores
Resumo:
Sol-gel synthesis in varied gravity is only a relatively new topic in the literature and further investigation is required to explore its full potential as a method to synthesise novel materials. Although trialled for systems such as silica, the specific application of varied gravity synthesis to other sol-gel systems such as titanium has not previously been undertaken. Current literature methods for the synthesis of sol-gel material in reduced gravity could not be applied to titanium sol-gel processing, thus a new strategy had to be developed in this study. To successfully conduct experiments in varied gravity a refined titanium sol-gel chemical precursor had to be developed which allowed the single solution precursor to remain un-reactive at temperatures up to 50oC and only begin to react when exposed to a pressure decrease from a vacuum. Due to the new nature of this precursor, a thorough characterisation of the reaction precursors was subsequently undertaken with the use of techniques such as Nuclear Magnetic Resonance, Infra-red and UV-Vis spectroscopy in order to achieve sufficient understanding of precursor chemistry and kinetic stability. This understanding was then used to propose gelation reaction mechanisms under varied gravity conditions. Two unique reactor systems were designed and built with the specific purpose to allow the effects of varied gravity (high, normal, reduced) during synthesis of titanium sol-gels to be studied. The first system was a centrifuge capable of providing high gravity environments of up to 70 g’s for extended periods, whilst applying a 100 mbar vacuum and a temperature of 40-50oC to the reaction chambers. The second system to be used in the QUT Microgravity Drop Tower Facility was also required to provide the same thermal and vacuum conditions used in the centrifuge, but had to operate autonomously during free fall. Through the use of post synthesis characterisation techniques such as Raman Spectroscopy, X-Ray diffraction (XRD) and N2 adsorption, it was found that increased gravity levels during synthesis, had the greatest effect on the final products. Samples produced in reduced and normal gravity appeared to form amorphous gels containing very small particles with moderate surface areas. Whereas crystalline anatase (TiO2), was found to form in samples synthesised above 5 g with significant increases in crystallinity, particle size and surface area observed when samples were produced at gravity levels up to 70 g. It is proposed that for samples produced in higher gravity, an increased concentration gradient of water is forms at the bottom of the reacting film due to forced convection. The particles formed in higher gravity diffuse downward towards this excess of water, which favours the condensation reaction of remaining sol gel precursors with the particles promoting increased particle growth. Due to the removal of downward convection in reduced gravity, particle growth due to condensation reaction processes are physically hindered hydrolysis reactions favoured instead. Another significant finding from this work was that anatase could be produced at relatively low temperatures of 40-50oC instead of the conventional method of calcination above 450oC solely through sol-gel synthesis at higher gravity levels. It is hoped that the outcomes of this research will lead to an increased understanding of the effects of gravity on chemical synthesis of titanium sol-gel, potentially leading to the development of improved products suitable for diverse applications such as semiconductor or catalyst materials as well as significantly reducing production and energy costs through manufacturing these materials at significantly lower temperatures.
Resumo:
The reliability analysis is crucial to reducing unexpected down time, severe failures and ever tightened maintenance budget of engineering assets. Hazard based reliability methods are of particular interest as hazard reflects the current health status of engineering assets and their imminent failure risks. Most existing hazard models were constructed using the statistical methods. However, these methods were established largely based on two assumptions: one is the assumption of baseline failure distributions being accurate to the population concerned and the other is the assumption of effects of covariates on hazards. These two assumptions may be difficult to achieve and therefore compromise the effectiveness of hazard models in the application. To address this issue, a non-linear hazard modelling approach is developed in this research using neural networks (NNs), resulting in neural network hazard models (NNHMs), to deal with limitations due to the two assumptions for statistical models. With the success of failure prevention effort, less failure history becomes available for reliability analysis. Involving condition data or covariates is a natural solution to this challenge. A critical issue for involving covariates in reliability analysis is that complete and consistent covariate data are often unavailable in reality due to inconsistent measuring frequencies of multiple covariates, sensor failure, and sparse intrusive measurements. This problem has not been studied adequately in current reliability applications. This research thus investigates such incomplete covariates problem in reliability analysis. Typical approaches to handling incomplete covariates have been studied to investigate their performance and effects on the reliability analysis results. Since these existing approaches could underestimate the variance in regressions and introduce extra uncertainties to reliability analysis, the developed NNHMs are extended to include handling incomplete covariates as an integral part. The extended versions of NNHMs have been validated using simulated bearing data and real data from a liquefied natural gas pump. The results demonstrate the new approach outperforms the typical incomplete covariates handling approaches. Another problem in reliability analysis is that future covariates of engineering assets are generally unavailable. In existing practices for multi-step reliability analysis, historical covariates were used to estimate the future covariates. Covariates of engineering assets, however, are often subject to substantial fluctuation due to the influence of both engineering degradation and changes in environmental settings. The commonly used covariate extrapolation methods thus would not be suitable because of the error accumulation and uncertainty propagation. To overcome this difficulty, instead of directly extrapolating covariate values, projection of covariate states is conducted in this research. The estimated covariate states and unknown covariate values in future running steps of assets constitute an incomplete covariate set which is then analysed by the extended NNHMs. A new assessment function is also proposed to evaluate risks of underestimated and overestimated reliability analysis results. A case study using field data from a paper and pulp mill has been conducted and it demonstrates that this new multi-step reliability analysis procedure is able to generate more accurate analysis results.