943 resultados para high dependancy unit


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Engineering is pivotal to any country's development. Yet there are insufficient engineers to take up available positions in many countries, including Australia (Engineers Australia, 2008). Engineering education is limited in Australia at the primary, middle and high school levels. One of the starting points for addressing this shortfall lies in preservice teacher education. This study explores second-year preservice teachers' potential to teach engineering in middle school, following their engagement with engineering concepts in their science curriculum unit and their teaching of engineering activities to Year 7 students. Using a literature-based pretest-posttest survey, items were categorised into four constructs (ie. personal professional attributes, student motivation, pedagogical knowledge and fused curricula). Results indicated that the preservice teachers' responses had not changed for instilling positive attitudes (88%) and accepting advice from colleagues (94%). However, there was statistical significance with 9 of the 25 survey items (p<0.05) after the preservice teachers' involvement in engineering activities. Fusing engineering education with other subjects, such as mathematics and science, is an essential first step in promoting preservice teachers' potential to implement engineering education in the middle school.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The degradation of high voltage electrical insulation is a prime factor that can significantly influence the reliability performance and the costs of maintaining high voltage electricity networks. Little information is known about the system of localized degradation from corona discharges on the relatively new silicone rubber sheathed composite insulators that are now being widely used in high voltage applications. This current work focuses on the fundamental principles of electrical corona discharge phenomena to provide further insights to where damaging surface discharges may localize and examines how these discharges may degrade the silicone rubber material. Although water drop corona has been identified by many authors as a major cause of deterioration of silicone rubber high voltage insulation until now no thorough studies have been made of this phenomenon. Results from systematic measurements taken using modern digital instrumentation to simultaneously record the discharge current pulses and visible images associated with corona discharges from between metal electrodes, metal electrodes and water drops, and between waters drops on the surface of silicone rubber insulation, using a range of 50 Hz voltages are inter compared. Visual images of wet electrodes show how water drops can play a part in encouraging flashover, and the first reproducible visual images of water drop corona at the triple junction of water air and silicone rubber insulation are presented. A study of the atomic emission spectra of the corona produced by the discharge from its onset up to and including spark-over, using a high resolution digital spectrometer with a fiber optic probe, provides further understanding of the roles of the active species of atoms and molecules produced by the discharge that may be responsible for not only for chemical changes of insulator surfaces, but may also contribute to the degradation of the metal fittings that support the high voltage insulators. Examples of real insulators and further work specific to the electrical power industry are discussed. A new design concept to prevent/reduce the damaging effects of water drop corona is also presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study focuses on an alluvial plain situated within a large meander of the Logan River at Josephville near Beaudesert which supports a factory that processes gelatine. The plant draws water from on site bores, as well as the Logan River, for its production processes and produces approximately 1.5 ML per day (Douglas Partners, 2004) of waste water containing high levels of dissolved ions. At present a series of treatment ponds are used to aerate the waste water reducing the level of organic matter; the water is then used to irrigate grazing land around the site. Within the study the hydrogeology is investigated, a conceptual groundwater model is produced and a numerical groundwater flow model is developed from this. On the site are several bores that access groundwater, plus a network of monitoring bores. Assessment of drilling logs shows the area is formed from a mixture of poorly sorted Quaternary alluvial sediments with a laterally continuous aquifer comprised of coarse sands and fine gravels that is in contact with the river. This aquifer occurs at a depth of between 11 and 15 metres and is overlain by a heterogeneous mixture of silts, sands and clays. The study investigates the degree of interaction between the river and the groundwater within the fluvially derived sediments for reasons of both environmental monitoring and sustainability of the potential local groundwater resource. A conceptual hydrogeological model of the site proposes two hydrostratigraphic units, a basal aquifer of coarse-grained materials overlain by a thick semi-confining unit of finer materials. From this, a two-layer groundwater flow model and hydraulic conductivity distribution was developed based on bore monitoring and rainfall data using MODFLOW (McDonald and Harbaugh, 1988) and PEST (Doherty, 2004) based on GMS 6.5 software (EMSI, 2008). A second model was also considered with the alluvium represented as a single hydrogeological unit. Both models were calibrated to steady state conditions and sensitivity analyses of the parameters has demonstrated that both models are very stable for changes in the range of ± 10% for all parameters and still reasonably stable for changes up to ± 20% with RMS errors in the model always less that 10%. The preferred two-layer model was found to give the more realistic representation of the site, where water level variations and the numerical modeling showed that the basal layer of coarse sands and fine gravels is hydraulically connected to the river and the upper layer comprising a poorly sorted mixture of silt-rich clays and sands of very low permeability limits infiltration from the surface to the lower layer. The paucity of historical data has limited the numerical modelling to a steady state one based on groundwater levels during a drought period and forecasts for varying hydrological conditions (e.g. short term as well as prolonged dry and wet conditions) cannot reasonably be made from such a model. If future modelling is to be undertaken it is necessary to establish a regular program of groundwater monitoring and maintain a long term database of water levels to enable a transient model to be developed at a later stage. This will require a valid monitoring network to be designed with additional bores required for adequate coverage of the hydrogeological conditions at the Josephville site. Further investigations would also be enhanced by undertaking pump testing to investigate hydrogeological properties in the aquifer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The issue of what an effective high quality / high equity education system might look like remains contested. Indeed there is more educational commentary on those systems that do not achieve this goal (see for example Luke & Woods, 2009 for a detailed review of the No Child Left Behind policy initiatives put forward in the United States under the Bush Administration) than there is detailed consideration of what such a system might enact and represent. A long held critique of socio cultural and critical perspectives in education has been their focus on deconstruction to the supposed detriment of reconstructive work. This critique is less warranted in recent times based on work in the field, especially the plethora of qualitative research focusing on case studies of ‘best practice’. However it certainly remains the case that there is more work to be done in investigating the characteristics of a socially just system. This issue of Point and Counterpoint aims to progress such a discussion. Several of the authors call for a reconfiguration of the use of large scale comparative assessment measures and all suggest new ways of thinking about quality and equity for school systems. Each of the papers tackles different aspects of the problematic of how to achieve high equity without compromising quality within a large education system. They each take a reconstructive focus, highlighting ways forward for education systems in Australia and beyond. While each paper investigates different aspects of the issue, the clearly stated objective of seeking to delineate and articulate characteristics of socially just education is consistent throughout the issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: In the majority of exercise intervention studies, the aggregate reported weight loss is often small. The efficacy of exercise as a weight loss tool remains in question. The aim of the present study was to investigate the variability in appetite and body weight when participants engaged in a supervised and monitored exercise programme. ---------- Design: Fifty-eight obese men and women (BMI = 31·8 ± 4·5 kg/m2) were prescribed exercise to expend approximately 2092 kJ (500 kcal) per session, five times a week at an intensity of 70 % maximum heart rate for 12 weeks under supervised conditions in the research unit. Body weight and composition, total daily energy intake and various health markers were measured at weeks 0, 4, 8 and 12. ---------- Results: Mean reduction in body weight (3·2 ± 1·98 kg) was significant (P < 0·001); however, there was large individual variability (−14·7 to +2·7 kg). This large variability could be largely attributed to the differences in energy intake over the 12-week intervention. Those participants who failed to lose meaningful weight increased their food intake and reduced intake of fruits and vegetables. ---------- Conclusion: These data have demonstrated that even when exercise energy expenditure is high, a healthy diet is still required for weight loss to occur in many people.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Television viewing time, independent of leisure-time physical activity, has cross-sectional relationships with the metabolic syndrome and its individual components. We examined whether baseline and five-year changes in self-reported television viewing time are associated with changes in continuous biomarkers of cardio-metabolic risk (waist circumference, triglycerides, high density lipoprotein cholesterol, systolic and diastolic blood pressure, fasting plasma glucose; and a clustered cardio-metabolic risk score) in Australian adults. Methods: AusDiab is a prospective, population-based cohort study with biological, behavioral, and demographic measures collected in 1999–2000 and 2004–2005. Non-institutionalized adults aged ≥ 25 years were measured at baseline (11,247; 55% of those completing an initial household interview); 6,400 took part in the five-year follow-up biomedical examination, and 3,846 met the inclusion criteria for this analysis. Multiple linear regression analysis was used and unstandardized B coefficients (95% CI) are provided. Results: Baseline television viewing time (10 hours/week unit) was not significantly associated with change in any of the biomarkers of cardio-metabolic risk. Increases in television viewing time over five years (10 hours/week unit) were associated with increases in: waist circumference (cm) (men: 0.43 (0.08, 0.78), P = 0.02; women: 0.68 (0.30, 1.05), P <0.001), diastolic blood pressure (mmHg) (women: 0.47 (0.02, 0.92), P = 0.04), and the clustered cardio-metabolic risk score (women: 0.03 (0.01, 0.05), P = 0.007). These associations were independent of baseline television viewing time and baseline and change in physical activity and other potential confounders. Conclusion: These findings indicate that an increase in television viewing time is associated with adverse cardio-metabolic biomarker changes. Further prospective studies using objective measures of several sedentary behaviors are required to confirm causality of the associations found.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To understand the diffusion of high technology products such as PCs, digital cameras and DVD players it is necessary to consider the dynamics of successive generations of technology. From the consumer’s perspective, these technology changes may manifest themselves as either a new generation product substituting for the old (for instance digital cameras) or as multiple generations of a single product (for example PCs). To date, research has been confined to aggregate level sales models. These models consider the demand relationship between one generation of a product and a successor generation. However, they do not give insights into the disaggregate-level decisions by individual households – whether to adopt the newer generation, and if so, when. This paper makes two contributions. It is the first large scale empirical study to collect household data for successive generations of technologies in an effort to understand the drivers of adoption. Second, in contrast to traditional analysis in diffusion research that conceptualizes technology substitution as an “adoption of innovation” type process, we propose that from a consumer’s perspective, technology substitution combines elements of both adoption (adopting the new generation technology) and replacement (replacing generation I product with generation II). Key Propositions In some cases, successive generations are clear “substitutes” for the earlier generation (e.g. PCs Pentium I to II to III ). More commonly the new generation II technology is a “partial substitute” for existing generation I technology (e.g. DVD players and VCRs). Some consumers will purchase generation II products as substitutes for their generation I product, while other consumers will purchase generation II products as additional products to be used as well as their generation I product. We propose that substitute generation II purchases combine elements of both adoption and replacement, but additional generation II purchases are solely adoption-driven process. Moreover, drawing on adoption theory consumer innovativeness is the most important consumer characteristic for adoption timing of new products. Hence, we hypothesize consumer innovativeness to influence the timing of both additional and substitute generation II purchases but to have a stronger impact on additional generation II purchases. We further propose that substitute generation II purchases act partially as a replacement purchase for the generation I product. Thus, we hypothesize that households with older generation I products will make substitute generation II purchases earlier. Methods We employ Cox hazard modeling to study factors influencing the timing of a household’s adoption of generation II products. A separate hazard model is conducted for additional and substitute purchases. The age of the generation I product is calculated based on the most recent household purchase of that product. Control variables include size and income of household, age and education of decision-maker. Results and Implications Our preliminary results confirm both our hypotheses. Consumer innovativeness has a strong influence on both additional purchases and substitute purchases. Also consistent with our hypotheses, the age of the generation I product has a dramatic influence for substitute purchases of VCR/DVD players and a strong influence for PCs/notebooks. Yet, also as hypothesized, there was no influence on additional purchases. This implies that there is a clear distinction between additional and substitute purchases of generation II products, each with different drivers. For substitute purchases, product age is a key driver. Therefore marketers of high technology products can utilize data on generation I product age (e.g. from warranty or loyalty programs) to target customers who are more likely to make a purchase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Restrung New Chamber Festival was a practice-led research project which explored the intricacies of musical relationships. Specifically, it investigated the relationships between new music ensembles and pop-oriented bands inspired by the new music genre. The festival, held at the Brisbane Powerhouse (28 February-2 March 2009) comprised 17 diverse groups including the Brodsky Quartet, Topology, Wood, Fourplay and CODA. Restrung used a new and distinctive model which presented new music and syncretic musical genres within an immersive environment. Restrung brought together approaches used in both contemporary classical and popular music festivals, using musical, visual and spatial aspects to engage audiences. Interactivity was encouraged through video and sound installations, workshops and forums. This paper will investigate some of the issues surrounding the conception and design of the Restrung model, within the context of an overview of European new music trends. It includes a discussion of curating such an event in a musically sensitive and effective way, and approaches to identifying new and receptive audiences. As a guide to programming Restrung, I formulated a working definition of new music, further developed by interviews with specialists in Australia and Europe, and this will be outlined below.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The population Monte Carlo algorithm is an iterative importance sampling scheme for solving static problems. We examine the population Monte Carlo algorithm in a simplified setting, a single step of the general algorithm, and study a fundamental problem that occurs in applying importance sampling to high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of estimate under conditions on the importance function. We demonstrate the exponential growth of the asymptotic variance with the dimension and show that the optimal covariance matrix for the importance function can be estimated in special cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While increasing numbers of young high school students engage in part-time work, there is no consensus about its impact on educational outcomes. Indeed this field has had a dearth of research. The present paper presents a review of recent research, primarily from Australia and the US, although it is acknowledged that there are considerable contextual differences. Suggestions for school counsellors to harness the students’ experiences to assist in educational and career decision-making are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives. To evaluate the performance of the dynamic-area high-speed videokeratoscopy technique in the assessment of tear film surface quality with and without the presence of soft contact lenses on eye. Methods. Retrospective data from a tear film study using basic high-speed videokeratoscopy, captured at 25 frames per second, (Kopf et al., 2008, J Optom) were used. Eleven subjects had tear film analysis conducted in the morning, midday and evening on the first and seventh day of one week of no lens wear. Five of the eleven subjects then completed an extra week of hydrogel lens wear followed by a week of silicone hydrogel lens wear. Analysis was performed on a 6 second period of the inter-blink recording. The dynamic-area high-speed videokeratoscopy technique uses the maximum available area of Placido ring pattern reflected from the tear interface and eliminates regions of disturbance due to shadows from the eyelashes. A value of tear film surface quality was derived using image rocessing techniques, based on the quality of the reflected ring pattern orientation. Results. The group mean tear film surface quality and the standard deviations for each of the conditions (bare eye, hydrogel lens, and silicone hydrogel lens) showed a much lower coefficient of variation than previous methods (average reduction of about 92%). Bare eye measurements from the right and left eyes of eleven individuals showed high correlation values (Pearson’s correlation r = 0.73, p < 0.05). Repeated measures ANOVA across the 6 second period of measurement in the normal inter-blink period for the bare eye condition showed no statistically significant changes. However, across the 6 second inter-blink period with both contact lenses, statistically significant changes were observed (p < 0.001) for both types of contact lens material. Overall, wearing hydrogel and silicone hydrogel lenses caused the tear film surface quality to worsen compared with the bare eye condition (repeated measures ANOVA, p < 0.0001 for both hydrogel and silicone hydrogel). Conclusions. The results suggest that the dynamic-area method of high-speed videokeratoscopy was able to distinguish and quantify the subtle, but systematic worsening of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new method for noninvasive assessment of tear film surface quality (TFSQ) is proposed. The method is based on high-speed videokeratoscopy in which the corneal area for the analysis is dynamically estimated in a manner that removes videokeratoscopy interference from the shadows of eyelashes but not that related to the poor quality of the precorneal tear film that is of interest. The separation between the two types of seemingly similar videokeratoscopy interference is achieved by region-based classification in which the overall noise is first separated from the useful signal (unaltered videokeratoscopy pattern), followed by a dedicated interference classification algorithm that distinguishes between the two considered interferences. The proposed technique provides a much wider corneal area for the analysis of TFSQ than the previously reported techniques. A preliminary study with the proposed technique, carried out for a range of anterior eye conditions, showed an effective behavior in terms of noise to signal separation, interference classification, as well as consistent TFSQ results. Subsequently, the method proved to be able to not only discriminate between the bare eye and the lens on eye conditions but also to have the potential to discriminate between the two types of contact lenses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-speed videokeratoscopy is an emerging technique that enables study of the corneal surface and tear-film dynamics. Unlike its static predecessor, this new technique results in a very large amount of digital data for which storage needs become significant. We aimed to design a compression technique that would use mathematical functions to parsimoniously fit corneal surface data with a minimum number of coefficients. Since the Zernike polynomial functions that have been traditionally used for modeling corneal surfaces may not necessarily correctly represent given corneal surface data in terms of its optical performance, we introduced the concept of Zernike polynomial-based rational functions. Modeling optimality criteria were employed in terms of both the rms surface error as well as the point spread function cross-correlation. The parameters of approximations were estimated using a nonlinear least-squares procedure based on the Levenberg-Marquardt algorithm. A large number of retrospective videokeratoscopic measurements were used to evaluate the performance of the proposed rational-function-based modeling approach. The results indicate that the rational functions almost always outperform the traditional Zernike polynomial approximations with the same number of coefficients.