792 resultados para Potentially mineralizable N


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Credentials are a salient form of cultural capital and if a student’s learning and productions are not assessed, they are invisible in current social systems of education and employment. In this field, invisible equals non-existent. This paper arises from the context of an alternative education institution where conventional educational assessment techniques currently fail to recognise the creativity and skills of a cohort of marginalised young people. In order to facilitate a new assessment model an electronic portfolio system (EPS) is being developed and trialled to capture evidence of students’ learning and their productions. In so doing a dynamic system of arranging, exhibiting, exploiting and disseminating assessment data in the form of coherent, meaningful and valuable reports will be maintained. The paper investigates the notion of assessing development of creative thinking and skills through the means of a computerised system that operates in an area described as the efield. A model of the efield is delineated and is explained as a zone existing within the internet where free users exploit the cloud and cultivate social and cultural capital. Drawing largely on sociocultural theory and Bourdieu’s concepts of field, habitus and capitals, the article positions the efield as a potentially productive instrument in assessment for learning practices. An important aspect of the dynamics of this instrument is the recognition of teachers as learners. This is seen as an integral factor in the sociocultural approach to assessment for learning practices that will be deployed with the EPS. What actually takes place is argued to be assessment for learning as a field of exchange. The model produced in this research is aimed at delivering visibility and recognition through an engaging instrument that will enhance the prospects of marginalised young people and shift the paradigm for assessment in a creative world.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determining the ecologically relevant spatial scales for predicting species occurrences is an important concept when determining species–environment relationships. Therefore species distribution modelling should consider all ecologically relevant spatial scales. While several recent studies have addressed this problem in artificially fragmented landscapes, few studies have researched relevant ecological scales for organisms that also live in naturally fragmented landscapes. This situation is exemplified by the Australian rock-wallabies’ preference for rugged terrain and we addressed the issue of scale using the threatened brush-tailed rock-wallaby (Petrogale penicillata) in eastern Australia. We surveyed for brush-tailed rock-wallabies at 200 sites in southeast Queensland, collecting potentially influential site level and landscape level variables. We applied classification trees at either scale to capture a hierarchy of relationships between the explanatory variables and brush-tailed rock-wallaby presence/absence. Habitat complexity at the site level and geology at the landscape level were the best predictors of where we observed brush-tailed rock-wallabies. Our study showed that the distribution of the species is affected by both site scale and landscape scale factors, reinforcing the need for a multi-scale approach to understanding the relationship between a species and its environment. We demonstrate that careful design of data collection, using coarse scale spatial datasets and finer scale field data, can provide useful information for identifying the ecologically relevant scales for studying species–environment relationships. Our study highlights the need to determine patterns of environmental influence at multiple scales to conserve specialist species such as the brush-tailed rock-wallaby in naturally fragmented landscapes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many jurisdictions have developed mature infrastructures, both administratively and legislatively, to promote competition. Substantial funds have been expended to monitor activities that are anticompetitive and many jurisdictions also have adopted a form of "Cartel Leniency Program", first developed by the US Federal Trade Commission, to assist in cartel detection. Further, some jurisdictions are now criminalizing cartel behaviour so that cartel participants can be held criminally liable with substantial custodial penalties imposed. Notwithstanding these multijurisdictional approaches, a new form of possibly anticompetitive behaviour is looming. Synergistic monopolies („synopolies‟) involve not competitors within a horizontal market but complimentors within separate vertical markets. Where two complimentary corporations are monopolists in their own market they can, through various technologies, assist each other to expand their respective monopolies thus creating a barrier to new entrants and/or blocking existing participants from further participation in that market. The nature of the technologies involved means that it is easy for this potentially anti-competitive activity to enter and affect the global marketplace. Competition regulators need to be aware of this potential for abuse and ensure that their respective competition frameworks appropriately address this activity. This paper discusses how new technologies can be used to create a synopoly.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Adolescents experience many benefits from bicycling; however, there are also potentially significant injury consequences. One effective counter-measure for the prevention of adolescent bicycling injuries is to promote bicycle helmet wearing. An overview is provided of injury risks of bicycle riding with particular attention to the role of helmet wearing and associated countermeasures such as legislation and school and community approaches. The findings are presented of a study conducted in Australia that examined the effectiveness of a theory-based injury prevention program, Skills for Preventing Injury in Youth (SPIY) for ninth-grade students (age 13 to 14 years). The findings showed a significant, 20.2% decrease in cycling without a helmet among the intervention students (n = 360) and no change for the students in the comparison group (n = 363) after 6 months. In addition, it was found that failing to wear a helmet was significantly associated with engaging in other transport-related risks, being male, having friends who do not wear a helmet and are specific targets of change in the SPIY program, showing a negative attitude toward risk, failing to intervene in friends' risk-taking, and having low knowledge of first aid. Overall, the SPIY program appeared to be an effective theory-based intervention to increase helmet wearing among early adolescents, a group not often targeted in school and community helmet-wearing programs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Games and related virtual environments have been a much-hyped area of the entertainment industry. The classic quote is that games are now approaching the size of Hollywood box office sales [1]. Books are now appearing that talk up the influence of games on business [2], and it is one of the key drivers of present hardware development. Some of this 3D technology is now embedded right down at the operating system level via the Windows Presentation Foundations – hit Windows/Tab on your Vista box to find out... In addition to this continued growth in the area of games, there are a number of factors that impact its development in the business community. Firstly, the average age of gamers is approaching the mid thirties. Therefore, a number of people who are in management positions in large enterprises are experienced in using 3D entertainment environments. Secondly, due to the pressure of demand for more computational power in both CPU and Graphical Processing Units (GPUs), your average desktop, any decent laptop, can run a game or virtual environment. In fact, the demonstrations at the end of this paper were developed at the Queensland University of Technology (QUT) on a standard Software Operating Environment, with an Intel Dual Core CPU and basic Intel graphics option. What this means is that the potential exists for the easy uptake of such technology due to 1. a broad range of workers being regularly exposed to 3D virtual environment software via games; 2. present desktop computing power now strong enough to potentially roll out a virtual environment solution across an entire enterprise. We believe such visual simulation environments can have a great impact in the area of business process modeling. Accordingly, in this article we will outline the communication capabilities of such environments, giving fantastic possibilities for business process modeling applications, where enterprises need to create, manage, and improve their business processes, and then communicate their processes to stakeholders, both process and non-process cognizant. The article then concludes with a demonstration of the work we are doing in this area at QUT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been established that mixed venous oxygen saturation (SvO2) reflects the balance between systemic oxygen deliver y and consumption. Literature indicates that it is a valuable clinical indicator and has good prognostic value early in patient course. This article aims to establish the usefulness of SvO2 as a clinical indicator. A secondary aim was to determine whether central venous oxygen saturation (ScvO2) and SvO2 are interchangeable. Of particular relevance to cardiac nurses is the link between decreased SvO2 and cardiac failure in patients with myocardial infarction, and with decline in myocardial function, clinical shock and arrhythmias. While absolute values ScvO2 and SvO2 are not interchangeable, ScvO2 and SvO2are equivalent in terms of clinical course. Additionally, ScvO2 monitoring is a safer and less costly alternative to SvO2 monitoring. It can be concluded that continuous ScvO2 monitoring should potentially be undertaken in patients at risk of haemodynamic instability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: There is a sound rationale for the population-based approach to falls injury prevention but there is currently insufficient evidence to advise governments and communities on how they can use population-based strategies to achieve desired reductions in the burden of falls-related injury.---------- Aim: To quantify the effectiveness of a streamlined (and thus potentially sustainable and cost-effective), population-based, multi-factorial falls injury prevention program for people over 60 years of age.---------- Methods: Population-based falls-prevention interventions were conducted at two geographically-defined and separate Australian sites: Wide Bay, Queensland, and Northern Rivers, NSW. Changes in the prevalence of key risk factors and changes in rates of injury outcomes within each community were compared before and after program implementation and changes in rates of injury outcomes in each community were also compared with the rates in their respective States.---------- Results: The interventions in neither community substantially decreased the rate of falls-related injury among people aged 60 years or older, although there was some evidence of reductions in occurrence of multiple falls reported by women. In addition, there was some indication of improvements in fall-related risk factors, but the magnitudes were generally modest.---------- Conclusion: The evidence suggests that low intensity population-based falls prevention programs may not be as effective as those are intensively implemented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Road curves are an important feature of road infrastructure and many serious crashes occur on road curves. In Queensland, the number of fatalities is twice as many on curves as that on straight roads. Therefore, there is a need to reduce drivers’ exposure to crash risk on road curves. Road crashes in Australia and in the Organisation for Economic Co-operation and Development(OECD) have plateaued in the last five years (2004 to 2008) and the road safety community is desperately seeking innovative interventions to reduce the number of crashes. However, designing an innovative and effective intervention may prove to be difficult as it relies on providing theoretical foundation, coherence, understanding, and structure to both the design and validation of the efficiency of the new intervention. Researchers from multiple disciplines have developed various models to determine the contributing factors for crashes on road curves with a view towards reducing the crash rate. However, most of the existing methods are based on statistical analysis of contributing factors described in government crash reports. In order to further explore the contributing factors related to crashes on road curves, this thesis designs a novel method to analyse and validate these contributing factors. The use of crash claim reports from an insurance company is proposed for analysis using data mining techniques. To the best of our knowledge, this is the first attempt to use data mining techniques to analyse crashes on road curves. Text mining technique is employed as the reports consist of thousands of textual descriptions and hence, text mining is able to identify the contributing factors. Besides identifying the contributing factors, limited studies to date have investigated the relationships between these factors, especially for crashes on road curves. Thus, this study proposed the use of the rough set analysis technique to determine these relationships. The results from this analysis are used to assess the effect of these contributing factors on crash severity. The findings obtained through the use of data mining techniques presented in this thesis, have been found to be consistent with existing identified contributing factors. Furthermore, this thesis has identified new contributing factors towards crashes and the relationships between them. A significant pattern related with crash severity is the time of the day where severe road crashes occur more frequently in the evening or night time. Tree collision is another common pattern where crashes that occur in the morning and involves hitting a tree are likely to have a higher crash severity. Another factor that influences crash severity is the age of the driver. Most age groups face a high crash severity except for drivers between 60 and 100 years old, who have the lowest crash severity. The significant relationship identified between contributing factors consists of the time of the crash, the manufactured year of the vehicle, the age of the driver and hitting a tree. Having identified new contributing factors and relationships, a validation process is carried out using a traffic simulator in order to determine their accuracy. The validation process indicates that the results are accurate. This demonstrates that data mining techniques are a powerful tool in road safety research, and can be usefully applied within the Intelligent Transport System (ITS) domain. The research presented in this thesis provides an insight into the complexity of crashes on road curves. The findings of this research have important implications for both practitioners and academics. For road safety practitioners, the results from this research illustrate practical benefits for the design of interventions for road curves that will potentially help in decreasing related injuries and fatalities. For academics, this research opens up a new research methodology to assess crash severity, related to road crashes on curves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Building construction is a highly competitive and risky business. This competitiveness is compounded where conflicting objectives amongst contracting and subcontracting firms sets the stage for an adversarial and potentially destructive approach. There is a need for change in the construction industry—not only to a more cooperative approach to build mutual trust, respect and good faith—but also from a confrontationist and adversarial attitude to a harmonious relationship. It is necessary to change the culture to create a win-win situation. “Strategic Alliances” is one such concept. A strategic alliance is a cooperative arrangement between two or more organisations that forms part of their overall strategies, and contributes to achieving their major goals and objectives. This paper begins with an overview of the Australian building construction industry, then reviews the literature and describes an analysis framework comprising six attributes of strategic alliances—trust, commitment, interdependence, cooperation, communication, and joint problem solving. Given the trend towards greater emphasis on broader contracting firm performance criteria, indicators are proposed as a component of the tender evaluation process for public works.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Timberland is seen as a long-term investment which has recently received increased institutional investor attention in many countries and potentially provides added value in a mixed-asset portfolio. Using the National Council of Real Estate Investment Fiduciaries (NCREIF) timberland series, this paper analyses the risk-adjusted performance and portfolio diversification benefits of timberland in the United States over the period of 1987-2007. U.S. timberland is seen to have been a strongly performed asset class with significant portfolio diversification benefits over this period; with a significant portfolio role separate to that of real estate. However, recent years have seen reduced risk-adjusted returns, with some loss of portfolio diversification benefits of timberland with stocks and real estate. Global drivers are likely to see increased future demand for timberland investment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Location based games (LBGs) provide an opportunity to look at how new technologies can support a reciprocal relationship between formal classroom learning and learning that can potentially occur in other everyday environments. Fundamentally many games are intensely engaging due to the resulting social interactions and technical challenges they provide to individual and group players. By introducing the use of mobile devices we can transport these characteristics of games into everyday spaces. LBGs are understood as a broad genre incorporating ideas and tools that provide many unique opportunities for us to to reveal, create and even subvert various social, cultural, technical, and scientific interpretations of place, in particular places where learning is sometimes problematic.--------- A team of Queensland game developers have learnt a great deal through designing a range of LBGs such as SCOOT for various user groups and places. While these LBGs were primarily designed as social events, we found that the players recognised and valued the game as an opportunity to learn about their environment, it's history, cultural significance, inhabitants, services etc. Since identifying the strong pedagogical outcomes of LBGs, the team has created a set of authoring tools for people to design and host their own LBGs. A particular version of this is known as MiLK the mobile learning kit for schools.---------- This presentation will include examples of how LBGs have been used to improve the teaching and learning outcomes in various contexts. Participants will be introduced to MiLK and invited to trial it in their own classrooms with students.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The little grey cat engine (greyCat) is part of a series of projects which explore software which can enable access to the potentially empowering nature of represented space and game design. GreyCat is the result of research into the culture of the software itself in order to provide participatory environments which enable the telling of ‘small stories’ – stories and experiences which are those of the everyday or those of a cultural perspective other than that prioritised by most world building softwares or game engines. GreyCat offers a simple framework which allows participants to use their own image materials (photographs for the most part) as a basis for spatial exploration of their own places.---------- Truna aka j.turner (2008) The little grey cat engine: telling small stories (Demo), Australasian Computer Human Interaction Conference, OZCHI 2008, December 8th-12th, Cairns, Australia---------- Research Publications: truna aka j.turner & Browning, D. (2009) Designing spatial story telling software, in proceedings OZCHI09, Melbourne---------- Truna aka j.turner, Browning, D. & Champion, E. (2008) Designing for Engaged Experience, In proceedings Australasian Computer Human Interaction Conference, OZCHI 2008, December 8th-12th, Cairns, Australia---------- Truna aka. J.turner & Bidwell, N. (2007) Through the looking glass: game worlds as representations and views from elsewhere, Proceedings of the 4th Australasian conference on Interactive entertainment, Melbourne, Australia---------- Truna aka j.turner, Browning, D & Bidwell, N. (2007) Wanderer beyond game worlds, in proceedings, Hutchinson, A (ed) PerthDAC 2007: The seventh International Digital Arts and Culture Conference: The future of digital media culture, 15-18 September 2007, Perth, Australia, Curtin University of Technology---------- Truna aka j.turner (2006) To explore strange new worlds: experience design in 3 dimensional immersive environments - role and place in a world as object of interaction, In proceedings, Australasian Computer Human Interaction Conference, OZCHI 2006, November 22nd-24th, Sydney, Australia, November 20th – 24th 2006, pp 26- 29---------- Truna aka j.turner (2006) Digital songlines environment (Demonstration), In proceedings 2006 International conference on Game research and development, Perth, Australia---------- Truna aka j.turner (2006) Destination Space: Experiential Spatiality and Stories, Special Session on Experiential Spatiality, In proceedings 2006 International conference on Game research and development, Perth, Australia

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An Approach with Vertical Guidance (APV) is an instrument approach procedure which provides horizontal and vertical guidance to a pilot on approach to landing in reduced visibility conditions. APV approaches can greatly reduce the safety risk to general aviation by improving the pilot’s situational awareness. In particular the incidence of Controlled Flight Into Terrain (CFIT) which has occurred in a number of fatal air crashes in general aviation over the past decade in Australia, can be reduced. APV approaches can also improve general aviation operations. If implemented at Australian airports, APV approach procedures are expected to bring a cost saving of millions of dollars to the economy due to fewer missed approaches, diversions and an increased safety benefit. The provision of accurate horizontal and vertical guidance is achievable using the Global Positioning System (GPS). Because aviation is a safety of life application, an aviation-certified GPS receiver must have integrity monitoring or augmentation to ensure that its navigation solution can be trusted. However, the difficulty with the current GPS satellite constellation alone meeting APV integrity requirements, the susceptibility of GPS to jamming or interference and the potential shortcomings of proposed augmentation solutions for Australia such as the Ground-based Regional Augmentation System (GRAS) justifies the investigation of Aircraft Based Augmentation Systems (ABAS) as an alternative integrity solution for general aviation. ABAS augments GPS with other sensors at the aircraft to help it meet the integrity requirements. Typical ABAS designs assume high quality inertial sensors to provide an accurate reference trajectory for Kalman filters. Unfortunately high-quality inertial sensors are too expensive for general aviation. In contrast to these approaches the purpose of this research is to investigate fusing GPS with lower-cost Micro-Electro-Mechanical System (MEMS) Inertial Measurement Units (IMU) and a mathematical model of aircraft dynamics, referred to as an Aircraft Dynamic Model (ADM) in this thesis. Using a model of aircraft dynamics in navigation systems has been studied before in the available literature and shown to be useful particularly for aiding inertial coasting or attitude determination. In contrast to these applications, this thesis investigates its use in ABAS. This thesis presents an ABAS architecture concept which makes use of a MEMS IMU and ADM, named the General Aviation GPS Integrity System (GAGIS) for convenience. GAGIS includes a GPS, MEMS IMU, ADM, a bank of Extended Kalman Filters (EKF) and uses the Normalized Solution Separation (NSS) method for fault detection. The GPS, IMU and ADM information is fused together in a tightly-coupled configuration, with frequent GPS updates applied to correct the IMU and ADM. The use of both IMU and ADM allows for a number of different possible configurations. Three are investigated in this thesis; a GPS-IMU EKF, a GPS-ADM EKF and a GPS-IMU-ADM EKF. The integrity monitoring performance of the GPS-IMU EKF, GPS-ADM EKF and GPS-IMU-ADM EKF architectures are compared against each other and against a stand-alone GPS architecture in a series of computer simulation tests of an APV approach. Typical GPS, IMU, ADM and environmental errors are simulated. The simulation results show the GPS integrity monitoring performance achievable by augmenting GPS with an ADM and low-cost IMU for a general aviation aircraft on an APV approach. A contribution to research is made in determining whether a low-cost IMU or ADM can provide improved integrity monitoring performance over stand-alone GPS. It is found that a reduction of approximately 50% in protection levels is possible using the GPS-IMU EKF or GPS-ADM EKF as well as faster detection of a slowly growing ramp fault on a GPS pseudorange measurement. A second contribution is made in determining how augmenting GPS with an ADM compares to using a low-cost IMU. By comparing the results for the GPS-ADM EKF against the GPS-IMU EKF it is found that protection levels for the GPS-ADM EKF were only approximately 2% higher. This indicates that the GPS-ADM EKF may potentially replace the GPS-IMU EKF for integrity monitoring should the IMU ever fail. In this way the ADM may contribute to the navigation system robustness and redundancy. To investigate this further, a third contribution is made in determining whether or not the ADM can function as an IMU replacement to improve navigation system redundancy by investigating the case of three IMU accelerometers failing. It is found that the failed IMU measurements may be supplemented by the ADM and adequate integrity monitoring performance achieved. Besides treating the IMU and ADM separately as in the GPS-IMU EKF and GPS-ADM EKF, a fourth contribution is made in investigating the possibility of fusing the IMU and ADM information together to achieve greater performance than either alone. This is investigated using the GPS-IMU-ADM EKF. It is found that the GPS-IMU-ADM EKF can achieve protection levels approximately 3% lower in the horizontal and 6% lower in the vertical than a GPS-IMU EKF. However this small improvement may not justify the complexity of fusing the IMU with an ADM in practical systems. Affordable ABAS in general aviation may enhance existing GPS-only fault detection solutions or help overcome any outages in augmentation systems such as the Ground-based Regional Augmentation System (GRAS). Countries such as Australia which currently do not have an augmentation solution for general aviation could especially benefit from the economic savings and safety benefits of satellite navigation-based APV approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND:Previous epidemiological investigations of associations between dietary glycemic intake and insulin resistance have used average daily measures of glycemic index (GI) and glycemic load (GL). We explored multiple and novel measures of dietary glycemic intake to determine which was most predictive of an association with insulin resistance.METHODS:Usual dietary intakes were assessed by diet history interview in women aged 42-81 years participating in the Longitudinal Assessment of Ageing in Women. Daily measures of dietary glycemic intake (n = 329) were carbohydrate, GI, GL, and GL per megacalorie (GL/Mcal), while meal based measures (n = 200) were breakfast, lunch and dinner GL; and a new measure, GL peak score, to represent meal peaks. Insulin resistant status was defined as a homeostasis model assessment (HOMA) value of >3.99; HOMA as a continuous variable was also investigated.RESULTS:GL, GL/Mcal, carbohydrate (all P < 0.01), GL peak score (P = 0.04) and lunch GL (P = 0.04) were positively and independently associated with insulin resistant status. Daily measures were more predictive than meal-based measures, with minimal difference between GL/Mcal, GL and carbohydrate. No significant associations were observed with HOMA as a continuous variable.CONCLUSION:A dietary pattern with high peaks of GL above the individual's average intake was a significant independent predictor of insulin resistance in this population, however the contribution was less than daily GL and carbohydrate variables. Accounting for energy intake slightly increased the predictive ability of GL, which is potentially important when examining disease risk in more diverse populations with wider variations in energy requirements.