915 resultados para Application times
Resumo:
The focus of governments on increasing active travel has motivated renewed interest in cycling safety. Bicyclists are up to 20 times more likely to be involved in serious injury crashes than drivers so understanding the relationship among factors in bicyclist crash risk is critically important for identifying effective policy tools, for informing bicycle infrastructure investments, and for identifying high risk bicycling contexts. This study aims to better understand the complex relationships between bicyclist self reported injuries resulting from crashes (e.g. hitting a car) and non-crashes (e.g. spraining an ankle) and perceived risk of cycling as a function of cyclist exposure, rider conspicuity, riding environment, rider risk aversion, and rider ability. Self reported data from 2,500 Queensland cyclists are used to estimate a series of seemingly unrelated regressions to examine the relationships among factors. The major findings suggest that perceived risk does not appear to influence injury rates, nor do injury rates influence perceived risks of cycling. Riders who perceive cycling as risky tend not to be commuters, do not engage in group riding, tend to always wear mandatory helmets and front lights, and lower their perception of risk by increasing days per week of riding and by increasing riding proportion on bicycle paths. Riders who always wear helmets have lower crash injury risk. Increasing the number of days per week riding tends to decrease both crash injury and non crash injury risk (e.g. a sprain). Further work is needed to replicate some of the findings in this study.
Resumo:
Introduction: Feeding on demand supports an infant’s innate capacity to respond to hunger and satiety cues and may promote later self-regulation of intake. Our aim was to examine whether feeding style (on demand vs to schedule) is associated with weight gain in early life. Methods: Participants were first-time mothers of healthy term infants enrolled NOURISH, an RCT evaluating an intervention to promote positive early feeding practices. Baseline assessment occurred when infants were aged 2-7 months. Infants able to be categorised clearly as feeding on demand or to schedule (mothers self report) were included in the logistic regression analysis. The model was adjusted for gender, breastfeeding and maternal age, education, BMI. Weight gain was defined as a positive difference in baseline minus birthweight z-scores (WHO standards) which indicated tracking above weight percentile. Results: Data from 356 infants with a mean age of 4.4 (SD 1.0) months were available. Of these, 197 (55%) were fed on demand, 42 (12%) were fed on schedule. There was no statistical association between feeding style and weight gain [OR=0.72 (95%CI 0.35-1.46), P=0.36]. Formula fed infants were three times more likely to be fed on schedule and formula feeding was independently associated with increased weight gain [OR=2.02 (95%CI 1.11-3.66), P=0.021]. Conclusion: In this preliminary analysis the association between feeding style and weight gain did not reach statistical significance, however , the effect size may be clinically relevant and future analysis will include the full study sample (N=698).
Resumo:
A hospital consists of a number of wards, units and departments that provide a variety of medical services and interact on a day-to-day basis. Nearly every department within a hospital schedules patients for the operating theatre (OT) and most wards receive patients from the OT following post-operative recovery. Because of the interrelationships between units, disruptions and cancellations within the OT can have a flow-on effect to the rest of the hospital. This often results in dissatisfied patients, nurses and doctors, escalating waiting lists, inefficient resource usage and undesirable waiting times. The objective of this study is to use Operational Research methodologies to enhance the performance of the operating theatre by improving elective patient planning using robust scheduling and improving the overall responsiveness to emergency patients by solving the disruption management and rescheduling problem. OT scheduling considers two types of patients: elective and emergency. Elective patients are selected from a waiting list and scheduled in advance based on resource availability and a set of objectives. This type of scheduling is referred to as ‘offline scheduling’. Disruptions to this schedule can occur for various reasons including variations in length of treatment, equipment restrictions or breakdown, unforeseen delays and the arrival of emergency patients, which may compete for resources. Emergency patients consist of acute patients requiring surgical intervention or in-patients whose conditions have deteriorated. These may or may not be urgent and are triaged accordingly. Most hospitals reserve theatres for emergency cases, but when these or other resources are unavailable, disruptions to the elective schedule result, such as delays in surgery start time, elective surgery cancellations or transfers to another institution. Scheduling of emergency patients and the handling of schedule disruptions is an ‘online’ process typically handled by OT staff. This means that decisions are made ‘on the spot’ in a ‘real-time’ environment. There are three key stages to this study: (1) Analyse the performance of the operating theatre department using simulation. Simulation is used as a decision support tool and involves changing system parameters and elective scheduling policies and observing the effect on the system’s performance measures; (2) Improve viability of elective schedules making offline schedules more robust to differences between expected treatment times and actual treatment times, using robust scheduling techniques. This will improve the access to care and the responsiveness to emergency patients; (3) Address the disruption management and rescheduling problem (which incorporates emergency arrivals) using innovative robust reactive scheduling techniques. The robust schedule will form the baseline schedule for the online robust reactive scheduling model.
Resumo:
Swelling social need and competing calls on government funds have heightened the philanthropic dollar’s value. Yet, Australia is not regarded as having a robust giving culture: while 86% of adults give, a mere 16% plan their giving with those who do donating four times as much as spontaneous givers (Giving Australia, 2005). Traditionally, the prime planned giving example is a charitable bequest, a revenue stream not prevalent here (Baker, 2007). In fact, Baker’s Victorian probate data shows under 5% of estates provide a charitable bequest and just over 1% of estate assets is bequeathed. The UK, in contrast, sources 30% and the US 10% of charitable income through bequests (NCVO, 2004; Sargeant, Wymer and Hilton,2006). Australian charities could boost bequest giving. Understanding the donor market, which has or may remember them in their will is critical. This paper reports donor perceptions of Australian charities’ bequest communication/ marketing. The data forms part of a wider study of Australian donors’ bequest attitudes and behaviour. Charities spend heavily on bequest promotion, from advertising to personal selling to public relations and promotion. Infrastructure funds are scarce so guidance on what works for donors is important. Guy and Patton (1988) made their classic call for a nonprofit marketing perspective and identify the need for charities to better understand the motivations and behaviour of their supporters. In similar vein, this study aims to improve the way nonprofits and givers interact; and ultimately, enhance the giving experience and thus multiply planned giving participation. Academically, it offers insights to Australian bequest motivations and attitudes not studied empirically before.
Resumo:
This thesis addresses one of the fundamental issues that remains unresolved in patent law today. It is a question that strikes at the heart of what a patent is and what it is supposed to protect. That question is whether an invention must produce a physical effect or cause a physical transformation of matter to be patentable, or whether it is sufficient that an invention involves a specific practical application of an idea or principle to achieve a useful result. In short, the question is whether patent law contains a physicality requirement. Resolving this issue will determine whether only traditional mechanical, industrial and manufacturing processes are patent eligible, or whether patent eligibility extends to include purely intangible, or non-physical, products and processes. To this end, this thesis seeks to identify where the dividing line lies between patentable subject matter and the recognised categories of excluded matter, namely, fundamental principles of nature, physical phenomena, and abstract ideas. It involves determining which technological advances are worth the inconvenience monopoly protection causes the public at large, and which should remain free for all to use without restriction. This is an issue that has important ramifications for innovation in the ‘knowledge economy’ of the Information Age. Determining whether patent law contains a physicality requirement is integral to deciding whether much of the valuable innovation we are likely to witness, in what are likely to be the emerging areas of technology in the near future, will receive the same encouragement as industrial and manufacturing advances of previous times.
Resumo:
We live in uncertain times. The sub-prime crisis that commenced in the U.S. in 2007, the global economic crisis that followed, and the recent sovereign debt crisis in various European countries have led to ongoing instability in global financial markets that continues to receive daily media attention. These uncertain times create enormous opportunities for researchers across many disciplines to research capital markets and business practices. From an accounting perspective, accounting regulators have been active in developing new standards to address risk management issues arising from the crises and have continued to develop and refine financial reporting standards. With the adoption of, or transition to international financial reporting standards (IFRS) in many countries, the globalisation of financial reporting standards is close to becoming a reality. However, doubts still remain about whether the IFRS will lead to any real long-term improvement in financial reporting and transparency (see Sunder, 2011)...
Resumo:
Aims Multi-method study including two parts: Study One: three sets of observations in two regional areas of Queensland Study Two: two sets of parent intercept interviews conducted in Toowoomba, Queensland. The aim of Study Two is to determine parents’ views, opinions and knowledge of child restraint practices and the Queensland legislative amendment.
Resumo:
Purpose: The Cobb technique is the universally accepted method for measuring the severity of spinal deformities. Traditionally, Cobb angles have been measured using protractor and pencil on hardcopy radiographic films. The new generation of mobile phones make accurate angle measurement possible using an integrated accelerometer, providing a potentially useful clinical tool for assessing Cobb angles. The purpose of this study was to compare Cobb angle measurements performed using an Apple iPhone and traditional protractor in a series of twenty Adolescent Idiopathic Scoliosis patients. Methods: Seven observers measured major Cobb angles on twenty pre-operative postero-anterior radiographs of Adolescent Idiopathic Scoliosis patients with both a standard protractor and using an Apple iPhone. Five of the observers repeated the measurements at least a week after the original measurements. Results: The mean absolute difference between pairs of iPhone/protractor measurements was 2.1°, with a small (1°) bias toward lower Cobb angles with the iPhone. 95% confidence intervals for intra-observer variability were ±3.3° for the protractor and ±3.9° for the iPhone. 95% confidence intervals for inter-observer variability were ±8.3° for the iPhone and ±7.1° for the protractor. Both of these confidence intervals were within the range of previously published Cobb measurement studies. Conclusions: We conclude that the iPhone is an equivalent Cobb measurement tool to the manual protractor, and measurement times are about 15% less. The widespread availability of inclinometer-equipped mobile phones and the ability to store measurements in later versions of the angle measurement software may make these new technologies attractive for clinical measurement applications.
Resumo:
Zeolite N, an EDI type framework structure with ideal chemical formula K12Al10Si10O40Cl2•5H2O, was produced from kaolin between 100oC and 200oC in a continuously stirred reactor using potassic and potassic+sodic liquors containing a range of anions. Reactions using liquors such as KOH, KOH + KX (where X = F, Cl, Br, I, NO3, NO2), K2X (where X=CO3), KOH + NaCl or NaOH + KCl were complete (>95% product) in less than two hours depending on the batch composition and temperature of reaction. With KOH and KCl in the reaction mixture and H2O/Al2O3~49, zeolite N was formed over a range of concentrations (1M < [KOH] < 18M) and reaction times (0.5h < t < 60h). At higher temperatures or higher KOH molarity, other potassic phases such as kalsilite or kaliophyllite formed. In general, temperature and KOH molarity defined the extent of zeolite N formation under these conditions. The introduction of sodic reagents to the starting mixture or use of one potassic reagent in the starting mixture reduced the stability field for zeolite N formation. Zeolite N was also formed using zeolite 4A as a source of Al and Si albeit for longer reaction times at a particular temperature when compared with kaolin as the source material.
Resumo:
A full architectural education typically involves five years of formal education and two years of practice experience under the supervision of a registered architect. In many architecture courses some of this period of internship can be taken either as a ‘year out’ between years of study, or during enrolment as credited study; work place learning or work integrated learning. This period of learning can be characterised as an internship in which the student, as an adult learner, is supervised by their employer. This is a highly authentic learning environment, but one in which the learner is both student and employee, and the architect is both teacher and employer; at times conflicting roles. While the educational advantages of such authentic practice experience are well recognised, there are also concerns about the quality and variability of such experiences. This paper reviews the current state of practice, with respect to architectural internships, and analyses such practice using Laurillard’s ‘conversational framework’ (2002). The framework highlights the interactions and affordances between teacher and student in the form of concepts, adaptations, reflections, actions and feedback. A review of common practice in architectural work place learning, internships in other fields of education, and focused research at the author’s own university, are discussed, then analysed for ‘affordances’ of learning. Such analysis shows both the potential of work place learning to offer a unique environment for learning, and the need to organise and construct such experiences in ways that facilitates learning.
Resumo:
The project is a book collection of 65 poems, primarily with an environmental focus. This practice-led project draws on eco-critical theory (Wilson, 1992; and Bate, 2000) and Darwinian literary theory (Carroll, 2004) to explore ideas of ecology, the ‘natural’, and conservation. The poems explore a proposal of synthesis: that nature is for us both a construction of language/culture (as argued by post structuralism/ cultural studies) and also a pragmatic, empirical entity that can be experienced through the senses as well as through culture. For example, individual poems describe genres of ‘forest’ (‘Literary Forests’, ‘The Conservative Forest’, ‘The Imperial Forest’) which demonstrate how ‘nature’ can be culturally constructed, but also remain an empirical entity with which we experience a more immediate, physical connection, as posited by Bate (following Heidegger’s ‘being-in-the-world’) . The work also explores through satire the concept of evolutionary adaptation, for example the integration of machine into forest (‘The Black Forest’), animals adopting human characteristics (‘In Praise of Bears’), and ‘nature’ as a damaged or absent ‘other’. Without an Alibi makes various strands of theoretical thinking concrete and manifest by ‘showing not telling’, in creative practice. The work has been high positively reviewed in the prestigious Australian Book Review, and by Professor Peter Pierce in the Canberra Times. Several of the poems have since been reproduced in national anthologies including The Penguin Anthology of Australian Poetry (2000) and Australian Poetry Since 1988 (Uni of NSW Press).
Resumo:
Drawing on the example of a recent study (Wang, 2010), this paper discusses the application of a sociocultural approach to information literacy research and curriculat design. First, it describes the foundation of this research approach in sociocultural theories, in particular Vygotsky's sociocultural theory. Then it presents key theoretical principles arising from the research and describes how the sociocultural approach enabled the establishment of collaborative partnerships between information professionals and academic and teaching support staff in a community of practice for information literacy integration.
Resumo:
This paper reviews the current state in the application of infrared methods, particularly mid-infrared (mid-IR) and near infrared (NIR), for the evaluation of the structural and functional integrity of articular cartilage. It is noted that while a considerable amount of research has been conducted with respect to tissue characterization using mid-IR, it is almost certain that full-thickness cartilage assessment is not feasible with this method. On the contrary, the relatively more considerable penetration capacity of NIR suggests that it is a suitable candidate for full-thickness cartilage evaluation. Nevertheless, significant research is still required to improve the specificity and clinical applicability of the method if we are going to be able to use it for distinguishing between functional and dysfunctional cartilage.
Resumo:
Childhood sun exposure has been associated with increased risk of developing melanoma later in life. Sunscreen, children.s preferred method of sun protection, has been shown to reduce skin cancer risk. However, the effectiveness of sunscreen is largely dependent on user compliance, such as the thickness of application. To reach the sun protection factor (SPF) sunscreen must be applied at a thickness of 2mg/cm2. It has been demonstrated that adults tend to apply less than half of the recommended 2mg/cm2. This was the first study to measure the thickness at which children apply sunscreen. We recruited 87 primary school aged children (n=87, median age 8.7, 5-12 years) from seven state schools within one Brisbane education district (32% consent rate). The children were supplied with sunscreen in three dispenser types (pump, squeeze and roll-on) and were asked to use these for one week each. We measured the weight of the sunscreen before and after use, and calculated the children.s body surface area (based on height and weight) and area to which sunscreen was applied (based on children.s self-reported body coverage of application). Combined these measurements resulted in an average thickness of sunscreen application, which was our main outcome measure. We asked parents to complete a self-administered questionnaire which captured information about potential explanatory variables. Children applied sunscreen at a median thickness of 0.48mg/cm2, significantly less than the recommended 2mg/cm2 (p<0.001). When using the roll-on dispenser (median 0.22mg/cm2), children applied significantly less sunscreen thickness, compared to the pump (median 0.75mg.cm2, p<0.001), and squeeze (median 0.57mg/cm2, p<0.001) dispensers. School grade (1-7) was significantly associated with thickness of application (p=0.032), with children in the youngest grades applying the most. Other variables that were significantly associated with the outcome variable included: number of siblings (p=0.001), household annual income (p<0.001), and the number of lifetime sunburns the child had experienced (p=0.007). This work is the first to measure children.s sunscreen application thickness and demonstrates that regardless of their age or the type of dispenser that they use, children do not apply enough sunscreen to reach the advertised SPF. It is envisaged that this study will assist in the formulation of recommendations for future research, practice and policy aimed at improving childhood sun protection to reduce skin cancer incidence in the future.
Resumo:
Acting in the best interests of students is central to the moral and ethical work of schools. Yet tensions can arise between principals and school counsellors as they work from at times opposing professional paradigms. In this article we report on principals’ and counsellors’ responses to scenarios covering confidentiality and the law, student/teacher relationships, student welfare and psychological testing of students. This discussion takes place against an examination of ethics, ethical dilemmas and professional codes of ethics. While there were a number of commonalities among principals and school counsellors that arose from their common belief in education as a moral venture, there were also some key differences among them. These differences centred on the principals’ focus on the school as a whole and counsellors’ focus on the welfare of the individual student. A series of recommendations is offered to assist principals to navigate ethical dilemmas such as those considered in this article.