123 resultados para vertically vibrated beds
Resumo:
Aim: To describe the positioning of patients managed in an intensive care unit (ICU); assess how frequently these patients were repositioned; and determine if any specific factors influenced how, why or when patients were repositioned in the ICU. Background: Alterations in body position of ICU patients are important for patient comfort and are believed to prevent and/or treat pressure ulcers, improve respiratory function and combat the adverse effects of immobility. There is a paucity of research on the positioning of critically ill patients in Saudi Arabian ICUs. Design and Methods: A prospective observational study was undertaken. Participant demographic data were collected as were clinical factors (i.e. ventilation status, primary diagnosis, co-morbidities and Ramsay sedation score) and organizational factors (i.e. time of day, type of mattress or beds used, nurse/patient ratio and the patient's position). Clinical and some organization data were recorded over a continuous 48 hour period. Result: Twenty-eight participants were recruited to the study. No participant was managed in either a flat or prone position. Obese participants were most likely to be managed in a supine position. The mean time between turns was two hours. There was no significant association between the mean time between turns and the recorded variables related to patients' demographic and organizational considerations. Conclusion: Results indicate that patient positioning in the ICU was a direct result of unit policy - it appeared that patients were not repositioned based upon evaluation of their clinical condition but rather according to a two-hour ICU timetable
Resumo:
This report discusses the geologic framework and petroleum geology used to assess undiscovered petroleum resources in the Bohaiwan basin province for the 2000 World Energy Assessment Project of the U.S. Geological Survey. The Bohaiwan basin in northeastern China is the largest petroleum-producing region in China. Two total petroleum systems have been identified in the basin. The first, the Shahejie–Shahejie/Guantao/Wumishan Total Petroleum System, involves oil and gas generated from mature pods of lacustrine source rock that are associated with six major rift-controlled subbasins. Two assessment units are defined in this total petroleum system: (1) a Tertiary lacustrine assessment unit consisting of sandstone reservoirs interbedded with lacustrine shale source rocks, and (2) a pre-Tertiary buried hills assessment unit consisting of carbonate reservoirs that are overlain unconformably by Tertiary lacustrine shale source rocks. The second total petroleum system identified in the Bohaiwan basin is the Carboniferous/Permian Coal–Paleozoic Total Petroleum System, a hypothetical total petroleum system involving natural gas generated from multiple pods of thermally mature coal beds. Low-permeability Permian sandstones and possibly Carboniferous coal beds are the reservoir rocks. Most of the natural gas is inferred to be trapped in continuous accumulations near the center of the subbasins. This total petroleum system is largely unexplored and has good potential for undiscovered gas accumulations. One assessment unit, coal-sourced gas, is defined in this total petroleum system.
Resumo:
An oriented graphitic nanostructured carbon film has been employed as a conductometric hydrogen gas sensor. The carbon film was energetically deposited using a filtered cathodic vacuum arc with a -75 V bias applied to a stainless steel grid placed 1cm from the surface of the Si substrate. The substrate was heated to 400°C prior to deposition. Electron microscopy showed evidence that the film consisted largely of vertically oriented graphitic sheets and had a density of 2.06 g/cm3. 76% of the atoms were bonded in sp2 or graphitic configurations. A change in the device resistance of >; 1.5% was exhibited upon exposure to 1 % hydrogen gas (in synthetic, zero humidity air) at 100°C. The time for the sensor resistance to increase by 1.5 % under these conditions was approximately 60 s and the baseline (zero hydrogen exposure) resistance remained constant to within 0.01% during and after the hydrogen exposures.
Resumo:
The function of environmental governance and the principle of the rule of law are both controversial and challenging. To apply the principle of the rule of law to the function of environmental governance is perhaps even more controversial and challenging. A system of environmental governance seeks to bring together the range of competitive and potentially conflicting interests in how the environment and its resources are managed. Increasingly it is the need for economic, social and ecological sustainability that brings these interests – both public and private – together. Then there is the relevance of the principle of the rule of law. Economic, social and ecological sustainability will be achieved – if at all – by a complex series of rules of law that are capable of enforcement so as to ensure compliance with them. To what extent do these rules of law reflect the principle of the rule of law? Is the principle of the rule of law the formally unstated value that is expected to underpin the legal system or is it the normative predicate that directs the legal system both vertically and horizontally? Is sustainability an aspirational value or a normative predicate according to which the environment and its resources are managed? Let us deal sequentially with these issues by reviewing a number of examples that demonstrate the relationship between environmental governance and the rule of law.
Resumo:
Policy makers increasingly recognise that an educated workforce with a high proportion of Science, Technology, Engineering and Mathematics (STEM) graduates is a pre-requisite to a knowledge-based, innovative economy. Over the past ten years, the proportion of first university degrees awarded in Australia in STEM fields is below the global average and continues to decrease from 22.2% in 2002 to 18.8% in 2010 [1]. These trends are mirrored by declines between 20% and 30% in the proportions of high school students enrolled in science or maths. These trends are not unique to Australia but their impact is of concern throughout the policy-making community. To redress these demographic trends, QUT embarked upon a long-term investment strategy to integrate education and research into the physical and virtual infrastructure of the campus, recognising that expectations of students change as rapidly as technology and learning practices change. To implement this strategy, physical infrastructure refurbishment/re-building is accompanied by upgraded technologies not only for learning but also for research. QUT’s vision for its city-based campuses is to create vibrant and attractive places to learn and research and to link strongly to the wider surrounding community. Over a five year period, physical infrastructure at the Gardens Point campus was substantially reconfigured in two key stages: (a) a >$50m refurbishment of heritage-listed buildings to encompass public, retail and social spaces, learning and teaching “test beds” and research laboratories and (b) destruction of five buildings to be replaced by a $230m, >40,000m2 Science and Engineering Centre designed to accommodate retail, recreation, services, education and research in an integrated, coordinated precinct. This landmark project is characterised by (i) self-evident, collaborative spaces for learning, research and social engagement, (ii) sustainable building practices and sustainable ongoing operation and; (iii) dynamic and mobile re-configuration of spaces or staffing to meet demand. Innovative spaces allow for transformative, cohort-driven learning and the collaborative use of space to prosecute joint class projects. Research laboratories are aggregated, centralised and “on display” to the public, students and staff. A major visualisation space – the largest multi-touch, multi-user facility constructed to date – is a centrepiece feature that focuses on demonstrating scientific and engineering principles or science oriented scenes at large scale (e.g. the Great Barrier Reef). Content on this visualisation facility is integrated with the regional school curricula and supports an in-house schools program for student and teacher engagement. Researchers are accommodated in a combined open-plan and office floor-space (80% open plan) to encourage interdisciplinary engagement and cross-fertilisation of skills, ideas and projects. This combination of spaces re-invigorates the on-campus experience, extends educational engagement across all ages and rapidly enhances research collaboration.
Resumo:
Myopia (short-sightedness) is a common ocular disorder of children and young adults. Studies primarily using animal models have shown that the retina controls eye growth and the outer retina is likely to have a key role. One theory is that the proportion of L (long-wavelength-sensitive) and M (medium-wavelength-sensitive) cones is related to myopia development; with a high L/M cone ratio predisposing individuals to myopia. However, not all dichromats (persons with red-green colour vision deficiency) with extreme L/M cone ratios have high refractive errors. We predict that the L/M cone ratio will vary in individuals with normal trichromatic colour vision but not show a systematic difference simply due to refractive error. The aim of this study was to determine if L/M cone ratios in the central 30° are different between myopic and emmetropic young, colour normal adults. Information about L/M cone ratios was determined using the multifocal visual evoked potential (mfVEP). The mfVEP can be used to measure the response of visual cortex to different visual stimuli. The visual stimuli were generated and measurements performed using the Visual Evoked Response Imaging System (VERIS 5.1). The mfVEP was measured when the L and M cone systems were separately stimulated using the method of silent substitution. The method of silent substitution alters the output of three primary lights, each with physically different spectral distributions to control the excitation of one or more photoreceptor classes without changing the excitation of the unmodulated photoreceptor classes. The stimulus was a dartboard array subtending 30° horizontally and 30° vertically on a calibrated LCD screen. The m-sequence of the stimulus was 215-1. The N1-P1 amplitude ratio of the mfVEP was used to estimate the L/M cone ratio. Data were collected for 30 young adults (22 to 33 years of age), consisting of 10 emmetropes (+0.3±0.4 D) and 20 myopes (–3.4±1.7 D). The stimulus and analysis techniques were confirmed using responses of two dichromats. For the entire participant group, the estimated central L/M cone ratios ranged from 0.56 to 1.80 in the central 3°-13° diameter ring and from 0.94 to 1.91 in the more peripheral 13°-30° diameter ring. Within 3°-13°, the mean L/M cone ratio of the emmetropic group was 1.20±0.33 and the mean was similar, 1.20±0.26, for the myopic group. For the 13°-30° ring, the mean L/M cone ratio of the emmetropic group was 1.48±0.27 and it was slightly lower in the myopic group, 1.30±0.27. Independent-samples t-test indicated no significant difference between the L/M cone ratios of the emmetropic and myopic group for either the central 3°-13° ring (p=0.986) or the more peripheral 13°-30° ring (p=0.108). The similar distributions of estimated L/M cone ratios in the sample of emmetropes and myopes indicates that there is likely to be no association between the L/M cone ratio and refractive error in humans.
Resumo:
This paper presents the findings of an analysis of the activities of rural nurses from a national audit of the role and function of the rural nurse (Hegney, Pearson and McCarthy 1997). The results suggest that the size of the health service (defined by the number of acute beds) influences the activities of rural nurses. Further, the study reports on the differences of the context of practice between different size rural health services and the impact this has on the scope of rural nursing practice. The paper will conclude that the size of the health service is an outcome of rurality (small population densities, distance from larger health facilities, lack of on-site medical and allied health staff). It also notes that the size of the health service is a major contextual determinant of patient acuity and staff skill-mix in small rural hospitals, and therefore the scope of rural nursing practice.
Resumo:
Executive Summary Emergency health is a critical component of Australia’s health system and emergency departments (EDs) are increasingly congested from growing demand and blocked access to inpatient beds. The Emergency Health Services Queensland (EHSQ) study aims to identify the factors driving increased demand for emergency health and to evaluate strategies which may safely reduce the future demand growth. This monograph addresses the perspectives of users of both ambulance services and EDs. The research reported here aimed to identify the perspectives of users of emergency health services, both ambulance services and public hospital Emergency Departments and to identify the factors that they took into consideration when exercising their choice of location for acute health care. A cross-sectional survey design was used involving a survey of patients or their carers presenting to the EDs of a stratified sample of eight hospitals. A specific purpose questionnaire was developed based on a novel theoretical model which had been derived from analysis of the literature (Monograph 1). Two survey versions were developed: one for adult patients (self-complete); and one for children (to be completed by parents/guardians). The questionnaires measured perceptions of social support, health status, illness severity, self-efficacy; beliefs and attitudes towards ED and ambulance services; reasons for using these services, and actions taken prior to the service request. The survey was conducted at a stratified sample of eight hospitals representing major cities (four), inner regional (two) and outer regional and remote (two). Due to practical limitations, data were collected for ambulance and ED users within hospital EDs, while patients were waiting for or under treatment. A sample size quota was determined for each ED based on their 2009/10 presentation volumes. The data collection was conducted by four members of the research team and a group of eight interviewers between March and May 2011 (corresponding to autumn season). Of the total of 1608 patients in all eight emergency departments the interviewers were able to approach 1361 (85%) patients and seek their consent to participate in the study. In total, 911 valid surveys were available for analysis (response rate= 67%). These studies demonstrate that patients elected to attend hospital EDs in a considered fashion after weighing up alternatives and there is no evidence of deliberate or ill-informed misuse. • Patients attending ED have high levels of social support and self-efficacy that speak to the considered and purposeful nature of the exercise of choice. • About one third of patients have new conditions while two thirds have chronic illnesses • More than half the attendees (53.1%) had consulted a healthcare professional prior to making the decision. • The decision to seek urgent care at an ED was mostly constructed around the patient’s perception of the urgency and severity of their illness, reinforced by a strong perception that the hospital ED was the correct location for them (better specialised staff, better care for my condition, other options not as suitable). • 33% of the respondent held private hospital insurance but nevertheless attended a public hospital ED. Similarly patients exercised considered and rational judgements in their choice to seek help from the ambulance service. • The decision to call for ambulance assistance was based on a strong perception about the severity of the illness (too severe to use other means of transport) and that other options were not considered appropriate. • The decision also appeared influenced by a perception that the ambulance provided appropriate access to the ED which was considered most appropriate for their particular condition (too severe to go elsewhere, all facilities in one spot, better specialised and better care). • In 43.8% of cases a health care professional advised use of the ambulance. • Only a small number of people perceived that ambulance should be freely available regardless of severity or appropriateness. These findings confirm a growing understanding that the choice of professional emergency health care services is not made lightly but rather made by reasonable people exercising a judgement which is influenced by public awareness of the risks of acute health and which is most often informed by health professionals. It is also made on the basis of a rational weighing up of alternatives and a deliberate and considered choice to seek assistance from a service which the patient perceived was most appropriate to their needs at that time. These findings add weight to dispensing with public perceptions that ED and ambulance congestion is a result of inappropriate choice by patients. The challenge for health services is to better understand the patient’s needs and to design and validate services that meet those needs. The failure of our health system to do so should not be grounds for blaming the patient, claiming inappropriate patient choices.
Resumo:
Background Diabetes foot complications are a leading cause of overall avoidable hospital admissions. Since 2006, the Queensland Diabetes Clinical Network has implemented programs aimed at reducing diabetes-related hospitalisation. The aim of this retrospective observational study was to determine the incidence of diabetes foot-related hospital admissions in Queensland from 2005 to 2010. Methods Data on all primary diabetes foot-related admissions in Queensland from 2005-2010 was obtained using diabetes foot-related ICD-10-AM (hospital discharge) codes. Queensland diabetes foot-related admission incidences were calculated using general population data from the Australian Bureau of Statistics. Furthermore, diabetes foot-related sub-group admissions were analysed. Chi-squared tests were used to assess changes in admissions over time. Results Overall, 24,917 diabetes foot-related admissions occurred, resulting in the use of 260,085 bed days or 1.4% of all available Queensland hospital bed days (18,352,152). The primary reasons for these admissions were foot ulcers (49.8%), cellulitis (20.7%), peripheral vascular disease (17.8%) and osteomyelitis (3.8%). The diabetes foot-related admission incidence among the general population (per 100,000) reduced by 22% (103.0 in 2005, to 80.7 in 2010, p < 0.001); bed days decreased by 18% (1,099 to 904, p < 0.001). Conclusion Diabetes foot complications appear to be the primary reason for 1.4 out of every 100 hospital beds used in Queensland. There has been a significant reduction in the incidence of diabetes foot-related admissions in Queensland between 2005 and 2010. This decrease has coincided with a corresponding decrease in amputations and the implementation of several diabetes foot clinical programs throughout Queensland.
Resumo:
Information on foods patients like and dislike is the essential basis for planning menus which are acceptable to patients and promote adequate consumption. The aim of this study was to obtain quantitative data on the food preferences of inpatients at a large metropolitan public hospital for use in menu planning. Methodology was based on a study by Williams et al (1988), and included additional questions about appetite and taste changes. The survey used a 9 point hedonic scale to rate foods listed in random order and was modified to incorporate more contemporary foods than those used in the originalWilliams study. Surveys were conducted by final year University of Queensland dietetics students on Food Service Practicum at the Royal Brisbane and Women’s Hospital (929 beds) in 2012. The first survey (220 questions, n = 157) had a response rate of 61%. The second included more sandwich fillings and salads (231 questions, n = 219, response rate 67%). Total number surveyed was 376. Results showed the most preferred foods were roast potato, grilled steak, ice cream, fresh strawberries, roast lamb, roast beef, grapes and banana. The least preferred foods were grapefruit, soybeans, lentils, sardines, prune juice and grapefruit juice. Patients who reported taste changes (10%) had similar food preferences to those who didn’t report taste changes. Patients who reported poor/very poor appetite (10%) generally scored foods lower than those who reported OK (22%), good/very good appetite (65%). The results of this study informed planning for a new patient menu at the RBWH in December 2012.
Resumo:
Objectives: To i) identify predictors of admission, and ii) describe outcomes for patients who arrived via ambulance to three Australian public Emergency Departments (EDs), before and after the opening of 41 additional ED beds within the area. Methods: A retrospective, comparative, cohort study using deterministically linked health data collected between 3 September 2006 and 2 September 2008. Data included ambulance offload delay, time to see doctor, ED length of stay (ED LOS), admission requirement, access block, hospital length of stay and in-hospital mortality. Logistic regression analysis was undertaken to identify predictors of hospital admission. Results: One third of all 286,037 ED presentations were via ambulance (n= 79,196) and 40.3% required admission. After increasing emergency capacity, the only outcome measure to improve was in-hospital mortality. Ambulance offload delay, time to see doctor, ED length of stay (ED LOS), admission requirement, access block, hospital length of stay did not improve. Strong predictors of admission before and after increased capacity included: age over 65 years, Australian Triage Scale (ATS) category 1-3, diagnoses of circulatory or respiratory conditions and ED LOS > 4 hours. With additional capacity the odds ratios for these predictors increased for age >65 and ED LOS > 4 hours and decreased for triage category and ED diagnoses. Conclusions: Expanding ED capacity from 81 to 122 beds within a health service area impacted favourably on mortality outcomes but not on time-related service outcomes such as ambulance offload time, time to see doctor and ED LOS. To improve all service outcomes, when altering (increasing/decreasing) ED bed numbers, the whole healthcare system needs to be considered.
Resumo:
Traffic state estimation in an urban road network remains a challenge for traffic models and the question of how such a network performs remains a difficult one to answer for traffic operators. Lack of detailed traffic information has long restricted research in this area. The introduction of Bluetooth into the automotive world presented an alternative that has now developed to a stage where large-scale test-beds are becoming available, for traffic monitoring and model validation purposes. But how much confidence should we have in such data? This paper aims to give an overview of the usage of Bluetooth, primarily for the city-scale management of urban transport networks, and to encourage researchers and practitioners to take a more cautious look at what is currently understood as a mature technology for monitoring travellers in urban environments. We argue that the full value of this technology is yet to be realised, for the analytical accuracies peculiar to the data have still to be adequately resolved.
Resumo:
This article describes the first steps toward comprehensive characterization of molecular transport within scaffolds for tissue engineering. The scaffolds were fabricated using a novel melt electrospinning technique capable of constructing 3D lattices of layered polymer fibers with well - defined internal microarchitectures. The general morphology and structure order was then determined using T 2 - weighted magnetic resonance imaging and X - ray microcomputed tomography. Diffusion tensor microimaging was used to measure the time - dependent diffusivity and diffusion anisotropy within the scaffolds. The measured diffusion tensors were anisotropic and consistent with the cross - hatched geometry of the scaffolds: diffusion was least restricted in the direction perpendicular to the fiber layers. The results demonstrate that the cross - hatched scaffold structure preferentially promotes molecular transport vertically through the layers ( z - axis), with more restricted diffusion in the directions of the fiber layers ( x – y plane). Diffusivity in the x – y plane was observed to be invariant to the fiber thickness. The characteristic pore size of the fiber scaffolds can be probed by sampling the diffusion tensor at multiple diffusion times. Prospective application of diffusion tensor imaging for the real - time monitoring of tissue maturation and nutrient transport pathways within tissue engineering scaffolds is discussed.
Resumo:
Emergency healthcare is a high profile component of modern healthcare systems, which over the past three decades has fundamentally transformed in many countries. However, despite this rapid development, and associated investments in service standards, there is a high level of concern with the performance of emergency health services relating principally to system wide congestion. The factors driving this problem are complex but relate largely to the combined impact of growing demand, expanded scope of care and blocked access to inpatient beds. These factors are unlikely to disappear in the medium term despite the National Emergency Access Target. The aim of this article is to stimulate a conversation about the future design and functioning of emergency healthcare systems; examining what we understand about the problem and proposing a rationale that may underpin future strategic approaches. This is also an invitation to join the conversation.
Resumo:
The Wet Tropics region has a unique water asset and is also considered a priority region for the improvement of water quality entering the Great Barrier Reef due to a combination of high rainfall, intensive agricultural use, urban areas and the proximity of valuable reef assets to the coast. Agricultural activities are one of many identified threats to water quality and water flows in the Wet Tropics in terms of sediment and pollutant-related water quality decline. Information describing the current state of agricultural management practices across the region is patchy at best. Based on the best available information on agricultural management practices in the Wet Tropics in 2008, it is clear that opportunities exist to improve nutrient, sediment and pesticide management practice to reduce the impact on the water asset and the Great Barrier Reef. Based on current understandings of practices and the relationship between practices and reef water quality, the greatest opportunities for improved water quality are as follows: · nutrients – correct rate and the placement of fertilisers; · pesticides – improve weed control planning, herbicide rates and calibration practice; and · soil and sediment – implement new farming system practices. The 2008-09 Reef Rescue program sought to accelerate the rate of adoption of improved management practices and through Terrain invested $6.8M in the 2008-09 year for: · landholder water quality improvement incentive payments; · cross regional catchment repair of wetlands and riparian lands in areas of high sediment or nutrient loss; and · partnerships in the region to lever resources and support for on-ground practice change. The program delivered $3,021,999 in onground incentives to landholders in the Wet Tropics to improve farm practices from D or C level to B or A level. The landholder Water Quality Incentives Grants program received 300 individual applications for funding and funded 143 individual landholders to implement practice change across 36,098 ha of farm land. It is estimated that the Reef Rescue program facilitated practice change across 21% of the cane industry, and 20% of the banana industry. The program levered an additional $2,441,166 in landholder cash contributions and a further $907,653 in non-cash in-kind contributions bringing the total project value of the landholder grants program in the Wet Tropics to $6,370,819. Most funded projects targeted multiple water quality objectives with a focus on nutrient and sediment reduction. Of the 143 projects funded, 115 projects addressed nutrient management either as the primary focus or in combination with strategies that targeted other water quality objectives. Overall, 82 projects addressed two or more water quality targets. Forty-five percent of incentive funds were allocated to new farming system practices (direct drill legumes, zonal tillage equipment, permanent beds, min till planting equipment, GPS units, laser levelling), followed by 24% allocated to subsurface fertiliser applicators (subsurface application of fertiliser using a stool splitter or beside the stool, at the correct Six Easy Steps rate). As a result, Terrain estimates that the incentive grants achieved considerable reductions in nitrogen, phosphorus, sediment and pesticide loads. The program supported nutrient management training of 167 growers managing farms covering over 20% of the area harvested in 2008, and 18 industry advisors and resellers. This resulted in 115 growers (155 farms) developing nutrient management plans. The program also supported Integrated Weed Management training of 80 growers managing farms covering 8% of the area harvested in 2008, and 6 industry advisors and resellers. This report, which draws on the best available Reef Rescue Management Monitoring, Evaluation, Reporting, and Improvement (MERI) information to evaluate program performance and impact on water quality outcomes, is the first in a series of annual reports that will assess and evaluate the impact of the Reef Rescue program on agricultural practices and water quality outcomes. The assessment is predominantly focused on the cane industry because of data availability. In the next stage, efforts will expand to: · improve practice data for the banana and grazing industry; · gain a better understanding of the water quality trends and the factors influencing them in the Wet Tropics; in particular work will focus on linking the results of the Paddock to Reef monitoring program and practice change data to assess program impact; · enhance estimations of the impact of practice change on pollutant loads from agricultural land use; · gain a better understanding of the extent of ancillary practice (change not directly funded) resulting from Reef Rescue training/ education/communication programs; and · provide a better understanding of the economic cost of practice change across the Wet Tropics region. From an ecological perspective, water quality trends and the factors that may be contributing to change, require further investigation. There is a critical need to work towards an enhanced understanding of the link between catchment land management practice change and reef water quality, so that reduced nutrient, sediment, and pesticide discharge to the Great Barrier Reef can be quantified. This will also assist with future prioritisation of grants money to agricultural industries, catchments and sub catchments. From a social perspective, the program has delivered significant water quality benefits from landholder education and training. It is believed that these activities are giving landholders the information and tools to implement further lasting change in their production systems and in doing so, creating a change in attitude that is supportive and inclusive of Natural Resource Management (NRM). The program in the Wet Tropics has also considerably strengthened institutional partnerships for NRM, particularly between NRM and industry and extension organisations. As a result of the Reef Rescue program, all institutions are actively working together to collectively improve water quality. The Reef Rescue program is improving water quality entering the Great Barrier Reef Lagoon by catalysing substantial activity in the Wet Tropics region to improve land management practices and reduce the water quality impact of agricultural landscapes. The solid institutional partnerships between the regional body, industry, catchment and government organisations have been fundamental to the successful delivery of the landholder grant and catchment rehabilitation programs. Landholders have generally had a positive perception and reaction to the program, its intent, and the practical, focused nature of grant-based support. Demand in the program was extremely high in 2008-09 and is expected to increase in 2009-2010.