954 resultados para appropriate technology
Resumo:
As the societal awareness on sustainability is gaining momentum worldwide, the higher education sector is expected to take the lead in education, research and the promotion of sustainable development. Universities have the diversity of skills and knowledge to explore new concepts and issues, the academic freedom to offer unbiased observations, and the capacity to engage in experimentation for solutions. There is a global trend that universities have realized and responded to sustainability challenge. By adopting green technologies, buildings on university campuses have the potential to offer highly productive and green environments for a quality learning experience for students, while minimising environmental impacts. Despite the potential benefits and metaphorical link to sustainability, few universities have moved towards implementing Green Roof and Living Wall on campuses widely, which have had more successful applications in commercial and residential buildings. Few past research efforts have examined the fundamental barriers to the implementation of sustainable projects on campuses from organizational level. To address this deficiency, an on-going research project is undertaken by Queensland University of Technology in Australia. The research is aimed at developing a comprehensive framework to facilitate better decision making for the promotion of Green Roof and Living Wall application on campuses. It will explore and highlight organizational factors as well as investigate and emphasize project delivery issues. Also, the critical technical indicators for Green Roof and Living Wall implementation will be identified. The expected outcome of this research has the potential to enhance Green Roof and Living Wall delivery in Australian universities, as a vital step towards realizing sustainability in higher education sectors.
Resumo:
In just under 3 months worldwide sales of Apple's iPad tablet device stood at over 3 million units sold. The iPad device, along with rival products signify a shift in the way in which print and other media products are purchased and consumed by users. While facing initial skepticism about the uptake of the device numerous industries have been quick to adapt the device to their specific needs. Based around a newly developed six point typology of “post PC” device utility this project undertook a significant review of publicly available material to identify worldwide trends in iPad adoption and use within the tertiary sector.
Resumo:
Understanding the relationship between diet, physical activity and health in humans requires accurate measurement of body composition and daily energy expenditure. Stable isotopes provide a means of measuring total body water and daily energy expenditure under free-living conditions. While the use of isotope ratio mass spectrometry (IRMS) for the analysis of 2H (Deuterium) and 18O (Oxygen-18) is well established in the field of human energy metabolism research, numerous questions remain regarding the factors which influence analytical and measurement error using this methodology. This thesis was comprised of four studies with the following emphases. The aim of Study 1 was to determine the analytical and measurement error of the IRMS with regard to sample handling under certain conditions. Study 2 involved the comparison of TEE (Total daily energy expenditure) using two commonly employed equations. Further, saliva and urine samples, collected at different times, were used to determine if clinically significant differences would occur. Study 3 was undertaken to determine the appropriate collection times for TBW estimates and derived body composition values. Finally, Study 4, a single case study to investigate if TEE measures are affected when the human condition changes due to altered exercise and water intake. The aim of Study 1 was to validate laboratory approaches to measure isotopic enrichment to ensure accurate (to international standards), precise (reproducibility of three replicate samples) and linear (isotope ratio was constant over the expected concentration range) results. This established the machine variability for the IRMS equipment in use at Queensland University for both TBW and TEE. Using either 0.4mL or 0.5mL sample volumes for both oxygen-18 and deuterium were statistically acceptable (p>0.05) and showed a within analytical variance of 5.8 Delta VSOW units for deuterium, 0.41 Delta VSOW units for oxygen-18. This variance was used as “within analytical noise” to determine sample deviations. It was also found that there was no influence of equilibration time on oxygen-18 or deuterium values when comparing the minimum (oxygen-18: 24hr; deuterium: 3 days) and maximum (oxygen-18: and deuterium: 14 days) equilibration times. With regard to preparation using the vacuum line, any order of preparation is suitable as the TEE values fall within 8% of each other regardless of preparation order. An 8% variation is acceptable for the TEE values due to biological and technical errors (Schoeller, 1988). However, for the automated line, deuterium must be assessed first followed by oxygen-18 as the automated machine line does not evacuate tubes but merely refills them with an injection of gas for a predetermined time. Any fractionation (which may occur for both isotopes), would cause a slight elevation in the values and hence a lower TEE. The purpose of the second and third study was to investigate the use of IRMS to measure the TEE and TBW of and to validate the current IRMS practices in use with regard to sample collection times of urine and saliva, the use of two TEE equations from different research centers and the body composition values derived from these TEE and TBW values. Following the collection of a fasting baseline urine and saliva sample, 10 people (8 women, 2 men) were dosed with a doubly labeled water does comprised of 1.25g 10% oxygen-18 and 0.1 g 100% deuterium/kg body weight. The samples were collected hourly for 12 hrs on the first day and then morning, midday, and evening samples were collected for the next 14 days. The samples were analyzed using an isotope ratio mass spectrometer. For the TBW, time to equilibration was determined using three commonly employed data analysis approaches. Isotopic equilibration was reached in 90% of the sample by hour 6, and in 100% of the sample by hour 7. With regard to the TBW estimations, the optimal time for urine collection was found to be between hours 4 and 10 as to where there was no significant difference between values. In contrast, statistically significant differences in TBW estimations were found between hours 1-3 and from 11-12 when compared with hours 4-10. Most of the individuals in this study were in equilibrium after 7 hours. The TEE equations of Prof Dale Scholler (Chicago, USA, IAEA) and Prof K.Westerterp were compared with that of Prof. Andrew Coward (Dunn Nutrition Centre). When comparing values derived from samples collected in the morning and evening there was no effect of time or equation on resulting TEE values. The fourth study was a pilot study (n=1) to test the variability in TEE as a result of manipulations in fluid consumption and level of physical activity; the magnitude of change which may be expected in a sedentary adult. Physical activity levels were manipulated by increasing the number of steps per day to mimic the increases that may result when a sedentary individual commences an activity program. The study was comprised of three sub-studies completed on the same individual over a period of 8 months. There were no significant changes in TBW across all studies, even though the elimination rates changed with the supplemented water intake and additional physical activity. The extra activity may not have sufficiently strenuous enough and the water intake high enough to cause a significant change in the TBW and hence the CO2 production and TEE values. The TEE values measured show good agreement based on the estimated values calculated on an RMR of 1455 kcal/day, a DIT of 10% of TEE and activity based on measured steps. The covariance values tracked when plotting the residuals were found to be representative of “well-behaved” data and are indicative of the analytical accuracy. The ratio and product plots were found to reflect the water turnover and CO2 production and thus could, with further investigation, be employed to identify the changes in physical activity.
Resumo:
Older adults, especially those acutely ill, are vulnerable to developing malnutrition due to a range of risk factors. The high prevalence and extensive consequences of malnutrition in hospitalised older adults have been reported extensively. However, there are few well-designed longitudinal studies that report the independent relationship between malnutrition and clinical outcomes after adjustment for a wide range of covariates. Acutely ill older adults are exceptionally prone to nutritional decline during hospitalisation, but few reports have studied this change and impact on clinical outcomes. In the rapidly ageing Singapore population, all this evidence is lacking, and the characteristics associated with the risk of malnutrition are also not well-documented. Despite the evidence on malnutrition prevalence, it is often under-recognised and under-treated. It is therefore crucial that validated nutrition screening and assessment tools are used for early identification of malnutrition. Although many nutrition screening and assessment tools are available, there is no universally accepted method for defining malnutrition risk and nutritional status. Most existing tools have been validated amongst Caucasians using various approaches, but they are rarely reported in the Asian elderly and none has been validated in Singapore. Due to the multiethnicity, cultural, and language differences in Singapore older adults, the results from non-Asian validation studies may not be applicable. Therefore it is important to identify validated population and setting specific nutrition screening and assessment methods to accurately detect and diagnose malnutrition in Singapore. The aims of this study are therefore to: i) characterise hospitalised elderly in a Singapore acute hospital; ii) describe the extent and impact of admission malnutrition; iii) identify and evaluate suitable methods for nutritional screening and assessment; and iv) examine changes in nutritional status during admission and their impact on clinical outcomes. A total of 281 participants, with a mean (+SD) age of 81.3 (+7.6) years, were recruited from three geriatric wards in Tan Tock Seng Hospital over a period of eight months. They were predominantly Chinese (83%) and community-dwellers (97%). They were screened within 72 hours of admission by a single dietetic technician using four nutrition screening tools [Tan Tock Seng Hospital Nutrition Screening Tool (TTSH NST), Nutritional Risk Screening 2002 (NRS 2002), Mini Nutritional Assessment-Short Form (MNA-SF), and Short Nutritional Assessment Questionnaire (SNAQ©)] that were administered in no particular order. The total scores were not computed during the screening process so that the dietetic technician was blinded to the results of all the tools. Nutritional status was assessed by a single dietitian, who was blinded to the screening results, using four malnutrition assessment methods [Subjective Global Assessment (SGA), Mini Nutritional Assessment (MNA), body mass index (BMI), and corrected arm muscle area (CAMA)]. The SGA rating was completed prior to computation of the total MNA score to minimise bias. Participants were reassessed for weight, arm anthropometry (mid-arm circumference, triceps skinfold thickness), and SGA rating at discharge from the ward. The nutritional assessment tools and indices were validated against clinical outcomes (length of stay (LOS) >11days, discharge to higher level care, 3-month readmission, 6-month mortality, and 6-month Modified Barthel Index) using multivariate logistic regression. The covariates included age, gender, race, dementia (defined using DSM IV criteria), depression (defined using a single question “Do you often feel sad or depressed?”), severity of illness (defined using a modified version of the Severity of Illness Index), comorbidities (defined using Charlson Comorbidity Index, number of prescribed drugs and admission functional status (measured using Modified Barthel Index; MBI). The nutrition screening tools were validated against the SGA, which was found to be the most appropriate nutritional assessment tool from this study (refer section 5.6) Prevalence of malnutrition on admission was 35% (defined by SGA), and it was significantly associated with characteristics such as swallowing impairment (malnourished vs well-nourished: 20% vs 5%), poor appetite (77% vs 24%), dementia (44% vs 28%), depression (34% vs 22%), and poor functional status (MBI 48.3+29.8 vs 65.1+25.4). The SGA had the highest completion rate (100%) and was predictive of the highest number of clinical outcomes: LOS >11days (OR 2.11, 95% CI [1.17- 3.83]), 3-month readmission (OR 1.90, 95% CI [1.05-3.42]) and 6-month mortality (OR 3.04, 95% CI [1.28-7.18]), independent of a comprehensive range of covariates including functional status, disease severity and cognitive function. SGA is therefore the most appropriate nutritional assessment tool for defining malnutrition. The TTSH NST was identified as the most suitable nutritional screening tool with the best diagnostic performance against the SGA (AUC 0.865, sensitivity 84%, specificity 79%). Overall, 44% of participants experienced weight loss during hospitalisation, and 27% had weight loss >1% per week over median LOS 9 days (range 2-50). Wellnourished (45%) and malnourished (43%) participants were equally prone to experiencing decline in nutritional status (defined by weight loss >1% per week). Those with reduced nutritional status were more likely to be discharged to higher level care (adjusted OR 2.46, 95% CI [1.27-4.70]). This study is the first to characterise malnourished hospitalised older adults in Singapore. It is also one of the very few studies to (a) evaluate the association of admission malnutrition with clinical outcomes in a multivariate model; (b) determine the change in their nutritional status during admission; and (c) evaluate the validity of nutritional screening and assessment tools amongst hospitalised older adults in an Asian population. Results clearly highlight that admission malnutrition and deterioration in nutritional status are prevalent and are associated with adverse clinical outcomes in hospitalised older adults. With older adults being vulnerable to risks and consequences of malnutrition, it is important that they are systematically screened so timely and appropriate intervention can be provided. The findings highlighted in this thesis provide an evidence base for, and confirm the validity of the current nutrition screening and assessment tools used among hospitalised older adults in Singapore. As the older adults may have developed malnutrition prior to hospital admission, or experienced clinically significant weight loss of >1% per week of hospitalisation, screening of the elderly should be initiated in the community and continuous nutritional monitoring should extend beyond hospitalisation.
Resumo:
The adoption of IT Governance (ITG) continues to be an important topic for research. Many researchers have focused their attention on how these practices are currently being implemented in the many diverse areas and industries. Literature shows that a majority of these studies have only been based on industries and organizations in developed countries. There exist very few researches that look specifically within the context of a developing country. Furthermore, there seems to be a lack of research on identifying the barriers or inhibitors to IT Governance adoption within the context of an emerging yet still developing Asian country. This research sets out to justify, substantiate and improve on a priori model developed to study the barriers to the adoption of ITG practice using qualitative data obtained through a series of semi-structured interviews conducted on organizations in Malaysia.
Resumo:
Objective: To highlight the registration issues for nurses who wish to practice nationally, particularly those practicing within the telehealth sector. Design: As part of a national clinical research study, applications were made to every state and territory for mutual recognition of nursing registration and fee waiver for telenursing cross boarder practice for a period of three years. These processes are described using a case study approach. Outcome: The aim of this case study was to achieve registration in every state and territory of Australia without paying multiple fees by using mutual recognition provisions and the cross-border fee waiver policy of the nurse regulatory authorities in order to practice telenursing. Results: Mutual recognition and fee waiver for cross-border practice was granted unconditionally in two states: Victoria (Vic) and Tasmania (Tas), and one territory: the Northern Territory (NT). The remainder of the Australian states and territories would only grant temporary registration for the period of the project or not at all, due to policy restrictions or nurse regulatory authority (NRA) Board decisions. As a consequence of gaining fee waiver the annual cost of registration was a maximum of $145 per annum as opposed to the potential $959 for initial registration and $625 for annual renewal. Conclusions: Having eight individual nurses Acts and NRAs for a population of 265,000 nurses would clearly indicate a case for over regulation in this country. The structure of regulation of nursing in Australia is a barrier to the changing and evolving role of nurses in the 21st century and a significant factor when considering workforce planning.
Resumo:
Background In an attempt to establish some consensus on the proper use and design of experimental animal models in musculoskeletal research, AOVET (the veterinary specialty group of the AO Foundation) in concert with the AO Research Institute (ARI), and the European Academy for the Study of Scientific and Technological Advance, convened a group of musculoskeletal researchers, veterinarians, legal experts, and ethicists to discuss, in a frank and open forum, the use of animals in musculoskeletal research. Methods The group narrowed the field to fracture research. The consensus opinion resulting from this workshop can be summarized as follows: Results & Conclusion Anaesthesia and pain management protocols for research animals should follow standard protocols applied in clinical work for the species involved. This will improve morbidity and mortality outcomes. A database should be established to facilitate selection of anaesthesia and pain management protocols for specific experimental surgical procedures and adopted as an International Standard (IS) according to animal species selected. A list of 10 golden rules and requirements for conduction of animal experiments in musculoskeletal research was drawn up comprising 1) Intelligent study designs to receive appropriate answers; 2) Minimal complication rates (5 to max. 10%); 3) Defined end-points for both welfare and scientific outputs analogous to quality assessment (QA) audit of protocols in GLP studies; 4) Sufficient details for materials and methods applied; 5) Potentially confounding variables (genetic background, seasonal, hormonal, size, histological, and biomechanical differences); 6) Post-operative management with emphasis on analgesia and follow-up examinations; 7) Study protocols to satisfy criteria established for a "justified animal study"; 8) Surgical expertise to conduct surgery on animals; 9) Pilot studies as a critical part of model validation and powering of the definitive study design; 10) Criteria for funding agencies to include requirements related to animal experiments as part of the overall scientific proposal review protocols. Such agencies are also encouraged to seriously consider and adopt the recommendations described here when awarding funds for specific projects. Specific new requirements and mandates related both to improving the welfare and scientific rigour of animal-based research models are urgently needed as part of international harmonization of standards.
Resumo:
A Networked Control System (NCS) is a feedback-driven control system wherein the control loops are closed through a real-time network. Control and feedback signals in an NCS are exchanged among the system’s components in the form of information packets via the network. Nowadays, wireless technologies such as IEEE802.11 are being introduced to modern NCSs as they offer better scalability, larger bandwidth and lower costs. However, this type of network is not designed for NCSs because it introduces a large amount of dropped data, and unpredictable and long transmission latencies due to the characteristics of wireless channels, which are not acceptable for real-time control systems. Real-time control is a class of time-critical application which requires lossless data transmission, small and deterministic delays and jitter. For a real-time control system, network-introduced problems may degrade the system’s performance significantly or even cause system instability. It is therefore important to develop solutions to satisfy real-time requirements in terms of delays, jitter and data losses, and guarantee high levels of performance for time-critical communications in Wireless Networked Control Systems (WNCSs). To improve or even guarantee real-time performance in wireless control systems, this thesis presents several network layout strategies and a new transport layer protocol. Firstly, real-time performances in regard to data transmission delays and reliability of IEEE 802.11b-based UDP/IP NCSs are evaluated through simulations. After analysis of the simulation results, some network layout strategies are presented to achieve relatively small and deterministic network-introduced latencies and reduce data loss rates. These are effective in providing better network performance without performance degradation of other services. After the investigation into the layout strategies, the thesis presents a new transport protocol which is more effcient than UDP and TCP for guaranteeing reliable and time-critical communications in WNCSs. From the networking perspective, introducing appropriate communication schemes, modifying existing network protocols and devising new protocols, have been the most effective and popular ways to improve or even guarantee real-time performance to a certain extent. Most previously proposed schemes and protocols were designed for real-time multimedia communication and they are not suitable for real-time control systems. Therefore, devising a new network protocol that is able to satisfy real-time requirements in WNCSs is the main objective of this research project. The Conditional Retransmission Enabled Transport Protocol (CRETP) is a new network protocol presented in this thesis. Retransmitting unacknowledged data packets is effective in compensating for data losses. However, every data packet in realtime control systems has a deadline and data is assumed invalid or even harmful when its deadline expires. CRETP performs data retransmission only in the case that data is still valid, which guarantees data timeliness and saves memory and network resources. A trade-off between delivery reliability, transmission latency and network resources can be achieved by the conditional retransmission mechanism. Evaluation of protocol performance was conducted through extensive simulations. Comparative studies between CRETP, UDP and TCP were also performed. These results showed that CRETP significantly: 1). improved reliability of communication, 2). guaranteed validity of received data, 3). reduced transmission latency to an acceptable value, and 4). made delays relatively deterministic and predictable. Furthermore, CRETP achieved the best overall performance in comparative studies which makes it the most suitable transport protocol among the three for real-time communications in a WNCS.
Resumo:
The measurement error model is a well established statistical method for regression problems in medical sciences, although rarely used in ecological studies. While the situations in which it is appropriate may be less common in ecology, there are instances in which there may be benefits in its use for prediction and estimation of parameters of interest. We have chosen to explore this topic using a conditional independence model in a Bayesian framework using a Gibbs sampler, as this gives a great deal of flexibility, allowing us to analyse a number of different models without losing generality. Using simulations and two examples, we show how the conditional independence model can be used in ecology, and when it is appropriate.
Resumo:
The School of Electrical and Electronic Systems Engineering at Queensland University of Technology, Brisbane, Australia (QUT), offers three bachelor degree courses in electrical and computer engineering. In all its courses there is a strong emphasis on signal processing. A newly established Signal Processing Research Centre (SPRC) has played an important role in the development of the signal processing units in these courses. This paper describes the unique design of the undergraduate program in signal processing at QUT, the laboratories developed to support it, and the criteria that influenced the design.
Resumo:
OBJECTIVE: This paper reviews the epidemiological evidence on the relationship between ambient temperature and morbidity. It assesses the methodological issues in previous studies, and proposes future research directions. DATA SOURCES AND DATA EXTRACTION: We searched the PubMed database for epidemiological studies on ambient temperature and morbidity of non-communicable diseases published in refereed English journals prior to June 2010. 40 relevant studies were identified. Of these, 24 examined the relationship between ambient temperature and morbidity, 15 investigated the short-term effects of heatwave on morbidity, and 1 assessed both temperature and heatwave effects. DATA SYNTHESIS: Descriptive and time-series studies were the two main research designs used to investigate the temperature–morbidity relationship. Measurements of temperature exposure and health outcomes used in these studies differed widely. The majority of studies reported a significant relationship between ambient temperature and total or cause-specific morbidities. However, there were some inconsistencies in the direction and magnitude of non-linear lag effects. The lag effect of hot temperature on morbidity was shorter (several days) compared to that of cold temperature (up to a few weeks). The temperature–morbidity relationship may be confounded and/or modified by socio-demographic factors and air pollution. CONCLUSIONS: There is a significant short-term effect of ambient temperature on total and cause-specific morbidities. However, further research is needed to determine an appropriate temperature measure, consider a diverse range of morbidities, and to use consistent methodology to make different studies more comparable.