554 resultados para wide-area surveillance
Resumo:
Not having enough physical activity leads to poorer health. Regular physical activity can reduce the risk of chronic disease and improve one's health and well being. The lack of physical activity is a common and growing health problem. To address this, 25 studies have used improvement activities directed at communities using more than one approach in a single program. When we looked at the available research, we observed that there was a lack of good studies which could show whether this approach was or wasn't beneficial. For example, some research studies claimed that community wide programs improved physical activities and other studies did not. It was not possible to determine what might work. Future research is needed with improved designs, measures of outcomes and larger samples of participants.
Resumo:
In November 1999, the Queensland Health (QH) Transition to Practice Nurse Education Program - Intensive Care (TPNEP-IC) was initiated in QH Intensive Care Units (ICUs) across Queensland. This 12-month, state-wide, workplace based education program has set minimum standards for intensive care nursing education and therefore minimum standards for intensive care nursing practice in QH. In the 12 years of operation, 824 nurses have completed TPNEP-IC, 761 achieving academic credit status and 453 utilising this academic credit status to undertake postgraduate study in critical/intensive care nursing at three Queensland universities. These outcomes were achieved through the appointment of nurse educators within ICUs who, through a united and strong commitment to this state-wide approach formed collaborative professional networks, which resulted in the development, implementation and maintenance of the program. Furthermore, these networks enabled a framework of support for discussion and dissemination of evidence based practice, to endorse quality processes for TPNEP-IC and to nurture leadership potential among educators. Challenges to overcome included obtaining adequate resources to support all aspects of the program, gaining local management and administrative support, and embedding TPNEP-IC within ICU culture. The 12 years of operation of the program have demonstrated its long term sustainability. The program is being launched through a new blended learning approach utilising e-learning strategies. To capitalise on the current success, a strong commitment by all stakeholders will be required to ensure the ongoing sustainability of the program.
Genome-wide association study identifies a common variant associated with risk of endometrial cancer
Resumo:
Background Cohort studies can provide valuable evidence of cause and effect relationships but are subject to loss of participants over time, limiting the validity of findings. Computerised record linkage offers a passive and ongoing method of obtaining health outcomes from existing routinely collected data sources. However, the quality of record linkage is reliant upon the availability and accuracy of common identifying variables. We sought to develop and validate a method for linking a cohort study to a state-wide hospital admissions dataset with limited availability of unique identifying variables. Methods A sample of 2000 participants from a cohort study (n = 41 514) was linked to a state-wide hospitalisations dataset in Victoria, Australia using the national health insurance (Medicare) number and demographic data as identifying variables. Availability of the health insurance number was limited in both datasets; therefore linkage was undertaken both with and without use of this number and agreement tested between both algorithms. Sensitivity was calculated for a sub-sample of 101 participants with a hospital admission confirmed by medical record review. Results Of the 2000 study participants, 85% were found to have a record in the hospitalisations dataset when the national health insurance number and sex were used as linkage variables and 92% when demographic details only were used. When agreement between the two methods was tested the disagreement fraction was 9%, mainly due to "false positive" links when demographic details only were used. A final algorithm that used multiple combinations of identifying variables resulted in a match proportion of 87%. Sensitivity of this final linkage was 95%. Conclusions High quality record linkage of cohort data with a hospitalisations dataset that has limited identifiers can be achieved using combinations of a national health insurance number and demographic data as identifying variables.
Resumo:
Listening is the basic and complementary skill in second language learning. The term listening is used in language teaching to refer to a complex process that allows us to understand spoken language. Listening, the most widely used language skill, is often used in conjunction with the other skills of speaking, reading and writing. Listening is not only a skill area in primary language performance (L1), but is also a critical means of acquiring a second language (L2). Listening is the channel in which we process language in real time – employing pacing, units of encoding and decoding (the 2 processes are central to interpretation and meaning making) and pausing (allows for reflection) that are unique to spoken language. Despite the wide range of areas investigated in listening strategies during training, there is a lack of research looking specifically at how effectively L1 listening strategy training may transfer to L2. To investigate the development of any such transfer patterns the instructional design and implementation of listening strategy of L1 will be critical.
Resumo:
Topographic structural complexity of a reef is highly correlated to coral growth rates, coral cover and overall levels of biodiversity, and is therefore integral in determining ecological processes. Modeling these processes commonly includes measures of rugosity obtained from a wide range of different survey techniques that often fail to capture rugosity at different spatial scales. Here we show that accurate estimates of rugosity can be obtained from video footage captured using underwater video cameras (i.e., monocular video). To demonstrate the accuracy of our method, we compared the results to in situ measurements of a 2m x 20m area of forereef from Glovers Reef atoll in Belize. Sequential pairs of images were used to compute fine scale bathymetric reconstructions of the reef substrate from which precise measurements of rugosity and reef topographic structural complexity can be derived across multiple spatial scales. To achieve accurate bathymetric reconstructions from uncalibrated monocular video, the position of the camera for each image in the video sequence and the intrinsic parameters (e.g., focal length) must be computed simultaneously. We show that these parameters can be often determined when the data exhibits parallax-type motion, and that rugosity and reef complexity can be accurately computed from existing video sequences taken from any type of underwater camera from any reef habitat or location. This technique provides an infinite array of possibilities for future coral reef research by providing a cost-effective and automated method of determining structural complexity and rugosity in both new and historical video surveys of coral reefs.
Resumo:
A software tool (DRONE) has been developed to evaluate road traffic noise in a large area with the consideration of network dynamic traffic flow and the buildings. For more precise estimation of noise in urban network where vehicles are mainly in stop and go running conditions, vehicle sound power level (for acceleration/deceleration cruising and ideal vehicle) is incorporated in DRONE. The calculation performance of DRONE is increased by evaluating the noise in two steps of first estimating the unit noise database and then integrating it with traffic simulation. Details of the process from traffic simulation to contour maps are discussed in the paper and the implementation of DRONE on Tsukuba city is presented.
Resumo:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.
Resumo:
The year 2010 was the wettest year on record for Queensland, Australia and the wettest year since 1974 for Southeast Queensland. The extremely heavy rain in early January 2011 fell on the catchments of heavily saturated Brisbane and Stanley Rivers systems resulting in significant runoff which rapidly produced a widespread and devastating flood event. The area of inundation was equivalent to the total land area of France and Germany combined. Over 200,000 people were affected leaving 35 people dead and 9 missing. The damage bill was estimated at over $1B and cost to the economy at over $10B with over 30,000 homes and 6,000 business flooded and 86 towns and regional centres affected. The need to disburse disaster funding in a prompt manner to the affected population was paramount to facilitate individuals getting their lives back to some normality. However, the payout of insurance claims has proved to be a major area of community anger. The ongoing impasse in payment of insurance compensation is attributed to the nature and number of claims, confusing definition of flooding and the lack or accuracy of information needed to determine individually the properties affected and legitimacy of claims. Information was not readily available at the micro-level including, extent and type of inundation, flood heights at property level and cause of damage. Events during the aftermath highlighted widespread community misconceptions concerning the technical factors associated with the flood event and the impact of such on access to legitimate compensation and assistance. Individual and community wide concerns and frustration, anger and depression, have arisen resulting from delays in the timely settlement of insurance claims. Lessons learnt during the aftermath are presented in the context of their importance as a basis for inculcating communities impacted by the flood event with resilience for the future.