852 resultados para Formative assessment framework. Assessment tools. Ames
Resumo:
Situation Background Assessment and Recommendation (SBAR): Undergraduate Perspectives C Morgan, L Adams, J Murray, R Dunlop, IK Walsh. Ian K Walsh, Centre for Medical Education, Queen’s University Belfast, Mulhouse Building, Royal Victoria Hospital, Grosvenor Road, Belfast BT12 6DP Background and Purpose: Structured communication tools are used to improve team communication quality.1,2 The Situation Background Assessment and Recommendation (SBAR) tool is widely adopted within patient safety.3 SBAR effectiveness is reportedly equivocal, suggesting use is not sustained beyond initial training.4-6 Understanding perspectives of those using SBAR may further improve clinical communication. We investigated senior medical undergraduate perspectives on SBAR, particularly when communicating with senior colleagues. Methodology: Mixed methods data collection was used. A previously piloted questionnaire with 12 five point Lickert scale questions and 3 open questions was given to all final year medical students. A subgroup also participated in 10 focus groups, deploying strictly structured audio-recorded questions. Selection was by convenience sampling, data gathered by open text questions and comments transcribed verbatim. In-vivo coding (iterative, towards data saturation) preceded thematic analysis. Results: 233 of 255 students (91%) completed the survey. 1. There were clearly contradictory viewpoints on SBAR usage. A recurrent theme was a desire for formal feedback and a relative lack of practice/experience with SBAR. 2. Students reported SBAR as having variable interpretation between individuals; limiting use as a shared mental model. 3. Brief training sessions are insufficient to embed the tool. 4. Most students reported SBAR helping effective communication, especially by providing structure in stressful situations. 5. Only 18.5% of students felt an alternative resource might be needed. Sub analysis of the themes highlighted: A. Lack of clarity regarding what information to include and information placement within the acronym, B. Senior colleague negative response to SBAR C. Lack of conciseness with the tool. Discussion and Conclusions: Despite a wide range of contradictory interpretation of SBAR utility, most students wish to retain the resource. More practice opportunities/feedback may enhance user confidence and understanding. References: (1) Leonard M, Graham S, Bonacum D. The human factor: the critical importance of effective teamwork and communication in providing safe care. Quality & Safety in Health Care 2004 Oct;13(Suppl 1):85-90. (2) d'Agincourt-Canning LG, Kissoon N, Singal M, Pitfield AF. Culture, communication and safety: lessons from the airline industry. Indian J Pediatr 2011 Jun;78(6):703-708. (3) Dunsford J. Structured communication: improving patient safety with SBAR. Nurs Womens Health 2009 Oct;13(5):384-390. (4) Compton J, Copeland K, Flanders S, Cassity C, Spetman M, Xiao Y, et al. Implementing SBAR across a large multihospital health system. Jt Comm J Qual Patient Saf 2012 Jun;38(6):261-268. (5) Ludikhuize J, de Jonge E, Goossens A. Measuring adherence among nurses one year after training in applying the Modified Early Warning Score and Situation-Background-Assessment-Recommendation instruments. Resuscitation 2011 Nov;82(11):1428-1433. (6) Cunningham NJ, Weiland TJ, van Dijk J, Paddle P, Shilkofski N, Cunningham NY. Telephone referrals by junior doctors: a randomised controlled trial assessing the impact of SBAR in a simulated setting. Postgrad Med J 2012 Nov;88(1045):619-626.
Resumo:
This report provides a descriptive summary of tools to measure dietary intake and dietary behaviours - exploring the application, reliability and validity of the various tools.
Resumo:
L’Internet Physique (IP) est une initiative qui identifie plusieurs symptômes d’inefficacité et non-durabilité des systèmes logistiques et les traite en proposant un nouveau paradigme appelé logistique hyperconnectée. Semblable à l’Internet Digital, qui relie des milliers de réseaux d’ordinateurs personnels et locaux, IP permettra de relier les systèmes logistiques fragmentés actuels. Le but principal étant d’améliorer la performance des systèmes logistiques des points de vue économique, environnemental et social. Se concentrant spécifiquement sur les systèmes de distribution, cette thèse remet en question l’ordre de magnitude du gain de performances en exploitant la distribution hyperconnectée habilitée par IP. Elle concerne également la caractérisation de la planification de la distribution hyperconnectée. Pour répondre à la première question, une approche de la recherche exploratoire basée sur la modélisation de l’optimisation est appliquée, où les systèmes de distribution actuels et potentiels sont modélisés. Ensuite, un ensemble d’échantillons d’affaires réalistes sont créé, et leurs performances économique et environnementale sont évaluées en ciblant de multiples performances sociales. Un cadre conceptuel de planification, incluant la modélisation mathématique est proposé pour l’aide à la prise de décision dans des systèmes de distribution hyperconnectée. Partant des résultats obtenus par notre étude, nous avons démontré qu’un gain substantiel peut être obtenu en migrant vers la distribution hyperconnectée. Nous avons également démontré que l’ampleur du gain varie en fonction des caractéristiques des activités et des performances sociales ciblées. Puisque l’Internet physique est un sujet nouveau, le Chapitre 1 présente brièvement l’IP et hyper connectivité. Le Chapitre 2 discute les fondements, l’objectif et la méthodologie de la recherche. Les défis relevés au cours de cette recherche sont décrits et le type de contributions visés est mis en évidence. Le Chapitre 3 présente les modèles d’optimisation. Influencés par les caractéristiques des systèmes de distribution actuels et potentiels, trois modèles fondés sur le système de distribution sont développés. Chapitre 4 traite la caractérisation des échantillons d’affaires ainsi que la modélisation et le calibrage des paramètres employés dans les modèles. Les résultats de la recherche exploratoire sont présentés au Chapitre 5. Le Chapitre 6 décrit le cadre conceptuel de planification de la distribution hyperconnectée. Le chapitre 7 résume le contenu de la thèse et met en évidence les contributions principales. En outre, il identifie les limites de la recherche et les avenues potentielles de recherches futures.
Resumo:
Les procédures appliquées avant l’abattage des animaux influencent directement la qualité de la viande en modulant l’état physiologique des porcs; ainsi, l’augmentation de la température corporelle, les taux élevés de lactate sanguin et l’épuisement des réserves de glycogène entre autres, occasionnent la majorité des baisses de qualité. L’objectif de cette thèse était de valider des outils indicateurs de stress porcin pour les fermes et les abattoirs. Ceux-ci seraient appliqués à la surveillance du bien-être animal et à la prédiction de variation de qualité de la viande porcine au niveau commercial. Premierement, les résultats de la thèse ont permis de conclure qu’un des outils développés (analyseur portatif de lactate) mesure la variation du niveau de lactate sanguin associé à l’état physiologique des porcs dans la phase péri-mortem et aide à expliquer la variation de la qualité de la viande chez le porc à l’abattoir, en particulier dans les muscles du jambon. Deuxièmement, les résultats des audits du bien-être animal appliqués de la ferme à l’abattoir ont démontré que la qualité du système d’élevage à la ferme d’origine et les compétences du chauffeur de camion sont d’importants critères affectant la réponse comportementale des porcs à la manipulation avant l’abattage. Ces résultats ont également démontré que les conditions de logement à la ferme (la faible densité et l’enrichissement dans les enclos), le comportement des porcs en période pré-abattage (glissade), ainsi que les interventions du manipulateur (utilisation du bâton électrique) dans la zone d’étourdissement de l’abattoir affectent négativement la variation de la qualité de la viande. L’application des protocoles d’audits dans la filière porcine a également démontré que le respect des critères de bien-être animal fixés par un outil de vérification est primordiale et permet de contrôler les conditions de bien-être des porcs à chaque étape de la période pré-abattage, de produire une viande de qualité supérieure et de réduire les pertes. Les audits de bien-être animal sont donc un outil qui apporte des resultats très pertinents pour aider a éviter les variations de la qualité de la viande chez le porc. Troisièmement, la thermographie infrarouge s’est avéré être une technique prometteuse permettant d’évaluer la variation de température corporelle de l’animal pendant et après un stress physique, en particulier lorsque cette mesure est prise derrière les oreilles. En conclusion, les outils validés à travers cette thèse représentent des méthodologies non invasives et potentiellement complémentaires à d’autres approches d’évaluation de l’état physiologique et du bien-être animal par rapport au stress, permettant de réduire les pertes de qualité de viande (par exemple en utilisation conjointe avec le niveau de lactate sanguin et les indicateurs de stress comportemental, entre autres).
Resumo:
One of the biggest challenges that contaminant hydrogeology is facing, is how to adequately address the uncertainty associated with model predictions. Uncertainty arise from multiple sources, such as: interpretative error, calibration accuracy, parameter sensitivity and variability. This critical issue needs to be properly addressed in order to support environmental decision-making processes. In this study, we perform Global Sensitivity Analysis (GSA) on a contaminant transport model for the assessment of hydrocarbon concentration in groundwater. We provide a quantification of the environmental impact and, given the incomplete knowledge of hydrogeological parameters, we evaluate which are the most influential, requiring greater accuracy in the calibration process. Parameters are treated as random variables and a variance-based GSA is performed in a optimized numerical Monte Carlo framework. The Sobol indices are adopted as sensitivity measures and they are computed by employing meta-models to characterize the migration process, while reducing the computational cost of the analysis. The proposed methodology allows us to: extend the number of Monte Carlo iterations, identify the influence of uncertain parameters and lead to considerable saving computational time obtaining an acceptable accuracy.
Resumo:
This paper is reviewing objective assessments of Parkinson’s disease(PD) motor symptoms, cardinal, and dyskinesia, using sensor systems. It surveys the manifestation of PD symptoms, sensors that were used for their detection, types of signals (measures) as well as their signal processing (data analysis) methods. A summary of this review’s finding is represented in a table including devices (sensors), measures and methods that were used in each reviewed motor symptom assessment study. In the gathered studies among sensors, accelerometers and touch screen devices are the most widely used to detect PD symptoms and among symptoms, bradykinesia and tremor were found to be mostly evaluated. In general, machine learning methods are potentially promising for this. PD is a complex disease that requires continuous monitoring and multidimensional symptom analysis. Combining existing technologies to develop new sensor platforms may assist in assessing the overall symptom profile more accurately to develop useful tools towards supporting better treatment process.
Resumo:
Variability management is one of the major challenges in software product line adoption, since it needs to be efficiently managed at various levels of the software product line development process (e.g., requirement analysis, design, implementation, etc.). One of the main challenges within variability management is the handling and effective visualization of large-scale (industry-size) models, which in many projects, can reach the order of thousands, along with the dependency relationships that exist among them. These have raised many concerns regarding the scalability of current variability management tools and techniques and their lack of industrial adoption. To address the scalability issues, this work employed a combination of quantitative and qualitative research methods to identify the reasons behind the limited scalability of existing variability management tools and techniques. In addition to producing a comprehensive catalogue of existing tools, the outcome form this stage helped understand the major limitations of existing tools. Based on the findings, a novel approach was created for managing variability that employed two main principles for supporting scalability. First, the separation-of-concerns principle was employed by creating multiple views of variability models to alleviate information overload. Second, hyperbolic trees were used to visualise models (compared to Euclidian space trees traditionally used). The result was an approach that can represent models encompassing hundreds of variability points and complex relationships. These concepts were demonstrated by implementing them in an existing variability management tool and using it to model a real-life product line with over a thousand variability points. Finally, in order to assess the work, an evaluation framework was designed based on various established usability assessment best practices and standards. The framework was then used with several case studies to benchmark the performance of this work against other existing tools.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Perturbation of natural ecosystems, namely by increasing freshwater use and its degradative use, as well as topsoil erosion by water of land-use production systems, have been emerging as topics of high environmental concern. Freshwater use has become a focus of attention in the last few years for all stakeholders involved in the production of goods, mainly agro-industrial and forest-based products, which are freshwater-intensive consumers, requiring large inputs of green and blue water. This thesis presents a global review on the available Water Footprint Assessment and Life Cycle Assessment (LCA)-based methods for measuring and assessing the environmental relevance of freshwater resources use, based on a life cycle perspective. Using some of the available midpoint LCA-based methods, the freshwater use-related impacts of a Portuguese wine (white ‘vinho verde’) were assessed. However, the relevance of environmental green water has been neglected because of the absence of a comprehensive impact assessment method associated with green water flows. To overcome this constraint, this thesis helps to improve and enhance the LCA-based methods by providing a midpoint and spatially explicit Life Cycle Impact Assessment (LCIA) method for assessing impacts on terrestrial green water flow and addressing reductions in surface blue water production caused by reductions in surface runoff due to land-use production systems. The applicability of the proposed method is illustrated by a case study on Eucalyptus globulus conducted in Portugal, as the growth of short rotation forestry is largely dependent on local precipitation. Topsoil erosion by water has been characterised as one of the most upsetting problems for rivers. Because of this, this thesis also focuses on the ecosystem impacts caused by suspended solids (SS) from topsoil erosion that reach freshwater systems. A framework to conduct a spatially distributed SS delivery to freshwater streams and a fate and effect LCIA method to derive site-specific characterisation factors (CFs) for endpoint damage on aquatic ecosystem diversity, namely on algae, macrophyte, and macroinvertebrates organisms, were developed. The applicability of this framework, combined with the derived site-specific CFs, is shown by conducting a case study on E. globulus stands located in Portugal as an example of a land use based system. A spatially explicit LCA assessment was shown to be necessary, since the impacts associated with both green water flows and SS vary greatly as a function of spatial location.
Resumo:
Protective factors are neglected in risk assessment in adult psychiatric and criminal justice populations. This review investigated the predictive efficacy of selected tools that assess protective factors. Five databases were searched using comprehensive terms for records up to June 2014, resulting in 17 studies (n = 2,198). Results were combined in a multilevel meta-analysis using the R (R Core Team, R: A Language and Environment for Statistical Computing, Vienna, Austria: R Foundation for Statistical Computing, 2015) metafor package (Viechtbauer, Journal of Statistical Software, 2010, 36, 1). Prediction of outcomes was poor relative to a reference category of violent offending, with the exception of prediction of discharge from secure units. There were no significant differences between the predictive efficacy of risk scales, protective scales, and summary judgments. Protective factor assessment may be clinically useful, but more development is required. Claims that use of these tools is therapeutically beneficial require testing.
Resumo:
BACKGROUND: The identification of patients' health needs is pivotal in optimising the quality of health care, increasing patient satisfaction and directing resource allocation. Health needs are complex and not so easily evaluated as health-related quality of life (HRQL), which is becoming increasingly accepted as a means of providing a more global, patient-orientated assessment of the outcome of health care interventions than the simple medical model. The potential of HRQL as a surrogate measure of healthcare needs has not been evaluated. OBJECTIVES AND METHOD: A generic (Short Form-12; SF-12) and a disease-specific questionnaire (Seattle Angina Questionnaire; SAQ) were tested for their potential to predict health needs in patients with acute coronary disease. A wide range of healthcare needs were determined using a questionnaire specifically developed for this purpose. RESULTS: With the exception of information needs, healthcare needs were highly correlated with health-related quality of life. Patients with limited enjoyment of personal interests, weak financial situation, greater dependency on others to access health services, and dissatisfaction with accommodation reported poorer HRQL (SF-12: p < 0.001; SAQ: p < 0.01). Difficulties with mobility, aids to daily living and activities requiring assistance from someone else were strongly associated with both generic and disease-specific questionnaires (SF-12: r = 0.46-0.55, p < 0.01; SAQ: r = 0.53-0.65, p < 0.001). Variables relating to quality of care and health services were more highly correlated with SAQ components (r = 0.33-0.59) than with SF-12 (r = 0.07-0.33). Overall, the disease-specific Seattle Angina Questionnaire was superior to the generic Short Form-12 in detecting healthcare needs in patients with coronary disease. Receiver-operator curves supported the sensitivity of HRQL tools in detecting health needs. CONCLUSION: Healthcare needs are complex and developing suitable questionnaires to measure these is difficult and time-consuming. Without a satisfactory means of measuring these needs, the extent to which disease impacts on health will continue to be underestimated. Further investigation on larger populations is warranted but HRQL tools appear to be a reasonable proxy for healthcare needs, as they identify the majority of needs in patients with coronary disease, an observation not previously reported in this patient group
Resumo:
Uncovering mechanisms of unknown pathological mechanisms and body response to applied medication are the drive forces toward personalized medicine. In this post-genomic era, all eyes are tuned to proteomic field, searching for the answers and explanations by investigating the final physiological functional units – proteins and their proteoforms. Development of cutting-edge mass spectrometric technologies and powerful bioinformatics tools, allowed life-science community mining of disease-specific proteins as biomarkers, which are often hidden by high complexity of the samples and/or small abundance. Nowadays, there are several proteomics-based approaches to study the proteome. This chapter focuses on gold standard proteomics strategies and related issues towards candidate biomarker discovery, which may have diagnostic/prognostic as well as mechanistic utility.
Resumo:
Although a clear correlation between levels of fungi in the air and health impacts has not been shown in epidemiological studies, fungi must be regarded as potential occupational health hazards. Fungi can have an impact on human health in four different ways: (1) they can infect humans, (2) they may act as allergens, (3) they can be toxigenic, or (4) they may cause inflammatory reactions. Fungi of concern in occupational hygiene are mostly non-pathogenic or facultative pathogenic (opportunistic) species, but are relevant as allergens and mycotoxins producers. It is known that the exclusive use of conventional methods for fungal quantification (fungal culture) may underestimate the results due to different reasons. The incubation temperature chosen will not be the most suitable for every fungal species, resulting in the inhibition of some species and the favouring of others. Differences in fungi growth rates may also result in data underestimation, since the fungal species with higher growth rates may inhibit others species’ growth. Finally, underestimated data can result from non-viable fungal particles that may have been collected or fungal species that do not grow in the culture media used, although these species may have clinical relevance in the context. Due to these constraints occupational exposure assessment, in setings with high fungal contamination levels, should follow these steps: Apply conventional methods to obtain fungal load information (air and surfaces) regarding the most critical scenario previously selected; Guideline comparation aplying or legal requirements or suggested limits by scientific and/or technical organizations. We should also compare our results with others from the same setting (if there is any); Select the most suitable indicators for each setting and apply conventional-culture methods and also molecular tools. These methodology will ensure a more real characterization of fungal burden in each setting and, consequently, permits to identify further measures regarding assessment of fungal metabolites, and also a more adequate workers health surveillance. The methodology applied to characterize fungal burden in several occupational environments, focused in Aspergillus spp. prevalence, will be present and discussed.
Resumo:
It is nowadays recognized that the risk of human co-exposure to multiple mycotoxins is real. In the last years, a number of studies have approached the issue of co-exposure and the best way to develop a more precise and realistic assessment. Likewise, the growing concern about the combined effects of mycotoxins and their potential impact on human health has been reflected by the increasing number of toxicological studies on the combined toxicity of these compounds. Nevertheless, risk assessment of these toxins, still follows the conventional paradigm of single exposure and single effects, incorporating only the possibility of additivity but not taking into account the complex dynamics associated to interactions between different mycotoxins or between mycotoxins and other food contaminants. Considering that risk assessment is intimately related to the establishment of regulatory guidelines, once the risk assessment is completed, an effort to reduce or manage the risk should be followed to protect public health. Risk assessment of combined human exposure to multiple mycotoxins thus poses several challenges to scientists, risk assessors and risk managers and opens new avenues for research. This presentation aims to give an overview of the different challenges posed by the likelihood of human co-exposure to mycotoxins and the possibility of interactive effects occurring after absorption, towards knowledge generation to support a more accurate human risk assessment and risk management. For this purpose, a physiologically-based framework that includes knowledge on the bioaccessibility, toxicokinetics and toxicodynamics of multiple toxins is proposed. Regarding exposure assessment, the need of harmonized food consumption data, availability of multianalyte methods for mycotoxin quantification, management of left-censored data and use of probabilistic models will be highlight, in order to develop a more precise and realistic exposure assessment. On the other hand, the application of predictive mathematical models to estimate mycotoxins’ combined effects from in vitro toxicity studies will be also discussed. Results from a recent Portuguese project aimed at exploring the toxic effects of mixtures of mycotoxins in infant foods and their potential health impact will be presented as a case study, illustrating the different aspects of risk assessment highlighted in this presentation. Further studies on hazard and exposure assessment of multiple mycotoxins, using harmonized approaches and methodologies, will be crucial towards an improvement in data quality and contributing to holistic risk assessment and risk management strategies for multiple mycotoxins in foodstuffs.