861 resultados para Human Factors Methods.
Resumo:
Background: The neuropeptide secretoneurin, with potential relevance to leukocyte trafficking, is present in nerves of the nasal mucosa in allergic rhinitis and may be released in response to allergen and histamine exposure. There is no information on the occurrence and mechanisms of release of secretoneurin in healthy human airways. Methods: The presence of secretoneurin in nasal biopsies and its release in response to nasal capsaicin and histamine challenges were examined. Symptoms and lavage fluid levels of fucose were recorded as markers of effects in part produced by neural activity. Bronchial histamine challenges followed by sputum induction and analysis of secretoneurin were also carried out. Results: Nerves displaying secretoneurin immunoreactivity abounded in the nasal mucosa. Nasal capsaicin challenge produced local pain (P < 0.05) and increased the levels of fucose (P < 0.05), but failed to affect the levels of secretoneurin. Nasal histamine challenge produced symptoms (P < 0.05) and increased the mucosal output of secretoneurin (P < 0.05) and fucose (P < 0.05). Bronchial histamine challenge increased the sputum levels of secretoneurin (P < 0.05). Conclusions: We conclude that secretoneurin is present in healthy human airways and that histamine evokes its release in both nasal and bronchial mucosae. The present observations support the possibility that secretoneurin is involved in histamine-dependent responses of the human airway mucosa.
Resumo:
Background: Changes in brain gene expression are thought to be responsible for the tolerance, dependence, and neurotoxicity produced by chronic alcohol abuse, but there has been no large scale study of gene expression in human alcoholism. Methods: RNA was extracted from postmortem samples of superior frontal cortex of alcoholics and nonalcoholics. Relative levels of RNA were determined by array techniques. We used both cDNA and oligonucleotide microarrays to provide coverage of a large number of genes and to allow cross-validation for those genes represented on both types of arrays. Results: Expression levels were determined for over 4000 genes and 163 of these were found to differ by 40% or more between alcoholics and nonalcoholics. Analysis of these changes revealed a selective reprogramming of gene expression in this brain region, particularly for myelin-related genes which were downregulated in the alcoholic samples. In addition, cell cycle genes and several neuronal genes were changed in expression. Conclusions: These gene expression changes suggest a mechanism for the loss of cerebral white matter in alcoholics as well as alterations that may lead to the neurotoxic actions of ethanol.
Resumo:
Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.
Resumo:
Safety enforcement practitioners within Europe and marketers, designers or manufacturers of consumer products need to determine compliance with the legal test of "reasonable safety" for consumer goods, to reduce the "risks" of injury to the minimum. To enable freedom of movement of products, a method for safety appraisal is required for use as an "expert" system of hazard analysis by non-experts in safety testing of consumer goods for implementation consistently throughout Europe. Safety testing approaches and the concept of risk assessment and hazard analysis are reviewed in developing a model for appraising consumer product safety which seeks to integrate the human factors contribution of risk assessment, hazard perception, and information processing. The model develops a system of hazard identification, hazard analysis and risk assessment which can be applied to a wide range of consumer products through use of a series of systematic checklists and matrices and applies alternative numerical and graphical methods for calculating a final product safety risk assessment score. It is then applied in its pilot form by selected "volunteer" Trading Standards Departments to a sample of consumer products. A series of questionnaires is used to select participating Trading Standards Departments, to explore the contribution of potential subjective influences, to establish views regarding the usability and reliability of the model and any preferences for the risk assessment scoring system used. The outcome of the two stage hazard analysis and risk assessment process is considered to determine consistency in results of hazard analysis, final decisions regarding the safety of the sample product and to determine any correlation in the decisions made using the model and alternative scoring methods of risk assessment. The research also identifies a number of opportunities for future work, and indicates a number of areas where further work has already begun.
Resumo:
A re-examination of fundamental concepts and a formal structuring of the waveform analysis problem is presented in Part I. eg. the nature of frequency is examined and a novel alternative to the classical methods of detection proposed and implemented which has the advantage of speed and independence from amplitude. Waveform analysis provides the link between Parts I and II. Part II is devoted to Human Factors and the Adaptive Task Technique. The Historical, Technical and Intellectual development of the technique is traced in a review which examines the evidence of its advantages relative to non-adaptive fixed task methods of training, skill assessment and man-machine optimisation. A second review examines research evidence on the effect of vibration on manual control ability. Findings are presented in terms of percentage increment or decrement in performance relative to performance without vibration in the range 0-0.6Rms'g'. Primary task performance was found to vary by as much as 90% between tasks at the same Rms'g'. Differences in task difficulty accounted for this difference. Within tasks vibration-added-difficulty accounted for the effects of vibration intensity. Secondary tasks were found to be largely insensitive to vibration except secondaries which involved fine manual adjustment of minor controls. Three experiments are reported next in which an adaptive technique was used to measure the % task difficulty added by vertical random and sinusoidal vibration to a 'Critical Compensatory Tracking task. At vibration intensities between 0 - 0.09 Rms 'g' it was found that random vibration added (24.5 x Rms'g')/7.4 x 100% to the difficulty of the control task. An equivalence relationship between Random and Sinusoidal vibration effects was established based upon added task difficulty. Waveform Analyses which were applied to the experimental data served to validate Phase Plane analysis and uncovered the development of a control and possibly a vibration isolation strategy. The submission ends with an appraisal of subjects mentioned in the thesis title.
Resumo:
The present scarcity of operational knowledge-based systems (KBS) has been attributed, in part, to an inadequate consideration shown to user interface design during development. From a human factors perspective the problem has stemmed from an overall lack of user-centred design principles. Consequently the integration of human factors principles and techniques is seen as a necessary and important precursor to ensuring the implementation of KBS which are useful to, and usable by, the end-users for whom they are intended. Focussing upon KBS work taking place within commercial and industrial environments, this research set out to assess both the extent to which human factors support was presently being utilised within development, and the future path for human factors integration. The assessment consisted of interviews conducted with a number of commercial and industrial organisations involved in KBS development; and a set of three detailed case studies of individual KBS projects. Two of the studies were carried out within a collaborative Alvey project, involving the Interdisciplinary Higher Degrees Scheme (IHD) at the University of Aston in Birmingham, BIS Applied Systems Ltd (BIS), and the British Steel Corporation. This project, which had provided the initial basis and funding for the research, was concerned with the application of KBS to the design of commercial data processing (DP) systems. The third study stemmed from involvement on a KBS project being carried out by the Technology Division of the Trustees Saving Bank Group plc. The preliminary research highlighted poor human factors integration. In particular, there was a lack of early consideration of end-user requirements definition and user-centred evaluation. Instead concentration was given to the construction of the knowledge base and prototype evaluation with the expert(s). In response to this identified problem, a set of methods was developed that was aimed at encouraging developers to consider user interface requirements early on in a project. These methods were then applied in the two further projects, and their uptake within the overall development process was monitored. Experience from the two studies demonstrated that early consideration of user interface requirements was both feasible, and instructive for guiding future development work. In particular, it was shown a user interface prototype could be used as a basis for capturing requirements at the functional (task) level, and at the interface dialogue level. Extrapolating from this experience, a KBS life-cycle model is proposed which incorporates user interface design (and within that, user evaluation) as a largely parallel, rather than subsequent, activity to knowledge base construction. Further to this, there is a discussion of several key elements which can be seen as inhibiting the integration of human factors within KBS development. These elements stem from characteristics of present KBS development practice; from constraints within the commercial and industrial development environments; and from the state of existing human factors support.
Resumo:
The performance of direct workers has a significant impact on the competitiveness of many manufacturing systems. Unfortunately, system designers are ill equipped to assess this impact during the design process. An opportunity exists to assist designers by expanding the capabilities of popular simulation modelling tools, and using them as a vehicle to better consider human factors during the process of system design manufacture. To support this requirement, this paper reports on an extensive review of literature that develops a theoretical framework, which summarizes the principal factors and relationships that such a modelling tool should incorporate.
Resumo:
INTRODUCTION Zero-G parabolic flight reproduces the weightlessness of space for short periods of time. However motion sickness may affect some fliers. The aim was to assess the extent of this problem and to find possible predictors and modifying factors. METHODS Airbus Zero-G flights consist of 31 parabolas performed in blocks. Each parabola consisted of 20s 0g sandwiched by 20s hypergravity of 1.5-1.8g. The survey covered n=246 person-flights (193 Males 53 Females), aged (M+/-SD) 36.0+/-11.3 years. An anonymous questionnaire included motion sickness rating (1=OK to 6=Vomiting), Motion Sickness Susceptibility Questionnaire (MSSQ), anti-motion sickness medication, prior Zero-G experience, anxiety level, and other characteristics. RESULTS Participants had lower MSSQ percentile scores 27.4+/-28.0 than the population norm of 50. Motion sickness was experienced by 33% and 12% vomited. Less motion sickness was predicted by older age, greater prior Zero-G flight experience, medication with scopolamine, lower MSSQ scores, but not gender nor anxiety. Sickness ratings in fliers pre-treated with scopolamine (1.81+/-1.58) were lower than for non-medicated fliers (2.93+/-2.16), and incidence of vomiting in fliers using scopolamine treatment was reduced by half to a third. Possible confounding factors including age, sex, flight experience, MSSQ, could not account for this. CONCLUSION Motion sickness affected one third of Zero-G fliers, despite being intrinsically less motion sickness susceptible compared to the general population. Susceptible individuals probably try to avoid such a provocative environment. Risk factors for motion sickness included younger age and higher MSSQ scores. Protective factors included prior Zero-G flight experience (habituation) and anti-motion sickness medication.
Resumo:
Dengue fever is one of the most important mosquito-borne diseases worldwide and is caused by infection with dengue virus (DENV). The disease is endemic in tropical and sub-tropical regions and has increased remarkably in the last few decades. At present, there is no antiviral or approved vaccine against the virus. Treatment of dengue patients is usually supportive, through oral or intravenous rehydration, or by blood transfusion for more severe dengue cases. Infection of DENV in humans and mosquitoes involves a complex interplay between the virus and host factors. This results in regulation of numerous intracellular processes, such as signal transduction and gene transcription which leads to progression of disease. To understand the mechanisms underlying the disease, the study of virus and host factors is therefore essential and could lead to the identification of human proteins modulating an essential step in the virus life cycle. Knowledge of these human proteins could lead to the discovery of potential new drug targets and disease control strategies in the future. Recent advances of high throughput screening technologies have provided researchers with molecular tools to carry out investigations on a large scale. Several studies have focused on determination of the host factors during DENV infection in human and mosquito cells. For instance, a genome-wide RNA interference (RNAi) screen has identified host factors that potentially play an important role in both DENV and West Nile virus replication (Krishnan et al. 2008). In the present study, a high-throughput yeast two-hybrid screen has been utilised in order to identify human factors interacting with DENV non-structural proteins. From the screen, 94 potential human interactors were identified. These include proteins involved in immune signalling regulation, potassium voltage-gated channels, transcriptional regulators, protein transporters and endoplasmic reticulum-associated proteins. Validation of fifteen of these human interactions revealed twelve of them strongly interacted with DENV proteins. Two proteins of particular interest were selected for further investigations of functional biological systems at the molecular level. These proteins, including a nuclear-associated protein BANP and a voltage-gated potassium channel Kv1.3, both have been identified through interaction with the DENV NS2A. BANP is known to be involved in NF-kB immune signalling pathway, whereas, Kv1.3 is known to play an important role in regulating passive flow of potassium ions upon changes in the cell transmembrane potential. This study also initiated a construction of an Aedes aegypti cDNA library for use with DENV proteins in Y2H screen. However, several issues were encountered during the study which made the library unsuitable for protein interaction analysis. In parallel, innate immune signalling was also optimised for downstream analysis. Overall, the work presented in this thesis, in particular the Y2H screen provides a number of human factors potentially targeted by DENV during infection. Nonetheless, more work is required to be done in order to validate these proteins and determine their functional properties, as well as testing them with infectious DENV to establish a biological significance. In the long term, data from this study will be useful for investigating potential human factors for development of antiviral strategies against dengue.
Resumo:
Introdução: Durante a gravidez, devido ao aumento da exigência metabólica placentária há um aumento na produção das espécies reativas de oxigénio (ROS) que podem causar, por exemplo, oxidação de ácidos graxos poli-insaturados na placenta, além disso, nesta fase ocorre um aumento na expressão da aromatase e do receptor relacionado ao estrógeno gama (ERRgama) na placenta humana. Objetivo: O objetivo do estudo foi avaliar os parâmetros de estresse oxidativo e imunomarcação da aromatase e do ERRgama na placenta humana. Métodologia: A capacidade antioxidante total (ACAP), atividade da glutamato cisteína ligase (GCL), concentração glutationa (GSH), peroxidação lipídica e imunomarcação da aromatase e do ERRgama foram analisados em tecido placentário de 58 parturientes. Estas análises foram relacionadas com os dados socio-demográficos das participantes. Resultados: Os recém-nascidos de mães fumantes nasceram com menor peso (p=0,001). A concentração de GSH diminuiu a produção da peroxidação lipídica (p<0,05), por outro lado, a atividade de GCL teve efeito oposto (p<0,001). Encontramos uma diminuição na capacidade antioxidante total e aumento da peroxidação lipídica (p<0,05) na placenta. A placenta de mães fumantes tinham menos marcação da aromatase (p=0,037) já, as mães mais velhas tiveram menos marcação do ERRgama (p=0,009) na placenta. A GSH teve efeito positivo na imunomarcação de ERRgama (p=0,001). Conclusões: A expressão da aromatase e do ERRgamma na placenta são alterados tanto por fatores exógenos, tais como o fumo do cigarro, como por fatores endógenos, tais como a concentração de GSH e a idade da mãe. Os marcadores de estresse oxidativo na placenta são mais elevados em mães mais velhas e em placenta com menor capacidade antioxidante total.
Resumo:
Background In occupational life, a mismatch between high expenditure of effort and receiving few rewards may promote the co-occurrence of lifestyle risk factors, however, there is insufficient evidence to support or refute this hypothesis. The aim of this study is to examine the extent to which the dimensions of the Effort-Reward Imbalance (ERI) model – effort, rewards and ERI – are associated with the co-occurrence of lifestyle risk factors. Methods Based on data from the Finnish Public Sector Study, cross-sectional analyses were performed for 28,894 women and 7233 men. ERI was conceptualized as a ratio of effort and rewards. To control for individual differences in response styles, such as a personal disposition to answer negatively to questionnaires, occupational and organizational -level ecological ERI scores were constructed in addition to individual-level ERI scores. Risk factors included current smoking, heavy drinking, body mass index ≥25 kg/m2, and physical inactivity. Multinomial logistic regression models were used to estimate the likelihood of having one risk factor, two risk factors, and three or four risk factors. The associations between ERI and single risk factors were explored using binary logistic regression models. Results After adjustment for age, socioeconomic position, marital status, and type of job contract, women and men with high ecological ERI were 40% more likely to have simultaneously ≥3 lifestyle risk factors (vs. 0 risk factors) compared with their counterparts with low ERI. When examined separately, both low ecological effort and low ecological rewards were also associated with an elevated prevalence of risk factor co-occurrence. The results obtained with the individual-level scores were in the same direction. The associations of ecological ERI with single risk factors were generally less marked than the associations with the co-occurrence of risk factors. Conclusion This study suggests that a high ratio of occupational efforts relative to rewards may be associated with an elevated risk of having multiple lifestyle risk factors. However, an unexpected association between low effort and a higher likelihood of risk factor co-occurrence as well as the absence of data on overcommitment (and thereby a lack of full test of the ERI model) warrant caution in regard to the extent to which the entire ERI model is supported by our evidence.
Resumo:
Aim: A retrospective clinical audit was carried out on records of endodontic treatment performed by dental undergraduates. The audit was performed to evaluate the technical quality of root canal fillings performed by dental undergraduates and determine the associated factors. Methods: 140 records of patients who had received root canal treatment by dental undergraduates were evaluated through periapical radiographs by two examiners (κ =0.74). The root canal fillings had their quality evaluated according to extent, condensation and presence of procedural mishap. Possible factors associated with technical quality such as tooth type, canal curvature, student level and quality of record keeping were evaluated. Data were statistically analyzed using chi-square test (p<0.05). Results: Among the 140 root-filled teeth, acceptable extent, condensation and no-mishap were observed in 72.1%, 66.4% and 77.9% cases respectively. Overall, the technical quality of 68 (48.6%) root-filled teeth was considered acceptable. Overall, non-acceptable root canal fillings were significantly more likely to be observed in molars (69.2%), moderately and severely curved canals (71.4%) and junior students (61.5%). There was no association between acceptable root canal fillings and quality of record keeping. Conclusions: The technical quality of root canal fillings was acceptable in 48.6% cases and it was associated with tooth type, degree of canal curvature and student seniority.
Resumo:
There has recently been a great deal of interest in the potential of computer games to function as innovative educational tools. However, there is very little evidence of games fulfilling that potential. Indeed, the process of merging the disparate goals of education and games design appears problematic, and there are currently no practical guidelines for how to do so in a coherent manner. In this paper, we describe the successful, empirically validated teaching methods developed by behavioural psychologists and point out how they are uniquely suited to take advantage of the benefits that games offer to education. We conclude by proposing some practical steps for designing educational games, based on the techniques of Applied Behaviour Analysis. It is intended that this paper can both focus educational games designers on the features of games that are genuinely useful for education, and also introduce a successful form of teaching that this audience may not yet be familiar with.
Resumo:
Esta investigación midió la percepción del personal asistencial sobre la cultura de seguridad de los pacientes en un hospital de primer nivel de complejidad por medio de un estudio descriptivo de corte transversal. Se utilizó como herramienta de medición la encuesta ‘Hospital Survey on Patient Safety Cultura’ (HSOPSC) de la Agency of Healthcare Research and Quality (AHRQ) versión en español, la cual evalúa doce dimensiones. Los resultados mostraron fortalezas como el aprendizaje organizacional, las mejoras continuas y el apoyo de los administradores para la seguridad del paciente. Las dimensiones clasificadas como oportunidades de mejora fueron la cultura no punitiva, el personal, las transferencias y transiciones y el grado en que la comunicación es abierta. Se concluyó que aunque el personal percibía como positivo el proceso de mejoramiento y apoyo de la administración también sentía que era juzgado si reportaba algún evento adverso.