55 resultados para Teachers evaluation performance
Resumo:
PURPOSE: To compare the diagnostic performance of multi-detector CT arthrography (CTA) and 1.5-T MR arthrography (MRA) in detecting hyaline cartilage lesions of the shoulder, with arthroscopic correlation. PATIENTS AND METHODS: CTA and MRA prospectively obtained in 56 consecutive patients following the same arthrographic procedure were independently evaluated for glenohumeral cartilage lesions (modified Outerbridge grade ≥2 and grade 4) by two musculoskeletal radiologists. The cartilage surface was divided in 18 anatomical areas. Arthroscopy was taken as the reference standard. Diagnostic performance of CTA and MRA was compared using ROC analysis. Interobserver and intraobserver agreement was determined by κ statistics. RESULTS: Sensitivity and specificity of CTA varied from 46.4 to 82.4 % and from 89.0 to 95.9 % respectively; sensitivity and specificity of MRA varied from 31.9 to 66.2 % and from 91.1 to 97.5 % respectively. Diagnostic performance of CTA was statistically significantly better than MRA for both readers (all p ≤ 0.04). Interobserver agreement for the evaluation of cartilage lesions was substantial with CTA (κ = 0.63) and moderate with MRA (κ = 0.54). Intraobserver agreement was almost perfect with both CTA (κ = 0.94-0.95) and MRA (κ = 0.83-0.87). CONCLUSION: The diagnostic performance of CTA and MRA for the detection of glenohumeral cartilage lesions is moderate, although statistically significantly better with CTA. KEY POINTS: ? CTA has moderate diagnostic performance for detecting glenohumeral cartilage substance loss. ? MRA has moderate diagnostic performance for detecting glenohumeral cartilage substance loss. ? CTA is more accurate than MRA for detecting cartilage substance loss.
Resumo:
Background: Variable definitions of outcome (Constant score, Simple Shoulder Test [SST]) have been used to assess outcome after shoulder treatment, although none has been accepted as the universal standard. Physicians lack an objective method to reliably assess the activity of their patients in dynamic conditions. Our purpose was to clinically validate the shoulder kinematic scores given by a portable movement analysis device, using the activities of daily living described in the SST as a reference. The secondary objective was to determine whether this device could be used to document the effectiveness of shoulder treatments (for glenohumeral osteoarthritis and rotator cuff disease) and detect early failures.Methods: A clinical trial including 34 patients and a control group of 31 subjects over an observation period of 1 year was set up. Evaluations were made at baseline and 3, 6, and 12 months after surgery by 2 independent observers. Miniature sensors (3-dimensional gyroscopes and accelerometers) allowed kinematic scores to be computed. They were compared with the regular outcome scores: SST; Disabilities of the Arm, Shoulder and Hand; American Shoulder and Elbow Surgeons; and Constant.Results: Good to excellent correlations (0.61-0.80) were found between kinematics and clinical scores. Significant differences were found at each follow-up in comparison with the baseline status for all the kinematic scores (P < .015). The kinematic scores were able to point out abnormal patient outcomes at the first postoperative follow-up.Conclusion: Kinematic scores add information to the regular outcome tools. They offer an effective way to measure the functional performance of patients with shoulder pathology and have the potential to detect early treatment failures.Level of evidence: Level II, Development of Diagnostic Criteria, Diagnostic Study. (C) 2011 Journal of Shoulder and Elbow Surgery Board of Trustees.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
In this paper we propose a highly accurate approximation procedure for ruin probabilities in the classical collective risk model, which is based on a quadrature/rational approximation procedure proposed in [2]. For a certain class of claim size distributions (which contains the completely monotone distributions) we give a theoretical justification for the method. We also show that under weaker assumptions on the claim size distribution, the method may still perform reasonably well in some cases. This in particular provides an efficient alternative to a related method proposed in [3]. A number of numerical illustrations for the performance of this procedure is provided for both completely monotone and other types of random variables.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
BACKGROUND: Tracheal intubation may be more difficult in morbidly obese (MO) patients than in the non-obese. The aim of this study was to evaluate clinically if the use of the Video Intubation Unit (VIU), a video-optical intubation stylet, could improve the laryngoscopic view compared with the standard Macintosh laryngoscope in this specific population. METHODS: We studied 40 MO patients (body mass index >35 kg/m(2)) scheduled for bariatric surgery. Each patient had a conventional laryngoscopy and a VIU inspection. The laryngoscopic grades (LG) using the Cormack and Lehane scoring system were noted and compared. Thereafter, the patients were randomised to be intubated with one of the two techniques. In one group, the patients were intubated with the help of the VIU and in the control group, tracheal intubation was performed conventionally. The duration of intubation, as well as the minimal SpO(2) achieved during the procedure, were measured. RESULTS: Patient characteristics were similar in both groups. Seventeen patients had a direct LG of 2 or 3 (no patient had a grade of 4). Out of these 17 patients, the LG systematically improved with the VIU and always attained grade 1 (P<0.0001). The intubation time was shorter within the VIU group, but did not attain significance. There was no difference in the SpO(2) post-intubation. CONCLUSION: In MO patients, the use of the VIU significantly improves the visualisation of the larynx, thereby improving the intubation conditions.
Resumo:
Depth-averaged velocities and unit discharges within a 30 km reach of one of the world's largest rivers, the Rio Parana, Argentina, were simulated using three hydrodynamic models with different process representations: a reduced complexity (RC) model that neglects most of the physics governing fluid flow, a two-dimensional model based on the shallow water equations, and a three-dimensional model based on the Reynolds-averaged Navier-Stokes equations. Row characteristics simulated using all three models were compared with data obtained by acoustic Doppler current profiler surveys at four cross sections within the study reach. This analysis demonstrates that, surprisingly, the performance of the RC model is generally equal to, and in some instances better than, that of the physics based models in terms of the statistical agreement between simulated and measured flow properties. In addition, in contrast to previous applications of RC models, the present study demonstrates that the RC model can successfully predict measured flow velocities. The strong performance of the RC model reflects, in part, the simplicity of the depth-averaged mean flow patterns within the study reach and the dominant role of channel-scale topographic features in controlling the flow dynamics. Moreover, the very low water surface slopes that typify large sand-bed rivers enable flow depths to be estimated reliably in the RC model using a simple fixed-lid planar water surface approximation. This approach overcomes a major problem encountered in the application of RC models in environments characterised by shallow flows and steep bed gradients. The RC model is four orders of magnitude faster than the physics based models when performing steady-state hydrodynamic calculations. However, the iterative nature of the RC model calculations implies a reduction in computational efficiency relative to some other RC models. A further implication of this is that, if used to simulate channel morphodynamics, the present RC model may offer only a marginal advantage in terms of computational efficiency over approaches based on the shallow water equations. These observations illustrate the trade off between model realism and efficiency that is a key consideration in RC modelling. Moreover, this outcome highlights a need to rethink the use of RC morphodynamic models in fluvial geomorphology and to move away from existing grid-based approaches, such as the popular cellular automata (CA) models, that remain essentially reductionist in nature. In the case of the world's largest sand-bed rivers, this might be achieved by implementing the RC model outlined here as one element within a hierarchical modelling framework that would enable computationally efficient simulation of the morphodynamics of large rivers over millennial time scales. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
INTRODUCTION: Video records are widely used to analyze performance in alpine skiing at professional or amateur level. Parts of these analyses require the labeling of some movements (i.e. determining when specific events occur). If differences among coaches and differences for the same coach between different dates are expected, they have never been quantified. Moreover, knowing these differences is essential to determine which parameters reliable should be used. This study aimed to quantify the precision and the repeatability for alpine skiing coaches of various levels, as it is done in other fields (Koo et al, 2005). METHODS: A software similar to commercialized products was designed to allow video analyses. 15 coaches divided into 3 groups (5 amateur coaches (G1), 5 professional instructors (G2) and 5 semi-professional coaches (G3)) were enrolled. They were asked to label 15 timing parameters (TP) according to the Swiss ski manual (Terribilini et al, 2001) for each curve. TP included phases (initiation, steering I-II), body and ski movements (e.g. rotation, weighting, extension, balance). Three video sequences sampled at 25 Hz were used and one curve per video was labeled. The first video was used to familiarize the analyzer to the software. The two other videos, corresponding to slalom and giant slalom, were considered for the analysis. G1 realized twice the analysis (A1 and A2) at different dates and TP were randomized between both analyses. Reference TP were considered as the median of G2 and G3 at A1. The precision was defined as the RMS difference between individual TP and reference TP, whereas the repeatability was calculated as the RMS difference between individual TP at A1 and at A2. RESULTS AND DISCUSSION: For G1, G2 and G3, a precision of +/-5.6 frames, +/-3.0 and +/-2.0 frames, was respectively obtained. These results showed that G2 was more precise than G1, and G3 more precise than G2, were in accordance with group levels. The repeatability for G1 was +/-3.1 frames. Furthermore, differences among TP precision were observed, considering G2 and G3, with largest differences of +/-5.9 frames for "body counter rotation movement in steering phase II", and of 0.8 frame for "ski unweighting in initiation phase". CONCLUSION: This study quantified coach ability to label video in term of precision and repeatability. The best precision was obtained for G3 and was of +/-0.08s, which corresponds to +/-6.5% of the curve cycle. Regarding the repeatability, we obtained a result of +/-0.12s for G1, corresponding to +/-12% of the curve cycle. The repeatability of G2 and G3 are expected to be lower than the precision of G1 and the corresponding repeatability will be assessed soon. In conclusion, our results indicate that the labeling of video records is reliable for some TP, whereas caution is required for others. REFERENCES Koo S, Gold MD, Andriacchi TP. (2005). Osteoarthritis, 13, 782-789. Terribilini M, et al. (2001). Swiss Ski manual, 29-46. IASS, Lucerne.
Resumo:
PURPOSE: To evaluate the technical quality and the diagnostic performance of a protocol with use of low volumes of contrast medium (25 mL) at 64-detector spiral computed tomography (CT) in the diagnosis and management of adult, nontraumatic subarachnoid hemorrhage (SAH). MATERIALS AND METHODS: This study was performed outside the United States and was approved by the institutional review board. Intracranial CT angiography was performed in 73 consecutive patients with nontraumatic SAH diagnosed at nonenhanced CT. Image quality was evaluated by two observers using two criteria: degree of arterial enhancement and venous contamination. The two independent readers evaluated diagnostic performance (lesion detection and correct therapeutic decision-making process) by using rotational angiographic findings as the standard of reference. Sensitivity, specificity, and positive and negative predictive values were calculated for patients who underwent CT angiography and three-dimensional rotational angiography. The intraclass correlation coefficient was calculated to assess interobserver concordance concerning aneurysm measurements and therapeutic management. RESULTS: All aneurysms were detected, either ruptured or unruptured. Arterial opacification was excellent in 62 cases (85%), and venous contamination was absent or minor in 61 cases (84%). In 95% of cases, CT angiographic findings allowed optimal therapeutic management. The intraclass correlation coefficient ranged between 0.93 and 0.95, indicating excellent interobserver agreement. CONCLUSION: With only 25 mL of iodinated contrast medium focused on the arterial phase, 64-detector CT angiography allowed satisfactory diagnostic and therapeutic management of nontraumatic SAH.
Resumo:
La syncope est un symptôme clinique fréquent mais son origine demeure indéterminée jusque dans 60% des cas de patients admis dans un centre d'urgences. Le développement de consultations spécialisées de la syncope a considérablement modifié l'évaluation des patients avec une syncope inexpliquée en les orientant vers des stratégies d'investigations non-invasives, tels que le tilt-test, le massage du sinus carotidien et le test ^hyperventilation. Cependant, il existe peu de données dans 10 la littérature concernant dans la performance diagnostique réelle de ces tests fonctionnels.Notre travail de recherche porte sur l'analyse des données des 939 premiers patients adressés à la consultation ambulatoire de la syncope du CHUV pour l'investigation d'une syncope d'origine indéterminée. L'objectif de notre travail de thèse est 1) d'évaluer la performance diagnostique de l'algorithme de prise en charge standardisé et de ses différents tests pratiqués dans le cadre de notre 15 consultation et 2) de déterminer les caractéristiques cliniques communes des patients avec un diagnostic final de syncope d'origine rythmique ou vaso-vagale.Notre travail de thèse démontre qu'un algorithme de prise en charge standardisé basé sur des tests non-invasifs permet de déterminer 2/3 des causes de syncope initialement d'origine indéterminée. Par ailleurs, notre travail montre que des étiologies bénignes, telles que la syncope d'origine vaso- 20 vagale ou psychogène, représentent la moitié des causes syncopales alors que les arythmies cardiaques demeurent peu fréquentes. Finalement, notre travail démontre que l'absence de symptomatologie prodromique, en particulier chez les patients âgés avec une limitation fonctionnelle ou un allongement de la durée de l'onde Ρ à l'électrocardiogramme, suggère une syncope d'origine rythmique. Ce travail de thèse contribuera à optimaliser notre algorithme de prise 25 en charge standardisée de la syncope d'origine indéterminée et ouvre de nouvelles perspectives de recherche dans le développement de modèles basés sur des facteurs cliniques permettant de prédire les principales causes syncopales.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
[Table des matières] Technology assessment in health care in the United States: an historical review / S. Perry. - The aims and methods of technology assessment / JH Glasser. - Evaluation des technologies de la santé / A. Griffiths. - Les données nécessaires pour l'évaluation des technologies médicales / R. Chrzanowski, F. Gutzwiller, F. Paccaud. - Economic issues in technology assessment/DR Lairson, JM Swint. - Two decades of experience in technology assessment: evaluating the safety, performance, and cost effectiveness of medical equipment / JJ Nobel. - Demography and technology assessment / H. Hansluwka. - Méthodes expérimentale et non expérimentale pour l'évaluation des innovations technologiques / R. Chrzanowski, F. Paccaud. - Skull radiography in head trauma: a successful case of technology assessment / NT Racoveanu. - Complications associées à l'anesthésie: une étude prospective en France / L. Tiret et al. - Impact de l'information publique sur les taux opératoires: le cas de l'hystérectomie / G. Domenighetti, P. Luraschi, A. Casabianca. - The clinical effectiveness of acupuncture for the relief of chronic pain / MS Patel, F. Gutzwiller, F. Paccaud, A. Marazzi. - Soins à domicile et hébergement à long terme: à la recherche d'un développement optimum / G. Tinturier. - Economic evaluation of six scenarios for the treatment of stones in the kidney and ureter by surgery or ESWL / MS Patel et al. - Technology assessment and medical practice / F. Gutzwiller. - Technology assessment and health policy / SJ Reiser. - Global programme on appropriate technology for health, its role and place within WHO / K. Staehr Johansen.
Resumo:
The aim of this study was to develop an ambulatory system for the three-dimensional (3D) knee kinematics evaluation, which can be used outside a laboratory during long-term monitoring. In order to show the efficacy of this ambulatory system, knee function was analysed using this system, after an anterior cruciate ligament (ACL) lesion, and after reconstructive surgery. The proposed system was composed of two 3D gyroscopes, fixed on the shank and on the thigh, and a portable data logger for signal recording. The measured parameters were the 3D mean range of motion (ROM) and the healthy knee was used as control. The precision of this system was first assessed using an ultrasound reference system. The repeatability was also estimated. A clinical study was then performed on five unilateral ACL-deficient men (range: 19-36 years) prior to, and a year after the surgery. The patients were evaluated with the IKDC score and the kinematics measurements were carried out on a 30 m walking trial. The precision in comparison with the reference system was 4.4 degrees , 2.7 degrees and 4.2 degrees for flexion-extension, internal-external rotation, and abduction-adduction, respectively. The repeatability of the results for the three directions was 0.8 degrees , 0.7 degrees and 1.8 degrees . The averaged ROM of the five patients' healthy knee were 70.1 degrees (standard deviation (SD) 5.8 degrees), 24.0 degrees (SD 3.0 degrees) and 12.0 degrees (SD 6.3 degrees for flexion-extension, internal-external rotation and abduction-adduction before surgery, and 76.5 degrees (SD 4.1 degrees), 21.7 degrees (SD 4.9 degrees) and 10.2 degrees (SD 4.6 degrees) 1 year following the reconstruction. The results for the pathologic knee were 64.5 degrees (SD 6.9 degrees), 20.6 degrees (SD 4.0 degrees) and 19.7 degrees (8.2 degrees) during the first evaluation, and 72.3 degrees (SD 2.4 degrees), 25.8 degrees (SD 6.4 degrees) and 12.4 degrees (SD 2.3 degrees) during the second one. The performance of the system enabled us to detect knee function modifications in the sagittal and transverse plane. Prior to the reconstruction, the ROM of the injured knee was lower in flexion-extension and internal-external rotation in comparison with the controlateral knee. One year after the surgery, four patients were classified normal (A) and one almost normal (B), according to the IKDC score, and changes in the kinematics of the five patients remained: lower flexion-extension ROM and higher internal-external rotation ROM in comparison with the controlateral knee. The 3D kinematics was changed after an ACL lesion and remained altered one year after the surgery
Resumo:
Professional cleaning is a basic service occupation with a wide variety of tasks carried out in all kind of different sectors and workplaces by a large workforce. One important risk for cleaning workers is the exposure to chemical substances that are present in cleaning products.Monoethanolamine was found to be often present in cleaning products such as general purpose cleaners, bathroom cleaners, floor cleaners and kitchen cleaners. Monoethanolamine can injure the skin, and exposure to monoethanolamine was associated to asthma even when the air concentrations were low. It is a strong irritant and known to be involved in sensitizing mechanisms. It is very likely that the use of cleaning products containing monoethanolamine gives rise to respiratory and dermal exposures. Therefore there is a need to further investigate the exposures to monoethanolamine for both, respiratory and dermal exposure.The determination of monoethanolamine has traditionally been difficult and analytical methods available are little adapted for occupational exposure assessments. For monoethanolamine air concentrations, a sampling and analytical method was already available and could be used. However, a method to analyses samples for skin exposure assessments as well as samples of skin permeation experiments was missing. Therefore one main objective of this master thesis was to search an already developed and described analytical method for the measurement of monoethanolamine in water solutions, and to set it up in the laboratory. Monoethanolamine was analyzed after a derivatisation reaction with o-pthtaldialdehyde. The derivated fluorescing monoethanolamine was then separated with high performance liquid chromatography and detection took place with a fluorescent detector. The method was found to be suitable for qualitative and quantitative analysis of monoethanolamine. An exposure assessment was conducted in the cleaning sector to measure the respiratory and dermal exposures to monoethanolamine during floor cleaning. Stationary air samples (n=36) were collected in 8 companies and samples for dermal exposures (n=12) were collected in two companies. Air concentrations (Mean = 0.18 mg/m3, Standard Deviation = 0.23 mg/m3, geometric Mean = 0.09 mg/m3, Geometric Standard Deviation = 3.50) detected were mostly below 1/10 of the Swiss 8h time weighted average occupational exposure limit. Factors that influenced the measured monoethanolamine air concentrations were room size, ventilation system and the concentration of monoethanolamine in the cleaning product and amount of monoethanolamine used. Measured skin exposures ranged from 0.6 to 128.4 mg/sample. Some cleaning workers that participated in the skin exposure assessment did not use gloves and had direct contact with the solutions containing the cleaning product and monoethanolamine. During the entire sampling campaign, cleaning workers mostly did not use gloves. Cleaning workers are at risk to be regularly exposed to low air concentrations of monoethanolamine. This exposure may be problematic if a worker suffers from allergic reactions (e.g. Asthma). In that case a substitution of the cleaning product may be a good prevention measure as several different cleaning products are available for similar cleaning tasks. Currently there are no occupational exposure limits to compare the skin exposures that were found. To prevent skin exposures, adaptations of the cleaning techniques and the use of gloves should be considered. The simultaneous skin and airborne exposures might accelerate adverse health effects. Overall the risks caused by exposures to monoethanolamine are considered as low to moderate when the cleaning products are used correctly. Whenever possible, skin exposures should be avoided. Further research should consider especially the dermal exposure routes, as very high exposures might occur by skin contact with cleaning products. Dermatitis but also sensitization might be caused by skin exposures. In addition, new biomedical insights are needed to better understand the risks of the dermal exposure. Therefore skin permeability experiments should be considered.
Resumo:
STUDY OBJECTIVES: Traditionally, sleep studies in mammals are performed using electroencephalogram/electromyogram (EEG/EMG) recordings to determine sleep-wake state. In laboratory animals, this requires surgery and recovery time and causes discomfort to the animal. In this study, we evaluated the performance of an alternative, noninvasive approach utilizing piezoelectric films to determine sleep and wakefulness in mice by simultaneous EEG/EMG recordings. The piezoelectric films detect the animal's movements with high sensitivity and the regularity of the piezo output signal, related to the regular breathing movements characteristic of sleep, serves to automatically determine sleep. Although the system is commercially available (Signal Solutions LLC, Lexington, KY), this is the first statistical validation of various aspects of sleep. DESIGN: EEG/EMG and piezo signals were recorded simultaneously during 48 h. SETTING: Mouse sleep laboratory. PARTICIPANTS: Nine male and nine female CFW outbred mice. INTERVENTIONS: EEG/EMG surgery. MEASUREMENTS AND RESULTS: The results showed a high correspondence between EEG/EMG-determined and piezo-determined total sleep time and the distribution of sleep over a 48-h baseline recording with 18 mice. Moreover, the piezo system was capable of assessing sleep quality (i.e., sleep consolidation) and interesting observations at transitions to and from rapid eye movement sleep were made that could be exploited in the future to also distinguish the two sleep states. CONCLUSIONS: The piezo system proved to be a reliable alternative to electroencephalogram/electromyogram recording in the mouse and will be useful for first-pass, large-scale sleep screens for genetic or pharmacological studies. CITATION: Mang GM, Nicod J, Emmenegger Y, Donohue KD, O'Hara BF, Franken P. Evaluation of a piezoelectric system as an alternative to electroencephalogram/electromyogram recordings in mouse sleep studies.