982 resultados para Tape Recording
Resumo:
L' évaluation quantitative des dangers et des expositions aux nanomatériaux se heurte à de nombreuses incertitudes qui ne seront levées qu'à mesure de la progression des connaissances scientifiques de leurs propriétés. L' une des conséquences de ces incertitudes est que les valeurs limites d'exposition professionnelle définies actuellement pour les poussières ne sont pas nécessairement pertinentes aux nanomatériaux. En l'absence de référentiel quantitatif et, à la demande de la DGS pour éclairer les réflexions de l' AFNOR et de l'ISO sur le sujet, une démarche de gestion graduée des risques (control banding) a été élaborée au sein de l' Anses. Ce développement a été réalisé à l'aide d'un groupe d'experts rapporteurs rattaché au Comité d'experts spécialisés évaluation des risques liés aux agents physiques, aux nouvelles technologies et aux grands aménagements. La mise en oeuvre de la démarche de gestion graduée des risques proposée repose sur quatre grandes étapes: 1. Le recueil des informations. Cette étape consiste à réunir les informations disponibles sur les dangers du nanomatériau manufacturé considéré ; ainsi que sur l'exposition potentielle des personnes aux postes de travail (observation sur le terrain, mesures, etc.). 2. L'attribution d'une bande de danger. Le danger potentiel du nanomatériau manufacturé présent, qu'il soit brut où incorporé dans une matrice (liquide ou solide) est évalué dans cette étape. La bande danger attribuée tient compte de la dangerosité du produit bulk ou de sa substance analogue à l'échelle non-nanométrique, de la bio-persistance du matériau (pour les matériaux fibreux), de sa solubilité et de son éventuelle réactivité. 3. Attribution d'une bande d'exposition. La bande d'exposition du nanomatériau manufacturé considéré ou du produit en contenant est définie par le niveau de potentiel d'émission du produit. Elle tient compte de sa forme physique (solide, liquide, poudre aérosol), de sa pulvérulence et de sa volatilité. Le nombre de travailleurs, la fréquence, la durée d'exposition ainsi que la quantité mise en oeuvre ne sont pas pris en compte, contrairement à une évaluation classique des risques chimiques. 4. Obtention d'une bande de maîtrise des risques. Le croisement des bandes de dangers et d'exposition préalablement attribuées permet de défi nir le niveau de maîtrise du risque. Il fait correspondre les moyens techniques et organisationnels à mettre en oeuvre pour maintenir le risque au niveau le plus faible possible. Un plan d'action est ensuite défi ni pour garantir l'effi cacité de la prévention recommandée par le niveau de maîtrise déterminé. Il tient compte des mesures de prévention déjà existantes et les renforce si nécessaire. Si les mesures indiquées par le niveau de maîtrise de risque ne sont pas réalisables, par exemple, pour des raisons techniques ou budgétaires, une évaluation de risque approfondie devra être réalisée par un expert. La gestion graduée des risques est une méthode alternative pour réaliser une évaluation qualitative de risques et mettre en place des moyens de prévention sans recourir à une évaluation quantitative des risques. Son utilisation semble particulièrement adaptée au contexte des nanomatériaux manufacturés, pour lequel les choix de valeurs de référence (Valeurs limites d'exposition en milieu professionnel) et des techniques de mesurage appropriées souffrent d'une grande incertitude. La démarche proposée repose sur des critères simples, accessibles dans la littérature scientifi que ou via les données techniques relatives aux produits utilisés. Pour autant, sa mise en oeuvre requiert des compétences minimales dans les domaines de la prévention des risques chimiques (chimie, toxicologie, etc.), des nanosciences et des nanotechnologies.
Resumo:
A method of making a multiple matched filter which allows the recognition of different characters in successive planes in simple conditions is proposed. The generation of the filter is based on recording on the same plate the Fourier transforms of the different patterns to be recognized, each of which is affected by different spherical phase factors because the patterns have been placed at different distances from the lens. This is proved by means of experiments with a triple filter which allows satisfactory recognition of three characters.
Resumo:
Introduction: Difficult tracheal intubation remains a constant and significant source of morbidity and mortality in anaesthetic practice. Insufficient airway assessment in the preoperative period continues to be a major cause of unanticipated difficult intubation. Although many risk factors have already been identified, preoperative airway evaluation is not always regarded as a standard procedure and the respective weight of each risk factor remains unclear. Moreover the predictive scores available are not sensitive, moderately specific and often operator-dependant. In order to improve the preoperative detection of patients at risk for difficult intubation, we developed a system for automated and objective evaluation of morphologic criteria of the face and neck using video recordings and advanced techniques borrowed from face recognition. Method and results: Frontal video sequences were recorded in 5 healthy volunteers. During the video recording, subjects were requested to perform maximal flexion-extension of the neck and to open wide the mouth with tongue pulled out. A robust and real-time face tracking system was then applied, allowing to automatically identify and map a grid of 55 control points on the face, which were tracked during head motion. These points located important features of the face, such as the eyebrows, the nose, the contours of the eyes and mouth, and the external contours, including the chin. Moreover, based on this face tracking, the orientation of the head could also be estimated at each frame of the video sequence. Thus, we could infer for each frame the pitch angle of the head pose (related to the vertical rotation of the head) and obtain the degree of head extension. Morphological criteria used in the most frequent cited predictive scores were also extracted, such as mouth opening, degree of visibility of the uvula or thyreo-mental distance. Discussion and conclusion: Preliminary results suggest the high feasibility of the technique. The next step will be the application of the same automated and objective evaluation to patients who will undergo tracheal intubation. The difficulties related to intubation will be then correlated to the biometric characteristics of the patients. The objective in mind is to analyze the biometrics data with artificial intelligence algorithms to build a highly sensitive and specific predictive test.
Resumo:
INTRODUCTION: Electroencephalography (EEG) has a central role in the outcome prognostication in subjects with anoxic/hypoxic encephalopathy following a cardiac arrest (CA). Continuous EEG monitoring (cEEG) has been consistently developed and studied; however, its yield as compared to repeated standard EEG (sEEG) is unknown. METHODS: We studied a prospective cohort of comatose adults treated with therapeutic hypothermia (TH) after a CA. cEEG data regarding background activity and epileptiform components were compared to two 20 minute sEEG extracted from the cEEG recording (one during TH, and one in early normothermia). RESULTS: In this cohort, 34 recordings were studied. During TH, the agreement between cEEG and sEEG was 97.1% (95% CI: 84.6 - 99.9%) for background discontinuity and reactivity evaluation, while it was 94.1% (95% CI 80.3 - 99.2%) regarding epileptiform activity. In early normothermia, we did not find any discrepancies. Thus, concordance was very good during TH (kappa 0.83), and optimal during normothermia (kappa=1). The median delay between CA and the first EEG reactivity testing was 18 hours (range: 4.75 - 25) for patients with perfect agreement and 10 hours (range: 5.75 - 10.5) for the three patients in whom there were discordant findings (P=0.02, Wilcoxon). CONCLUSION: Standard intermittent EEG has comparable performance than continuous EEG both for variables important for outcome prognostication (EEG reactivity) and identification of epileptiform transients in this relatively small sample of comatose survivors of CA. This finding has an important practical implication, especially for centers where EEG resources are limited.
Resumo:
Given that clay-rich landslides may become mobilized, leading to rapid mass movements (earthflows and debris flows), they pose critical problems in risk management worldwide. The most widely proposed mechanism leading to such flow-like movements is the increase in water pore pressure in the sliding mass, generating partial or complete liquefaction. This solid-to-liquid transition results in a dramatic reduction of mechanical rigidity in the liquefied zones, which could be detected by monitoring shear wave velocity variations. With this purpose in mind, the ambient seismic noise correlation technique has been applied to measure the variation in the seismic surface wave velocity in the Pont Bourquin landslide (Swiss Alps). This small but active composite earthslide-earthflow was equipped with continuously recording seismic sensors during spring and summer 2010. An earthslide of a few thousand cubic meters was triggered in mid-August 2010, after a rainy period. This article shows that the seismic velocity of the sliding material, measured from daily noise correlograms, decreased continuously and rapidly for several days prior to the catastrophic event. From a spectral analysis of the velocity decrease, it was possible to determine the location of the change at the base of the sliding layer. These results demonstrate that ambient seismic noise can be used to detect rigidity variations before failure and could potentially be used to predict landslides.
Resumo:
The aim of the present study was to explore the prevalence of acute cerebrovascular symptoms temporally related to carotid Doppler examination (DEx), in order to increase the awareness and recording of such events and to discuss possible mechanisms. All adult patients who complained of acute onset neurologic symptoms during or shortly after a carotid DEx, between 01/2003 and 12/2011 in the University Hospital of Lausanne were prospectively collected. We identified four consecutive patients with acute onset neurologic symptoms during or shortly after a carotid DEx among approximately 13,500 patients who underwent carotid DEx in our facility during the nine-year period (0.015% of all examined carotids). Clinical data, imaging reports and CTA (CT angiography) or/and ultrasound images are presented for each patient. Ischemic cerebrovascular events during or immediately after cervical Doppler could be due to chance or to several physical factors. They should be promptly recognized by Doppler personnel and properly treated, but do not put into question the overwhelming benefits of Doppler in cerebrovascular patients.
Resumo:
The graffiti on pottery discovered on the site of Aventicum (Avenches, VD/Switzerland) form the largest corpus of minor inscriptions of the Roman Empire studied until now. Indeed, a total of 1828 graffiti have been found. The reading and the recording of the inscriptions are generally dependent on the state of conservation of the graffito and its support. In numerous cases, only a pale shadow of the inscription is visible, which makes traditional observations, such as visual observations with the naked eye, unsuitable for its decipherment. Consequently, advanced techniques have been applied for enhancing the readability of such inscriptions. In our paper we show the efficiency of 3D laser profilometry as well as high resolution photography as powerful means to decipher illegible engraved inscriptions. The use of such analyses to decipher graffiti on pottery or on other materials enables a better understanding of minor inscriptions and improves the knowledge of the daily life of ancient populations substantially.
Resumo:
Recent findings suggest an association between exposure to cleaning products and respiratory dysfunctions including asthma. However, little information is available about quantitative airborne exposures of professional cleaners to volatile organic compounds deriving from cleaning products. During the first phases of the study, a systematic review of cleaning products was performed. Safety data sheets were reviewed to assess the most frequently added volatile organic compounds. It was found that professional cleaning products are complex mixtures of different components (compounds in cleaning products: 3.5 ± 2.8), and more than 130 chemical substances listed in the safety data sheets were identified in 105 products. The main groups of chemicals were fragrances, glycol ethers, surfactants, solvents; and to a lesser extent phosphates, salts, detergents, pH-stabilizers, acids, and bases. Up to 75% of products contained irritant (Xi), 64% harmful (Xn) and 28% corrosive (C) labeled substances. Hazards for eyes (59%), skin (50%) and by ingestion (60%) were the most reported. Monoethanolamine, a strong irritant and known to be involved in sensitizing mechanisms as well as allergic reactions, is frequently added to cleaning products. Monoethanolamine determination in air has traditionally been difficult and air sampling and analysis methods available were little adapted for personal occupational air concentration assessments. A convenient method was developed with air sampling on impregnated glass fiber filters followed by one step desorption, gas chromatography and nitrogen phosphorous selective detection. An exposure assessment was conducted in the cleaning sector, to determine airborne concentrations of monoethanolamine, glycol ethers, and benzyl alcohol during different cleaning tasks performed by professional cleaning workers in different companies, and to determine background air concentrations of formaldehyde, a known indoor air contaminant. The occupational exposure study was carried out in 12 cleaning companies, and personal air samples were collected for monoethanolamine (n=68), glycol ethers (n=79), benzyl alcohol (n=15) and formaldehyde (n=45). All but ethylene glycol mono-n-butyl ether air concentrations measured were far below (<1/10) of the Swiss eight hours occupational exposure limits, except for butoxypropanol and benzyl alcohol, where no occupational exposure limits were available. Although only detected once, ethylene glycol mono-n-butyl ether air concentrations (n=4) were high (49.5 mg/m3 to 58.7 mg/m3), hovering at the Swiss occupational exposure limit (49 mg/m3). Background air concentrations showed no presence of monoethanolamine, while the glycol ethers were often present, and formaldehyde was universally detected. Exposures were influenced by the amount of monoethanolamine in the cleaning product, cross ventilation and spraying. The collected data was used to test an already existing exposure modeling tool during the last phases of the study. The exposure estimation of the so called Bayesian tool converged with the measured range of exposure the more air concentrations of measured exposure were added. This was best described by an inverse 2nd order equation. The results suggest that the Bayesian tool is not adapted to predict low exposures. The Bayesian tool should be tested also with other datasets describing higher exposures. Low exposures to different chemical sensitizers and irritants should be further investigated to better understand the development of respiratory disorders in cleaning workers. Prevention measures should especially focus on incorrect use of cleaning products, to avoid high air concentrations at the exposure limits. - De récentes études montrent l'existence d'un lien entre l'exposition aux produits de nettoyages et les maladies respiratoires telles que l'asthme. En revanche, encore peu d'informations sont disponibles concernant la quantité d'exposition des professionnels du secteur du nettoyage aux composants organiques volatiles provenant des produits qu'ils utilisent. Pendant la première phase de cette étude, un recueil systématique des produits professionnels utilisés dans le secteur du nettoyage a été effectué. Les fiches de données de sécurité de ces produits ont ensuite été analysées, afin de répertorier les composés organiques volatiles les plus souvent utilisés. Il a été mis en évidence que les produits de nettoyage professionnels sont des mélanges complexes de composants chimiques (composants chimiques dans les produits de nettoyage : 3.5 ± 2.8). Ainsi, plus de 130 substances listées dans les fiches de données de sécurité ont été retrouvées dans les 105 produits répertoriés. Les principales classes de substances chimiques identifiées étaient les parfums, les éthers de glycol, les agents de surface et les solvants; dans une moindre mesure, les phosphates, les sels, les détergents, les régulateurs de pH, les acides et les bases ont été identifiés. Plus de 75% des produits répertoriés contenaient des substances décrites comme irritantes (Xi), 64% nuisibles (Xn) et 28% corrosives (C). Les risques pour les yeux (59%), la peau (50%) et par ingestion (60%) était les plus mentionnés. La monoéthanolamine, un fort irritant connu pour être impliqué dans les mécanismes de sensibilisation tels que les réactions allergiques, est fréquemment ajouté aux produits de nettoyage. L'analyse de la monoéthanolamine dans l'air a été habituellement difficile et les échantillons d'air ainsi que les méthodes d'analyse déjà disponibles étaient peu adaptées à l'évaluation de la concentration individuelle d'air aux postes de travail. Une nouvelle méthode plus efficace a donc été développée en captant les échantillons d'air sur des filtres de fibre de verre imprégnés, suivi par une étape de désorption, puis une Chromatographie des gaz et enfin une détection sélective des composants d'azote. Une évaluation de l'exposition des professionnels a été réalisée dans le secteur du nettoyage afin de déterminer la concentration atmosphérique en monoéthanolamine, en éthers de glycol et en alcool benzylique au cours des différentes tâches de nettoyage effectuées par les professionnels du nettoyage dans différentes entreprises, ainsi que pour déterminer les concentrations atmosphériques de fond en formaldéhyde, un polluant de l'air intérieur bien connu. L'étude de l'exposition professionnelle a été effectuée dans 12 compagnies de nettoyage et les échantillons d'air individuels ont été collectés pour l'éthanolamine (n=68), les éthers de glycol (n=79), l'alcool benzylique (n=15) et le formaldéhyde (n=45). Toutes les substances mesurées dans l'air, excepté le 2-butoxyéthanol, étaient en-dessous (<1/10) de la valeur moyenne d'exposition aux postes de travail en Suisse (8 heures), excepté pour le butoxypropanol et l'alcool benzylique, pour lesquels aucune valeur limite d'exposition n'était disponible. Bien que détecté qu'une seule fois, les concentrations d'air de 2-butoxyéthanol (n=4) étaient élevées (49,5 mg/m3 à 58,7 mg/m3), se situant au-dessus de la frontière des valeurs limites d'exposition aux postes de travail en Suisse (49 mg/m3). Les concentrations d'air de fond n'ont montré aucune présence de monoéthanolamine, alors que les éthers de glycol étaient souvent présents et les formaldéhydes quasiment toujours détectés. L'exposition des professionnels a été influencée par la quantité de monoéthanolamine présente dans les produits de nettoyage utilisés, par la ventilation extérieure et par l'emploie de sprays. Durant la dernière phase de l'étude, les informations collectées ont été utilisées pour tester un outil de modélisation de l'exposition déjà existant, l'outil de Bayesian. L'estimation de l'exposition de cet outil convergeait avec l'exposition mesurée. Cela a été le mieux décrit par une équation du second degré inversée. Les résultats suggèrent que l'outil de Bayesian n'est pas adapté pour mettre en évidence les taux d'expositions faibles. Cet outil devrait également être testé avec d'autres ensembles de données décrivant des taux d'expositions plus élevés. L'exposition répétée à des substances chimiques ayant des propriétés irritatives et sensibilisantes devrait être investiguée d'avantage, afin de mieux comprendre l'apparition de maladies respiratoires chez les professionnels du nettoyage. Des mesures de prévention devraient tout particulièrement être orientées sur l'utilisation correcte des produits de nettoyage, afin d'éviter les concentrations d'air élevées se situant à la valeur limite d'exposition acceptée.
Resumo:
In many practical applications the state of field soils is monitored by recording the evolution of temperature and soil moisture at discrete depths. We theoretically investigate the systematic errors that arise when mass and energy balances are computed directly from these measurements. We show that, even with no measurement or model errors, large residuals might result when finite difference approximations are used to compute fluxes and storage term. To calculate the limits set by the use of spatially discrete measurements on the accuracy of balance closure, we derive an analytical solution to estimate the residual on the basis of the two key parameters: the penetration depth and the distance between the measurements. When the thickness of the control layer for which the balance is computed is comparable to the penetration depth of the forcing (which depends on the thermal diffusivity and on the forcing period) large residuals arise. The residual is also very sensitive to the distance between the measurements, which requires accurately controlling the position of the sensors in field experiments. We also demonstrate that, for the same experimental setup, mass residuals are sensitively larger than the energy residuals due to the nonlinearity of the moisture transport equation. Our analysis suggests that a careful assessment of the systematic mass error introduced by the use of spatially discrete data is required before using fluxes and residuals computed directly from field measurements.
Resumo:
INTRODUCTION: Continuous EEG (cEEG) is increasingly used to monitor brain function in neuro-ICU patients. However, its value in patients with coma after cardiac arrest (CA), particularly in the setting of therapeutic hypothermia (TH), is only beginning to be elucidated. The aim of this study was to examine whether cEEG performed during TH may predict outcome. METHODS: From April 2009 to April 2010, we prospectively studied 34 consecutive comatose patients treated with TH after CA who were monitored with cEEG, initiated during hypothermia and maintained after rewarming. EEG background reactivity to painful stimulation was tested. We analyzed the association between cEEG findings and neurologic outcome, assessed at 2 months with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). RESULTS: Continuous EEG recording was started 12 ± 6 hours after CA and lasted 30 ± 11 hours. Nonreactive cEEG background (12 of 15 (75%) among nonsurvivors versus none of 19 (0) survivors; P < 0.001) and prolonged discontinuous "burst-suppression" activity (11 of 15 (73%) versus none of 19; P < 0.001) were significantly associated with mortality. EEG seizures with absent background reactivity also differed significantly (seven of 15 (47%) versus none of 12 (0); P = 0.001). In patients with nonreactive background or seizures/epileptiform discharges on cEEG, no improvement was seen after TH. Nonreactive cEEG background during TH had a positive predictive value of 100% (95% confidence interval (CI), 74 to 100%) and a false-positive rate of 0 (95% CI, 0 to 18%) for mortality. All survivors had cEEG background reactivity, and the majority of them (14 (74%) of 19) had a favorable outcome (CPC 1 or 2). CONCLUSIONS: Continuous EEG monitoring showing a nonreactive or discontinuous background during TH is strongly associated with unfavorable outcome in patients with coma after CA. These data warrant larger studies to confirm the value of continuous EEG monitoring in predicting prognosis after CA and TH.
Resumo:
Alzheimer's disease (AD) disrupts functional connectivity in distributed cortical networks. We analyzed changes in the S-estimator, a measure of multivariate intraregional synchronization, in electroencephalogram (EEG) source space in 15 mild AD patients versus 15 age-matched controls to evaluate its potential as a marker of AD progression. All participants underwent 2 clinical evaluations and 2 EEG recording sessions on diagnosis and after a year. The main effect of AD was hyposynchronization in the medial temporal and frontal regions and relative hypersynchronization in posterior cingulate, precuneus, cuneus, and parietotemporal cortices. However, the S-estimator did not change over time in either group. This result motivated an analysis of rapidly progressing AD versus slow-progressing patients. Rapidly progressing AD patients showed a significant reduction in synchronization with time, manifest in left frontotemporal cortex. Thus, the evolution of source EEG synchronization over time is correlated with the rate of disease progression and should be considered as a cost-effective AD biomarker.
Resumo:
A t the request of the Iowa State Highway Commission, the Engineering Research Institute observed the traffic operations at the Interstate 29 (1-29) and Interstate 80 (1-80) interchange in the southwest part of Council Bluffs. The general location of the site is shown in Figure 1. Before limiting the analysis to the diverging area the project staff drove the entire Council Bluffs freeway system and consulted with M r . Philip Hassenstab (Iowa State Highway Commission, District 4, Resident Maintenance Engineer at Council Bluffs). The final study scope was delineated as encompassing only the operational characteristics of the diverge area where 1-29 South and 1-80 East divide and the ramp to merge area where 1-80 West joins 1-29 North (both areas being contained within the aforementioned interchange). Supplementing the traffic operations scope, was an effort to delineate and document the applicability of video-tape techniques to traffic engineering studies and analyses. Documentation was primarily in the form of a demonstration video-tape.
Resumo:
Objectifs: Etudier et décrire étape par étape les aspects techniques de la thérapie par cryoablation percutanée ciblée (CPC). Matériels et méthodes: CPC est réalisée par un : adiologue interventionnel sous guidage échographique et/ou tomodensitométrique. Résultats: CPC est un traitement peu invasif qui utilise un froid extrême pour geler et détruire des tumeurs localisées dans le foie, les reins, la prostate, les poumons, les os, les seins et la peau. L'imagerie est utilisée pour guider le placement d'un ou plusieurs applicateurs à travers la peau jusqu'au site cible et surveiller le processus de congélation. De l'azote liquide ou du gaz d'argon circule au sein d'une aiguille-applicateur (CryoPhobe) créant froid intense au contact de la cible. Une sonde indépendante surveille la température du tissu environnant. Le froid est maintenu pendant au moins 20 minutes suivi d'un dégel actif de la "boule de glace". La procédure est ensuite répétée pour maximiser la destruction tumorale. La procédure dure environ 1 heure et se fait de manière ambulatoire. Conclusion: CPC est un traitement alternatif efficace du cancer chez des patients sélectionnés. Les propriétés destructrices des tissus tumoraux sont bien établies pour le cancer du rein, cependant des investigations supplémentaires sont nécessaires pour déterminer l'efficacité de CPC à long terme dans d'autres indications.
Resumo:
1. The hypermetabolism frequently observed at rest in patients with chronic obstructive pulmonary disease has been attributed to a high cost of breathing. However, measurement of the cost of breathing by the usual hyperventilation procedure is fraught with methodological problems. The purpose of this study was to measure more directly the cost of breathing in a group of ambulatory patients with stable chronic obstructive pulmonary disease. 2. The cost of breathing was calculated as the difference in oxygen consumption measured by indirect calorimetry between spontaneous breathing and noninvasive mechanical ventilation. Inspiratory muscle rest was achieved by negative or positive pressure ventilation and assessed by the recording of surface electromyograms of the diaphragm and parasternal intercostal muscles. 3. Seven tests were performed in six ambulatory patients with stable chronic obstructive pulmonary disease, four tests using positive pressure ventilation and three with negative pressure ventilation. During mechanical ventilation, the electromyographic activity of the diaphragm decreased by 70 +/- 22%, while that of the parasternals was suppressed in four tests, and remained unchanged in three. However, oxygen consumption was only 1.6 +/- 6.2% lower during mechanical ventilation. 4. The cost of breathing measured in this study was therefore much lower than previously published values. Stress was not likely to influence the results, as both the heart rate and plasma catecholamines did not change between spontaneous breathing and mechanical ventilation. These results suggest that the cost of breathing in ambulatory patients with stable chronic obstructive pulmonary disease may be lower than previously estimated.