51 resultados para Assessment and certification systems
Resumo:
Molar heat capacities of the binary compounds NiAl, NiIn, NiSi, NiGe, NiBi, NiSb, CoSb and FeSb were determined every 10 K by differential scanning calorimetry in the temperature range 310-1080 K. The experimental results have been fitted versus temperature according to C-p = a + b . T + c . T-2 + d . T-2. Results are given, discussed and compared to estimations found in the literature. Two compounds, NiBi and FeSb, are subject to transformations between 460 and 500 K. (C) 1999 Elsevier Science Ltd. All rights reserved.
Resumo:
BACKGROUND: We aimed to assess the value of a structured clinical assessment and genetic testing for refining the diagnosis of abacavir hypersensitivity reactions (ABC-HSRs) in a routine clinical setting. METHODS: We performed a diagnostic reassessment using a structured patient chart review in individuals who had stopped ABC because of suspected HSR. Two HIV physicians blinded to the human leukocyte antigen (HLA) typing results independently classified these individuals on a scale between 3 (ABC-HSR highly likely) and -3 (ABC-HSR highly unlikely). Scoring was based on symptoms, onset of symptoms and comedication use. Patients were classified as clinically likely (mean score > or =2), uncertain (mean score > or = -1 and < or = 1) and unlikely (mean score < or = -2). HLA typing was performed using sequence-based methods. RESULTS: From 131 reassessed individuals, 27 (21%) were classified as likely, 43 (33%) as unlikely and 61 (47%) as uncertain ABC-HSR. Of the 131 individuals with suspected ABC-HSR, 31% were HLA-B*5701-positive compared with 1% of 140 ABC-tolerant controls (P < 0.001). HLA-B*5701 carriage rate was higher in individuals with likely ABC-HSR compared with those with uncertain or unlikely ABC-HSR (78%, 30% and 5%, respectively, P < 0.001). Only six (7%) HLA-B*5701-negative individuals were classified as likely HSR after reassessment. CONCLUSIONS: HLA-B*5701 carriage is highly predictive of clinically diagnosed ABC-HSR. The high proportion of HLA-B*5701-negative individuals with minor symptoms among individuals with suspected HSR indicates overdiagnosis of ABC-HSR in the era preceding genetic screening. A structured clinical assessment and genetic testing could reduce the rate of inappropriate ABC discontinuation and identify individuals at high risk for ABC-HSR.
Resumo:
Proper examination of the pupil provides an objective measure of the integrity of the pregeniculate afferent visual pathway and allows assessment of sympathetic and parasympathetic innervation to the eye. Infrared videography and pupillography are increasingly used to study the dynamic behavior of the pupil in common disorders, such as Horner's syndrome and tonic pupil.
Resumo:
Because of the development of modern transportation facilities, an ever rising number of individuals including many patients with preexisting diseases visit high-altitude locations (>2500 m). High-altitude exposure triggers a series of physiologic responses intended to maintain an adequate tissue oxygenation. Even in normal subjects, there is enormous interindividual variability in these responses that may be further amplified by environmental factors such as cold temperature, low humidity, exercise, and stress. These adaptive mechanisms, although generally tolerated by most healthy subjects, may induce major problems in patients with preexisting cardiovascular diseases in which the functional reserves are already limited. Preexposure assessment of patients helps to minimize risk and detect contraindications to high-altitude exposure. Moreover, the great variability and nonpredictability of the adaptive response should encourage physicians counseling such patients to adapt a cautionary approach. Here, we will briefly review how high-altitude adjustments may interfere with and aggravate/decompensate preexisting cardiovascular diseases. Moreover, we will provide practical recommendations on how to investigate and counsel patients with cardiovascular disease desiring to travel to high-altitude locations.
Resumo:
BACKGROUND AND OBJECTIVES: Evaluation of glomerular hyperfiltration (GH) is difficult; the variable reported definitions impede comparisons between studies. A clear and universal definition of GH would help in comparing results of trials aimed at reducing GH. This study assessed how GH is measured and defined in the literature. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: Three databases (Embase, MEDLINE, CINAHL) were systematically searched using the terms "hyperfiltration" or "glomerular hyperfiltration". All studies reporting a GH threshold or studying the effect of a high GFR in a continuous manner against another outcome of interest were included. RESULTS: The literature search was performed from November 2012 to February 2013 and updated in August 2014. From 2013 retrieved studies, 405 studies were included. Threshold use to define GH was reported in 55.6% of studies. Of these, 88.4% used a single threshold and 11.6% used numerous thresholds adapted to participant sex or age. In 29.8% of the studies, the choice of a GH threshold was not based on a control group or literature references. After 2004, the use of GH threshold use increased (P<0.001), but the use of a control group to precisely define that GH threshold decreased significantly (P<0.001); the threshold did not differ among pediatric, adult, or mixed-age studies. The GH threshold ranged from 90.7 to 175 ml/min per 1.73 m(2) (median, 135 ml/min per 1.73 m(2)). CONCLUSION: Thirty percent of studies did not justify the choice of threshold values. The decrease of GFR in the elderly was rarely considered in defining GH. From a methodologic point of view, an age- and sex-matched control group should be used to define a GH threshold.
Resumo:
Since the discovery of hypocretins/orexins (Hcrt/Ox) in 1998, several narcoleptic mouse models, such as Hcrt-KO, Hcrtrl-KO, Hcrtr2-KO and double receptors KO mice, and orexin-ataxin transgenic mice were generated. The available Hcrt mouse models do not allow the dissection of the specific role of Hcrt in each target region. Dr. Anne Vassalli generated loxP-flanked alleles for each Hcrt receptor, which are manipulated by Cre recombinase to generate mouse lines with disrupted Hcrtrl or Hcrtr2 (or both) in cell type-specific manner. The role of noradrenaline (NA) and dopamine (OA) in ttie regulation of vigilance states is well documented. The purpose of this thesis is to explore the role of the Hcrt input into these two monoaminergic systems. Chronic loss of Hcrtrl in NA neurons consolidated paradoxical sleep (PS), and altered wakefulness brain activity in baseline, during the sleep deprivation (SD), and when mice were challenged by a novel environment, or exposed to nest-building material. The analysis of alterations in the sleep EEG delta power showed a consistent correlation with the changes in the preceding waking quality in these mice. Targeted inactivation of Hcrt input into DA neurons showed that Hcrtr2 inactivation present the strongest phenotype. The loss of Hcrtr2 in DA neurons caused modified brain activities in spontaneous wakefulness, during SD, and in novel environmental conditions. In addition to alteration of wakefulness quality and quantity, conditional inactivation of Hcrtr2 in DA neurons caused an increased in time spent in PS in baseline and a delayed and less complete PS recovery after SD. In the first 30 min of sleep recovery, single (i.e. for Hcrtrl or Hcrtr2) conditional knockout receptor mice had opposite changes in delta activity, including an increased power density in the fast delta range with specific inactivation of Hcrtr2, but a decreased power density in the same range with specific inactivation of Hcrtrl in DA cells. These studies demonstrate a complex impact of Hcrt receptors signaling in both NA and DA system, not only on quantity and quality of wakefulness, but also on PS amount regulation as well as on SWS delta power expression. -- Depuis la découverte des hypocrétines/orexines (Hcrt/Ox) en 1998, plusieurs modèles de souris, narcoleptiques telles que Hcrt-KO, Hcrtr2-KO et récepteurs doubles KO et les souris transgéniques orexine-ataxine ont été générés. Les modèles de souris Hcrt disponibles ne permettaient pas la dissection du rôle spécifique de l'Hcrt dans chaque noyau neuronal cible. Notre laboratoire a généré des allèles loxP pour chacun des 2 gènes codant pour les récepteurs Hcrtr, qui sont manipulés par recombinase Cre pour générer des lignées de souris avec Hcrtrl inactivé, ou Hcrtr2 inactivé, (ou les deux), spécifiquement dans un type cellulaire particulier. Le rôle de la noradrénaline (NA) et la dopamine (DA) dans la régulation des états de vigilance est bien documentée. Le but de cette thèse est d'étudier le rôle de l'afférence Hcrt dans ces deux systèmes monoaminergiques au niveau de l'activité cérébrale telle qu'elle apparaît dans l'électroencéphalogramme (EEG). Mon travail montre que la perte chronique de Hcrtrl dans les neurones NA consolide le sommeil paradoxal (PS), et l'activité cérébrale de l'éveil est modifiée en condition spontanée, au cours d'une experience de privation de sommeil (SD), et lorsque les souris sont présentées à un nouvel environnement, ou exposées à des matériaux de construction du nid. Ces modifications de l'éveil sont corrélées à des modifications de puissance de l'activité delta du sommeil lent qui le suit. L'inactivation ciblée des Hcrtrs dans les neurones DA a montré que l'inactivation Hcrtr2 conduit au phénotype le plus marqué. La perte de Hcrtr2 dans les neurones DA mène à des modification d'activité cérébrale en éveil spontané, pendant SD, ainsi que dans des conditions environnementales nouvelles. En plus de l'altération de la qualité de l'éveil et de la quantité, l'inactivation conditionnelle de Hcrtr2 dans les neurones DA a provoqué une augmentation du temps passé en sommeil paradoxal (PS) en condition de base, et une reprise retardée et moins complète du PS après SD. Dans les 30 premières minutes de la récupération de sommeil, les modèles inactivés pour un seul des récepteurs (ie pour Hcrtrl ou Hcrtr2 seulement) montrent des changements opposés en activité delta, en particulier une densité de puissance accrue dans le delta rapide avec l'inactivation spécifique de Hcrtr2, mais une densité de puissance diminuée dans cette même gamme chez les souris inactivées spécifiquement en Hcrtrl dans les neurones DA. Ces études démontrent un impact complexe de l'inactivation de la neurotransmission au niveau des récepteurs d'Hcrt dans les deux compartiments NA et DA, non seulement sur la quantité et la qualité de l'éveil, mais aussi sur la régulation de quantité de sommeil paradoxal, ainsi que sur l'expression de la puissance delta pendant le sommeil lent.
Resumo:
Adult and pediatric laryngotracheal stenoses (LTS) comprise a wide array of various conditions that require precise preoperative assessment and classification to improve comparison of different therapeutic modalities in a matched series of patients. This consensus paper of the European Laryngological Society proposes a five-step endoscopic airway assessment and a standardized reporting system to better differentiate fresh, incipient from mature, cicatricial LTSs, simple one-level from complex multilevel LTSs and finally "healthy" from "severely morbid" patients. The proposed scoring system, which integrates all of these parameters, may be used to help define different groups of LTS patients, choose the best treatment modality for each individual patient and assess distinct post-treatment outcomes accordingly.
Resumo:
Characterizing the geological features and structures in three dimensions over inaccessible rock cliffs is needed to assess natural hazards such as rockfalls and rockslides and also to perform investigations aimed at mapping geological contacts and building stratigraphy and fold models. Indeed, the detailed 3D data, such as LiDAR point clouds, allow to study accurately the hazard processes and the structure of geologic features, in particular in vertical and overhanging rock slopes. Thus, 3D geological models have a great potential of being applied to a wide range of geological investigations both in research and applied geology projects, such as mines, tunnels and reservoirs. Recent development of ground-based remote sensing techniques (LiDAR, photogrammetry and multispectral / hyperspectral images) are revolutionizing the acquisition of morphological and geological information. As a consequence, there is a great potential for improving the modeling of geological bodies as well as failure mechanisms and stability conditions by integrating detailed remote data. During the past ten years several large rockfall events occurred along important transportation corridors where millions of people travel every year (Switzerland: Gotthard motorway and railway; Canada: Sea to sky highway between Vancouver and Whistler). These events show that there is still a lack of knowledge concerning the detection of potential rockfalls, making mountain residential settlements and roads highly risky. It is necessary to understand the main factors that destabilize rocky outcrops even if inventories are lacking and if no clear morphological evidences of rockfall activity are observed. In order to increase the possibilities of forecasting potential future landslides, it is crucial to understand the evolution of rock slope stability. Defining the areas theoretically most prone to rockfalls can be particularly useful to simulate trajectory profiles and to generate hazard maps, which are the basis for land use planning in mountainous regions. The most important questions to address in order to assess rockfall hazard are: Where are the most probable sources for future rockfalls located? What are the frequencies of occurrence of these rockfalls? I characterized the fracturing patterns in the field and with LiDAR point clouds. Afterwards, I developed a model to compute the failure mechanisms on terrestrial point clouds in order to assess the susceptibility to rockfalls at the cliff scale. Similar procedures were already available to evaluate the susceptibility to rockfalls based on aerial digital elevation models. This new model gives the possibility to detect the most susceptible rockfall sources with unprecented detail in the vertical and overhanging areas. The results of the computation of the most probable rockfall source areas in granitic cliffs of Yosemite Valley and Mont-Blanc massif were then compared to the inventoried rockfall events to validate the calculation methods. Yosemite Valley was chosen as a test area because it has a particularly strong rockfall activity (about one rockfall every week) which leads to a high rockfall hazard. The west face of the Dru was also chosen for the relevant rockfall activity and especially because it was affected by some of the largest rockfalls that occurred in the Alps during the last 10 years. Moreover, both areas were suitable because of their huge vertical and overhanging cliffs that are difficult to study with classical methods. Limit equilibrium models have been applied to several case studies to evaluate the effects of different parameters on the stability of rockslope areas. The impact of the degradation of rockbridges on the stability of large compartments in the west face of the Dru was assessed using finite element modeling. In particular I conducted a back-analysis of the large rockfall event of 2005 (265'000 m3) by integrating field observations of joint conditions, characteristics of fracturing pattern and results of geomechanical tests on the intact rock. These analyses improved our understanding of the factors that influence the stability of rock compartments and were used to define the most probable future rockfall volumes at the Dru. Terrestrial laser scanning point clouds were also successfully employed to perform geological mapping in 3D, using the intensity of the backscattered signal. Another technique to obtain vertical geological maps is combining triangulated TLS mesh with 2D geological maps. At El Capitan (Yosemite Valley) we built a georeferenced vertical map of the main plutonio rocks that was used to investigate the reasons for preferential rockwall retreat rate. Additional efforts to characterize the erosion rate were made at Monte Generoso (Ticino, southern Switzerland) where I attempted to improve the estimation of long term erosion by taking into account also the volumes of the unstable rock compartments. Eventually, the following points summarize the main out puts of my research: The new model to compute the failure mechanisms and the rockfall susceptibility with 3D point clouds allows to define accurately the most probable rockfall source areas at the cliff scale. The analysis of the rockbridges at the Dru shows the potential of integrating detailed measurements of the fractures in geomechanical models of rockmass stability. The correction of the LiDAR intensity signal gives the possibility to classify a point cloud according to the rock type and then use this information to model complex geologic structures. The integration of these results, on rockmass fracturing and composition, with existing methods can improve rockfall hazard assessments and enhance the interpretation of the evolution of steep rockslopes. -- La caractérisation de la géologie en 3D pour des parois rocheuses inaccessibles est une étape nécessaire pour évaluer les dangers naturels tels que chutes de blocs et glissements rocheux, mais aussi pour réaliser des modèles stratigraphiques ou de structures plissées. Les modèles géologiques 3D ont un grand potentiel pour être appliqués dans une vaste gamme de travaux géologiques dans le domaine de la recherche, mais aussi dans des projets appliqués comme les mines, les tunnels ou les réservoirs. Les développements récents des outils de télédétection terrestre (LiDAR, photogrammétrie et imagerie multispectrale / hyperspectrale) sont en train de révolutionner l'acquisition d'informations géomorphologiques et géologiques. Par conséquence, il y a un grand potentiel d'amélioration pour la modélisation d'objets géologiques, ainsi que des mécanismes de rupture et des conditions de stabilité, en intégrant des données détaillées acquises à distance. Pour augmenter les possibilités de prévoir les éboulements futurs, il est fondamental de comprendre l'évolution actuelle de la stabilité des parois rocheuses. Définir les zones qui sont théoriquement plus propices aux chutes de blocs peut être très utile pour simuler les trajectoires de propagation des blocs et pour réaliser des cartes de danger, qui constituent la base de l'aménagement du territoire dans les régions de montagne. Les questions plus importantes à résoudre pour estimer le danger de chutes de blocs sont : Où se situent les sources plus probables pour les chutes de blocs et éboulement futurs ? Avec quelle fréquence vont se produire ces événements ? Donc, j'ai caractérisé les réseaux de fractures sur le terrain et avec des nuages de points LiDAR. Ensuite, j'ai développé un modèle pour calculer les mécanismes de rupture directement sur les nuages de points pour pouvoir évaluer la susceptibilité au déclenchement de chutes de blocs à l'échelle de la paroi. Les zones sources de chutes de blocs les plus probables dans les parois granitiques de la vallée de Yosemite et du massif du Mont-Blanc ont été calculées et ensuite comparés aux inventaires des événements pour vérifier les méthodes. Des modèles d'équilibre limite ont été appliqués à plusieurs cas d'études pour évaluer les effets de différents paramètres sur la stabilité des parois. L'impact de la dégradation des ponts rocheux sur la stabilité de grands compartiments de roche dans la paroi ouest du Petit Dru a été évalué en utilisant la modélisation par éléments finis. En particulier j'ai analysé le grand éboulement de 2005 (265'000 m3), qui a emporté l'entier du pilier sud-ouest. Dans le modèle j'ai intégré des observations des conditions des joints, les caractéristiques du réseau de fractures et les résultats de tests géoméchaniques sur la roche intacte. Ces analyses ont amélioré l'estimation des paramètres qui influencent la stabilité des compartiments rocheux et ont servi pour définir des volumes probables pour des éboulements futurs. Les nuages de points obtenus avec le scanner laser terrestre ont été utilisés avec succès aussi pour produire des cartes géologiques en 3D, en utilisant l'intensité du signal réfléchi. Une autre technique pour obtenir des cartes géologiques des zones verticales consiste à combiner un maillage LiDAR avec une carte géologique en 2D. A El Capitan (Yosemite Valley) nous avons pu géoréferencer une carte verticale des principales roches plutoniques que j'ai utilisé ensuite pour étudier les raisons d'une érosion préférentielle de certaines zones de la paroi. D'autres efforts pour quantifier le taux d'érosion ont été effectués au Monte Generoso (Ticino, Suisse) où j'ai essayé d'améliorer l'estimation de l'érosion au long terme en prenant en compte les volumes des compartiments rocheux instables. L'intégration de ces résultats, sur la fracturation et la composition de l'amas rocheux, avec les méthodes existantes permet d'améliorer la prise en compte de l'aléa chute de pierres et éboulements et augmente les possibilités d'interprétation de l'évolution des parois rocheuses.
Resumo:
The introduction of engineered nanostructured materials into a rapidly increasing number of industrial and consumer products will result in enhanced exposure to engineered nanoparticles. Workplace exposure has been identified as the most likely source of uncontrolled inhalation of engineered aerosolized nanoparticles, but release of engineered nanoparticles may occur at any stage of the lifecycle of (consumer) products. The dynamic development of nanomaterials with possibly unknown toxicological effects poses a challenge for the assessment of nanoparticle induced toxicity and safety.In this consensus document from a workshop on in-vitro cell systems for nanoparticle toxicity testing11Workshop on 'In-Vitro Exposure Studies for Toxicity Testing of Engineered Nanoparticles' sponsored by the Association for Aerosol Research (GAeF), 5-6 September 2009, Karlsruhe, Germany. an overview is given of the main issues concerning exposure to airborne nanoparticles, lung physiology, biological mechanisms of (adverse) action, in-vitro cell exposure systems, realistic tissue doses, risk assessment and social aspects of nanotechnology. The workshop participants recognized the large potential of in-vitro cell exposure systems for reliable, high-throughput screening of nanoparticle toxicity. For the investigation of lung toxicity, a strong preference was expressed for air-liquid interface (ALI) cell exposure systems (rather than submerged cell exposure systems) as they more closely resemble in-vivo conditions in the lungs and they allow for unaltered and dosimetrically accurate delivery of aerosolized nanoparticles to the cells. An important aspect, which is frequently overlooked, is the comparison of typically used in-vitro dose levels with realistic in-vivo nanoparticle doses in the lung. If we consider average ambient urban exposure and occupational exposure at 5mg/m3 (maximum level allowed by Occupational Safety and Health Administration (OSHA)) as the boundaries of human exposure, the corresponding upper-limit range of nanoparticle flux delivered to the lung tissue is 3×10-5-5×10-3μg/h/cm2 of lung tissue and 2-300particles/h/(epithelial) cell. This range can be easily matched and even exceeded by almost all currently available cell exposure systems.The consensus statement includes a set of recommendations for conducting in-vitro cell exposure studies with pulmonary cell systems and identifies urgent needs for future development. As these issues are crucial for the introduction of safe nanomaterials into the marketplace and the living environment, they deserve more attention and more interaction between biologists and aerosol scientists. The members of the workshop believe that further advances in in-vitro cell exposure studies would be greatly facilitated by a more active role of the aerosol scientists. The technical know-how for developing and running ALI in-vitro exposure systems is available in the aerosol community and at the same time biologists/toxicologists are required for proper assessment of the biological impact of nanoparticles.
Resumo:
BACKGROUND: According to recent guidelines, patients with coronary artery disease (CAD) should undergo revascularization if significant myocardial ischemia is present. Both, cardiovascular magnetic resonance (CMR) and fractional flow reserve (FFR) allow for a reliable ischemia assessment and in combination with anatomical information provided by invasive coronary angiography (CXA), such a work-up sets the basis for a decision to revascularize or not. The cost-effectiveness ratio of these two strategies is compared. METHODS: Strategy 1) CMR to assess ischemia followed by CXA in ischemia-positive patients (CMR + CXA), Strategy 2) CXA followed by FFR in angiographically positive stenoses (CXA + FFR). The costs, evaluated from the third party payer perspective in Switzerland, Germany, the United Kingdom (UK), and the United States (US), included public prices of the different outpatient procedures and costs induced by procedural complications and by diagnostic errors. The effectiveness criterion was the correct identification of hemodynamically significant coronary lesion(s) (= significant CAD) complemented by full anatomical information. Test performances were derived from the published literature. Cost-effectiveness ratios for both strategies were compared for hypothetical cohorts with different pretest likelihood of significant CAD. RESULTS: CMR + CXA and CXA + FFR were equally cost-effective at a pretest likelihood of CAD of 62% in Switzerland, 65% in Germany, 83% in the UK, and 82% in the US with costs of CHF 5'794, euro 1'517, £ 2'680, and $ 2'179 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. CONCLUSIONS: The CMR + CXA strategy is more cost-effective than CXA + FFR below a CAD prevalence of 62%, 65%, 83%, and 82% for the Swiss, the German, the UK, and the US health care systems, respectively. These findings may help to optimize resource utilization in the diagnosis of CAD.