965 resultados para Objective measurement


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paracaspase MALT1 is a Cys-dependent, Arg-specific protease that plays an essential role in the activation and proliferation of lymphocytes during the immune response. Oncogenic activation of MALT1 is associated with the development of specific forms of B-cell lymphomas. Through specific cleavage of its substrates, MALT1 controls various aspects of lymphocyte activation, including the activation of transcriptional pathways, the stabilization of mRNAs, and an increase in cellular adhesion. In lymphocytes, the activity of MALT1 is tightly controlled by its inducible monoubiquitination, which promotes the dimerization of MALT1. Here, we describe both in vitro and in vivo assays that have been developed to assess MALT1 activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To describe the methodology of Confirmatory Factor Analyis for categorical items and to apply this methodology to evaluate the factor structure and invariance of the WHO-Disability Assessment Schedule (WHODAS-II) questionnaire, developed by the World HealthOrganization.Methods: Data used for the analysis come from the European Study of Mental Disorders(ESEMeD), a cross-sectional interview to a representative sample of the general population of 6 european countries (n=8796). Respondents were administered a modified version of theWHODAS-II, that measures functional disability in the previous 30 days in 6 differentdimensions: Understanding and Communicating; Self-Care, Getting Around, Getting Along withOthers, Life Activities and Participation. The questionnaire includes two types of items: 22severity items (5 points likert) and 8 frequency items (continuous). An Exploratory factoranalysis (EFA) with promax rotation was conducted on a random 50% of the sample. Theremaining half of the sample was used to perform a Confirmatory Factor Analysis (CFA) inorder to compare three different models: (a) the model suggested by the results obtained in theEFA; (b) the theoretical model suggested by the WHO with 6 dimensions; (c) a reduced modelequivalent to model b where 4 of the frequency items are excluded. Moreover, a second orderfactor was also evaluated. Finally, a CFA with covariates was estimated in order to evaluatemeasurement invariance of the items between Mediterranean and non-mediterranean countries.Results: The solution that provided better results in the EFA was that containing 7 factors. Twoof the frequency items presented high factor loadings in the same factor, and one of thempresented factor loadings smaller than 0.3 with all the factors. With regard to the CFA, thereduced model (model c) presented the best goodness of fit results (CFI=0.992,TLI=0.996,RMSEA=0.024). The second order factor structure presented adequate goodness of fit (CFI=0.987,TLI=0.991, RMSEA=0.036). Measurement non-invariance was detected for one of the items of thequestionnaire (FD20 ¿ Embarrassment due to health problems).Conclusions: AFC confirmed the initial hypothesis about the factorial structure of the WHODAS-II in 6factors. The second order factor supports the existence of a global dimension of disability. The use of 4of the frequency items is not recommended in the scoring of the corresponding dimensions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The measurement of pavement roughness has been the concern of highway engineers for more than 70 years. This roughness is referred to as "riding quality" by the traveling public. Pavement roughness evaluating devices have attempted to place either a graphical or numerical value on the public's riding comfort or discomfort. Early graphical roughness recorders had many different designs. In 1900 an instrument called the "Viagraph" was developed by an Irish engineer.' The "Viagraph" consisted of a twelve foot board with graphical recorder drawn over the pavement. The "Profilometer" built in Illinois in 1922 was much more impressive. ' The instrument's recorder was mounted on a frame supported by 32 bicycle wheels mounted in tandem. Many other variations of profilometers with recorders were built but most were difficult to handle and could not secure uniformly reproducible results. The Bureau of Public Roads (BPR) Road Roughness Indicator b u i l t in 1941 is the most widely used numerical roughness recorder.' The BPR Road Roughness Indicator consists of a trailer unit with carefully selected springs, means of dampening, and balanced wheel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: TIDratio indirectly reflects myocardial ischemia and is correlated with cardiacprognosis. We aimed at comparing the influence of three different softwarepackages for the assessment of TID using Rb-82 cardiac PET/CT. Methods: Intotal, data of 30 patients were used based on normal myocardial perfusion(SSS<3 and SRS<3) and stress myocardial blood flow 2mL/min/g)assessed by Rb-82 cardiac PET/CT. After reconstruction using 2D OSEM (2Iterations, 28 subsets), 3-D filtering (Butterworth, order=10, ωc=0.5), data were automatically processed, and then manually processed fordefining identical basal and apical limits on both stress and rest images.TIDratio were determined with Myometrix®, ECToolbox® and QGS®software packages. Comparisons used ANOVA, Student t-tests and Lin concordancetest (ρc). Results: All of the 90 processings were successfullyperformed. TID ratio were not statistically different between software packageswhen data were processed automatically (P=0.2) or manually (P=0.17). There was a slight, butsignificant relative overestimation of TID with automatic processing incomparison to manual processing using ECToolbox® (1.07 ± 0.13 vs 1.0± 0.13, P=0.001)and Myometrix® (1.07 ± 0.15 vs 1.01 ± 0.11, P=0.003) but not using QGS®(1.02 ±0.12 vs 1.05 ± 0.11, P=0.16). The best concordance was achieved between ECToolbox®and Myometrix® manual (ρc=0.67) processing.Conclusion: Using automatic or manual mode TID estimation was not significantlyinfluenced by software type. Using Myometrix® or ECToolbox®TID was significantly different between automatic and manual processing, butnot using QGS®. Software package should be account for when definingTID normal reference limits, as well as when used in multicenter studies. QGS®software seemed to be the most operator-independent software package, whileECToolbox® and Myometrix® produced the closest results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: Our objective is to test the hypothesis that coronary endothelial function (CorEndoFx) does not change with repeated isometric handgrip (IHG) stress in CAD patients or healthy subjects. BACKGROUND: Coronary responses to endothelial-dependent stressors are important measures of vascular risk that can change in response to environmental stimuli or pharmacologic interventions. The evaluation of the effect of an acute intervention on endothelial response is only valid if the measurement does not change significantly in the short term under normal conditions. Using 3.0 Tesla (T) MRI, we non-invasively compared two coronary artery endothelial function measurements separated by a ten minute interval in healthy subjects and patients with coronary artery disease (CAD). METHODS: Twenty healthy adult subjects and 12 CAD patients were studied on a commercial 3.0 T whole-body MR imaging system. Coronary cross-sectional area (CSA), peak diastolic coronary flow velocity (PDFV) and blood-flow were quantified before and during continuous IHG stress, an endothelial-dependent stressor. The IHG exercise with imaging was repeated after a 10 minute recovery period. RESULTS: In healthy adults, coronary artery CSA changes and blood-flow increases did not differ between the first and second stresses (mean % change ±SEM, first vs. second stress CSA: 14.8%±3.3% vs. 17.8%±3.6%, p = 0.24; PDFV: 27.5%±4.9% vs. 24.2%±4.5%, p = 0.54; blood-flow: 44.3%±8.3 vs. 44.8%±8.1, p = 0.84). The coronary vasoreactive responses in the CAD patients also did not differ between the first and second stresses (mean % change ±SEM, first stress vs. second stress: CSA: -6.4%±2.0% vs. -5.0%±2.4%, p = 0.22; PDFV: -4.0%±4.6% vs. -4.2%±5.3%, p = 0.83; blood-flow: -9.7%±5.1% vs. -8.7%±6.3%, p = 0.38). CONCLUSION: MRI measures of CorEndoFx are unchanged during repeated isometric handgrip exercise tests in CAD patients and healthy adults. These findings demonstrate the repeatability of noninvasive 3T MRI assessment of CorEndoFx and support its use in future studies designed to determine the effects of acute interventions on coronary vasoreactivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring school efficiency is a challenging task. First, a performance measurement technique has to be selected. Within Data Envelopment Analysis (DEA), one such technique, alternative models have been developed in order to deal with environmental variables. The majority of these models lead to diverging results. Second, the choice of input and output variables to be included in the efficiency analysis is often dictated by data availability. The choice of the variables remains an issue even when data is available. As a result, the choice of technique, model and variables is probably, and ultimately, a political judgement. Multi-criteria decision analysis methods can help the decision makers to select the most suitable model. The number of selection criteria should remain parsimonious and not be oriented towards the results of the models in order to avoid opportunistic behaviour. The selection criteria should also be backed by the literature or by an expert group. Once the most suitable model is identified, the principle of permanence of methods should be applied in order to avoid a change of practices over time. Within DEA, the two-stage model developed by Ray (1991) is the most convincing model which allows for an environmental adjustment. In this model, an efficiency analysis is conducted with DEA followed by an econometric analysis to explain the efficiency scores. An environmental variable of particular interest, tested in this thesis, consists of the fact that operations are held, for certain schools, on multiple sites. Results show that the fact of being located on more than one site has a negative influence on efficiency. A likely way to solve this negative influence would consist of improving the use of ICT in school management and teaching. Planning new schools should also consider the advantages of being located on a unique site, which allows reaching a critical size in terms of pupils and teachers. The fact that underprivileged pupils perform worse than privileged pupils has been public knowledge since Coleman et al. (1966). As a result, underprivileged pupils have a negative influence on school efficiency. This is confirmed by this thesis for the first time in Switzerland. Several countries have developed priority education policies in order to compensate for the negative impact of disadvantaged socioeconomic status on school performance. These policies have failed. As a result, other actions need to be taken. In order to define these actions, one has to identify the social-class differences which explain why disadvantaged children underperform. Childrearing and literary practices, health characteristics, housing stability and economic security influence pupil achievement. Rather than allocating more resources to schools, policymakers should therefore focus on related social policies. For instance, they could define pre-school, family, health, housing and benefits policies in order to improve the conditions for disadvantaged children.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: In haemodynamically stable patients with acute symptomatic pulmonary embolism (PE), studies have not evaluated the usefulness of combining the measurement of cardiac troponin, transthoracic echocardiogram (TTE), and lower extremity complete compression ultrasound (CCUS) testing for predicting the risk of PE-related death. Methods: The study assessed the ability of three diagnostic tests (cardiac troponin I (cTnI), echocardiogram, and CCUS) to prognosticate the primary outcome of PE-related mortality during 30 days of follow-up after a diagnosis of PE by objective testing. Results: Of 591 normotensive patients diagnosed with PE, the primary outcome occurred in 37 patients (6.3%; 95% CI 4.3% to 8.2%). Patients with right ventricular dysfunction (RVD) by TTE and concomitant deep vein thrombosis (DVT) by CCUS had a PE-related mortality of 19.6%, compared with 17.1% of patients with elevated cTnI and concomitant DVT and 15.2% of patients with elevated cTnI and RVD. The use of any two-test strategy had a higher specificity and positive predictive value compared with the use of any test by itself. A combined three-test strategy did not further improve prognostication. For a subgroup analysis of high-risk patients, according to the pulmonary embolism severity index (classes IV and V), positive predictive values of the two-test strategies for PE-related mortality were 25.0%, 24.4% and 20.7%, respectively. Conclusions: In haemodynamically stable patients with acute symptomatic PE, a combination of echocardiography (or troponin testing) and CCUS improved prognostication compared with the use of any test by itself for the identification of those at high risk of PE-related death.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze the constraints on the mass and mixing of a superstring-inspired E6 Z' neutral gauge boson that follow from the recent precise Z mass measurements and show that they depend very sensitively on the assumed value of the W mass and also, to a lesser extent, on the top-quark mass.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Awareness of being monitored can influence participants' habitual physical activity (PA) behavior. This reactivity effect may threaten the validity of PA assessment. Reports on reactivity when measuring the PA of children and adolescents have been inconsistent. The aim of this study was to investigate whether PA outcomes measured by accelerometer devices differ from measurement day to measurement day and whether the day of the week and the day on which measurement started influence these differences. METHODS: Accelerometer data (counts per minute [cpm]) of children and adolescents (n = 2081) pooled from eight studies in Switzerland with at least 10 h of daily valid recording were investigated for effects of measurement day, day of the week, and start day using mixed linear regression. RESULTS: The first measurement day was the most active day. Counts per minute were significantly higher than on the second to the sixth day, but not on the seventh day. Differences in the age-adjusted means between the first and consecutive days ranged from 23 to 45 cpm (3.6%-7.1%). In preschoolchildren, the differences almost reached 10%. The start day significantly influenced PA outcome measures. CONCLUSIONS: Reactivity to accelerometer measurement of PA is likely to be present to an extent of approximately 5% on the first day and may introduce a relevant bias to accelerometer-based studies. In preschoolchildren, the effects are larger than those in elementary and secondary schoolchildren. As the day of the week and the start day significantly influence PA estimates, researchers should plan for at least one familiarization day in school-age children and randomly assign start days.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurement of the blood pressure by the physician remains an essential step in the evaluation of cardiovascular risk. Ambulatory measurement and self-measurement of blood pressure are ways of counteracting the "white coat" effect which is the rise in blood pressure many patients experience in the presence of doctors. Thus, it is possible to define the cardiovascular risk of hypertension and identify the patients with the greatest chance of benefiting from antihypertensive therapy. However, it must be realised that normotensive subjects during their everyday activities and becoming hypertensive in the doctor's surgery, may become hypertensive with time, irrespective of the means used to measure blood pressure. These patients should be followed up regularly even if the decision to treat has been postponed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kinematic functional evaluation with body-worn sensors provides discriminative and responsive scores after shoulder surgery, but the optimal movements' combination has not yet been scientifically investigated. The aim of this study was the development of a simplified shoulder function kinematic score including only essential movements. The P Score, a seven-movement kinematic score developed on 31 healthy participants and 35 patients before surgery and at 3, 6 and 12 months after shoulder surgery, served as a reference.Principal component analysis and multiple regression were used to create simplified scoring models. The candidate models were compared to the reference score. ROC curve for shoulder pathology detection and correlations with clinical questionnaires were calculated.The B-B Score (hand to the Back and hand upwards as to change a Bulb) showed no difference to the P Score in time*score interaction (P > .05) and its relation with the reference score was highly linear (R(2) > .97). Absolute value of correlations with clinical questionnaires ranged from 0.51 to 0.77. Sensitivity was 97% and specificity 94%.The B-B and reference scores are equivalent for the measurement of group responses. The validated simplified scoring model presents practical advantages that facilitate the objective evaluation of shoulder function in clinical practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract : The occupational health risk involved with handling nanoparticles is the probability that a worker will experience an adverse health effect: this is calculated as a function of the worker's exposure relative to the potential biological hazard of the material. Addressing the risks of nanoparticles requires therefore knowledge on occupational exposure and the release of nanoparticles into the environment as well as toxicological data. However, information on exposure is currently not systematically collected; therefore this risk assessment lacks quantitative data. This thesis aimed at, first creating the fundamental data necessary for a quantitative assessment and, second, evaluating methods to measure the occupational nanoparticle exposure. The first goal was to determine what is being used where in Swiss industries. This was followed by an evaluation of the adequacy of existing measurement methods to assess workplace nanopaiticle exposure to complex size distributions and concentration gradients. The study was conceived as a series of methodological evaluations aimed at better understanding nanoparticle measurement devices and methods. lt focused on inhalation exposure to airborne particles, as respiration is considered to be the most important entrance pathway for nanoparticles in the body in terms of risk. The targeted survey (pilot study) was conducted as a feasibility study for a later nationwide survey on the handling of nanoparticles and the applications of specific protection means in industry. The study consisted of targeted phone interviews with health and safety officers of Swiss companies that were believed to use or produce nanoparticles. This was followed by a representative survey on the level of nanoparticle usage in Switzerland. lt was designed based on the results of the pilot study. The study was conducted among a representative selection of clients of the Swiss National Accident Insurance Fund (SUVA), covering about 85% of Swiss production companies. The third part of this thesis focused on the methods to measure nanoparticles. Several pre- studies were conducted studying the limits of commonly used measurement devices in the presence of nanoparticle agglomerates, This focus was chosen, because several discussions with users and producers of the measurement devices raised questions about their accuracy measuring nanoparticle agglomerates and because, at the same time, the two survey studies revealed that such powders are frequently used in industry. The first preparatory experiment focused on the accuracy of the scanning mobility particle sizer (SMPS), which showed an improbable size distribution when measuring powders of nanoparticle agglomerates. Furthermore, the thesis includes a series of smaller experiments that took a closer look at problems encountered with other measurement devices in the presence of nanoparticle agglomerates: condensation particle counters (CPC), portable aerosol spectrometer (PAS) a device to estimate the aerodynamic diameter, as well as diffusion size classifiers. Some initial feasibility tests for the efficiency of filter based sampling and subsequent counting of carbon nanotubes (CNT) were conducted last. The pilot study provided a detailed picture of the types and amounts of nanoparticles used and the knowledge of the health and safety experts in the companies. Considerable maximal quantities (> l'000 kg/year per company) of Ag, Al-Ox, Fe-Ox, SiO2, TiO2, and ZnO (mainly first generation particles) were declared by the contacted Swiss companies, The median quantity of handled nanoparticles, however, was 100 kg/year. The representative survey was conducted by contacting by post mail a representative selection of l '626 SUVA-clients (Swiss Accident Insurance Fund). It allowed estimation of the number of companies and workers dealing with nanoparticles in Switzerland. The extrapolation from the surveyed companies to all companies of the Swiss production sector suggested that l'309 workers (95%-confidence interval l'073 to l'545) of the Swiss production sector are potentially exposed to nanoparticles in 586 companies (145 to l'027). These numbers correspond to 0.08% (0.06% to 0.09%) of all workers and to 0.6% (0.2% to 1.1%) of companies in the Swiss production sector. To measure airborne concentrations of sub micrometre-sized particles, a few well known methods exist. However, it was unclear how well the different instruments perform in the presence of the often quite large agglomerates of nanostructured materials. The evaluation of devices and methods focused on nanoparticle agglomerate powders. lt allowed the identification of the following potential sources of inaccurate measurements at workplaces with considerable high concentrations of airborne agglomerates: - A standard SMPS showed bi-modal particle size distributions when measuring large nanoparticle agglomerates. - Differences in the range of a factor of a thousand were shown between diffusion size classifiers and CPC/SMPS. - The comparison between CPC/SMPS and portable aerosol Spectrometer (PAS) was much better, but depending on the concentration, size or type of the powders measured, the differences were still of a high order of magnitude - Specific difficulties and uncertainties in the assessment of workplaces were identified: the background particles can interact with particles created by a process, which make the handling of background concentration difficult. - Electric motors produce high numbers of nanoparticles and confound the measurement of the process-related exposure. Conclusion: The surveys showed that nanoparticles applications exist in many industrial sectors in Switzerland and that some companies already use high quantities of them. The representative survey demonstrated a low prevalence of nanoparticle usage in most branches of the Swiss industry and led to the conclusion that the introduction of applications using nanoparticles (especially outside industrial chemistry) is only beginning. Even though the number of potentially exposed workers was reportedly rather small, it nevertheless underscores the need for exposure assessments. Understanding exposure and how to measure it correctly is very important because the potential health effects of nanornaterials are not yet fully understood. The evaluation showed that many devices and methods of measuring nanoparticles need to be validated for nanoparticles agglomerates before large exposure assessment studies can begin. Zusammenfassung : Das Gesundheitsrisiko von Nanopartikel am Arbeitsplatz ist die Wahrscheinlichkeit dass ein Arbeitnehmer einen möglichen Gesundheitsschaden erleidet wenn er diesem Stoff ausgesetzt ist: sie wird gewöhnlich als Produkt von Schaden mal Exposition gerechnet. Für eine gründliche Abklärung möglicher Risiken von Nanomaterialien müssen also auf der einen Seite Informationen über die Freisetzung von solchen Materialien in die Umwelt vorhanden sein und auf der anderen Seite solche über die Exposition von Arbeitnehmenden. Viele dieser Informationen werden heute noch nicht systematisch gesarnmelt und felilen daher für Risikoanalysen, Die Doktorarbeit hatte als Ziel, die Grundlagen zu schaffen für eine quantitative Schatzung der Exposition gegenüber Nanopartikel am Arbeitsplatz und die Methoden zu evaluieren die zur Messung einer solchen Exposition nötig sind. Die Studie sollte untersuchen, in welchem Ausmass Nanopartikel bereits in der Schweizer Industrie eingesetzt werden, wie viele Arbeitnehrner damit potentiel] in Kontakt komrrien ob die Messtechnologie für die nötigen Arbeitsplatzbelastungsmessungen bereits genügt, Die Studie folcussierte dabei auf Exposition gegenüber luftgetragenen Partikel, weil die Atmung als Haupteintrittspforte iïlr Partikel in den Körper angesehen wird. Die Doktorarbeit besteht baut auf drei Phasen auf eine qualitative Umfrage (Pilotstudie), eine repräsentative, schweizerische Umfrage und mehrere technische Stndien welche dem spezitischen Verständnis der Mëglichkeiten und Grenzen einzelner Messgeräte und - teclmikeri dienen. Die qualitative Telephonumfrage wurde durchgeführt als Vorstudie zu einer nationalen und repräsentativen Umfrage in der Schweizer Industrie. Sie zielte auf Informationen ab zum Vorkommen von Nanopartikeln, und den angewendeten Schutzmassnahmen. Die Studie bestand aus gezielten Telefoninterviews mit Arbeit- und Gesundheitsfachpersonen von Schweizer Unternehmen. Die Untemehmen wurden aufgrund von offentlich zugànglichen lnformationen ausgewählt die darauf hinwiesen, dass sie mit Nanopartikeln umgehen. Der zweite Teil der Dolctorarbeit war die repräsentative Studie zur Evalniernng der Verbreitnng von Nanopaitikelanwendungen in der Schweizer lndustrie. Die Studie baute auf lnformationen der Pilotstudie auf und wurde mit einer repräsentativen Selektion von Firmen der Schweizerischen Unfall Versicherungsanstalt (SUVA) durchgeüihxt. Die Mehrheit der Schweizerischen Unternehmen im lndustrieselctor wurde damit abgedeckt. Der dritte Teil der Doktorarbeit fokussierte auf die Methodik zur Messung von Nanopartikeln. Mehrere Vorstudien wurden dnrchgefîihrt, um die Grenzen von oft eingesetzten Nanopartikelmessgeräten auszuloten, wenn sie grösseren Mengen von Nanopartikel Agglomeraten ausgesetzt messen sollen. Dieser F okns wurde ans zwei Gründen gewählt: weil mehrere Dislcussionen rnit Anwendem und auch dem Produzent der Messgeràte dort eine Schwachstelle vermuten liessen, welche Zweifel an der Genauigkeit der Messgeräte aufkommen liessen und weil in den zwei Umfragestudien ein häufiges Vorkommen von solchen Nanopartikel-Agglomeraten aufgezeigt wurde. i Als erstes widmete sich eine Vorstndie der Genauigkeit des Scanning Mobility Particle Sizer (SMPS). Dieses Messgerät zeigte in Präsenz von Nanopartikel Agglorneraten unsinnige bimodale Partikelgrössenverteilung an. Eine Serie von kurzen Experimenten folgte, welche sich auf andere Messgeräte und deren Probleme beim Messen von Nanopartikel-Agglomeraten konzentrierten. Der Condensation Particle Counter (CPC), der portable aerosol spectrometer (PAS), ein Gerät zur Schàtzung des aerodynamischen Durchniessers von Teilchen, sowie der Diffusion Size Classifier wurden getestet. Einige erste Machbarkeitstests zur Ermittlnng der Effizienz von tilterbasierter Messung von luftgetragenen Carbon Nanotubes (CNT) wnrden als letztes durchgeiührt. Die Pilotstudie hat ein detailliiertes Bild der Typen und Mengen von genutzten Nanopartikel in Schweizer Unternehmen geliefert, und hat den Stand des Wissens der interviewten Gesundheitsschntz und Sicherheitsfachleute aufgezeigt. Folgende Typen von Nanopaitikeln wurden von den kontaktierten Firmen als Maximalmengen angegeben (> 1'000 kg pro Jahr / Unternehrnen): Ag, Al-Ox, Fe-Ox, SiO2, TiO2, und ZnO (hauptsächlich Nanopartikel der ersten Generation). Die Quantitäten von eingesetzten Nanopartikeln waren stark verschieden mit einem ein Median von 100 kg pro Jahr. ln der quantitativen Fragebogenstudie wurden l'626 Unternehmen brieflich kontaktiert; allesamt Klienten der Schweizerischen Unfallversicherringsanstalt (SUVA). Die Resultate der Umfrage erlaubten eine Abschätzung der Anzahl von Unternehmen und Arbeiter, welche Nanopartikel in der Schweiz anwenden. Die Hochrechnung auf den Schweizer lndnstriesektor hat folgendes Bild ergeben: ln 586 Unternehmen (95% Vertrauensintervallz 145 bis 1'027 Unternehmen) sind 1'309 Arbeiter potentiell gegenüber Nanopartikel exponiert (95%-Vl: l'073 bis l'545). Diese Zahlen stehen für 0.6% der Schweizer Unternehmen (95%-Vl: 0.2% bis 1.1%) und 0.08% der Arbeiternehmerschaft (95%-V1: 0.06% bis 0.09%). Es gibt einige gut etablierte Technologien um die Luftkonzentration von Submikrometerpartikel zu messen. Es besteht jedoch Zweifel daran, inwiefern sich diese Technologien auch für die Messurrg von künstlich hergestellten Nanopartikeln verwenden lassen. Aus diesem Grund folcussierten die vorbereitenden Studien für die Arbeitsplatzbeurteilnngen auf die Messung von Pulverri, welche Nan0partike1-Agg10merate enthalten. Sie erlaubten die ldentifikation folgender rnöglicher Quellen von fehlerhaften Messungen an Arbeitsplätzen mit erhöhter Luft-K0nzentrati0n von Nanopartikel Agglomeratenz - Ein Standard SMPS zeigte eine unglaubwürdige bimodale Partikelgrössenverteilung wenn er grössere Nan0par'til<e1Agg10merate gemessen hat. - Grosse Unterschiede im Bereich von Faktor tausend wurden festgestellt zwischen einem Diffusion Size Classiîier und einigen CPC (beziehungsweise dem SMPS). - Die Unterschiede zwischen CPC/SMPS und dem PAS waren geringer, aber abhängig von Grosse oder Typ des gemessenen Pulvers waren sie dennoch in der Grössenordnung von einer guten Grössenordnung. - Spezifische Schwierigkeiten und Unsicherheiten im Bereich von Arbeitsplatzmessungen wurden identitiziert: Hintergrundpartikel können mit Partikeln interagieren die während einem Arbeitsprozess freigesetzt werden. Solche Interaktionen erschweren eine korrekte Einbettung der Hintergrunds-Partikel-Konzentration in die Messdaten. - Elektromotoren produzieren grosse Mengen von Nanopartikeln und können so die Messung der prozessbezogenen Exposition stören. Fazit: Die Umfragen zeigten, dass Nanopartikel bereits Realitàt sind in der Schweizer Industrie und dass einige Unternehmen bereits grosse Mengen davon einsetzen. Die repräsentative Umfrage hat diese explosive Nachricht jedoch etwas moderiert, indem sie aufgezeigt hat, dass die Zahl der Unternehmen in der gesamtschweizerischen Industrie relativ gering ist. In den meisten Branchen (vor allem ausserhalb der Chemischen Industrie) wurden wenig oder keine Anwendungen gefunden, was schliessen last, dass die Einführung dieser neuen Technologie erst am Anfang einer Entwicklung steht. Auch wenn die Zahl der potentiell exponierten Arbeiter immer noch relativ gering ist, so unterstreicht die Studie dennoch die Notwendigkeit von Expositionsmessungen an diesen Arbeitsplätzen. Kenntnisse um die Exposition und das Wissen, wie solche Exposition korrekt zu messen, sind sehr wichtig, vor allem weil die möglichen Auswirkungen auf die Gesundheit noch nicht völlig verstanden sind. Die Evaluation einiger Geräte und Methoden zeigte jedoch, dass hier noch Nachholbedarf herrscht. Bevor grössere Mess-Studien durgefîihrt werden können, müssen die Geräte und Methodem für den Einsatz mit Nanopartikel-Agglomeraten validiert werden.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have designed and built an experimental device, which we called a "thermoelectric bridge." Its primary purpose is simultaneous measurement of the relative Peltier and Seebeck coefficients. The systematic errors for both coefficients are equal with this device and manipulation is not necessary between the measurement of one coefficient and the other. Thus, this device is especially suitable for verifying their linear relation postulated by Lord Kelvin. Also, simultaneous measurement of thermal conductivity is described in the text. A sample is made up of the couple nickel¿platinum, taking measurements in the range of ¿20¿60°C and establishing the dependence of each coefficient with temperature, with nearly equal random errors ±0.2%, and systematic errors estimated at ¿0.5%. The aforementioned Kelvin relation is verified in this range from these results, proving that the behavioral deviations are ¿0.3% contained in the uncertainty ±0.5% caused by the propagation of errors