906 resultados para Thrombophilia Risk Evaluation
Resumo:
Pharmacological treatment of hypertension represents a cost-effective way of preventing cardiovascular and renal complications. To benefit maximally from antihypertensive treatment, blood pressure should be brought to below 140/90 mmHg in every hypertensive patient, and even lower (< 130/80 mmHg) if diabetes or renal disease co-exists. Such targets cannot usually be reached using monotherapies. This is especially true in patients who present with a high cardiovascular risk. The co-administration of two agents acting by different mechanisms considerably increases the blood pressure control rate. Such combinations are not only efficacious, but are also well tolerated, and some fixed low-dose combinations even have a placebo-like tolerability. This is the case for the preparation containing the angiotensin-converting enzyme inhibitor perindopril (2 mg) and the diuretic indapamide (0.625 mg), a fixed low-dose combination that has been shown in controlled trials to be more effective than monotherapies in reducing albuminuria, regressing cardiac hypertrophy and improving the stiffness of large arteries. Using this combination to initiate antihypertensive therapy has been shown in a double-blind trial (Strategies of Treatment in Hypertension: Evaluation; STRATHE) to normalize blood pressure (< 140/90 mmHg) in significantly more patients (62%) than a sequential monotherapy approach based on atenolol, losartan and amlodipine (49%) and a stepped-care strategy based on valsartan and hydrochlorothiazide (47%), with no difference between the three arm groups in terms of tolerability. An ongoing randomized trial (Action in Diabetes and Vascular Disease: Preterax and Diamicron Modified Release Controlled Evaluation; ADVANCE) is a study with a 2 x 2 factorial design assessing the effects of the fixed-dose perindopril-indapamide combination and of the intensive gliclazide modified release-based glucose control regimen in type 2 diabetic patients, with or without hypertension. A total of 11 140 patients were randomly selected. Within the first 6 weeks of treatment (run-in phase), the perindopril-indapamide combination lowered blood pressure from 145/81 +/- 22/11 mmHg (mean +/- SD) to 137/78 +/- 20/10 mmHg. Fixed-dose combinations are becoming more and more popular for the management of hypertension, and are even proposed by hypertension guidelines as a first-line option to treat hypertensive patients.
Resumo:
Lifting is said to be on of the major risk factors for the onset of low back pain, several different measures has been developed to study this. Several programs are available in order to measure these components, or to determine the ability of an individual to perform a certain job or to discover if the job creates dangerous positions for the worker. In these different fields reliable and valid instruments exist but they are costly and time spending. We present a simplified functional capacity measuring that we use daily in practise. Method: 280 patients have been evaluated on this base. The majority was referred to multidisciplinary rehabilitation treatment. The patients had recurrent back problems for months or years. Inclusion criteria were between 18 and 64 years, currently of work, no work compensation. Exclusion criteria were chronic low back pain with a specific cause. They followed a one-hour evaluation test as a functional capacity evaluation at the end of the multidisciplinary treatment period, it was compared to the PILE-test done at the beginning and at the end. Results: We included 280 subjects: 160 men and 120 women. Mean age 43.6 by the women and 44 years by the men. We studied the caring foot-hip, hip-shoulder, 5 m carrying, pushing and tiring and the global weight carried during the test. We found this global value to be 696 kg by men and 422 kg by women suffering from chronic lumbar pain. The increase in this value had a clear incidence on a greater work ability, as had a decrease. Conclusions: We were able to develop a lifting capacity program that is easy to reproduce and not expensive, giving us the possibility to have an idea on how to reorient the patients according to their work place and their capacities. We could also have an information of work performance and power consumption. It should be more tested and compared to standard capacity in the healthy population.
Resumo:
Elevated risk of fatal and non-fatal cardiovascular events is associated with high prevalence of peripheral arterial disease, with assessment through the ankle-brachial index (ABI). This study aimed to demonstrate that the ABI and the Edinburgh Claudication Questionnaire are tools to be used by nurses in prevention and/or treatment of CVD (cardiovascular disease). A cross-sectional study was carried out with patients from a cardiovascular clinic. The Edinburgh Claudication Questionnaire was applied and the ABI was measured with the formula (ABI= Blood Pressure Ankle/Blood Pressure Brachial). A total of 115 patients were included, most were females (57.4%), aged 60.6 ± 12.5 years. The most prevalent risk factors were hypertension (64.3%), physical inactivity (48.7%) and family history (58.3%). The study showed that abnormal ABI was frequently found and 42.6% of the patients with abnormal ABI showed intermittent claudication. The method to evaluate the ABI associated to the Edinburg Claudication Questionnaire, can be easily used by nurses in the clinical evaluation of asymptomatic and symptomatic CVD patients.
Resumo:
Objective: To investigate the neuromotor development of at-risk children between three and 12 months of life, administering the Brazilian version of the Harris Infant Neuromotor Test (HINT).Method: A longitudinal study, with 78 children and 76 parents/guardians discharged from a neonatal intensive care unit in Fortaleza-CE/Brazil. Two instruments were administered: HINT and a socioeconomic questionnaire, between July/2009 to August/2010. Data from 55 preterm and 23 term children were analyzed. Results: The final mean scores ranged from 14.6 to 25.2 and from 11.2 to 24.7, for preterm and term, respectively, showing that 91% of children demonstrated good neuromotor performance; seven premature infants showed alterations which led to the referral of three children to a specialized clinic for examination and diagnostics.Conclusion: The test allowed nurses to assess infant development, identify deviations early, and plan interventions.
Resumo:
Two main approaches are commonly used to empirically evaluate linear factor pricingmodels: regression and SDF methods, with centred and uncentred versions of the latter.We show that unlike standard two-step or iterated GMM procedures, single-step estimatorssuch as continuously updated GMM yield numerically identical values for prices of risk,pricing errors, Jensen s alphas and overidentifying restrictions tests irrespective of the modelvalidity. Therefore, there is arguably a single approach regardless of the factors being tradedor not, or the use of excess or gross returns. We illustrate our results by revisiting Lustigand Verdelhan s (2007) empirical analysis of currency returns.
Resumo:
We address the question of whether growth and welfare can be higher in crisis prone economies. First, we show that there is a robust empirical link between per-capita GDP growth and negative skewness of credit growth across countries with active financial markets. That is, countries that have experienced occasional crises have grown on average faster than countries with smooth credit conditions. We then present a two-sector endogenous growth model in which financial crises can occur, and analyze the relationship between financial fragility and growth. The underlying credit market imperfections generateborrowing constraints, bottlenecks and low growth. We show that under certain conditions endogenous real exchange rate risk arises and firms find it optimal to take on credit risk in the form of currency mismatch. Along such a risky path average growth is higher, but self-fulfilling crises occur occasionally. Furthermore, we establish conditions under which the adoption of credit risk is welfare improving and brings the allocation nearer to the Pareto optimal level. The design of the model is motivated by several features of recent crises: credit risk in the form of foreign currency denominated debt; costly crises that generate firesales and widespread bankruptcies; and asymmetric sectorial responses, wherethe nontradables sector falls more than the tradables sector in the wake of crises.
Resumo:
Spatial evaluation of Culicidae (Diptera) larvae from different breeding sites: application of a geospatial method and implications for vector control. This study investigates the spatial distribution of urban Culicidae and informs entomological monitoring of species that use artificial containers as larval habitats. Collections of mosquito larvae were conducted in the São Paulo State municipality of Santa Bárbara d' Oeste between 2004 and 2006 during house-to-house visits. A total of 1,891 samples and nine different species were sampled. Species distribution was assessed using the kriging statistical method by extrapolating municipal administrative divisions. The sampling method followed the norms of the municipal health services of the Ministry of Health and can thus be adopted by public health authorities in disease control and delimitation of risk areas. Moreover, this type of survey and analysis can be employed for entomological surveillance of urban vectors that use artificial containers as larval habitat.
Resumo:
Mining in the State of Minas Gerais-Brazil is one of the activities with the strongest impact on the environment, in spite of its economical importance. Amongst mining activities, acid drainage poses a serious environmental problem due to its widespread practice in gold-extracting areas. It originates from metal-sulfide oxidation, which causes water acidification, increasing the risk of toxic element mobilization and water resource pollution. This research aimed to evaluate the acid drainage problem in Minas Gerais State. The study began with a bibliographic survey at FEAM (Environment Foundation of Minas Gerais State) to identify mining sites where sulfides occur. Substrate samples were collected from these sites to determine AP (acidity potential) and NP (neutralization potential). The AP was evaluated by the procedure of the total sulfide content and by oxygen peroxide oxidation, followed by acidity titration. The NP was evaluated by the calcium carbonate equivalent. Petrographic thin sections were also mounted and described with a special view to sulfides and carbonates. Based on the chemical analysis, the acid-base accounting (ABA) was determined by the difference of AP and NP, and the acid drainage potential obtained by the ABA value and the total volume of material at each site. Results allowed the identification of substrates with potential to generate acid drainage in Minas Gerais state. Altogether these activities represent a potential to produce between 3.1 to 10.4 billions of m³ of water at pH 2 or 31.4 to 103.7 billions of m³ of water at pH 3. This, in turn, would imply in costs of US$ 7.8 to 25.9 millions to neutralize the acidity with commercial limestone. These figures are probably underestimated because some mines were not surveyed, whereas, in other cases, surface samples may not represent reality. A more reliable state-wide evaluation of the acid drainage potential would require further studies, including a larger number of samples. Such investigations should consider other mining operations beyond the scope of this study as well as the kinetics of the acid generation by simulated weathering procedures.
Resumo:
The trabecular bone score (TBS) is a new parameter that is determined from gray-level analysis of dual-energy X-ray absorptiometry (DXA) images. It relies on the mean thickness and volume fraction of trabecular bone microarchitecture. This was a preliminary case-control study to evaluate the potential diagnostic value of TBS as a complement to bone mineral density (BMD), by comparing postmenopausal women with and without fractures. The sample consisted of 45 women with osteoporotic fractures (5 hip fractures, 20 vertebral fractures, and 20 other types of fracture) and 155 women without a fracture. Stratification was performed, taking into account each type of fracture (except hip), and women with and without fractures were matched for age and spine BMD. BMD and TBS were measured at the total spine. TBS measured at the total spine revealed a significant difference between the fracture and age- and spine BMD-matched nonfracture group, when considering all types of fractures and vertebral fractures. In these cases, the diagnostic value of the combination of BMD and TBS likely will be higher compared with that of BMD alone. TBS, as evaluated from standard DXA scans directly, potentially complements BMD in the detection of osteoporotic fractures. Prospective studies are necessary to fully evaluate the potential role of TBS as a complementary risk factor for fracture.
Resumo:
PURPOSE: To develop and assess the diagnostic performance of a three-dimensional (3D) whole-body T1-weighted magnetic resonance (MR) imaging pulse sequence at 3.0 T for bone and node staging in patients with prostate cancer. MATERIALS AND METHODS This prospective study was approved by the institutional ethics committee; informed consent was obtained from all patients. Thirty patients with prostate cancer at high risk for metastases underwent whole-body 3D T1-weighted imaging in addition to the routine MR imaging protocol for node and/or bone metastasis screening, which included coronal two-dimensional (2D) whole-body T1-weighted MR imaging, sagittal proton-density fat-saturated (PDFS) imaging of the spine, and whole-body diffusion-weighted MR imaging. Two observers read the 2D and 3D images separately in a blinded manner for bone and node screening. Images were read in random order. The consensus review of MR images and the findings at prospective clinical and MR imaging follow-up at 6 months were used as the standard of reference. The interobserver agreement and diagnostic performance of each sequence were assessed on per-patient and per-lesion bases. RESULTS: The signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were significantly higher with whole-body 3D T1-weighted imaging than with whole-body 2D T1-weighted imaging regardless of the reference region (bone or fat) and lesion location (bone or node) (P < .003 for all). For node metastasis, diagnostic performance (area under the receiver operating characteristic curve) was higher for whole-body 3D T1-weighted imaging (per-patient analysis; observer 1: P < .001 for 2D T1-weighted imaging vs 3D T1-weighted imaging, P = .006 for 2D T1-weighted imaging + PDFS imaging vs 3D T1-weighted imaging; observer 2: P = .006 for 2D T1-weighted imaging vs 3D T1-weighted imaging, P = .006 for 2D T1-weighted imaging + PDFS imaging vs 3D T1-weighted imaging), as was sensitivity (per-lesion analysis; observer 1: P < .001 for 2D T1-weighted imaging vs 3D T1-weighted imaging, P < .001 for 2D T1-weighted imaging + PDFS imaging vs 3D T1-weighted imaging; observer 2: P < .001 for 2D T1-weighted imaging vs 3D T1-weighted imaging, P < .001 for 2D T1-weighted imaging + PDFS imaging vs 3D T1-weighted imaging). CONCLUSION: Whole-body MR imaging is feasible with a 3D T1-weighted sequence and provides better SNR and CNR compared with 2D sequences, with a diagnostic performance that is as good or better for the detection of bone metastases and better for the detection of lymph node metastases.
Resumo:
In urban communities, there are often limited amounts of right-of-way available for establishing a large setback distance from the curb for fixed objects. Urban communities must constantly weigh the cost of purchasing additional right-of-way for clear zones against the risk of fixed object crashes. From 2004 to 2006, this type of crash on curbed roads represented 15% of all fatal crashes and 3% of all crashes in the state of Iowa. Many states have kept the current minimum AASHTO recommendations as their minimum clear zone standards; however, other states have decided that these recommendations are insufficient and have increased the required minimum clear zone distance to better suit the judgment of local designers. This report presents research on the effects of the clear zone on urban curbed streets. The research was conducted in two phases. The first phase involved a synthesis of practice that included a literature review and a survey of practices in jurisdictions that have developmental and historical patterns similar to those of Iowa. The second phase involved investigating the benefits of a 10 ft clear zone, which included examining urban corridors in Iowa that meet or do not meet the 10 ft clear zone goal. The results of this study indicate that a consistent fixed object offset results in a reduction in the number of fixed object crashes, a 5 ft clear zone is most effective when the goal is to minimize the number of fixed object c ashes, and a 3 ft clear zone is most effective when the goal is to minimize the cost of fixed object crashes.
Resumo:
Measurement of the blood pressure by the physician remains an essential step in the evaluation of cardiovascular risk. Ambulatory measurement and self-measurement of blood pressure are ways of counteracting the "white coat" effect which is the rise in blood pressure many patients experience in the presence of doctors. Thus, it is possible to define the cardiovascular risk of hypertension and identify the patients with the greatest chance of benefiting from antihypertensive therapy. However, it must be realised that normotensive subjects during their everyday activities and becoming hypertensive in the doctor's surgery, may become hypertensive with time, irrespective of the means used to measure blood pressure. These patients should be followed up regularly even if the decision to treat has been postponed.
Resumo:
Moisture sensitivity of Hot Mix Asphalt (HMA) mixtures, generally called stripping, is a major form of distress in asphalt concrete pavement. It is characterized by the loss of adhesive bond between the asphalt binder and the aggregate (a failure of the bonding of the binder to the aggregate) or by a softening of the cohesive bonds within the asphalt binder (a failure within the binder itself), both of which are due to the action of loading under traffic in the presence of moisture. The evaluation of HMA moisture sensitivity has been divided into two categories: visual inspection test and mechanical test. However, most of them have been developed in pre-Superpave mix design. This research was undertaken to develop a protocol for evaluating the moisture sensitivity potential of HMA mixtures using the Nottingham Asphalt Tester (NAT). The mechanisms of HMA moisture sensitivity were reviewed and the test protocols using the NAT were developed. Different types of blends as moisture-sensitive groups and non-moisture-sensitive groups were used to evaluate the potential of the proposed test. The test results were analyzed with three parameters based on performance character: the retained flow number depending on critical permanent deformation failure (RFNP), the retained flow number depending on cohesion failure (RFNC), and energy ratio (ER). Analysis based on energy ratio of elastic strain (EREE ) at flow number of cohesion failure (FNC) has higher potential to evaluate the HMA moisture sensitivity than other parameters. If the measurement error in data-acquisition process is removed, analyses based on RFNP and RFNC would also have high potential to evaluate the HMA moisture sensitivity. The vacuum pressure saturation used in AASHTO T 283 and proposed test has a risk to damage specimen before the load applying.
Resumo:
Abstract : The occupational health risk involved with handling nanoparticles is the probability that a worker will experience an adverse health effect: this is calculated as a function of the worker's exposure relative to the potential biological hazard of the material. Addressing the risks of nanoparticles requires therefore knowledge on occupational exposure and the release of nanoparticles into the environment as well as toxicological data. However, information on exposure is currently not systematically collected; therefore this risk assessment lacks quantitative data. This thesis aimed at, first creating the fundamental data necessary for a quantitative assessment and, second, evaluating methods to measure the occupational nanoparticle exposure. The first goal was to determine what is being used where in Swiss industries. This was followed by an evaluation of the adequacy of existing measurement methods to assess workplace nanopaiticle exposure to complex size distributions and concentration gradients. The study was conceived as a series of methodological evaluations aimed at better understanding nanoparticle measurement devices and methods. lt focused on inhalation exposure to airborne particles, as respiration is considered to be the most important entrance pathway for nanoparticles in the body in terms of risk. The targeted survey (pilot study) was conducted as a feasibility study for a later nationwide survey on the handling of nanoparticles and the applications of specific protection means in industry. The study consisted of targeted phone interviews with health and safety officers of Swiss companies that were believed to use or produce nanoparticles. This was followed by a representative survey on the level of nanoparticle usage in Switzerland. lt was designed based on the results of the pilot study. The study was conducted among a representative selection of clients of the Swiss National Accident Insurance Fund (SUVA), covering about 85% of Swiss production companies. The third part of this thesis focused on the methods to measure nanoparticles. Several pre- studies were conducted studying the limits of commonly used measurement devices in the presence of nanoparticle agglomerates, This focus was chosen, because several discussions with users and producers of the measurement devices raised questions about their accuracy measuring nanoparticle agglomerates and because, at the same time, the two survey studies revealed that such powders are frequently used in industry. The first preparatory experiment focused on the accuracy of the scanning mobility particle sizer (SMPS), which showed an improbable size distribution when measuring powders of nanoparticle agglomerates. Furthermore, the thesis includes a series of smaller experiments that took a closer look at problems encountered with other measurement devices in the presence of nanoparticle agglomerates: condensation particle counters (CPC), portable aerosol spectrometer (PAS) a device to estimate the aerodynamic diameter, as well as diffusion size classifiers. Some initial feasibility tests for the efficiency of filter based sampling and subsequent counting of carbon nanotubes (CNT) were conducted last. The pilot study provided a detailed picture of the types and amounts of nanoparticles used and the knowledge of the health and safety experts in the companies. Considerable maximal quantities (> l'000 kg/year per company) of Ag, Al-Ox, Fe-Ox, SiO2, TiO2, and ZnO (mainly first generation particles) were declared by the contacted Swiss companies, The median quantity of handled nanoparticles, however, was 100 kg/year. The representative survey was conducted by contacting by post mail a representative selection of l '626 SUVA-clients (Swiss Accident Insurance Fund). It allowed estimation of the number of companies and workers dealing with nanoparticles in Switzerland. The extrapolation from the surveyed companies to all companies of the Swiss production sector suggested that l'309 workers (95%-confidence interval l'073 to l'545) of the Swiss production sector are potentially exposed to nanoparticles in 586 companies (145 to l'027). These numbers correspond to 0.08% (0.06% to 0.09%) of all workers and to 0.6% (0.2% to 1.1%) of companies in the Swiss production sector. To measure airborne concentrations of sub micrometre-sized particles, a few well known methods exist. However, it was unclear how well the different instruments perform in the presence of the often quite large agglomerates of nanostructured materials. The evaluation of devices and methods focused on nanoparticle agglomerate powders. lt allowed the identification of the following potential sources of inaccurate measurements at workplaces with considerable high concentrations of airborne agglomerates: - A standard SMPS showed bi-modal particle size distributions when measuring large nanoparticle agglomerates. - Differences in the range of a factor of a thousand were shown between diffusion size classifiers and CPC/SMPS. - The comparison between CPC/SMPS and portable aerosol Spectrometer (PAS) was much better, but depending on the concentration, size or type of the powders measured, the differences were still of a high order of magnitude - Specific difficulties and uncertainties in the assessment of workplaces were identified: the background particles can interact with particles created by a process, which make the handling of background concentration difficult. - Electric motors produce high numbers of nanoparticles and confound the measurement of the process-related exposure. Conclusion: The surveys showed that nanoparticles applications exist in many industrial sectors in Switzerland and that some companies already use high quantities of them. The representative survey demonstrated a low prevalence of nanoparticle usage in most branches of the Swiss industry and led to the conclusion that the introduction of applications using nanoparticles (especially outside industrial chemistry) is only beginning. Even though the number of potentially exposed workers was reportedly rather small, it nevertheless underscores the need for exposure assessments. Understanding exposure and how to measure it correctly is very important because the potential health effects of nanornaterials are not yet fully understood. The evaluation showed that many devices and methods of measuring nanoparticles need to be validated for nanoparticles agglomerates before large exposure assessment studies can begin. Zusammenfassung : Das Gesundheitsrisiko von Nanopartikel am Arbeitsplatz ist die Wahrscheinlichkeit dass ein Arbeitnehmer einen möglichen Gesundheitsschaden erleidet wenn er diesem Stoff ausgesetzt ist: sie wird gewöhnlich als Produkt von Schaden mal Exposition gerechnet. Für eine gründliche Abklärung möglicher Risiken von Nanomaterialien müssen also auf der einen Seite Informationen über die Freisetzung von solchen Materialien in die Umwelt vorhanden sein und auf der anderen Seite solche über die Exposition von Arbeitnehmenden. Viele dieser Informationen werden heute noch nicht systematisch gesarnmelt und felilen daher für Risikoanalysen, Die Doktorarbeit hatte als Ziel, die Grundlagen zu schaffen für eine quantitative Schatzung der Exposition gegenüber Nanopartikel am Arbeitsplatz und die Methoden zu evaluieren die zur Messung einer solchen Exposition nötig sind. Die Studie sollte untersuchen, in welchem Ausmass Nanopartikel bereits in der Schweizer Industrie eingesetzt werden, wie viele Arbeitnehrner damit potentiel] in Kontakt komrrien ob die Messtechnologie für die nötigen Arbeitsplatzbelastungsmessungen bereits genügt, Die Studie folcussierte dabei auf Exposition gegenüber luftgetragenen Partikel, weil die Atmung als Haupteintrittspforte iïlr Partikel in den Körper angesehen wird. Die Doktorarbeit besteht baut auf drei Phasen auf eine qualitative Umfrage (Pilotstudie), eine repräsentative, schweizerische Umfrage und mehrere technische Stndien welche dem spezitischen Verständnis der Mëglichkeiten und Grenzen einzelner Messgeräte und - teclmikeri dienen. Die qualitative Telephonumfrage wurde durchgeführt als Vorstudie zu einer nationalen und repräsentativen Umfrage in der Schweizer Industrie. Sie zielte auf Informationen ab zum Vorkommen von Nanopartikeln, und den angewendeten Schutzmassnahmen. Die Studie bestand aus gezielten Telefoninterviews mit Arbeit- und Gesundheitsfachpersonen von Schweizer Unternehmen. Die Untemehmen wurden aufgrund von offentlich zugànglichen lnformationen ausgewählt die darauf hinwiesen, dass sie mit Nanopartikeln umgehen. Der zweite Teil der Dolctorarbeit war die repräsentative Studie zur Evalniernng der Verbreitnng von Nanopaitikelanwendungen in der Schweizer lndustrie. Die Studie baute auf lnformationen der Pilotstudie auf und wurde mit einer repräsentativen Selektion von Firmen der Schweizerischen Unfall Versicherungsanstalt (SUVA) durchgeüihxt. Die Mehrheit der Schweizerischen Unternehmen im lndustrieselctor wurde damit abgedeckt. Der dritte Teil der Doktorarbeit fokussierte auf die Methodik zur Messung von Nanopartikeln. Mehrere Vorstudien wurden dnrchgefîihrt, um die Grenzen von oft eingesetzten Nanopartikelmessgeräten auszuloten, wenn sie grösseren Mengen von Nanopartikel Agglomeraten ausgesetzt messen sollen. Dieser F okns wurde ans zwei Gründen gewählt: weil mehrere Dislcussionen rnit Anwendem und auch dem Produzent der Messgeràte dort eine Schwachstelle vermuten liessen, welche Zweifel an der Genauigkeit der Messgeräte aufkommen liessen und weil in den zwei Umfragestudien ein häufiges Vorkommen von solchen Nanopartikel-Agglomeraten aufgezeigt wurde. i Als erstes widmete sich eine Vorstndie der Genauigkeit des Scanning Mobility Particle Sizer (SMPS). Dieses Messgerät zeigte in Präsenz von Nanopartikel Agglorneraten unsinnige bimodale Partikelgrössenverteilung an. Eine Serie von kurzen Experimenten folgte, welche sich auf andere Messgeräte und deren Probleme beim Messen von Nanopartikel-Agglomeraten konzentrierten. Der Condensation Particle Counter (CPC), der portable aerosol spectrometer (PAS), ein Gerät zur Schàtzung des aerodynamischen Durchniessers von Teilchen, sowie der Diffusion Size Classifier wurden getestet. Einige erste Machbarkeitstests zur Ermittlnng der Effizienz von tilterbasierter Messung von luftgetragenen Carbon Nanotubes (CNT) wnrden als letztes durchgeiührt. Die Pilotstudie hat ein detailliiertes Bild der Typen und Mengen von genutzten Nanopartikel in Schweizer Unternehmen geliefert, und hat den Stand des Wissens der interviewten Gesundheitsschntz und Sicherheitsfachleute aufgezeigt. Folgende Typen von Nanopaitikeln wurden von den kontaktierten Firmen als Maximalmengen angegeben (> 1'000 kg pro Jahr / Unternehrnen): Ag, Al-Ox, Fe-Ox, SiO2, TiO2, und ZnO (hauptsächlich Nanopartikel der ersten Generation). Die Quantitäten von eingesetzten Nanopartikeln waren stark verschieden mit einem ein Median von 100 kg pro Jahr. ln der quantitativen Fragebogenstudie wurden l'626 Unternehmen brieflich kontaktiert; allesamt Klienten der Schweizerischen Unfallversicherringsanstalt (SUVA). Die Resultate der Umfrage erlaubten eine Abschätzung der Anzahl von Unternehmen und Arbeiter, welche Nanopartikel in der Schweiz anwenden. Die Hochrechnung auf den Schweizer lndnstriesektor hat folgendes Bild ergeben: ln 586 Unternehmen (95% Vertrauensintervallz 145 bis 1'027 Unternehmen) sind 1'309 Arbeiter potentiell gegenüber Nanopartikel exponiert (95%-Vl: l'073 bis l'545). Diese Zahlen stehen für 0.6% der Schweizer Unternehmen (95%-Vl: 0.2% bis 1.1%) und 0.08% der Arbeiternehmerschaft (95%-V1: 0.06% bis 0.09%). Es gibt einige gut etablierte Technologien um die Luftkonzentration von Submikrometerpartikel zu messen. Es besteht jedoch Zweifel daran, inwiefern sich diese Technologien auch für die Messurrg von künstlich hergestellten Nanopartikeln verwenden lassen. Aus diesem Grund folcussierten die vorbereitenden Studien für die Arbeitsplatzbeurteilnngen auf die Messung von Pulverri, welche Nan0partike1-Agg10merate enthalten. Sie erlaubten die ldentifikation folgender rnöglicher Quellen von fehlerhaften Messungen an Arbeitsplätzen mit erhöhter Luft-K0nzentrati0n von Nanopartikel Agglomeratenz - Ein Standard SMPS zeigte eine unglaubwürdige bimodale Partikelgrössenverteilung wenn er grössere Nan0par'til<e1Agg10merate gemessen hat. - Grosse Unterschiede im Bereich von Faktor tausend wurden festgestellt zwischen einem Diffusion Size Classiîier und einigen CPC (beziehungsweise dem SMPS). - Die Unterschiede zwischen CPC/SMPS und dem PAS waren geringer, aber abhängig von Grosse oder Typ des gemessenen Pulvers waren sie dennoch in der Grössenordnung von einer guten Grössenordnung. - Spezifische Schwierigkeiten und Unsicherheiten im Bereich von Arbeitsplatzmessungen wurden identitiziert: Hintergrundpartikel können mit Partikeln interagieren die während einem Arbeitsprozess freigesetzt werden. Solche Interaktionen erschweren eine korrekte Einbettung der Hintergrunds-Partikel-Konzentration in die Messdaten. - Elektromotoren produzieren grosse Mengen von Nanopartikeln und können so die Messung der prozessbezogenen Exposition stören. Fazit: Die Umfragen zeigten, dass Nanopartikel bereits Realitàt sind in der Schweizer Industrie und dass einige Unternehmen bereits grosse Mengen davon einsetzen. Die repräsentative Umfrage hat diese explosive Nachricht jedoch etwas moderiert, indem sie aufgezeigt hat, dass die Zahl der Unternehmen in der gesamtschweizerischen Industrie relativ gering ist. In den meisten Branchen (vor allem ausserhalb der Chemischen Industrie) wurden wenig oder keine Anwendungen gefunden, was schliessen last, dass die Einführung dieser neuen Technologie erst am Anfang einer Entwicklung steht. Auch wenn die Zahl der potentiell exponierten Arbeiter immer noch relativ gering ist, so unterstreicht die Studie dennoch die Notwendigkeit von Expositionsmessungen an diesen Arbeitsplätzen. Kenntnisse um die Exposition und das Wissen, wie solche Exposition korrekt zu messen, sind sehr wichtig, vor allem weil die möglichen Auswirkungen auf die Gesundheit noch nicht völlig verstanden sind. Die Evaluation einiger Geräte und Methoden zeigte jedoch, dass hier noch Nachholbedarf herrscht. Bevor grössere Mess-Studien durgefîihrt werden können, müssen die Geräte und Methodem für den Einsatz mit Nanopartikel-Agglomeraten validiert werden.
Resumo:
In this paper we propose a highly accurate approximation procedure for ruin probabilities in the classical collective risk model, which is based on a quadrature/rational approximation procedure proposed in [2]. For a certain class of claim size distributions (which contains the completely monotone distributions) we give a theoretical justification for the method. We also show that under weaker assumptions on the claim size distribution, the method may still perform reasonably well in some cases. This in particular provides an efficient alternative to a related method proposed in [3]. A number of numerical illustrations for the performance of this procedure is provided for both completely monotone and other types of random variables.