69 resultados para Methods of Encryption
Resumo:
Abstract : The occupational health risk involved with handling nanoparticles is the probability that a worker will experience an adverse health effect: this is calculated as a function of the worker's exposure relative to the potential biological hazard of the material. Addressing the risks of nanoparticles requires therefore knowledge on occupational exposure and the release of nanoparticles into the environment as well as toxicological data. However, information on exposure is currently not systematically collected; therefore this risk assessment lacks quantitative data. This thesis aimed at, first creating the fundamental data necessary for a quantitative assessment and, second, evaluating methods to measure the occupational nanoparticle exposure. The first goal was to determine what is being used where in Swiss industries. This was followed by an evaluation of the adequacy of existing measurement methods to assess workplace nanopaiticle exposure to complex size distributions and concentration gradients. The study was conceived as a series of methodological evaluations aimed at better understanding nanoparticle measurement devices and methods. lt focused on inhalation exposure to airborne particles, as respiration is considered to be the most important entrance pathway for nanoparticles in the body in terms of risk. The targeted survey (pilot study) was conducted as a feasibility study for a later nationwide survey on the handling of nanoparticles and the applications of specific protection means in industry. The study consisted of targeted phone interviews with health and safety officers of Swiss companies that were believed to use or produce nanoparticles. This was followed by a representative survey on the level of nanoparticle usage in Switzerland. lt was designed based on the results of the pilot study. The study was conducted among a representative selection of clients of the Swiss National Accident Insurance Fund (SUVA), covering about 85% of Swiss production companies. The third part of this thesis focused on the methods to measure nanoparticles. Several pre- studies were conducted studying the limits of commonly used measurement devices in the presence of nanoparticle agglomerates, This focus was chosen, because several discussions with users and producers of the measurement devices raised questions about their accuracy measuring nanoparticle agglomerates and because, at the same time, the two survey studies revealed that such powders are frequently used in industry. The first preparatory experiment focused on the accuracy of the scanning mobility particle sizer (SMPS), which showed an improbable size distribution when measuring powders of nanoparticle agglomerates. Furthermore, the thesis includes a series of smaller experiments that took a closer look at problems encountered with other measurement devices in the presence of nanoparticle agglomerates: condensation particle counters (CPC), portable aerosol spectrometer (PAS) a device to estimate the aerodynamic diameter, as well as diffusion size classifiers. Some initial feasibility tests for the efficiency of filter based sampling and subsequent counting of carbon nanotubes (CNT) were conducted last. The pilot study provided a detailed picture of the types and amounts of nanoparticles used and the knowledge of the health and safety experts in the companies. Considerable maximal quantities (> l'000 kg/year per company) of Ag, Al-Ox, Fe-Ox, SiO2, TiO2, and ZnO (mainly first generation particles) were declared by the contacted Swiss companies, The median quantity of handled nanoparticles, however, was 100 kg/year. The representative survey was conducted by contacting by post mail a representative selection of l '626 SUVA-clients (Swiss Accident Insurance Fund). It allowed estimation of the number of companies and workers dealing with nanoparticles in Switzerland. The extrapolation from the surveyed companies to all companies of the Swiss production sector suggested that l'309 workers (95%-confidence interval l'073 to l'545) of the Swiss production sector are potentially exposed to nanoparticles in 586 companies (145 to l'027). These numbers correspond to 0.08% (0.06% to 0.09%) of all workers and to 0.6% (0.2% to 1.1%) of companies in the Swiss production sector. To measure airborne concentrations of sub micrometre-sized particles, a few well known methods exist. However, it was unclear how well the different instruments perform in the presence of the often quite large agglomerates of nanostructured materials. The evaluation of devices and methods focused on nanoparticle agglomerate powders. lt allowed the identification of the following potential sources of inaccurate measurements at workplaces with considerable high concentrations of airborne agglomerates: - A standard SMPS showed bi-modal particle size distributions when measuring large nanoparticle agglomerates. - Differences in the range of a factor of a thousand were shown between diffusion size classifiers and CPC/SMPS. - The comparison between CPC/SMPS and portable aerosol Spectrometer (PAS) was much better, but depending on the concentration, size or type of the powders measured, the differences were still of a high order of magnitude - Specific difficulties and uncertainties in the assessment of workplaces were identified: the background particles can interact with particles created by a process, which make the handling of background concentration difficult. - Electric motors produce high numbers of nanoparticles and confound the measurement of the process-related exposure. Conclusion: The surveys showed that nanoparticles applications exist in many industrial sectors in Switzerland and that some companies already use high quantities of them. The representative survey demonstrated a low prevalence of nanoparticle usage in most branches of the Swiss industry and led to the conclusion that the introduction of applications using nanoparticles (especially outside industrial chemistry) is only beginning. Even though the number of potentially exposed workers was reportedly rather small, it nevertheless underscores the need for exposure assessments. Understanding exposure and how to measure it correctly is very important because the potential health effects of nanornaterials are not yet fully understood. The evaluation showed that many devices and methods of measuring nanoparticles need to be validated for nanoparticles agglomerates before large exposure assessment studies can begin. Zusammenfassung : Das Gesundheitsrisiko von Nanopartikel am Arbeitsplatz ist die Wahrscheinlichkeit dass ein Arbeitnehmer einen möglichen Gesundheitsschaden erleidet wenn er diesem Stoff ausgesetzt ist: sie wird gewöhnlich als Produkt von Schaden mal Exposition gerechnet. Für eine gründliche Abklärung möglicher Risiken von Nanomaterialien müssen also auf der einen Seite Informationen über die Freisetzung von solchen Materialien in die Umwelt vorhanden sein und auf der anderen Seite solche über die Exposition von Arbeitnehmenden. Viele dieser Informationen werden heute noch nicht systematisch gesarnmelt und felilen daher für Risikoanalysen, Die Doktorarbeit hatte als Ziel, die Grundlagen zu schaffen für eine quantitative Schatzung der Exposition gegenüber Nanopartikel am Arbeitsplatz und die Methoden zu evaluieren die zur Messung einer solchen Exposition nötig sind. Die Studie sollte untersuchen, in welchem Ausmass Nanopartikel bereits in der Schweizer Industrie eingesetzt werden, wie viele Arbeitnehrner damit potentiel] in Kontakt komrrien ob die Messtechnologie für die nötigen Arbeitsplatzbelastungsmessungen bereits genügt, Die Studie folcussierte dabei auf Exposition gegenüber luftgetragenen Partikel, weil die Atmung als Haupteintrittspforte iïlr Partikel in den Körper angesehen wird. Die Doktorarbeit besteht baut auf drei Phasen auf eine qualitative Umfrage (Pilotstudie), eine repräsentative, schweizerische Umfrage und mehrere technische Stndien welche dem spezitischen Verständnis der Mëglichkeiten und Grenzen einzelner Messgeräte und - teclmikeri dienen. Die qualitative Telephonumfrage wurde durchgeführt als Vorstudie zu einer nationalen und repräsentativen Umfrage in der Schweizer Industrie. Sie zielte auf Informationen ab zum Vorkommen von Nanopartikeln, und den angewendeten Schutzmassnahmen. Die Studie bestand aus gezielten Telefoninterviews mit Arbeit- und Gesundheitsfachpersonen von Schweizer Unternehmen. Die Untemehmen wurden aufgrund von offentlich zugànglichen lnformationen ausgewählt die darauf hinwiesen, dass sie mit Nanopartikeln umgehen. Der zweite Teil der Dolctorarbeit war die repräsentative Studie zur Evalniernng der Verbreitnng von Nanopaitikelanwendungen in der Schweizer lndustrie. Die Studie baute auf lnformationen der Pilotstudie auf und wurde mit einer repräsentativen Selektion von Firmen der Schweizerischen Unfall Versicherungsanstalt (SUVA) durchgeüihxt. Die Mehrheit der Schweizerischen Unternehmen im lndustrieselctor wurde damit abgedeckt. Der dritte Teil der Doktorarbeit fokussierte auf die Methodik zur Messung von Nanopartikeln. Mehrere Vorstudien wurden dnrchgefîihrt, um die Grenzen von oft eingesetzten Nanopartikelmessgeräten auszuloten, wenn sie grösseren Mengen von Nanopartikel Agglomeraten ausgesetzt messen sollen. Dieser F okns wurde ans zwei Gründen gewählt: weil mehrere Dislcussionen rnit Anwendem und auch dem Produzent der Messgeràte dort eine Schwachstelle vermuten liessen, welche Zweifel an der Genauigkeit der Messgeräte aufkommen liessen und weil in den zwei Umfragestudien ein häufiges Vorkommen von solchen Nanopartikel-Agglomeraten aufgezeigt wurde. i Als erstes widmete sich eine Vorstndie der Genauigkeit des Scanning Mobility Particle Sizer (SMPS). Dieses Messgerät zeigte in Präsenz von Nanopartikel Agglorneraten unsinnige bimodale Partikelgrössenverteilung an. Eine Serie von kurzen Experimenten folgte, welche sich auf andere Messgeräte und deren Probleme beim Messen von Nanopartikel-Agglomeraten konzentrierten. Der Condensation Particle Counter (CPC), der portable aerosol spectrometer (PAS), ein Gerät zur Schàtzung des aerodynamischen Durchniessers von Teilchen, sowie der Diffusion Size Classifier wurden getestet. Einige erste Machbarkeitstests zur Ermittlnng der Effizienz von tilterbasierter Messung von luftgetragenen Carbon Nanotubes (CNT) wnrden als letztes durchgeiührt. Die Pilotstudie hat ein detailliiertes Bild der Typen und Mengen von genutzten Nanopartikel in Schweizer Unternehmen geliefert, und hat den Stand des Wissens der interviewten Gesundheitsschntz und Sicherheitsfachleute aufgezeigt. Folgende Typen von Nanopaitikeln wurden von den kontaktierten Firmen als Maximalmengen angegeben (> 1'000 kg pro Jahr / Unternehrnen): Ag, Al-Ox, Fe-Ox, SiO2, TiO2, und ZnO (hauptsächlich Nanopartikel der ersten Generation). Die Quantitäten von eingesetzten Nanopartikeln waren stark verschieden mit einem ein Median von 100 kg pro Jahr. ln der quantitativen Fragebogenstudie wurden l'626 Unternehmen brieflich kontaktiert; allesamt Klienten der Schweizerischen Unfallversicherringsanstalt (SUVA). Die Resultate der Umfrage erlaubten eine Abschätzung der Anzahl von Unternehmen und Arbeiter, welche Nanopartikel in der Schweiz anwenden. Die Hochrechnung auf den Schweizer lndnstriesektor hat folgendes Bild ergeben: ln 586 Unternehmen (95% Vertrauensintervallz 145 bis 1'027 Unternehmen) sind 1'309 Arbeiter potentiell gegenüber Nanopartikel exponiert (95%-Vl: l'073 bis l'545). Diese Zahlen stehen für 0.6% der Schweizer Unternehmen (95%-Vl: 0.2% bis 1.1%) und 0.08% der Arbeiternehmerschaft (95%-V1: 0.06% bis 0.09%). Es gibt einige gut etablierte Technologien um die Luftkonzentration von Submikrometerpartikel zu messen. Es besteht jedoch Zweifel daran, inwiefern sich diese Technologien auch für die Messurrg von künstlich hergestellten Nanopartikeln verwenden lassen. Aus diesem Grund folcussierten die vorbereitenden Studien für die Arbeitsplatzbeurteilnngen auf die Messung von Pulverri, welche Nan0partike1-Agg10merate enthalten. Sie erlaubten die ldentifikation folgender rnöglicher Quellen von fehlerhaften Messungen an Arbeitsplätzen mit erhöhter Luft-K0nzentrati0n von Nanopartikel Agglomeratenz - Ein Standard SMPS zeigte eine unglaubwürdige bimodale Partikelgrössenverteilung wenn er grössere Nan0par'til<e1Agg10merate gemessen hat. - Grosse Unterschiede im Bereich von Faktor tausend wurden festgestellt zwischen einem Diffusion Size Classiîier und einigen CPC (beziehungsweise dem SMPS). - Die Unterschiede zwischen CPC/SMPS und dem PAS waren geringer, aber abhängig von Grosse oder Typ des gemessenen Pulvers waren sie dennoch in der Grössenordnung von einer guten Grössenordnung. - Spezifische Schwierigkeiten und Unsicherheiten im Bereich von Arbeitsplatzmessungen wurden identitiziert: Hintergrundpartikel können mit Partikeln interagieren die während einem Arbeitsprozess freigesetzt werden. Solche Interaktionen erschweren eine korrekte Einbettung der Hintergrunds-Partikel-Konzentration in die Messdaten. - Elektromotoren produzieren grosse Mengen von Nanopartikeln und können so die Messung der prozessbezogenen Exposition stören. Fazit: Die Umfragen zeigten, dass Nanopartikel bereits Realitàt sind in der Schweizer Industrie und dass einige Unternehmen bereits grosse Mengen davon einsetzen. Die repräsentative Umfrage hat diese explosive Nachricht jedoch etwas moderiert, indem sie aufgezeigt hat, dass die Zahl der Unternehmen in der gesamtschweizerischen Industrie relativ gering ist. In den meisten Branchen (vor allem ausserhalb der Chemischen Industrie) wurden wenig oder keine Anwendungen gefunden, was schliessen last, dass die Einführung dieser neuen Technologie erst am Anfang einer Entwicklung steht. Auch wenn die Zahl der potentiell exponierten Arbeiter immer noch relativ gering ist, so unterstreicht die Studie dennoch die Notwendigkeit von Expositionsmessungen an diesen Arbeitsplätzen. Kenntnisse um die Exposition und das Wissen, wie solche Exposition korrekt zu messen, sind sehr wichtig, vor allem weil die möglichen Auswirkungen auf die Gesundheit noch nicht völlig verstanden sind. Die Evaluation einiger Geräte und Methoden zeigte jedoch, dass hier noch Nachholbedarf herrscht. Bevor grössere Mess-Studien durgefîihrt werden können, müssen die Geräte und Methodem für den Einsatz mit Nanopartikel-Agglomeraten validiert werden.
Resumo:
BACKGROUND: Deep burn assessment made by clinical evaluation has an accuracy varying between 60% and 80% and will determine if a burn injury will need tangential excision and skin grafting or if it will be able to heal spontaneously. Laser Doppler Imaging (LDI) techniques allow an improved burn depth assessment but their use is limited by the time-consuming image acquisition which may take up to 6 min per image. METHODS: To evaluate the effectiveness and reliability of a newly developed full-field LDI technology, 15 consecutive patients presenting with intermediate depth burns were assessed both clinically and by FluxExplorer LDI technology. Comparison between the two methods of assessment was carried out. RESULTS: Image acquisition was done within 6 s. FluxEXPLORER LDI technology achieved a significantly improved accuracy of burn depth assessment compared to the clinical judgement performed by board certified plastic and reconstructive surgeons (P < 0.05, 93% of correctly assessed burns injuries vs. 80% for clinical assessment). CONCLUSION: Technological improvements of LDI technology leading to a decreased image acquisition time and reliable burn depth assessment allow the routine use of such devices in the acute setting of burn care without interfering with the patient's treatment. Rapid and reliable LDI technology may assist clinicians in burn depth assessment and may limit the morbidity of burn patients through a minimization of the area of surgical debridement. Future technological improvements allowing the miniaturization of the device will further ease its clinical application.
Resumo:
OBJECTIVE: Reliable data about the nutrient intake of elderly noninstitutionalized women in Switzerland is lacking. The aim of this study was to assess the energy and nutrient intake in this specific population. SUBJECTS: The 401 subjects were randomly selected women of mean age of 80.4 years (range 75-87) recruited from the Swiss SEMOF (Swiss Evaluation of the Methods of Measurement of Osteoporotic Fracture Risk) cohort study. A validated food frequency questionnaire (FFQ) was submitted to the 401 subjects to assess dietary intake. RESULTS: The FFQ showed a mean daily energy intake of 1544 kcal (+/-447.7). Protein intake was 65.2 g (+/-19.9), that is 1.03 g kg(-1) body weight per day. The mean daily intake for energy, fat, carbohydrate, calcium, magnesium, vitamin C, D and E were below the RNI. However, protein, phosphorus, potassium, iron and vitamin B6 were above the RNI. CONCLUSION: The mean nutrient intake of these free living Swiss elderly women was low compared with standards. Energy dense foods rich in carbohydrate, magnesium, calcium, vitamin D and E as well as regular sunshine exposure is recommended in order to optimise dietary intake.
Resumo:
Abstract Traditionally, the common reserving methods used by the non-life actuaries are based on the assumption that future claims are going to behave in the same way as they did in the past. There are two main sources of variability in the processus of development of the claims: the variability of the speed with which the claims are settled and the variability between the severity of the claims from different accident years. High changes in these processes will generate distortions in the estimation of the claims reserves. The main objective of this thesis is to provide an indicator which firstly identifies and quantifies these two influences and secondly to determine which model is adequate for a specific situation. Two stochastic models were analysed and the predictive distributions of the future claims were obtained. The main advantage of the stochastic models is that they provide measures of variability of the reserves estimates. The first model (PDM) combines one conjugate family Dirichlet - Multinomial with the Poisson distribution. The second model (NBDM) improves the first one by combining two conjugate families Poisson -Gamma (for distribution of the ultimate amounts) and Dirichlet Multinomial (for distribution of the incremental claims payments). It was found that the second model allows to find the speed variability in the reporting process and development of the claims severity as function of two above mentioned distributions' parameters. These are the shape parameter of the Gamma distribution and the Dirichlet parameter. Depending on the relation between them we can decide on the adequacy of the claims reserve estimation method. The parameters have been estimated by the Methods of Moments and Maximum Likelihood. The results were tested using chosen simulation data and then using real data originating from the three lines of business: Property/Casualty, General Liability, and Accident Insurance. These data include different developments and specificities. The outcome of the thesis shows that when the Dirichlet parameter is greater than the shape parameter of the Gamma, resulting in a model with positive correlation between the past and future claims payments, suggests the Chain-Ladder method as appropriate for the claims reserve estimation. In terms of claims reserves, if the cumulated payments are high the positive correlation will imply high expectations for the future payments resulting in high claims reserves estimates. The negative correlation appears when the Dirichlet parameter is lower than the shape parameter of the Gamma, meaning low expected future payments for the same high observed cumulated payments. This corresponds to the situation when claims are reported rapidly and fewer claims remain expected subsequently. The extreme case appears in the situation when all claims are reported at the same time leading to expectations for the future payments of zero or equal to the aggregated amount of the ultimate paid claims. For this latter case, the Chain-Ladder is not recommended.
Resumo:
Airway stenting is a common endoscopic procedure that is used to treat a variety of central airway lesions. Obstructions or fistulas involving the carina or nearby tracheobronchial structures require the use of specially designed stents, commonly referred to as Y-stents. Conventional methods of endobronchial Y-stent delivery are all characterized by a blind and apneic period during the procedure that carries the risk of stent misplacement or ventilation/oxygenation problems or both. Using combined suspension laryngoscopy, flexible bronchoscopy, and jet ventilation, we describe a technique that makes challenging bronchoscopic interventions--such as self-expandable Y-shaped airway stent delivery--easy, precise, and safe.
Resumo:
Background/Objective:Little is known about the precise role of parental migrant status (MS) and educational level (EL) on adiposity and various eating habits in young children. Therefore, we assessed their independent contribution in preschoolers.Subjects/Methods:Of 655 randomly selected preschoolers, 542 (5.1±0.6 years; 71% of parental MS and 37% of low parental EL) were analysed. Body composition was measured by bioelectrical impedance. Eating habits were assessed using a semiqualitative food frequency questionnaire and analysed according to five messages developed by the Swiss Society for Nutrition, based on factors implicated in childhood obesity: (1) 'Drinking water and decreasing sweetened drinks', (2) 'Eating fruit and vegetables', (3) 'Decreasing breakfast skipping', (4) 'Reducing fatty and sweet foods' and (5) 'Reducing the intake of meals and snacks in front of television'.Results:Children of migrant and low EL parents had higher body fat, ate more meals and snacks while watching television and had more fruit and fatty foods compared with their respective counterparts (all P0.04). Children of low EL parents also consumed less water and vegetables compared with their counterparts (all P0.04). In most instances, we found an independent contribution of parental MS and EL to adiposity and eating habits. A more pronounced effect was found if both parents were migrants or of low EL. Differences in adiposity and eating habits were relatively similar to the joint parental data when assessed individually for maternal and paternal MS and EL.Conclusions:Parental MS and EL are independently related to adiposity and various eating habits in preschoolers.European Journal of Clinical Nutrition advance online publication, 3 November 2010; doi:10.1038/ejcn.2010.248.
Resumo:
The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.
Resumo:
OBJECTIVE: Little is known regarding health-related quality of life and its relation with physical activity level in the general population. Our primary objective was to systematically review data examining this relationship. METHODS: We systematically searched MEDLINE, EMBASE, CINAHL, and PsycINFO for health-related quality of life and physical activity related keywords in titles, abstracts, or indexing fields. RESULTS: From 1426 retrieved references, 55 citations were judged to require further evaluation. Fourteen studies were retained for data extraction and analysis; seven were cross-sectional studies, two were cohort studies, four were randomized controlled trials and one used a combined cross sectional and longitudinal design. Thirteen different methods of physical activity assessment were used. Most health-related quality of life instruments related to the Medical Outcome Study SF-36 questionnaire. Cross-sectional studies showed a consistently positive association between self-reported physical activity and health-related quality of life. The largest cross-sectional study reported an adjusted odds ratio of "having 14 or more unhealthy days" during the previous month to be 0.40 (95% Confidence Interval 0.36-0.45) for those meeting recommended levels of physical activity compared to inactive subjects. Cohort studies and randomized controlled trials tended to show a positive effect of physical activity on health-related quality of life, but similar to the cross-sectional studies, had methodological limitations. CONCLUSION: Cross-sectional data showed a consistently positive association between physical activity level and health-related quality of life. Limited evidence from randomized controlled trials and cohort studies precludes a definitive statement about the nature of this association.
Resumo:
Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
Resumo:
Autoantibodies are frequently determined in unclear clinical situations and in the context of an inflammatory syndrome. The aim of this article is not to review all autoantibodies in details, but to discuss those used in clinical practice by describing their methods of detection and interpretation. Thus we will focus on antinuclear antibodies (ANA), which are typically associated with connective tissue diseases, as well as anti-neutrophil cytoplasmic antibodies (ANCA), which are useful in the diagnosis of ANCA-associated vasculitides. Due to its high sensitivity indirect immunofluorescence is used as a screening test; when positive, ELISA is performed to search for antibodies more specifically associated with certain auto-immune diseases.
Resumo:
The coupling between synaptic activity and glucose utilization (neurometabolic coupling) is a central physiologic principle of brain function that has provided the basis for 2-deoxyglucose-based functional imaging with positron emission tomography. Approximately 10 y ago we provided experimental evidence that indicated a central role of glutamate signaling on astrocytes in neurometabolic coupling. The basic mechanism in neurometabolic coupling is the glutamate-stimulated aerobic glycolysis in astrocytes, such that the sodium-coupled reuptake of glutamate by astrocytes and the ensuing activation of the Na(+)-K(+) ATPase triggers glucose uptake and its glycolytic processing, which results in the release of lactate from astrocytes. Lactate can then contribute to the activity-dependent fueling of the neuronal energy demands associated with synaptic transmission. Analyses of this coupling have been extended in vivo and have defined the methods of coupling for inhibitory neurotransmission as well as its spatial extent in relation to the propagation of metabolic signals within the astrocytic syncytium. On the basis of a large body of experimental evidence, we proposed an operational model, "the astrocyte-neuron lactate shuttle." A series of results obtained by independent laboratories have provided further support for this model. This body of evidence provides a molecular and cellular basis for interpreting data that are obtained with functional brain imaging studies.
Resumo:
Reliable diagnoses of sepsis remain challenging in forensic pathology routine despite improved methods of sample collection and extensive biochemical and immunohistochemical investigations. Macroscopic findings may be elusive and have an infectious or non-infectious origin. Blood culture results can be difficult to interpret due to postmortem contamination or bacterial translocation. Lastly, peripheral and cardiac blood may be unavailable during autopsy. Procalcitonin, C-reactive protein, and interleukin-6 can be measured in biological fluids collected during autopsy and may be used as in clinical practice for diagnostic purposes. However, concentrations of these parameters may be increased due to etiologies other than bacterial infections, indicating that a combination of biomarkers could more effectively discriminate non-infectious from infectious inflammations. In this article, we propose a review of the literature pertaining to the diagnostic performance of classical and novel biomarkers of inflammation and bacterial infection in the forensic setting.
Resumo:
Owl pellets contain a good skeletal record of the small mammals consumed, and correspond to the undigested portions of prey which are regurgitated. These pellets are easy to find at the roosting site of owls. As it has been demonstrated that amplifiable DNA can be isolated from ancient bone remains, the possibility of using owl pellets as a source of DNA for small mammal genetics studies via the polymerase chain reaction has been investigated. The main uncertainties when isolating DNA from such a material are firstly the possibility that the extracted DNA would be too degraded during the digestion in the stomach of the owl, and secondly that extensive cross-contaminations could occur among the different prey consumed. The results obtained clearly demonstrate that cross-contamination does not occur, and that mitochondrial and nuclear DNA can be amplified using skulls of small mammals found in owl pellets as a source of DNA. The relative efficiency of two methods of DNA extraction is estimated and discussed. Thus, owl pellets represent a non-invasive sampling technique which provides a valuable source of DNA for studying population genetics of small mammals.
Resumo:
PURPOSE: To remind of the absolute necessity for early diagnosis in the presence of ocular signs in children giving rise to possible intraocular tumours. METHOD: Based on our own experience of intraocular tumours in children, together with findings from the literature, diagnostic criteria and methods of treatment are presented. RESULTS: Retinoblastoma is the predominant cause of intraocular tumours in children, representing over 80% of cases under the age of 15 years. Other diseases may give rise to the same initial signs, usually leukocoria, sometimes strabismus, more rarely other atypical signs. Elements taken into account for diagnosis include age, sex, laterality, heredity, size of the globe, clinical aspect of the tumours, presence of calcifications and vitreous seeding. Full fundus examination under general anaesthetic is usually necessary. Biological examination, ultrasonography, computerized tomography and MRI enable an accurate diagnosis to be made in the majority of doubtful cases. The management of retinoblastoma is adapted for each individual case from the wide range of treatments available. Enucleation, radioactive applicators (...), brachytherapy (...), cryo- and photocoagulation represent classical measures. Primary chemotherapy, combined with other treatments such as thermotherapy, has become the treatment of choice in those cases where external beam radiotherapy has been used up to now, or in some instances before enucleation. Enucleation is usually carried out for medullo-epitheliomas, but brachytherapy may offer an alternative. CONCLUSION: Any unexplained ocular sign in children should be considered as a possible retinoblastoma, making an accurate and certain diagnosis imperative. Early treatment may save not only the life but also the vision of patients carrying this highly malignant lesion.