966 resultados para Separators (Machines)
Resumo:
PURPOSE: Late toxicities such as second cancer induction become more important as treatment outcome improves. Often the dose distribution calculated with a commercial treatment planning system (TPS) is used to estimate radiation carcinogenesis for the radiotherapy patient. However, for locations beyond the treatment field borders, the accuracy is not well known. The aim of this study was to perform detailed out-of-field-measurements for a typical radiotherapy treatment plan administered with a Cyberknife and a Tomotherapy machine and to compare the measurements to the predictions of the TPS. MATERIALS AND METHODS: Individually calibrated thermoluminescent dosimeters were used to measure absorbed dose in an anthropomorphic phantom at 184 locations. The measured dose distributions from 6 MV intensity-modulated treatment beams for CyberKnife and TomoTherapy machines were compared to the dose calculations from the TPS. RESULTS: The TPS are underestimating the dose far away from the target volume. Quantitatively the Cyberknife underestimates the dose at 40cm from the PTV border by a factor of 60, the Tomotherapy TPS by a factor of two. If a 50% dose uncertainty is accepted, the Cyberknife TPS can predict doses down to approximately 10 mGy/treatment Gy, the Tomotherapy-TPS down to 0.75 mGy/treatment Gy. The Cyberknife TPS can then be used up to 10cm from the PTV border the Tomotherapy up to 35cm. CONCLUSIONS: We determined that the Cyberknife and Tomotherapy TPS underestimate substantially the doses far away from the treated volume. It is recommended not to use out-of-field doses from the Cyberknife TPS for applications like modeling of second cancer induction. The Tomotherapy TPS can be used up to 35cm from the PTV border (for a 390 cm(3) large PTV).
Resumo:
Although the radiation doses involved in basic research radiology are relatively small, the increasing number of radiological procedures makes risks becoming increasingly high. Quality control techniques in radiological practice have to ensure an adequate system of protection for people exposed to radiation. These techniques belong to a quality assurance program for X-ray machines and are designed to correct problems related to equipment and radiological practices, to obtain radiological images of high quality and to reduce the unnecessary exposures.
Resumo:
Background: Bone health is a concern when treating early stage breast cancer patients with adjuvant aromatase inhibitors. Early detection of patients (pts) at risk of osteoporosis and fractures may be helpful for starting preventive therapies and selecting the most appropriate endocrine therapy schedule. We present statistical models describing the evolution of lumbar and hip bone mineral density (BMD) in pts treated with tamoxifen (T), letrozole (L) and sequences of T and L. Methods: Available dual-energy x-ray absorptiometry exams (DXA) of pts treated in trial BIG 1-98 were retrospectively collected from Swiss centers. Treatment arms: A) T for 5 years, B) L for 5 years, C) 2 years of T followed by 3 years of L and, D) 2 years of L followed by 3 years of T. Pts without DXA were used as a control for detecting selection biases. Patients randomized to arm A were subsequently allowed an unplanned switch from T to L. Allowing for variations between DXA machines and centres, two repeated measures models, using a covariance structure that allow for different times between DXA, were used to estimate changes in hip and lumbar BMD (g/cm2) from trial randomization. Prospectively defined covariates, considered as fixed effects in the multivariable models in an intention to treat analysis, at the time of trial randomization were: age, height, weight, hysterectomy, race, known osteoporosis, tobacco use, prior bone fracture, prior hormone replacement therapy (HRT), bisphosphonate use and previous neo-/adjuvant chemotherapy (ChT). Similarly, the T-scores for lumbar and hip BMD measurements were modeled using a per-protocol approach (allowing for treatment switch in arm A), specifically studying the effect of each therapy upon T-score percentage. Results: A total of 247 out of 546 pts had between 1 and 5 DXA; a total of 576 DXA were collected. Number of DXA measurements per arm were; arm A 133, B 137, C 141 and D 135. The median follow-up time was 5.8 years. Significant factors positively correlated with lumbar and hip BMD in the multivariate analysis were weight, previous HRT use, neo-/adjuvant ChT, hysterectomy and height. Significant negatively correlated factors in the models were osteoporosis, treatment arm (B/C/D vs. A), time since endocrine therapy start, age and smoking (current vs. never).Modeling the T-score percentage, differences from T to L were -4.199% (p = 0.036) and -4.907% (p = 0.025) for the hip and lumbar measurements respectively, before any treatment switch occurred. Conclusions: Our statistical models describe the lumbar and hip BMD evolution for pts treated with L and/or T. The results of both localisations confirm that, contrary to expectation, the sequential schedules do not seem less detrimental for the BMD than L monotherapy. The estimated difference in BMD T-score percent is at least 4% from T to L.
Resumo:
Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.
Resumo:
L’objectiu d’aquest estudi és comparar els efectes de dos mitjans de treball sobre la força explosiva en 9 atletes de nivell autonòmic i nacional (19,6 ± 2,6 anys, 1,76 ± 7 m i 68,9 ± 3,5 kg) que entrenaren durant el període competitiu, 6 setmanes amb una freqüència de 2 cops per setmana, seguint una periodització creixent. Els subjectes varen ser dividits en 3 grups de 3 atletes cada un, però utilitzant diferents mitjans d’entrenament: el grup experimental combinant la plataforma vibratòria i les màquines inercials (1), amb un temps d’exposició a la vibració de 30’’, a una intensitat de 45 Hz i una amplitud de 5 mm amb una pausa d’1’. I de 3 sèries de 8 repeticions executades a màxima velocitat en la fase concèntrica i controlant aquesta en la fase excèntrica, amb una pausa de 3’ en les màquines io-io. El grup experimental de pesos lliures (2), va realitzar 4 exercicis: ½ squat, pliometria (CEE), multisalts horitzontals i acceleracions, seguint les pautes d’un treball de força explosiva proposat per Badillo i Gorostiaga (1995). I un grup control (3), que no realitzà cap entrenament de força. Abans i després del període d’intervenció és realitzaren els següents tests: salt sense contramoviment (SJ) i salt amb contramoviment (CMJ). Els resultats indicaren que el grup (1), (2) i (3) disminuïren significativament el SJ i el CMJ. Es conclou que l’entrenament, tant en el de combinació d’estímuls vibratoris amb màquines inercials com el de pesos lliures pareix ser un mitja que s’ha de controlar i perioditzar molt bé ja que en el període competitiu l’atleta acumula uns nivells de fatiga tant muscular com fisiològics molt superiors que en altres períodes de la temporada.
Resumo:
Actualment hi ha una gran varietat de marques i models d'estabilitzadors de tensió, tot i que tots estan dissenyats i construïts amb el mateix propòsit, entregar una tensió estable a la sortida del dispositiu. La raó per la que es fabriquen estabilitzadors de tensió, es basa en el fet que, tot i amb els avenços tècnics i millores dels serveis de l'àrea energètica, no s'han pogut eliminar les freqüents caigudes o pujades de tensió en les xarxes d'alimentació d'energia elèctrica, fet que pot ocasionar errors en el funcionament dels equips electrònics. És per això que la majoria d'usuaris d'equips electrònics interposen un estabilitzador de tensió entre la línea d'alimentació dels seus aparells. L'objectiu del treball és realitzar la construcció d'un dispositiu de tres bobines amb un nucli ferromagnètic per a dur a terme la funció d'estabilitzador de tensió alterna. Mitjançant la tècnica de reluctància de nucli saturable i aplicant-la en aquest cas pràctic, es vol aconseguir desenvolupar un estabilitzador de tensió fiable i econòmic. Aquest treball ha d'aportar una solució en entorns de treball, entre altres, on s'utilitzen màquines d'alimentació alternes que siguin molt sensibles a les variacions de tensió de la xarxa i que, a més a més, no suposi una despesa molt elevada.
Resumo:
The book presents the state of the art in machine learning algorithms (artificial neural networks of different architectures, support vector machines, etc.) as applied to the classification and mapping of spatially distributed environmental data. Basic geostatistical algorithms are presented as well. New trends in machine learning and their application to spatial data are given, and real case studies based on environmental and pollution data are carried out. The book provides a CD-ROM with the Machine Learning Office software, including sample sets of data, that will allow both students and researchers to put the concepts rapidly to practice.
Resumo:
En la actualidad, las cooperativas recolectan, seleccionan, tratan y separan la fruta según su calibre (peso, diámetro máximo, medio y/o mínimo) para que esta llegue al consumidor final según la categoría (calibre). Para poder competir en un mercado cada vez más exigente en calidad y precios, se requieren sistemas de clasificación automáticos que nos permitan obtener óptimos resultados con altos niveles de producción y productividad. Para realizar estas tareas existen calibradoras industriales que pesan la fruta mediante células de carga y con el peso obtenido las clasifican asignando las piezas a su salida correspondiente (mesa de confección) a través de un sistema de electroimanes. Desafortunadamente el proceso de calibración de la fruta por peso no es en absoluto fiable ya que en este proceso no se considera el grosor de la piel, contenido de agua, de azúcar u otros factores altamente relevantes que influyen considerablemente en los resultados finales. El objeto de este proyecto es el de evolucionar las existentes calibradoras de fruta instalando un sistema industrial de visión artificial (rápido y robusto) que trabaje en un rango de espectro Infrarrojo (mayor fiabilidad) proporcionando óptimos resultados finales en la clasificación de las frutas, verduras y hortalizas. De este modo, el presente proyecto ofrece la oportunidad de mejorar el rendimiento de la línea de clasificación de frutas, aumentando la velocidad, disminuyendo pérdidas en tiempo y error humano y mejorando indiscutiblemente la calidad del producto final deseada por los consumidores.
Resumo:
This letter presents advanced classification methods for very high resolution images. Efficient multisource information, both spectral and spatial, is exploited through the use of composite kernels in support vector machines. Weighted summations of kernels accounting for separate sources of spectral and spatial information are analyzed and compared to classical approaches such as pure spectral classification or stacked approaches using all the features in a single vector. Model selection problems are addressed, as well as the importance of the different kernels in the weighted summation.
Resumo:
Clear Lake, Iowa's third largest natural lake, is a premier natural resource and popular recreational destination in north central Iowa. Despite the lake's already strong recreational use, water quality concerns have not allowed the lake to reach its full potential. Clear Lake is listed on Iowa's Draft 2010 303(d) Impaired Waters List for algae, bacteria, and turbidity. Many restoration practices have been implemented to treat the algae and turbidity impairment, but few practices have been installed to treat bacteria. Reducing beach bacteria levels is a priority of the lake restoration partners. Federal, State, and local partners have invested more than $20 million in lake and watershed restoration efforts to improve water clarity and quality. These partners have a strong desire to ensure high bacteria levels at public swim beaches do not undermine the other water quality improvements. Recent bacteria source tracking completed by the State Hygienic Laboratory indicates that Canada Geese are a major contributor of bacteria loading to the Clear Lake swim beaches. Other potential sources include unpermitted septic systems in the watershed. The grant request proposes to reduce bacteria levels at Clear Lake's three public swim beaches by utilizing beach cleaner machines to remove goose waste, installing goose deterrents at the swim beaches, and continuing a septic system update grant program. These practices began to be implemented in 2011 and recent bacteria samples in 2012 are showing they can be effective if the effort is continued.
Resumo:
Spatial data analysis mapping and visualization is of great importance in various fields: environment, pollution, natural hazards and risks, epidemiology, spatial econometrics, etc. A basic task of spatial mapping is to make predictions based on some empirical data (measurements). A number of state-of-the-art methods can be used for the task: deterministic interpolations, methods of geostatistics: the family of kriging estimators (Deutsch and Journel, 1997), machine learning algorithms such as artificial neural networks (ANN) of different architectures, hybrid ANN-geostatistics models (Kanevski and Maignan, 2004; Kanevski et al., 1996), etc. All the methods mentioned above can be used for solving the problem of spatial data mapping. Environmental empirical data are always contaminated/corrupted by noise, and often with noise of unknown nature. That's one of the reasons why deterministic models can be inconsistent, since they treat the measurements as values of some unknown function that should be interpolated. Kriging estimators treat the measurements as the realization of some spatial randomn process. To obtain the estimation with kriging one has to model the spatial structure of the data: spatial correlation function or (semi-)variogram. This task can be complicated if there is not sufficient number of measurements and variogram is sensitive to outliers and extremes. ANN is a powerful tool, but it also suffers from the number of reasons. of a special type ? multiplayer perceptrons ? are often used as a detrending tool in hybrid (ANN+geostatistics) models (Kanevski and Maignank, 2004). Therefore, development and adaptation of the method that would be nonlinear and robust to noise in measurements, would deal with the small empirical datasets and which has solid mathematical background is of great importance. The present paper deals with such model, based on Statistical Learning Theory (SLT) - Support Vector Regression. SLT is a general mathematical framework devoted to the problem of estimation of the dependencies from empirical data (Hastie et al, 2004; Vapnik, 1998). SLT models for classification - Support Vector Machines - have shown good results on different machine learning tasks. The results of SVM classification of spatial data are also promising (Kanevski et al, 2002). The properties of SVM for regression - Support Vector Regression (SVR) are less studied. First results of the application of SVR for spatial mapping of physical quantities were obtained by the authorsin for mapping of medium porosity (Kanevski et al, 1999), and for mapping of radioactively contaminated territories (Kanevski and Canu, 2000). The present paper is devoted to further understanding of the properties of SVR model for spatial data analysis and mapping. Detailed description of the SVR theory can be found in (Cristianini and Shawe-Taylor, 2000; Smola, 1996) and basic equations for the nonlinear modeling are given in section 2. Section 3 discusses the application of SVR for spatial data mapping on the real case study - soil pollution by Cs137 radionuclide. Section 4 discusses the properties of the modelapplied to noised data or data with outliers.
Resumo:
The present study deals with the analysis and mapping of Swiss franc interest rates. Interest rates depend on time and maturity, defining term structure of the interest rate curves (IRC). In the present study IRC are considered in a two-dimensional feature space - time and maturity. Exploratory data analysis includes a variety of tools widely used in econophysics and geostatistics. Geostatistical models and machine learning algorithms (multilayer perceptron and Support Vector Machines) were applied to produce interest rate maps. IR maps can be used for the visualisation and pattern perception purposes, to develop and to explore economical hypotheses, to produce dynamic asset-liability simulations and for financial risk assessments. The feasibility of an application of interest rates mapping approach for the IRC forecasting is considered as well. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Raman spectroscopy combined with chemometrics has recently become a widespread technique for the analysis of pharmaceutical solid forms. The application presented in this paper is the investigation of counterfeit medicines. This increasingly serious issue involves networks that are an integral part of industrialized organized crime. Efficient analytical tools are consequently required to fight against it. Quick and reliable authentication means are needed to allow the deployment of measures from the company and the authorities. For this purpose a method in two steps has been implemented here. The first step enables the identification of pharmaceutical tablets and capsules and the detection of their counterfeits. A nonlinear classification method, the Support Vector Machines (SVM), is computed together with a correlation with the database and the detection of Active Pharmaceutical Ingredient (API) peaks in the suspect product. If a counterfeit is detected, the second step allows its chemical profiling among former counterfeits in a forensic intelligence perspective. For this second step a classification based on Principal Component Analysis (PCA) and correlation distance measurements is applied to the Raman spectra of the counterfeits.
Resumo:
L'exposition aux poussières de bois est associé à un risque accru d'adénocarcinomes des fosses nasales et des sinus paranasaux (SNC, 'Sinonasal cancer') chez les travailleurs du bois. Les poussières de bois sont ainsi reconnues comme cancérogènes avérés pour l'homme par le Centre international de Recherche sur le Cancer (CIRC). Toutefois, l'agent causal spécifique et le mécanisme sous-jacent relatifs au cancer lié aux poussières de bois demeurent inconnus. Une possible explication est une co-exposition aux poussières de bois et aux Hydrocarbures Aromatiques Polycycliques (HAP), ces derniers étant potentiellement cancérogènes. Dans les faits, les travailleurs du bois sont non seulement exposés aux poussières de bois naturel, mais également à celles générées lors d'opérations effectuées à l'aide de machines (ponceuses, scies électriques, etc.) sur des finitions de bois (bois traités) ou sur des bois composites, tels que le mélaminé et les panneaux de fibres à densité moyenne (MDF, 'Medium Density Fiberboard'). Des HAP peuvent en effet être générés par la chaleur produite par l'utilisation de ces machines sur la surface du bois. Les principaux objectifs de cette thèse sont les suivants: (1) quantifier HAP qui sont présents dans les poussières générées lors de diverses opérations courantes effectuées sur différents bois (2) quantifier l'exposition individuelle aux poussières de bois et aux HAP chez les travailleurs, et (3) évaluer les effets génotoxiques (dommages au niveau de l'ADN et des chromosomes) due à l'exposition aux poussières de bois et aux HAP. Cette thèse est composée par une étude en laboratoire (objectif 1) et par une étude de terrain (objectifs 2 et 3). Pour l'étude en laboratoire, nous avons collecté des poussières de différents type de bois (sapin, MDF, hêtre, sipo, chêne, bois mélaminé) générées au cours de différentes opérations (comme le ponçage et le sciage), et ceci dans une chambre expérimentale et dans des conditions contrôlées. Ensuite, pour l'étude de terrain, nous avons suivi, dans le cadre de leur activité professionnelle, 31 travailleurs de sexe masculin (travailleurs du bois et ébenistes) exposés aux poussières de bois pendant deux jours de travail consécutifs. Nous avons également recruté, comme groupe de contrôle, 19 travailleurs non exposés. Pour effectuer une biosurveillance, nous avons collecté des échantillons de sang et des échantillons de cellules nasales et buccales pour chacun des participants. Ces derniers ont également rempli un questionnaire comprenant des données démographiques, ainsi que sur leur style de vie et sur leur exposition professionnelle. Pour les travailleurs du bois, un échantillonnage individuel de poussière a été effectué sur chaque sujet à l'aide d'une cassette fermée, puis nous avons évalué leur exposition à la poussière de bois et aux HAP, respectivement par mesure gravimétrique et par Chromatographie en phase gazeuse combinée à la spectrométrie de masse. L'évaluation des dommages induits à l'ADN et aux chromosomes (génotoxicité) a été, elle, effectuée à l'aide du test des micronoyaux (MN) sur les cellules nasales et buccales et à l'aide du test des comètes sur les échantillons de sang. Nos résultats montrent dans la poussière de la totalité des 6 types de bois étudiés la présence de HAP (dont certains sont cancérogènes). Des différences notoires dans les concentrations ont été néanmoins constatées en fonction du matériau étudié : les concentrations allant de 0,24 ppm pour la poussière de MDF à 7.95 ppm pour le mélaminé. Nos résultats montrent également que les travailleurs ont été exposés individuellement à de faibles concentrations de HAP (de 37,5 à 119,8 ng m-3) durant les opérations de travail du bois, alors que les concentrations de poussières inhalables étaient relativement élevés (moyenne géométrique de 2,8 mg m-3). En ce qui concerne la génotoxicité, les travailleurs exposés à la poussière de bois présentent une fréquence significativement plus élevée en MN dans les cellules nasales et buccales que les travailleurs du groupe témoin : un odds ratio de 3.1 a été obtenu pour les cellules nasales (IC 95% : de 1.8 à 5.1) et un odds ratio de 1,8 pour les cellules buccales (IC 95% : de 1.3 à 2.4). En outre, le test des comètes a montré que les travailleurs qui ont déclaré être exposés aux poussières de MDF et/ou de mélaminé avaient des dommages à l'ADN significativement plus élevés que les deux travailleurs exposés à la poussière de bois naturel (sapin, épicéa, hêtre, chêne) et que les travailleurs du groupe témoin (p <.01). Enfin, la fréquence des MN dans les cellules nasales et buccales augmentent avec les années d'exposition aux poussières de bois. Par contre, il n'y a pas de relation dose-réponse concernant la génotoxicité due à l'exposition journalière à la poussière et aux HAP. Cette étude montre qu'une exposition aux HAP eu bien lieu lors des opérations de travail du bois. Les travailleurs exposés aux poussières de bois, et donc aux HAP, courent un risque plus élevé (génotoxicité) par rapport au groupe témoin. Étant donné que certains des HAP détectés sont reconnus potentiellement cancérogènes, il est envisageable que les HAP générés au cours du travail sur les matériaux de bois sont un des agents responsables de la génotoxicité de la poussière de bois et du risque élevé de SNC observé chez les travailleurs du secteur. Etant donné la corrélation entre augmentation de la fréquence des MN, le test des micronoyaux dans les cellules nasales et buccales constitue sans conteste un futur outil pour la biosurveillance et pour la détection précoce du risque de SNC chez les travailleurs. - Exposures to wood dust have been associated with an elevated risk of adenocarcinomas of the Dasal cavity and the paranasal sinuses (sinonasal cancer or SNC) among wood workers. Wood dust is recognized as a human carcinogen by the International Agency for Research on Cancer. However, the specific cancer causative agent(s) and the mechanism(s) behind wood dust related carcinogenesis remains unknown. One possible explanation is a co-exposure to wood dust and polycyclic aromatic hydrocarbons (PAH), the latter being carcinogenic. In addition, wood workers are not only exposed to natural wood but also to wood finishes and composite woods such as wood melamine and medium density fiber (MDF) boards during the manipulation with power tools. The heat produced by the use of power tools can cause the generation of PAH from wood materials. The main objectives of the present thesis are to: (1) quantify possible PAH concentrations in wood dust generated during various common woodworking operations using different wood materials; (2) quantify personal wood dust concentrations and PAH exposures among wood workers; and (3) assess genotoxic effects (i.e., DNA and chromosomal damage) of wood dust and PAH exposure in wood workers. This thesis is composed by a laboratory study (objective 1) and a field study (objectives 2 and 3). In the laboratory study we collected wood dust from different wood materials (fir, MDF, beech, mahagany, oak, and wood melamine) generated during different wood operations (e.g., sanding and sawing) in an experimental chamber under controlled conditions. In the following field study, we monitored 31 male wood workers (furniture and construction workers) exposed to wood dust during their professional activity for two consecutive work shifts. Additionally, we recruited 19 non exposed workers as a control group. We collected from each participant blood samples, and nasal and buccal cell samples. They answered a questionnaire including demographic and life-style data and occupational exposure (current and past). Personal wood dust samples were collected using a closed-face cassette. We used gravimetrie analysis to determine the personal wood dust concentrations and capillary gas chromatography - mass spectrometry analysis to determine PAH concentrations. Genotoxicity was assessed with the micronucleus (MN) assay for nasal and buccal cells and with the comet assay for blood samples. Our results show that PAH (some of them carcinogenic) were present in dust from all six wood materials tested, yet at different concentrations depending on the material. The highest concentration was found in dust from wood melamine (7.95 ppm) and the lowest in MDF (0.24 ppm). Our results also show that workers were individually exposed to low concentrations of PAHs (37.5-119.8 ng m"3) during wood working operations, whereas the concentrations of inhalable dust were relatively high (geometric mean 2.8 mg m"3). Concerning the genotoxicity, wood workers had a significantly higher MN frequency in nasal and buccal cells than the workers in the control group (odds ratio for nasal cells 3.1 (95%CI 1.8-5.1) and buccal cells 1.8 (95%CI 1.3-2.4)). Furthermore, the comet assay showed that workers who reported to be exposed to dust from wooden boards (MDF and wood melamine) had significantly higher DNA damage than both the workers exposed to natural woods (fir, spruce, beech, oak) and the workers in the control group (p < 0.01). Finally, MN frequency in nasal and buccal cells increased with increasing years of exposure to wood dust. However, there was no genotoxic dose-response relationship with the per present day wood dust and PAH exposure. This study shows that PAH exposure occurred during wood working operations. Workers exposed to wood dust, and thus to PAH, had a higher risk for genotoxicity compared to the control group. Since some of the detected PAH are potentially carcinogenic, PAH generated from operations on wood materials may be one of the causative agents for the observed increased genotoxicity in wood workers. Since increased genotoxicity is manifested in an increased MN frequency, the MN assay in nasal and buccal cells may become a relevant biomonitoring tool in the future for early detection of SNC risk.
Resumo:
In this paper, mixed spectral-structural kernel machines are proposed for the classification of very-high resolution images. The simultaneous use of multispectral and structural features (computed using morphological filters) allows a significant increase in classification accuracy of remote sensing images. Subsequently, weighted summation kernel support vector machines are proposed and applied in order to take into account the multiscale nature of the scene considered. Such classifiers use the Mercer property of kernel matrices to compute a new kernel matrix accounting simultaneously for two scale parameters. Tests on a Zurich QuickBird image show the relevance of the proposed method : using the mixed spectral-structural features, the classification accuracy increases of about 5%, achieving a Kappa index of 0.97. The multikernel approach proposed provide an overall accuracy of 98.90% with related Kappa index of 0.985.