232 resultados para Radon exhalation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new algorithm for extracting features from images for object recognition is described. The algorithm uses higher order spectra to provide desirable invariance properties, to provide noise immunity, and to incorporate nonlinearity into the feature extraction procedure thereby allowing the use of simple classifiers. An image can be reduced to a set of 1D functions via the Radon transform, or alternatively, the Fourier transform of each 1D projection can be obtained from a radial slice of the 2D Fourier transform of the image according to the Fourier slice theorem. A triple product of Fourier coefficients, referred to as the deterministic bispectrum, is computed for each 1D function and is integrated along radial lines in bifrequency space. Phases of the integrated bispectra are shown to be translation- and scale-invariant. Rotation invariance is achieved by a regrouping of these invariants at a constant radius followed by a second stage of invariant extraction. Rotation invariance is thus converted to translation invariance in the second step. Results using synthetic and actual images show that isolated, compact clusters are formed in feature space. These clusters are linearly separable, indicating that the nonlinearity required in the mapping from the input space to the classification space is incorporated well into the feature extraction stage. The use of higher order spectra results in good noise immunity, as verified with synthetic and real images. Classification of images using the higher order spectra-based algorithm compares favorably to classification using the method of moment invariants

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new approach to recognition of images using invariant features based on higher-order spectra is presented. Higher-order spectra are translation invariant because translation produces linear phase shifts which cancel. Scale and amplification invariance are satisfied by the phase of the integral of a higher-order spectrum along a radial line in higher-order frequency space because the contour of integration maps onto itself and both the real and imaginary parts are affected equally by the transformation. Rotation invariance is introduced by deriving invariants from the Radon transform of the image and using the cyclic-shift invariance property of the discrete Fourier transform magnitude. Results on synthetic and actual images show isolated, compact clusters in feature space and high classification accuracies

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Water resources are known to contain radioactive materials, either from natural or anthropogenic sources. Treatment, including wastewater treatment, of water for drinking, domestic, agricultural and industrial purposes has the potential to concentrate radioactive materials. Inevitably concentrated radioactive material is discharged to the environment as a waste product, reused for soil conditioning, or perhaps recycled as a new potable water supply. This thesis, presented as a collection of peer reviewed scientific papers, explores a number of water / wastewater treatment applications, and the subsequent nature and potential impact of radioactive residues associated with water exploitation processes. The thesis draws together research outcomes for sites predominantly throughout Queensland, Australia, where it is recognised that there is a paucity of published data on the subject. This thesis contributes to current knowledge on the monitoring, assessment and potential for radiation exposure from radioactive residues associated with the water industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Highly sensitive infrared (IR) cameras provide high-resolution diagnostic images of the temperature and vascular changes of breasts. These images can be processed to emphasize hot spots that exhibit early and subtle changes owing to pathology. The resulting images show clusters that appear random in shape and spatial distribution but carry class dependent information in shape and texture. Automated pattern recognition techniques are challenged because of changes in location, size and orientation of these clusters. Higher order spectral invariant features provide robustness to such transformations and are suited for texture and shape dependent information extraction from noisy images. In this work, the effectiveness of bispectral invariant features in diagnostic classification of breast thermal images into malignant, benign and normal classes is evaluated and a phase-only variant of these features is proposed. High resolution IR images of breasts, captured with measuring accuracy of ±0.4% (full scale) and temperature resolution of 0.1 °C black body, depicting malignant, benign and normal pathologies are used in this study. Breast images are registered using their lower boundaries, automatically extracted using landmark points whose locations are learned during training. Boundaries are extracted using Canny edge detection and elimination of inner edges. Breast images are then segmented using fuzzy c-means clustering and the hottest regions are selected for feature extraction. Bispectral invariant features are extracted from Radon projections of these images. An Adaboost classifier is used to select and fuse the best features during training and then classify unseen test images into malignant, benign and normal classes. A data set comprising 9 malignant, 12 benign and 11 normal cases is used for evaluation of performance. Malignant cases are detected with 95% accuracy. A variant of the features using the normalized bispectrum, which discards all magnitude information, is shown to perform better for classification between benign and normal cases, with 83% accuracy compared to 66% for the original.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It was Dvorak in 1986 that postulated 'tumours are wounds that do not heal' as they share common cellular and molecular mechanisms, which are active in both wounds and in cancer tissue. Inflammation is a crucial part of the innate immune system that protects against pathogens and initiates adaptive immunity. Acute inflammation is usually a rapid and self-limiting process, however it does not always resolve. This leads to the establishment of a chronic inflammatory state and provides the perfect environment for carcinogenesis. Inflammation and cancer have long had an association, going back as far as Virchow in 1863, when leucocytes were noted in neoplastic tissue. It has been estimated that approximately 25% of all malignancies are initiated or exacerbated by inflammation caused by infectious agents. Furthermore, inflammation is linked to all of the six hallmarks of cancer (evasion of apoptosis, insensitivity to anti-growth signals, unlimited replicative potential, angiogenesis, increase in survival factors and invasion and metastasis). It is thought that inflammation may play a critical role in lung carcinogenesis given that individuals with inflammatory lung conditions have an increased risk of lung cancer development. Cigarette smoking can also induce inflammation in the lung and smokers are at a higher risk of developing lung cancer than non-smokers. However, exposure to a number of environmental agents such as radon, have also been demonstrated as a causative factor in this disease. This chapter will focus on inflammation as a contributory factor in non small cell lung cancer (NSCLC), concentrating primarily on the pathological involvement of the pro-inflammatory cytokines, TNF-α, IL-1β, and the CXC (ELR+) chemokine family. Targeting of inflammatory mediators will also be discussed as a therapeutic strategy in this disease. © 2013 by Nova Science Publishers, Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Long-term inhalation studies in rodents have presented unequivocal evidence of experimental carcinogenicity of ethylene oxide, based on the formation of malignant tumors at multiple sites. However, despite a considerable body of epidemiological data only limited evidence has been obtained of its carcinogenicity in humans. Ethylene oxide is not only an important exogenous toxicant, but it is also formed from ethylene as a biological precursor. Ethylene is a normal body constituent; its endogenous formation is evidenced by exhalation in rats and in humans. Consequently, ethylene oxide must also be regarded as a physiological compound. The most abundant DNA adduct of ethylene oxide is 7-(2-hydroxyethyl)guanine (HOEtG). Open questions are the nature and role of tissue-specific factors in ethylene oxide carcinogenesis and the physiological and quantitative role of DNA repair mechanisms. The detection of remarkable individual differences in the susceptibility of humans has promoted research into genetic factors that influence the metabolism of ethylene oxide. With this background it appears that current PBPK models for trans-species extrapolation of ethylene oxide toxicity need to be refined further. For a cancer risk assessment at low levels of DNA damage, exposure-related adducts must be discussed in relation to background DNA damage as well as to inter- and intraindividual variability. In rats, subacute ethylene oxide exposures on the order of 1 ppm (1.83 mg/m3) cause DNA adduct levels (HOEtG) of the same magnitude as produced by endogenous ethylene oxide. Based on very recent studies the endogenous background levels of HOEtG in DNA of humans are comparable to those that are produced in rodents by repetitive exogenous ethylene oxide exposures of about 10 ppm (18.3 mg/m3). Experimentally, ethylene oxide has revealed only weak mutagenic effects in vivo, which are confined to higher doses. It has been concluded that long-term human occupational exposure to low airborne concentrations to ethylene oxide, at or below current occupational exposure limits of 1 ppm (1.83 mg/m3), would not produce unacceptable increased genotoxic risks. However, critical questions remain that need further discussions relating to the coherence of animal and human data of experimental data in vitro vs. in vivo and to species-specific dynamics of DNA lesions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diabetic macular edema (DME) is one of the most common causes of visual loss among diabetes mellitus patients. Early detection and successive treatment may improve the visual acuity. DME is mainly graded into non-clinically significant macular edema (NCSME) and clinically significant macular edema according to the location of hard exudates in the macula region. DME can be identified by manual examination of fundus images. It is laborious and resource intensive. Hence, in this work, automated grading of DME is proposed using higher-order spectra (HOS) of Radon transform projections of the fundus images. We have used third-order cumulants and bispectrum magnitude, in this work, as features, and compared their performance. They can capture subtle changes in the fundus image. Spectral regression discriminant analysis (SRDA) reduces feature dimension, and minimum redundancy maximum relevance method is used to rank the significant SRDA components. Ranked features are fed to various supervised classifiers, viz. Naive Bayes, AdaBoost and support vector machine, to discriminate No DME, NCSME and clinically significant macular edema classes. The performance of our system is evaluated using the publicly available MESSIDOR dataset (300 images) and also verified with a local dataset (300 images). Our results show that HOS cumulants and bispectrum magnitude obtained an average accuracy of 95.56 and 94.39 % for MESSIDOR dataset and 95.93 and 93.33 % for local dataset, respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Characterization of indoor air quality in school classrooms is crucial to children’s health and performance. The present study was undertaken to characterize the indoor air quality in six naturally ventilated classrooms of three schools in Cassino (Italy). Indoor particle number, mass, black carbon, CO2 and radon concentrations, as well as outdoor particle number were measured within school hours during the winter and spring season. The study found the concentrations of indoor particle number were influenced by the concentrations in the outdoors; highest BC values were detected in classrooms during peak traffic time. The effect of different seasons’ airing mode on the indoor air quality was also detected. The ratio between indoor and outdoor particles was of 0.85 ± 0.10 in winter, under airing conditions of short opening window periods, and 1.00 ± 0.15 in spring when the windows were opened for longer periods. This was associated to a higher degree of penetration of outdoor particles due to longer period of window opening. Lower CO2 levels were found in classrooms in spring (908 ppm) than in winter (2206 ppm). Additionally, a greater reduction in radon concentrations was found in spring. In addition, high PM10 levels were found in classrooms during break time due to re-suspension of coarse particles. Keywords: classroom; Ni/Nout ratio; airing by opening windows; particle number

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spirometry is the most widely used lung function test in the world. It is fundamental in diagnostic and functional evaluation of various pulmonary diseases. In the studies described in this thesis, the spirometric assessment of reversibility of bronchial obstruction, its determinants, and variation features are described in a general population sample from Helsinki, Finland. This study is a part of the FinEsS study, which is a collaborative study of clinical epidemiology of respiratory health between Finland (Fin), Estonia (Es), and Sweden (S). Asthma and chronic obstructive pulmonary disease (COPD) constitute the two major obstructive airways diseases. The prevalence of asthma has increased, with around 6% of the population in Helsinki reporting physician-diagnosed asthma. The main cause of COPD is smoking with changes in smoking habits in the population affecting its prevalence with a delay. Whereas airway obstruction in asthma is by definition reversible, COPD is characterized by fixed obstruction. Cough and sputum production, the first symptoms of COPD, are often misinterpreted for smokers cough and not recognized as first signs of a chronic illness. Therefore COPD is widely underdiagnosed. More extensive use of spirometry in primary care is advocated to focus smoking cessation interventions on populations at risk. The use of forced expiratory volume in six seconds (FEV6) instead of forced vital capacity (FVC) has been suggested to enable office spirometry to be used in earlier detection of airflow limitation. Despite being a widely accepted standard method of assessment of lung function, the methodology and interpretation of spirometry are constantly developing. In 2005, the ATS/ERS Task Force issued a joint statement which endorsed the 12% and 200 ml thresholds for significant change in forced expiratory volume in one second (FEV1) or FVC during bronchodilation testing, but included the notion that in cases where only FVC improves it should be verified that this is not caused by a longer exhalation time in post-bronchodilator spirometry. This elicited new interest in the assessment of forced expiratory time (FET), a spirometric variable not usually reported or used in assessment. In this population sample, we examined FET and found it to be on average 10.7 (SD 4.3) s and to increase with ageing and airflow limitation in spirometry. The intrasession repeatability of FET was the poorest of the spirometric variables assessed. Based on the intrasession repeatability, a limit for significant change of 3 s was suggested for FET during bronchodilation testing. FEV6 was found to perform equally well as FVC in the population and in a subgroup of subjects with airways obstruction. In the bronchodilation test, decreases were frequently observed in FEV1 and particularly in FVC. The limit of significant increase based on the 95th percentile of the population sample was 9% for FEV1 and 6% for FEV6 and FVC; these are slightly lower than the current limits for single bronchodilation tests (ATS/ERS guidelines). FEV6 was proven as a valid alternative to FVC also in the bronchodilation test and would remove the need to control duration of exhalation during the spirometric bronchodilation test.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technical or contaminated ethanol products are sometimes ingested either accidentally or on purpose. Typical misused products are black-market liquor and automotive products, e.g., windshield washer fluids. In addition to less toxic solvents, these liquids may contain the deadly methanol. Symptoms of even lethal solvent poisoning are often non-specific at the early stage. The present series of studies was carried out to develop a method for solvent intoxication breath diagnostics to speed up the diagnosis procedure conventionally based on blood tests. Especially in the case of methanol ingestion, the analysis method should be sufficiently sensitive and accurate to determine the presence of even small amounts of methanol from the mixture of ethanol and other less-toxic components. In addition to the studies on the FT-IR method, the Dräger 7110 evidential breath analyzer was examined to determine its ability to reveal a coexisting toxic solvent. An industrial Fourier transform infrared analyzer was modified for breath testing. The sample cell fittings were widened and the cell size reduced in order to get an alveolar sample directly from a single exhalation. The performance and the feasibility of the Gasmet FT-IR analyzer were tested in clinical settings and in the laboratory. Actual human breath screening studies were carried out with healthy volunteers, inebriated homeless men, emergency room patients and methanol-intoxicated patients. A number of the breath analysis results were compared to blood test results in order to approximate the blood-breath relationship. In the laboratory experiments, the analytical performance of the Gasmet FT-IR analyzer and Dräger 7110 evidential breath analyzer was evaluated by means of artificial samples resembling exhaled breath. The investigations demonstrated that a successful breath ethanol analysis by Dräger 7110 evidential breath analyzer could exclude any significant methanol intoxication. In contrast, the device did not detect very high levels of acetone, 1-propanol and 2-propanol in simulated breath. The Dräger 7110 evidential breath ethanol analyzer was not equipped to recognize the interfering component. According to the studies the Gasmet FT-IR analyzer was adequately sensitive, selective and accurate for solvent intoxication diagnostics. In addition to diagnostics, the fast breath solvent analysis proved feasible for controlling the ethanol and methanol concentration during haemodialysis treatment. Because of the simplicity of the sampling and analysis procedure, non-laboratory personnel, such as police officers or social workers, could also operate the analyzer for screening purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research has been prompted by an interest in the atmospheric processes of hydrogen. The sources and sinks of hydrogen are important to know, particularly if hydrogen becomes more common as a replacement for fossil fuel in combustion. Hydrogen deposition velocities (vd) were estimated by applying chamber measurements, a radon tracer method and a two-dimensional model. These three approaches were compared with each other to discover the factors affecting the soil uptake rate. A static-closed chamber technique was introduced to determine the hydrogen deposition velocity values in an urban park in Helsinki, and at a rural site at Loppi. A three-day chamber campaign to carry out soil uptake estimation was held at a remote site at Pallas in 2007 and 2008. The atmospheric mixing ratio of molecular hydrogen has also been measured by a continuous method in Helsinki in 2007 - 2008 and at Pallas from 2006 onwards. The mean vd values measured in the chamber experiments in Helsinki and Loppi were between 0.0 and 0.7 mm s-1. The ranges of the results with the radon tracer method and the two-dimensional model were 0.13 - 0.93 mm s-1 and 0.12 - 0.61 mm s-1, respectively, in Helsinki. The vd values in the three-day campaign at Pallas were 0.06 - 0.52 mm s-1 (chamber) and 0.18 - 0.52 mm s-1 (radon tracer method and two-dimensional model). At Kumpula, the radon tracer method and the chamber measurements produced higher vd values than the two-dimensional model. The results of all three methods were close to each other between November and April, except for the chamber results from January to March, while the soil was frozen. The hydrogen deposition velocity values of all three methods were compared with one-week cumulative rain sums. Precipitation increases the soil moisture, which decreases the soil uptake rate. The measurements made in snow seasons showed that a thick snow layer also hindered gas diffusion, lowering the vd values. The H2 vd values were compared to the snow depth. A decaying exponential fit was obtained as a result. During a prolonged drought in summer 2006, soil moisture values were lower than in other summer months between 2005 and 2008. Such conditions were prevailing in summer 2006 when high chamber vd values were measured. The mixing ratio of molecular hydrogen has a seasonal variation. The lowest atmospheric mixing ratios were found in the late autumn when high deposition velocity values were still being measured. The carbon monoxide (CO) mixing ratio was also measured. Hydrogen and carbon monoxide are highly correlated in an urban environment, due to the emissions originating from traffic. After correction for the soil deposition of H2, the slope was 0.49±0.07 ppb (H2) / ppb (CO). Using the corrected hydrogen-to-carbon-monoxide ratio, the total hydrogen load emitted by Helsinki traffic in 2007 was 261 t (H2) a-1. Hydrogen, methane and carbon monoxide are connected with each other through the atmospheric methane oxidation process, in which formaldehyde is produced as an important intermediate. The photochemical degradation of formaldehyde produces hydrogen and carbon monoxide as end products. Examination of back-trajectories revealed long-range transportation of carbon monoxide and methane. The trajectories can be grouped by applying cluster and source analysis methods. Thus natural and anthropogenic emission sources can be separated by analyzing trajectory clusters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Floating in the air that surrounds us is a number of small particles, invisible to the human eye. The mixture of air and particles, liquid or solid, is called an aerosol. Aerosols have significant effects on air quality, visibility and health, and on the Earth's climate. Their effect on the Earth's climate is the least understood of climatically relevant effects. They can scatter the incoming radiation from the Sun, or they can act as seeds onto which cloud droplets are formed. Aerosol particles are created directly, by human activity or natural reasons such as breaking ocean waves or sandstorms. They can also be created indirectly as vapors or very small particles are emitted into the atmosphere and they combine to form small particles that later grow to reach climatically or health relevant sizes. The mechanisms through which those particles are formed is still under scientific discussion, even though this knowledge is crucial to make air quality or climate predictions, or to understand how aerosols will influence and will be influenced by the climate's feedback loops. One of the proposed mechanisms responsible for new particle formation is ion-induced nucleation. This mechanism is based on the idea that newly formed particles were ultimately formed around an electric charge. The amount of available charges in the atmosphere varies depending on radon concentrations in the soil and in the air, as well as incoming ionizing radiation from outer space. In this thesis, ion-induced nucleation is investigated through long-term measurements in two different environments: in the background site of Hyytiälä and in the urban site that is Helsinki. The main conclusion of this thesis is that ion-induced nucleation generally plays a minor role in new particle formation. The fraction of particles formed varies from day to day and from place to place. The relative importance of ion-induced nucleation, i.e. the fraction of particles formed through ion-induced nucleation, is bigger in cleaner areas where the absolute number of particles formed is smaller. Moreover, ion-induced nucleation contributes to a bigger fraction of particles on warmer days, when the sulfuric acid and water vapor saturation ratios are lower. This analysis will help to understand the feedbacks associated with climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aerosol particles have effect on climate, visibility, air quality and human health. However, the strength of which aerosol particles affect our everyday life is not well described or entirely understood. Therefore, investigations of different processes and phenomena including e.g. primary particle sources, initial steps of secondary particle formation and growth, significance of charged particles in particle formation, as well as redistribution mechanisms in the atmosphere are required. In this work sources, sinks and concentrations of air ions (charged molecules, cluster and particles) were investigated directly by measuring air molecule ionising components (i.e. radon activity concentrations and external radiation dose rates) and charged particle size distributions, as well as based on literature review. The obtained results gave comprehensive and valuable picture of the spatial and temporal variation of the air ion sources, sinks and concentrations to use as input parameters in local and global scale climate models. Newly developed air ion spectrometers (Airel Ltd.) offered a possibility to investigate atmospheric (charged) particle formation and growth at sub-3 nm sizes. Therefore, new visual classification schemes for charged particle formation events were developed, and a newly developed particle growth rate method was tested with over one year dataset. These data analysis methods have been widely utilised by other researchers since introducing them. This thesis resulted interesting characteristics of atmospheric particle formation and growth: e.g. particle growth may sometimes be suppressed before detection limit (~ 3 nm) of traditional aerosol instruments, particle formation may take place during daytime as well as in the evening, growth rates of sub-3 nm particles were quite constant throughout the year while growth rates of larger particles (3-20 nm in diameter) were higher during summer compared to winter. These observations were thought to be a consequence of availability of condensing vapours. The observations of this thesis offered new understanding of the particle formation in the atmosphere. However, the role of ions in particle formation, which is not well understood with current knowledge, requires further research in future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a number of applications of computerized tomography, the ultimate goal is to detect and characterize objects within a cross section. Detection of edges of different contrast regions yields the required information. The problem of detecting edges from projection data is addressed. It is shown that the class of linear edge detection operators used on images can be used for detection of edges directly from projection data. This not only reduces the computational burden but also avoids the difficulties of postprocessing a reconstructed image. This is accomplished by a convolution backprojection operation. For example, with the Marr-Hildreth edge detection operator, the filtering function that is to be used on the projection data is the Radon transform of the Laplacian of the 2-D Gaussian function which is combined with the reconstruction filter. Simulation results showing the efficacy of the proposed method and a comparison with edges detected from the reconstructed image are presented