57 resultados para Spectral Feature Extraction


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study investigates the spatial and spectral discrimination potential for grassland patches in the inner Turku Archipelago using Landsat Thematic Mapper satellite imagery. The spatial discrimination potential was computed through overlay analysis using official grassland parcel data and a hypothetical 30 m resolution satellite image capturing the site. It found that Landsat TM imagery’s ability to retrieve pure or near-pure pixels (90% purity or more) from grassland patches smaller than 1 hectare was limited to 13% success, compared to 52% success when upscaling the resolution to 10 x 10 m pixel size. Additionally, the perimeter/area patch metric is proposed as a predictor for the suitability of the spatial resolution of input imagery. Regression analysis showed that there is a strong negative correlation between a patch’s perimeter/area ratio and its pure pixel potential. The study goes on to characterise the spectral response and discrimination potential for the five main grassland types occurring in the study area: recreational grassland, traditional pasture, modern pasture, fodder production grassland and overgrown grassland. This was done through the construction of spectral response curves, a coincident spectral plot and a contingency matrix as well as by calculating the transformed divergence for the spectral signatures, all based on training samples from the TM imagery. Substantial differences in spectral discrimination potential between imagery from the beginning of the growing season and the middle of summer were found. This is because the spectral responses for these five grassland types converge as the peak of the growing season draws nearer. Recreational grassland shows a consistent discrimination advantage over other grassland types, whereas modern pasture is most easily confused. Traditional pasture land, perhaps the most biologically valuable grassland type, can be spectrally discriminated from other grassland types with satisfactory success rates provided early growing season imagery is used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Local features are used in many computer vision tasks including visual object categorization, content-based image retrieval and object recognition to mention a few. Local features are points, blobs or regions in images that are extracted using a local feature detector. To make use of extracted local features the localized interest points are described using a local feature descriptor. A descriptor histogram vector is a compact representation of an image and can be used for searching and matching images in databases. In this thesis the performance of local feature detectors and descriptors is evaluated for object class detection task. Features are extracted from image samples belonging to several object classes. Matching features are then searched using random image pairs of a same class. The goal of this thesis is to find out what are the best detector and descriptor methods for such task in terms of detector repeatability and descriptor matching rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the increase of use of digital media the need for the methods of multimedia protection becomes extremely important. The number of the solutions to the problem from encryption to watermarking is large and is growing every year. In this work digital image watermarking is considered, specifically a novel method of digital watermarking of color and spectral images. An overview of existing methods watermarking of color and grayscale images is given in the paper. Methods using independent component analysis (ICA) for detection and the ones using discrete wavelet transform (DWT) and discrete cosine transform (DCT) are considered in more detail. A novel method of watermarking proposed in this paper allows embedding of a color or spectral watermark image into color or spectral image consequently and successful extraction of the watermark out of the resultant watermarked image. A number of experiments have been performed on the quality of extraction depending on the parameters of the embedding procedure. Another set of experiments included the test of the robustness of the algorithm proposed. Three techniques have been chosen for that purpose: median filter, low-pass filter (LPF) and discrete cosine transform (DCT), which are a part of a widely known StirMark - Image Watermarking Robustness Test. The study shows that the proposed watermarking technique is fragile, i.e. watermark is altered by simple image processing operations. Moreover, we have found that the contents of the image to be watermarked do not affect the quality of the extraction. Mixing coefficients, that determine the amount of the key and watermark image in the result, should not exceed 1% of the original. The algorithm proposed has proven to be successful in the task of watermark embedding and extraction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, feature selection in classification based problems is highlighted. The role of feature selection methods is to select important features by discarding redundant and irrelevant features in the data set, we investigated this case by using fuzzy entropy measures. We developed fuzzy entropy based feature selection method using Yu's similarity and test this using similarity classifier. As the similarity classifier we used Yu's similarity, we tested our similarity on the real world data set which is dermatological data set. By performing feature selection based on fuzzy entropy measures before classification on our data set the empirical results were very promising, the highest classification accuracy of 98.83% was achieved when testing our similarity measure to the data set. The achieved results were then compared with some other results previously obtained using different similarity classifiers, the obtained results show better accuracy than the one achieved before. The used methods helped to reduce the dimensionality of the used data set, to speed up the computation time of a learning algorithm and therefore have simplified the classification task

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Singular Value Decomposition (SVD), Principal Component Analysis (PCA) and Multiple Linear Regression (MLR) are some of the mathematical pre- liminaries that are discussed prior to explaining PLS and PCR models. Both PLS and PCR are applied to real spectral data and their di erences and similarities are discussed in this thesis. The challenge lies in establishing the optimum number of components to be included in either of the models but this has been overcome by using various diagnostic tools suggested in this thesis. Correspondence analysis (CA) and PLS were applied to ecological data. The idea of CA was to correlate the macrophytes species and lakes. The di erences between PLS model for ecological data and PLS for spectral data are noted and explained in this thesis. i

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Green IT is a term that covers various tasks and concepts that are related to reducing the environmental impact of IT. At enterprise level, Green IT has significant potential to generate sustainable cost savings: the total amount of devices is growing and electricity prices are rising. The lifecycle of a computer can be made more environmentally sustainable using Green IT, e.g. by using energy efficient components and by implementing device power management. The challenge using power management at enterprise level is how to measure and follow-up the impact of power management policies? During the thesis a power management feature was developed to a configuration management system. The feature can be used to automatically power down and power on PCs using a pre-defined schedule and to estimate the total power usage of devices. Measurements indicate that using the feature the device power consumption can be monitored quite precisely and the power consumption can be reduced, which generates electricity cost savings and reduces the environmental impact of IT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable detection of intrapartum fetal acidosis is crucial for preventing morbidity. Hypoxia-related changes of fetal heart rate variability (FHRV) are controlled by the autonomic nervous system. Subtle changes in FHRV that cannot be identified by inspection can be detected and quantified by power spectral analysis. Sympathetic activity relates to low-frequency FHRV and parasympathetic activity to both low- and high-frequency FHRV. The aim was to study whether intra partum fetal acidosis can be detected by analyzing spectral powers of FHRV, and whether spectral powers associate with hypoxia-induced changes in the fetal electrocardiogram and with the pH of fetal blood samples taken intrapartum. The FHRV of 817 R-R interval recordings, collected as a part of European multicenter studies, were analyzed. Acidosis was defined as cord pH ≤ 7.05 or scalp pH ≤ 7.20, and metabolic acidosis as cord pH ≤ 7.05 and base deficit ≥ 12 mmol/l. Intrapartum hypoxia increased the spectral powers of FHRV. As fetal acidosis deepened, FHRV decreased: fetuses with significant birth acidosis had, after an initial increase, a drop in spectral powers near delivery, suggesting a breakdown of fetal compensation. Furthermore, a change in excess of 30% of the low-to-high frequency ratio of FHRV was associated with fetal metabolic acidosis. The results suggest that a decrease in the spectral powers of FHRV signals concern for fetal wellbeing. A single measure alone cannot be used to reveal fetal hypoxia since the spectral powers vary widely intra-individually. With technical developments, continuous assessment of intra-individual changes in spectral powers of FHRV might aid in the detection of fetal compromise due to hypoxia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Energy efficiency is one of the major objectives which should be achieved in order to implement the limited energy resources of the world in a sustainable way. Since radiative heat transfer is the dominant heat transfer mechanism in most of fossil fuel combustion systems, more accurate insight and models may cause improvement in the energy efficiency of the new designed combustion systems. The radiative properties of combustion gases are highly wavelength dependent. Better models for calculating the radiative properties of combustion gases are highly required in the modeling of large scale industrial combustion systems. With detailed knowledge of spectral radiative properties of gases, the modeling of combustion processes in the different applications can be more accurate. In order to propose a new method for effective non gray modeling of radiative heat transfer in combustion systems, different models for the spectral properties of gases including SNBM, EWBM, and WSGGM have been studied in this research. Using this detailed analysis of different approaches, the thesis presents new methods for gray and non gray radiative heat transfer modeling in homogeneous and inhomogeneous H2O–CO2 mixtures at atmospheric pressure. The proposed method is able to support the modeling of a wide range of combustion systems including the oxy-fired combustion scenario. The new methods are based on implementing some pre-obtained correlations for the total emissivity and band absorption coefficient of H2O–CO2 mixtures in different temperatures, gas compositions, and optical path lengths. They can be easily used within any commercial CFD software for radiative heat transfer modeling resulting in more accurate, simple, and fast calculations. The new methods were successfully used in CFD modeling by applying them to industrial scale backpass channel under oxy-fired conditions. The developed approaches are more accurate compared with other methods; moreover, they can provide complete explanation and detailed analysis of the radiation heat transfer in different systems under different combustion conditions. The methods were verified by applying them to some benchmarks, and they showed a good level of accuracy and computational speed compared to other methods. Furthermore, the implementation of the suggested banded approach in CFD software is very easy and straightforward.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Separation of carboxylic acids from aqueous streams is an important part of their manufacturing process. The aqueous solutions are usually dilute containing less than 10 % acids. Separation by distillation is difficult as the boiling points of acids are only marginally higher than that of water. Because of this distillation is not only difficult but also expensive due to the evaporation of large amounts of water. Carboxylic acids have traditionally been precipitated as calcium salts. The yields of these processes are usually relatively low and the chemical costs high. Especially the decomposition of calcium salts with sulfuric acid produces large amounts of calcium sulfate sludge. Solvent extraction has been studied as an alternative method for recovery of carboxylic acids. Solvent extraction is based on mixing of two immiscible liquids and the transfer of the wanted components form one liquid to another due to equilibrium difference. In the case of carboxylic acids, the acids are transferred from aqueous phase to organic solvent due to physical and chemical interactions. The acids and the extractant form complexes which are soluble in the organic phase. The extraction efficiency is affected by many factors, for instance initial acid concentration, type and concentration of the extractant, pH, temperature and extraction time. In this paper, the effects of initial acid concentration, type of extractant and temperature on extraction efficiency were studied. As carboxylic acids are usually the products of the processes, they are wanted to be recovered. Hence the acids have to be removed from the organic phase after the extraction. The removal of acids from the organic phase also regenerates the extractant which can be then recycled in the process. The regeneration of the extractant was studied by back-extracting i.e. stripping the acids form the organic solution into diluent sodium hydroxide solution. In the solvent regeneration, the regenerability of different extractants and the effect of initial acid concentration and temperature were studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Changes in the electroencephalography (EEG) signal have been used to study the effects of anesthetic agents on the brain function. Several commercial EEG based anesthesia depth monitors have been developed to measure the level of the hypnotic component of anesthesia. Specific anesthetic related changes can be seen in the EEG, but still it remains difficult to determine whether the subject is consciousness or not during anesthesia. EEG reactivity to external stimuli may be seen in unconsciousness subjects, in anesthesia or even in coma. Changes in regional cerebral blood flow, which can be measured with positron emission tomography (PET), can be used as a surrogate for changes in neuronal activity. The aim of this study was to investigate the effects of dexmedetomidine, propofol, sevoflurane and xenon on the EEG and the behavior of two commercial anesthesia depth monitors, Bispectral Index (BIS) and Entropy. Slowly escalating drug concentrations were used with dexmedetomidine, propofol and sevoflurane. EEG reactivity at clinically determined similar level of consciousness was studied and the performance of BIS and Entropy in differentiating consciousness form unconsciousness was evaluated. Changes in brain activity during emergence from dexmedetomidine and propofol induced unconsciousness were studied using PET imaging. Additionally, the effects of normobaric hyperoxia, induced during denitrogenation prior to xenon anesthesia induction, on the EEG were studied. Dexmedetomidine and propofol caused increases in the low frequency, high amplitude (delta 0.5-4 Hz and theta 4.1-8 Hz) EEG activity during stepwise increased drug concentrations from the awake state to unconsciousness. With sevoflurane, an increase in delta activity was also seen, and an increase in alpha- slow beta (8.1-15 Hz) band power was seen in both propofol and sevoflurane. EEG reactivity to a verbal command in the unconsciousness state was best retained with propofol, and almost disappeared with sevoflurane. The ability of BIS and Entropy to differentiate consciousness from unconsciousness was poor. At the emergence from dexmedetomidine and propofol induced unconsciousness, activation was detected in deep brain structures, but not within the cortex. In xenon anesthesia, EEG band powers increased in delta, theta and alpha (8-12Hz) frequencies. In steady state xenon anesthesia, BIS and Entropy indices were low and these monitors seemed to work well in xenon anesthesia. Normobaric hyperoxia alone did not cause changes in the EEG. All of these results are based on studies in healthy volunteers and their application to clinical practice should be considered carefully.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Developing software is a difficult and error-prone activity. Furthermore, the complexity of modern computer applications is significant. Hence,an organised approach to software construction is crucial. Stepwise Feature Introduction – created by R.-J. Back – is a development paradigm, in which software is constructed by adding functionality in small increments. The resulting code has an organised, layered structure and can be easily reused. Moreover, the interaction with the users of the software and the correctness concerns are essential elements of the development process, contributing to high quality and functionality of the final product. The paradigm of Stepwise Feature Introduction has been successfully applied in an academic environment, to a number of small-scale developments. The thesis examines the paradigm and its suitability to construction of large and complex software systems by focusing on the development of two software systems of significant complexity. Throughout the thesis we propose a number of improvements and modifications that should be applied to the paradigm when developing or reengineering large and complex software systems. The discussion in the thesis covers various aspects of software development that relate to Stepwise Feature Introduction. More specifically, we evaluate the paradigm based on the common practices of object-oriented programming and design and agile development methodologies. We also outline the strategy to testing systems built with the paradigm of Stepwise Feature Introduction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The major type of non-cellulosic polysaccharides (hemicelluloses) in softwoods, the partly acetylated galactoglucomannans (GGMs), which comprise about 15% of spruce wood, have attracted growing interest because of their potential to become high-value products with applications in many areas. The main objective of this work was to explore the possibilities to extract galactoglucomannans in native, polymeric form in high yield from spruce wood with pressurised hot-water, and to obtain a deeper understanding of the process chemistry involved. Spruce (Picea abies) chips and ground wood particles were extracted using an accelerated solvent extractor (ASE) in the temperature range 160 – 180°C. Detailed chemical analyses were done on both the water extracts and the wood residues. As much as 80 – 90% of the GGMs in spruce wood, i.e. about 13% based on the original wood, could be extracted from ground spruce wood with pure water at 170 – 180°C with an extraction time of 60 min. GGMs comprised about 75% of the extracted carbohydrates and about 60% of the total dissolved solids. Other substances in the water extracts were xylans, arabinogalactans, pectins, lignin and acetic acid. The yields from chips were only about 60% of that from ground wood. Both the GGMs and other non-cellulosic polysaccharides were extensively hydrolysed at severe extraction conditions when pH dropped to the level of 3.5. Addition of sodium bicarbonate increased the yields of polymeric GGMs at low additions, 2.5 – 5 mM, where the end pH remained around 3.9. However, at higher addition levels the yields decreased, mainly because the acetyl groups in GGMs were split off, leading to a low solubility of GGMs. Extraction with buffered water in the pH range 3.8 – 4.4 gave similar yields as with plain water, but gave a higher yield of polymeric GGMs. Moreover, at these pH levels the hydrolysis of acetyl groups in GGMs was significantly inhibited. It was concluded that hot-water extraction of polymeric GGMs in good yields (up to 8% of wood) demands appropriate control of pH, in a narrow range about 4. These results were supported by a study of hydrolysis of GGM at constant pH in the range of 3.8 – 4.2 where a kinetic model for degradation of GGM was developed. The influence of wood particle size on hot-water extraction was studied with particles in the range of 0.1 – 2 mm. The smallest particles (< 0.1 mm) gave 20 – 40% higher total yield than the coarsest particles (1.25 – 2 mm). The difference was greatest at short extraction times. The results indicated that extraction of GGMs and other polysaccharides is limited mainly by the mass transfer in the fibre wall, and for coarse wood particles also in the wood matrix. Spruce sapwood, heartwood and thermomechnical pulp were also compared, but only small differences in yields and composition of extracts were found. Two methods for isolation and purification of polymeric GGMs, i.e. membrane filtration and precipitation in ethanol-water, were compared. Filtration through a series of membranes with different pore sizes separated GGMs of different molar masses, from polymers to oligomers. Polysaccharides with molar mass higher than 4 kDa were precipitated in ethanol-water. GGMs comprised about 80% of the precipitated polysaccharides. Other polysaccharides were mainly arabinoglucuronoxylans and pectins. The ethanol-precipitated GGMs were by 13C NMR spectroscopy verified to be very similar to GGMs extracted from spruce wood in low yield at a much lower temperature, 90°C. The obtained large body of experimental data could be utilised for further kinetic and economic calculations to optimise technical hot-water extractionof softwoods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Olemassa olevat spektrieromittarit eivät vastaa riittävästi CIEDE2000-värieroa. Tämän työn tavoitteena oli toteuttaa menetelmä, joka laskee värispektrien eron siten, että tulos vastaa CIEDE2000-värieroa. Kehitystyön tuloksena syntyi menetelmä, joka perustuu ennalta laskettuihin eroihin tunnettujen spektrien välillä ja niiden perusteella johdettuihin laskentaparametreihin. menetelmällä pystyy laskemaan spektrieroja vain niiden spektrien välillä, jotka saadaan sekoittamalla tunnettuja spektrejä. Laskentaparametrien laskenta on työläs prosessi ja siksi menetelmään toteutettiin hajautus usealle tietokoneelle. Menetelmä saatiin vastaamaan hyvin CIEDE2000:ia suurimmalle osalle spektrejä harvoja poikkeuksia lukuunottamatta. Ongelmat johtuvat mallissa olevasta matemaattisesta ominaisuudesta. Spektrieromittari näyttää metameerisille spektreille nollasta poikkeavan arvon, vaikka CIEDE2000 näyttää nollaa. Tämä osoittaa spektrieromittarin oikeamman toiminnan CIEDE2000-värieroon verrattuna.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Työssä käydään läpi tukivektorikoneiden teoreettista pohjaa sekä tutkitaan eri parametrien vaikutusta spektridatan luokitteluun.