29 resultados para Metadata Extraction
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Selostus: Alumiini- ja rautaoksidien fosforikyllästysasteen arvioiminen suomalaisista peltomaista
Resumo:
Summary
Resumo:
Tässä diplomityössä tutkitaan tekniikoita, joillavesileima lisätään spektrikuvaan, ja menetelmiä, joilla vesileimat tunnistetaanja havaitaan spektrikuvista. PCA (Principal Component Analysis) -algoritmia käyttäen alkuperäisten kuvien spektriulottuvuutta vähennettiin. Vesileiman lisääminen spektrikuvaan suoritettiin muunnosavaruudessa. Ehdotetun mallin mukaisesti muunnosavaruuden komponentti korvattiin vesileiman ja toisen muunnosavaruuden komponentin lineaarikombinaatiolla. Lisäyksessä käytettävää parametrijoukkoa tutkittiin. Vesileimattujen kuvien laatu mitattiin ja analysoitiin. Suositukset vesileiman lisäykseen esitettiin. Useita menetelmiä käytettiin vesileimojen tunnistamiseen ja tunnistamisen tulokset analysoitiin. Vesileimojen kyky sietää erilaisia hyökkäyksiä tarkistettiin. Diplomityössä suoritettiin joukko havaitsemis-kokeita ottamalla huomioon vesileiman lisäyksessä käytetyt parametrit. ICA (Independent Component Analysis) -menetelmää pidetään yhtenä mahdollisena vaihtoehtona vesileiman havaitsemisessa.
Resumo:
Perceiving the world visually is a basic act for humans, but for computers it is still an unsolved problem. The variability present innatural environments is an obstacle for effective computer vision. The goal of invariant object recognition is to recognise objects in a digital image despite variations in, for example, pose, lighting or occlusion. In this study, invariant object recognition is considered from the viewpoint of feature extraction. Thedifferences between local and global features are studied with emphasis on Hough transform and Gabor filtering based feature extraction. The methods are examined with respect to four capabilities: generality, invariance, stability, and efficiency. Invariant features are presented using both Hough transform and Gabor filtering. A modified Hough transform technique is also presented where the distortion tolerance is increased by incorporating local information. In addition, methods for decreasing the computational costs of the Hough transform employing parallel processing and local information are introduced.
Resumo:
In this study we used market settlement prices of European call options on stock index futures to extract implied probability distribution function (PDF). The method used produces a PDF of returns of an underlying asset at expiration date from implied volatility smile. With this method, the assumption of lognormal distribution (Black-Scholes model) is tested. The market view of the asset price dynamics can then be used for various purposes (hedging, speculation). We used the so called smoothing approach for implied PDF extraction presented by Shimko (1993). In our analysis we obtained implied volatility smiles from index futures markets (S&P 500 and DAX indices) and standardized them. The method introduced by Breeden and Litzenberger (1978) was then used on PDF extraction. The results show significant deviations from the assumption of lognormal returns for S&P500 options while DAX options mostly fit the lognormal distribution. A deviant subjective view of PDF can be used to form a strategy as discussed in the last section.
Resumo:
The amphiphilic nature of metal extractants causes the formation of micelles and other microscopic aggregates when in contact with water and an organic diluent. These phenomena and their effects on metal extraction were studied using carboxylic acid (Versatic 10) and organophosphorus acid (Cyanex 272) based extractants. Special emphasis was laid on the study of phase behaviour in a pre neutralisation stage when the extractant is transformed to a sodium or ammonium salt form. The pre neutralised extractants were used to extract nickel and to separate cobalt and nickel. Phase diagrams corresponding to the pre neutralisation stage in a metal extraction process were determined. The maximal solubilisation of the components in the system water(NH3)/extractant/isooctane takes place when the molar ratio between the ammonia salt form and the free form of the extractant is 0.5 for the carboxylic acid and 1 for the organophosphorus acid extractant. These values correspond to the complex stoichiometry of NH4A•HA and NIi4A, respectively. When such a solution is contacted with water a microemulsion is formed. If the aqueous phase contains also metal ions (e.g. Ni²+), complexation will take place on the microscopic interface of the micellar aggregates. Experimental evidence showing that the initial stage of nickel extraction with pre neutralised Versatic 10 is a fast pseudohomogeneous reaction was obtained. About 90% of the metal were extracted in the first 15 s after the initial contact. For nickel extraction with pre neutralised Versatic 10 it was found that the highest metal loading and the lowest residual ammonia and water contents in the organic phase are achieved when the feeds are balanced so that the stoichiometry is 2NH4+(org) = Nit2+(aq). In the case of Co/Ni separation using pre neutralised Cyanex 272 the highest separation is achieved when the Co/extractant molar ratio in the feeds is 1 : 4 and at the same time the optimal degree of neutralisation of the Cyanex 272 is about 50%. The adsorption of the extractants on solid surfaces may cause accumulation of solid fine particles at the interface between the aqueous and organic phases in metal extraction processes. Copper extraction processes are known to suffer of this problem. Experiments were carried out using model silica and mica particles. It was found that high copper loading, aromacity of the diluent, modification agents and the presence of aqueous phase decrease the adsorption of the hydroxyoxime on silica surfaces.
Resumo:
Liquid-liquid extraction is a mass transfer process for recovering the desired components from the liquid streams by contacting it to non-soluble liquid solvent. Literature part of this thesis deals with theory of the liquid-liquid extraction and the main steps of the extraction process design. The experimental part of this thesis investigates the extraction of organic acids from aqueous solution. The aim was to find the optimal solvent for recovering the organic acids from aqueous solutions. The other objective was to test the selected solvent in pilot scale with packed column and compare the effectiveness of the structured and the random packing, the effect of dispersed phase selection and the effect of packing material wettability properties. Experiments showed that selected solvent works well with dilute organic acid solutions. The random packing proved to be more efficient than the structured packing due to higher hold-up of the dispersed phase. Dispersing the phase that is present in larger volume proved to more efficient. With the random packing the material that was wetted by the dispersed phase was more efficient due to higher hold-up of the dispersed phase. According the literature, the behavior is usually opposite.
Resumo:
This thesis consists of three main theoretical themes: quality of data, success of information systems, and metadata in data warehousing. Loosely defined, metadata is descriptive data about data, and, in this thesis, master data means reference data about customers, products etc. The objective of the thesis is to contribute to an implementation of a metadata management solution for an industrial enterprise. The metadata system incorporates a repository, integration, delivery and access tools, as well as semantic rules and procedures for master data maintenance. It targets to improve maintenance processes and quality of hierarchical master data in the case company’s informational systems. That should bring benefits to whole organization in improved information quality, especially in cross-system data consistency, and in more efficient and effective data management processes. As the result of this thesis, the requirements for the metadata management solution in case were compiled, and the success of the new information system and the implementation project was evaluated.
Resumo:
Metadata in increasing levels of sophistication has been the most powerful concept used in management of unstructured information ever since the first librarian used the Dewey decimal system for library classifications. It remains to be seen, however, what the best approach is to implementing metadata to manage huge volumes of unstructured information in a large organization. Also, once implemented, how is it possible to track whether it is adding value to the company, and whether the implementation has been successful? Existing literature on metadata seems to either focus too much on technical and quality aspects or describe issues with respect to adoption for general information management initiatives. This research therefore, strives to contribute to these gaps: to give a consolidated framework for striving to understand the value added by implementing metadata. The basic methodology used is that of case study, which incorporates aspects of design science, surveys, and interviews in order to provide a holistic approach to quantitative and qualitative analysis of the case. The research identifies the various approaches to implementing metadata, particularly studying the one followed by the unit of analysis of case study, a large company in the Oil and Gas Sector. Of the three approaches identified, the selected company already follows an approach that appears to be superior. The researcher further explores its shortcomings, and proposes a slightly modified approach that can handle them. The research categorically and thoroughly (in context) identifies the top effectiveness criteria, and corresponding key performance indicators(KPIs) that can be measured to understand the level of advancement of the metadata management initiative in the company. In an effort to contrast and have a basis of comparison for the findings, the research also includes views from information managers dealing with core structured data stored in ERPs and other databases. In addition, the results include the basic criteria that can be used to evaluate metrics, in order to classify a metric as a KPI.
Resumo:
Francesca Morsellin esitys Europeana työpajassa 20.11.2012 Helsingissä.
Resumo:
Separation of carboxylic acids from aqueous streams is an important part of their manufacturing process. The aqueous solutions are usually dilute containing less than 10 % acids. Separation by distillation is difficult as the boiling points of acids are only marginally higher than that of water. Because of this distillation is not only difficult but also expensive due to the evaporation of large amounts of water. Carboxylic acids have traditionally been precipitated as calcium salts. The yields of these processes are usually relatively low and the chemical costs high. Especially the decomposition of calcium salts with sulfuric acid produces large amounts of calcium sulfate sludge. Solvent extraction has been studied as an alternative method for recovery of carboxylic acids. Solvent extraction is based on mixing of two immiscible liquids and the transfer of the wanted components form one liquid to another due to equilibrium difference. In the case of carboxylic acids, the acids are transferred from aqueous phase to organic solvent due to physical and chemical interactions. The acids and the extractant form complexes which are soluble in the organic phase. The extraction efficiency is affected by many factors, for instance initial acid concentration, type and concentration of the extractant, pH, temperature and extraction time. In this paper, the effects of initial acid concentration, type of extractant and temperature on extraction efficiency were studied. As carboxylic acids are usually the products of the processes, they are wanted to be recovered. Hence the acids have to be removed from the organic phase after the extraction. The removal of acids from the organic phase also regenerates the extractant which can be then recycled in the process. The regeneration of the extractant was studied by back-extracting i.e. stripping the acids form the organic solution into diluent sodium hydroxide solution. In the solvent regeneration, the regenerability of different extractants and the effect of initial acid concentration and temperature were studied.
Resumo:
The major type of non-cellulosic polysaccharides (hemicelluloses) in softwoods, the partly acetylated galactoglucomannans (GGMs), which comprise about 15% of spruce wood, have attracted growing interest because of their potential to become high-value products with applications in many areas. The main objective of this work was to explore the possibilities to extract galactoglucomannans in native, polymeric form in high yield from spruce wood with pressurised hot-water, and to obtain a deeper understanding of the process chemistry involved. Spruce (Picea abies) chips and ground wood particles were extracted using an accelerated solvent extractor (ASE) in the temperature range 160 – 180°C. Detailed chemical analyses were done on both the water extracts and the wood residues. As much as 80 – 90% of the GGMs in spruce wood, i.e. about 13% based on the original wood, could be extracted from ground spruce wood with pure water at 170 – 180°C with an extraction time of 60 min. GGMs comprised about 75% of the extracted carbohydrates and about 60% of the total dissolved solids. Other substances in the water extracts were xylans, arabinogalactans, pectins, lignin and acetic acid. The yields from chips were only about 60% of that from ground wood. Both the GGMs and other non-cellulosic polysaccharides were extensively hydrolysed at severe extraction conditions when pH dropped to the level of 3.5. Addition of sodium bicarbonate increased the yields of polymeric GGMs at low additions, 2.5 – 5 mM, where the end pH remained around 3.9. However, at higher addition levels the yields decreased, mainly because the acetyl groups in GGMs were split off, leading to a low solubility of GGMs. Extraction with buffered water in the pH range 3.8 – 4.4 gave similar yields as with plain water, but gave a higher yield of polymeric GGMs. Moreover, at these pH levels the hydrolysis of acetyl groups in GGMs was significantly inhibited. It was concluded that hot-water extraction of polymeric GGMs in good yields (up to 8% of wood) demands appropriate control of pH, in a narrow range about 4. These results were supported by a study of hydrolysis of GGM at constant pH in the range of 3.8 – 4.2 where a kinetic model for degradation of GGM was developed. The influence of wood particle size on hot-water extraction was studied with particles in the range of 0.1 – 2 mm. The smallest particles (< 0.1 mm) gave 20 – 40% higher total yield than the coarsest particles (1.25 – 2 mm). The difference was greatest at short extraction times. The results indicated that extraction of GGMs and other polysaccharides is limited mainly by the mass transfer in the fibre wall, and for coarse wood particles also in the wood matrix. Spruce sapwood, heartwood and thermomechnical pulp were also compared, but only small differences in yields and composition of extracts were found. Two methods for isolation and purification of polymeric GGMs, i.e. membrane filtration and precipitation in ethanol-water, were compared. Filtration through a series of membranes with different pore sizes separated GGMs of different molar masses, from polymers to oligomers. Polysaccharides with molar mass higher than 4 kDa were precipitated in ethanol-water. GGMs comprised about 80% of the precipitated polysaccharides. Other polysaccharides were mainly arabinoglucuronoxylans and pectins. The ethanol-precipitated GGMs were by 13C NMR spectroscopy verified to be very similar to GGMs extracted from spruce wood in low yield at a much lower temperature, 90°C. The obtained large body of experimental data could be utilised for further kinetic and economic calculations to optimise technical hot-water extractionof softwoods.
Resumo:
Kristiina Hormia-Poutasen esitys CBUC-konferenssissa Barcelonassa 12.4.2013.