30 resultados para Parameter extraction
Resumo:
Parameter estimation still remains a challenge in many important applications. There is a need to develop methods that utilize achievements in modern computational systems with growing capabilities. Owing to this fact different kinds of Evolutionary Algorithms are becoming an especially perspective field of research. The main aim of this thesis is to explore theoretical aspects of a specific type of Evolutionary Algorithms class, the Differential Evolution (DE) method, and implement this algorithm as codes capable to solve a large range of problems. Matlab, a numerical computing environment provided by MathWorks inc., has been utilized for this purpose. Our implementation empirically demonstrates the benefits of a stochastic optimizers with respect to deterministic optimizers in case of stochastic and chaotic problems. Furthermore, the advanced features of Differential Evolution are discussed as well as taken into account in the Matlab realization. Test "toycase" examples are presented in order to show advantages and disadvantages caused by additional aspects involved in extensions of the basic algorithm. Another aim of this paper is to apply the DE approach to the parameter estimation problem of the system exhibiting chaotic behavior, where the well-known Lorenz system with specific set of parameter values is taken as an example. Finally, the DE approach for estimation of chaotic dynamics is compared to the Ensemble prediction and parameter estimation system (EPPES) approach which was recently proposed as a possible solution for similar problems.
Resumo:
To obtain the desirable accuracy of a robot, there are two techniques available. The first option would be to make the robot match the nominal mathematic model. In other words, the manufacturing and assembling tolerances of every part would be extremely tight so that all of the various parameters would match the “design” or “nominal” values as closely as possible. This method can satisfy most of the accuracy requirements, but the cost would increase dramatically as the accuracy requirement increases. Alternatively, a more cost-effective solution is to build a manipulator with relaxed manufacturing and assembling tolerances. By modifying the mathematical model in the controller, the actual errors of the robot can be compensated. This is the essence of robot calibration. Simply put, robot calibration is the process of defining an appropriate error model and then identifying the various parameter errors that make the error model match the robot as closely as possible. This work focuses on kinematic calibration of a 10 degree-of-freedom (DOF) redundant serial-parallel hybrid robot. The robot consists of a 4-DOF serial mechanism and a 6-DOF hexapod parallel manipulator. The redundant 4-DOF serial structure is used to enlarge workspace and the 6-DOF hexapod manipulator is used to provide high load capabilities and stiffness for the whole structure. The main objective of the study is to develop a suitable calibration method to improve the accuracy of the redundant serial-parallel hybrid robot. To this end, a Denavit–Hartenberg (DH) hybrid error model and a Product-of-Exponential (POE) error model are developed for error modeling of the proposed robot. Furthermore, two kinds of global optimization methods, i.e. the differential-evolution (DE) algorithm and the Markov Chain Monte Carlo (MCMC) algorithm, are employed to identify the parameter errors of the derived error model. A measurement method based on a 3-2-1 wire-based pose estimation system is proposed and implemented in a Solidworks environment to simulate the real experimental validations. Numerical simulations and Solidworks prototype-model validations are carried out on the hybrid robot to verify the effectiveness, accuracy and robustness of the calibration algorithms.
Resumo:
Separation of carboxylic acids from aqueous streams is an important part of their manufacturing process. The aqueous solutions are usually dilute containing less than 10 % acids. Separation by distillation is difficult as the boiling points of acids are only marginally higher than that of water. Because of this distillation is not only difficult but also expensive due to the evaporation of large amounts of water. Carboxylic acids have traditionally been precipitated as calcium salts. The yields of these processes are usually relatively low and the chemical costs high. Especially the decomposition of calcium salts with sulfuric acid produces large amounts of calcium sulfate sludge. Solvent extraction has been studied as an alternative method for recovery of carboxylic acids. Solvent extraction is based on mixing of two immiscible liquids and the transfer of the wanted components form one liquid to another due to equilibrium difference. In the case of carboxylic acids, the acids are transferred from aqueous phase to organic solvent due to physical and chemical interactions. The acids and the extractant form complexes which are soluble in the organic phase. The extraction efficiency is affected by many factors, for instance initial acid concentration, type and concentration of the extractant, pH, temperature and extraction time. In this paper, the effects of initial acid concentration, type of extractant and temperature on extraction efficiency were studied. As carboxylic acids are usually the products of the processes, they are wanted to be recovered. Hence the acids have to be removed from the organic phase after the extraction. The removal of acids from the organic phase also regenerates the extractant which can be then recycled in the process. The regeneration of the extractant was studied by back-extracting i.e. stripping the acids form the organic solution into diluent sodium hydroxide solution. In the solvent regeneration, the regenerability of different extractants and the effect of initial acid concentration and temperature were studied.
Resumo:
The major type of non-cellulosic polysaccharides (hemicelluloses) in softwoods, the partly acetylated galactoglucomannans (GGMs), which comprise about 15% of spruce wood, have attracted growing interest because of their potential to become high-value products with applications in many areas. The main objective of this work was to explore the possibilities to extract galactoglucomannans in native, polymeric form in high yield from spruce wood with pressurised hot-water, and to obtain a deeper understanding of the process chemistry involved. Spruce (Picea abies) chips and ground wood particles were extracted using an accelerated solvent extractor (ASE) in the temperature range 160 – 180°C. Detailed chemical analyses were done on both the water extracts and the wood residues. As much as 80 – 90% of the GGMs in spruce wood, i.e. about 13% based on the original wood, could be extracted from ground spruce wood with pure water at 170 – 180°C with an extraction time of 60 min. GGMs comprised about 75% of the extracted carbohydrates and about 60% of the total dissolved solids. Other substances in the water extracts were xylans, arabinogalactans, pectins, lignin and acetic acid. The yields from chips were only about 60% of that from ground wood. Both the GGMs and other non-cellulosic polysaccharides were extensively hydrolysed at severe extraction conditions when pH dropped to the level of 3.5. Addition of sodium bicarbonate increased the yields of polymeric GGMs at low additions, 2.5 – 5 mM, where the end pH remained around 3.9. However, at higher addition levels the yields decreased, mainly because the acetyl groups in GGMs were split off, leading to a low solubility of GGMs. Extraction with buffered water in the pH range 3.8 – 4.4 gave similar yields as with plain water, but gave a higher yield of polymeric GGMs. Moreover, at these pH levels the hydrolysis of acetyl groups in GGMs was significantly inhibited. It was concluded that hot-water extraction of polymeric GGMs in good yields (up to 8% of wood) demands appropriate control of pH, in a narrow range about 4. These results were supported by a study of hydrolysis of GGM at constant pH in the range of 3.8 – 4.2 where a kinetic model for degradation of GGM was developed. The influence of wood particle size on hot-water extraction was studied with particles in the range of 0.1 – 2 mm. The smallest particles (< 0.1 mm) gave 20 – 40% higher total yield than the coarsest particles (1.25 – 2 mm). The difference was greatest at short extraction times. The results indicated that extraction of GGMs and other polysaccharides is limited mainly by the mass transfer in the fibre wall, and for coarse wood particles also in the wood matrix. Spruce sapwood, heartwood and thermomechnical pulp were also compared, but only small differences in yields and composition of extracts were found. Two methods for isolation and purification of polymeric GGMs, i.e. membrane filtration and precipitation in ethanol-water, were compared. Filtration through a series of membranes with different pore sizes separated GGMs of different molar masses, from polymers to oligomers. Polysaccharides with molar mass higher than 4 kDa were precipitated in ethanol-water. GGMs comprised about 80% of the precipitated polysaccharides. Other polysaccharides were mainly arabinoglucuronoxylans and pectins. The ethanol-precipitated GGMs were by 13C NMR spectroscopy verified to be very similar to GGMs extracted from spruce wood in low yield at a much lower temperature, 90°C. The obtained large body of experimental data could be utilised for further kinetic and economic calculations to optimise technical hot-water extractionof softwoods.
Resumo:
State-of-the-art predictions of atmospheric states rely on large-scale numerical models of chaotic systems. This dissertation studies numerical methods for state and parameter estimation in such systems. The motivation comes from weather and climate models and a methodological perspective is adopted. The dissertation comprises three sections: state estimation, parameter estimation and chemical data assimilation with real atmospheric satellite data. In the state estimation part of this dissertation, a new filtering technique based on a combination of ensemble and variational Kalman filtering approaches, is presented, experimented and discussed. This new filter is developed for large-scale Kalman filtering applications. In the parameter estimation part, three different techniques for parameter estimation in chaotic systems are considered. The methods are studied using the parameterized Lorenz 95 system, which is a benchmark model for data assimilation. In addition, a dilemma related to the uniqueness of weather and climate model closure parameters is discussed. In the data-oriented part of this dissertation, data from the Global Ozone Monitoring by Occultation of Stars (GOMOS) satellite instrument are considered and an alternative algorithm to retrieve atmospheric parameters from the measurements is presented. The validation study presents first global comparisons between two unique satellite-borne datasets of vertical profiles of nitrogen trioxide (NO3), retrieved using GOMOS and Stratospheric Aerosol and Gas Experiment III (SAGE III) satellite instruments. The GOMOS NO3 observations are also considered in a chemical state estimation study in order to retrieve stratospheric temperature profiles. The main result of this dissertation is the consideration of likelihood calculations via Kalman filtering outputs. The concept has previously been used together with stochastic differential equations and in time series analysis. In this work, the concept is applied to chaotic dynamical systems and used together with Markov chain Monte Carlo (MCMC) methods for statistical analysis. In particular, this methodology is advocated for use in numerical weather prediction (NWP) and climate model applications. In addition, the concept is shown to be useful in estimating the filter-specific parameters related, e.g., to model error covariance matrix parameters.
Resumo:
The power rating of wind turbines is constantly increasing; however, keeping the voltage rating at the low-voltage level results in high kilo-ampere currents. An alternative for increasing the power levels without raising the voltage level is provided by multiphase machines. Multiphase machines are used for instance in ship propulsion systems, aerospace applications, electric vehicles, and in other high-power applications including wind energy conversion systems. A machine model in an appropriate reference frame is required in order to design an efficient control for the electric drive. Modeling of multiphase machines poses a challenge because of the mutual couplings between the phases. Mutual couplings degrade the drive performance unless they are properly considered. In certain multiphase machines there is also a problem of high current harmonics, which are easily generated because of the small current path impedance of the harmonic components. However, multiphase machines provide special characteristics compared with the three-phase counterparts: Multiphase machines have a better fault tolerance, and are thus more robust. In addition, the controlled power can be divided among more inverter legs by increasing the number of phases. Moreover, the torque pulsation can be decreased and the harmonic frequency of the torque ripple increased by an appropriate multiphase configuration. By increasing the number of phases it is also possible to obtain more torque per RMS ampere for the same volume, and thus, increase the power density. In this doctoral thesis, a decoupled d–q model of double-star permanent-magnet (PM) synchronous machines is derived based on the inductance matrix diagonalization. The double-star machine is a special type of multiphase machines. Its armature consists of two three-phase winding sets, which are commonly displaced by 30 electrical degrees. In this study, the displacement angle between the sets is considered a parameter. The diagonalization of the inductance matrix results in a simplified model structure, in which the mutual couplings between the reference frames are eliminated. Moreover, the current harmonics are mapped into a reference frame, in which they can be easily controlled. The work also presents methods to determine the machine inductances by a finite-element analysis and by voltage-source inverters on-site. The derived model is validated by experimental results obtained with an example double-star interior PM (IPM) synchronous machine having the sets displaced by 30 electrical degrees. The derived transformation, and consequently, the decoupled d–q machine model, are shown to model the behavior of an actual machine with an acceptable accuracy. Thus, the proposed model is suitable to be used for the model-based control design of electric drives consisting of double-star IPM synchronous machines.
Resumo:
Biomedical natural language processing (BioNLP) is a subfield of natural language processing, an area of computational linguistics concerned with developing programs that work with natural language: written texts and speech. Biomedical relation extraction concerns the detection of semantic relations such as protein-protein interactions (PPI) from scientific texts. The aim is to enhance information retrieval by detecting relations between concepts, not just individual concepts as with a keyword search. In recent years, events have been proposed as a more detailed alternative for simple pairwise PPI relations. Events provide a systematic, structural representation for annotating the content of natural language texts. Events are characterized by annotated trigger words, directed and typed arguments and the ability to nest other events. For example, the sentence “Protein A causes protein B to bind protein C” can be annotated with the nested event structure CAUSE(A, BIND(B, C)). Converted to such formal representations, the information of natural language texts can be used by computational applications. Biomedical event annotations were introduced by the BioInfer and GENIA corpora, and event extraction was popularized by the BioNLP'09 Shared Task on Event Extraction. In this thesis we present a method for automated event extraction, implemented as the Turku Event Extraction System (TEES). A unified graph format is defined for representing event annotations and the problem of extracting complex event structures is decomposed into a number of independent classification tasks. These classification tasks are solved using SVM and RLS classifiers, utilizing rich feature representations built from full dependency parsing. Building on earlier work on pairwise relation extraction and using a generalized graph representation, the resulting TEES system is capable of detecting binary relations as well as complex event structures. We show that this event extraction system has good performance, reaching the first place in the BioNLP'09 Shared Task on Event Extraction. Subsequently, TEES has achieved several first ranks in the BioNLP'11 and BioNLP'13 Shared Tasks, as well as shown competitive performance in the binary relation Drug-Drug Interaction Extraction 2011 and 2013 shared tasks. The Turku Event Extraction System is published as a freely available open-source project, documenting the research in detail as well as making the method available for practical applications. In particular, in this thesis we describe the application of the event extraction method to PubMed-scale text mining, showing how the developed approach not only shows good performance, but is generalizable and applicable to large-scale real-world text mining projects. Finally, we discuss related literature, summarize the contributions of the work and present some thoughts on future directions for biomedical event extraction. This thesis includes and builds on six original research publications. The first of these introduces the analysis of dependency parses that leads to development of TEES. The entries in the three BioNLP Shared Tasks, as well as in the DDIExtraction 2011 task are covered in four publications, and the sixth one demonstrates the application of the system to PubMed-scale text mining.
Resumo:
Effective processes to fractionate the main compounds in biomass, such as wood, are a prerequisite for an effective biorefinery. Water is environmentally friendly and widely used in industry, which makes it a potential solvent also for forest biomass. At elevated temperatures over 100 °C, water can readily hydrolyse and dissolve hemicelluloses from biomass. In this work, birch sawdust was extracted using pressurized hot water (PHWE) flow-through systems. The hypothesis of the work was that it is possible to obtain polymeric, water-soluble hemicelluloses from birch sawdust using flow-through PHW extractions at both laboratory and large scale. Different extraction temperatures in the range 140–200 °C were evaluated to see the effect of temperature to the xylan yield. The yields and extracted hemicelluloses were analysed to obtain sugar ratios, the amount of acetyl groups, furfurals and the xylan yields. Higher extraction temperatures increased the xylan yield, but decreased the molar mass of the dissolved xylan. As the extraction temperature increased, more acetic acid was released from the hemicelluloses, thus further decreasing the pH of the extract. There were only trace amounts of furfurals present after the extractions, indicating that the treatment was mild enough not to degrade the sugars further. The sawdust extraction density was increased by packing more sawdust in the laboratory scale extraction vessel. The aim was to obtain extracts with higher concentration than in typical extraction densities. The extraction times and water flow rates were kept constant during these extractions. The higher sawdust packing degree decreased the water use in the extractions and the extracts had higher hemicellulose concentrations than extractions with lower sawdust degrees of packing. The molar masses of the hemicelluloses were similar in higher packing degrees and in the degrees of packing that were used in typical PHWE flow-through extractions. The structure of extracted sawdust was investigated using small angle-(SAXS) and wide angle (WAXS) x-ray scattering. The cell wall topography of birch sawdust and extracted sawdust was compared using x-ray tomography. The results showed that the structure of the cell walls of extracted birch sawdust was preserved but the cell walls were thinner after the extractions. Larger pores were opened inside the fibres and cellulose microfibrils were more tightly packed after the extraction. Acetate buffers were used to control the pH of the extracts during the extractions. The pH control prevented excessive xylan hydrolysis and increased the molar masses of the extracted xylans. The yields of buffered extractions were lower than for plain water extractions at 160–170 °C, but at 180 °C yields were similar to those from plain water and pH buffers. The pH can thus be controlled during extraction with acetate buffer to obtain xylan with higher molar mass than those obtainable using plain water. Birch sawdust was extracted both in the laboratory and pilot scale. The performance of the PHWE flow-through system was evaluated in the laboratory and the pilot scale using vessels with the same shape but different volumes, with the same relative water flow through the sawdust bed, and in the same extraction temperature. Pre-steaming improved the extraction efficiency and the water flow through the sawdust bed. The extracted birch sawdust and the extracted xylan were similar in both laboratory and pilot scale. The PHWE system was successfully scaled up by a factor of 6000 from the laboratory to pilot scale and extractions performed equally well in both scales. The results show that a flow-through system can be further scaled up and used to extract water-soluble xylans from birch sawdust. Extracted xylans can be concentrated, purified, and then used in e.g. films and barriers, or as building blocks for novel material applications.
Resumo:
Time series analysis can be categorized into three different approaches: classical, Box-Jenkins, and State space. Classical approach makes a basement for the analysis and Box-Jenkins approach is an improvement of the classical approach and deals with stationary time series. State space approach allows time variant factors and covers up a broader area of time series analysis. This thesis focuses on parameter identifiablity of different parameter estimation methods such as LSQ, Yule-Walker, MLE which are used in the above time series analysis approaches. Also the Kalman filter method and smoothing techniques are integrated with the state space approach and MLE method to estimate parameters allowing them to change over time. Parameter estimation is carried out by repeating estimation and integrating with MCMC and inspect how well different estimation methods can identify the optimal model parameters. Identification is performed in probabilistic and general senses and compare the results in order to study and represent identifiability more informative way.
Resumo:
Solvent extraction of calcium and magnesium impurities from a lithium-rich brine (Ca ~ 2,000 ppm, Mg ~ 50 ppm, Li ~ 30,000 ppm) was investigated using a continuous counter-current solvent extraction mixer-settler set-up. The literature review includes a general review about resources, demands and production methods of Li followed by basics of solvent extraction. Experimental section includes batch experiments for investigation of pH isotherms of three extractants; D2EHPA, Versatic 10 and LIX 984 with concentrations of 0.52, 0.53 and 0.50 M in kerosene respectively. Based on pH isotherms LIX 984 showed no affinity for solvent extraction of Mg and Ca at pH ≤ 8 while D2EHPA and Versatic 10 were effective in extraction of Ca and Mg. Based on constructed pH isotherms, loading isotherms of D2EHPA (at pH 3.5 and 3.9) and Versatic 10 (at pH 7 and 8) were further investigated. Furthermore based on McCabe-Thiele method, two extraction stages and one stripping stage (using HCl acid with concentration of 2 M for Versatic 10 and 3 M for D2EHPA) was practiced in continuous runs. Merits of Versatic 10 in comparison to D2EHPA are higher selectivity for Ca and Mg, faster phase disengagement, no detrimental change in viscosity due to shear amount of metal extraction and lower acidity in stripping. On the other hand D2EHPA has less aqueous solubility and is capable of removing Mg and Ca simultaneously even at higher Ca loading (A/O in continuous runs > 1). In general, shorter residence time (~ 2 min), lower temperature (~23 °C), lower pH values (6.5-7.0 for Versatic 10 and 3.5-3.7 for D2EHPA) and a moderately low A/O value (< 1:1) would cause removal of 100% of Ca and nearly 100% of Mg while keeping Li loss less than 4%, much lower than the conventional precipitation in which 20% of Li is lost.
Resumo:
Climatic impacts of energy-peat extraction are of increasing concern due to EU emissions trading requirements. A new excavation-drier peat extraction method has been developed to reduce the climatic impact and increase the efficiency of peat extraction. To quantify and compare the soil GHG fluxes of the excavation drier and the traditional milling methods, as well as the areas from which the energy peat is planned to be extracted in the future (extraction reserve area types), soil CO2, CH4 and N2O fluxes were measured during 2006–2007 at three sites in Finland. Within each site, fluxes were measured from drained extraction reserve areas, extraction fields and stockpiles of both methods and additionally from the biomass driers of the excavation-drier method. The Life Cycle Assessment (LCA), described at a principal level in ISO Standards 14040:2006 and 14044:2006, was used to assess the long-term (100 years) climatic impact from peatland utilisation with respect to land use and energy production chains where utilisation of coal was replaced with peat. Coal was used as a reference since in many cases peat and coal can replace each other in same power plants. According to this study, the peat extraction method used was of lesser significance than the extraction reserve area type in regards to the climatic impact. However, the excavation-drier method seems to cause a slightly reduced climatic impact as compared with the prevailing milling method.
Resumo:
Människor utnyttjar ofta kemi mångsidigt i sitt vardagliga liv utan att närmare tänka på detaljerna. Nuförtiden kan man framställa en ökande mängd av produkter ur förnybara råmaterial och en av de mest mångsidiga nybara råmaterialet i Norden är barrträd. Den lyriska lägerelden eller spiselden och möbler av ved samt papper är en väsentlig del av vardagen. Också livsmedel och läkemedel kan innehålla föreningar ur ved. Ved som råmaterial består av tre huvudkomponenten: cellulosa, som är uppbyggd av druvsockermolekyler är en långkedjad, oförgrenad polymer; lignin, som sammanhåller fibrerna i vedmaterialet som lim samt hemicellulosor, som ofta är uppbyggda av olika sockerarter och är en förgrenad polymer. Följaktligen består vedmaterialet av 70 % socker. I detta arbete har vi koncentrerat på i hemicellulosa och dess extraktion ur gran, samt bestämning av hemicellulosans egenskaper. Den slutliga målsättningen i forskningen var att skapa nya produkter ur gran. Forskning i extraktionens hemligheter eller hur hemicellulosa kan effektivt extraheras i den önskade formen kräver nya typers experimentellasanläggningar och experiment samt matematisk modellering. Den långkedjade hemicellulosan är lämplig för att användas t.ex. i skyddshinnor eller i livsmedel. Medel- och småmolekylär hemicellulosa kan användas som utgångsämne för framställning av bränslen, smörjmedel, sockersyror och alkoholer, av vilka xylitol är mest känd för alla pga hälsobefrämjande effekter. Det är utomordentligt viktigt ur miljöns och energiekonomins synvinkel att sträva efter effektivering av utnyttjandet av den värdefullaste och största naturtillgången, skogen i vårt land, med alla möjliga sätt. Resultaten av denna forskning utnyttjar avsevärt den växande, nya, på skogen baserande biobaseradeindustrin, som framställer nya spetsprodukter samt skapar nya arbetsplatser. ----------------------------------------------------- Ihmiset hyödyntävät usein huomaamattaan kemiaa monipuolisesti jokapäiväisessä elämässä. Nykyään kasvava määrä tuotteista kyetään valmistamaan uusiutuvista raaka-aineista ja yksi monipuolisimmista uusiutuvista luonnonvaroistamme pohjolassa ovat havupuut. Tunnelmallinen nuotio tai takkatuli ja puiset huonekalut sekä paperi ovat olennainen osa arkea. Myös elintarvikkeet ja lääkkeet voivat sisältää puusta peräisin olevia yhdisteitä. Puu materiaalina koostuu rakenteeltaan pääosin kolmesta osasta; selluloosasta, joka on rypälesokerista koostuva pitkäketjuinen haaroittumaton polymeeri, ligniinistä, joka toimii puun koossa pitävänä liima-aineena ja hemiselluloosasta, joka on useista eri sokereista rakentunut haaroittunut polymeeri. Näin ollen puusta 70 % on sokeria. Tässä työssä olemme keskittyneet hemiselluloosaan ja sen uuttamiseen kuusesta, sekä ominaisuuksien kartoittamiseen. Tutkimusaiheen lopullinen tavoite on luoda uusia tuotteita kuusesta. Uuton salojen tutkiminen eli miten hemiselluloosa saadaan tehokkaasti uutettua halutunlaisena vaatii uudenlaisia koelaitteistoja ja kokeita, sekä matemaattista mallintamista. Suurikokoinen hemiselluloosa on sopivaa käytettäväksi esimerkiksi suojakalvoissa tai elintarvikkeissa. Keskikokoista ja pienimolekyylistä hemiselluloosaa voidaan käyttää lähtöaineena valmistettaessa polttoaineita, voiteluaineita, sokerihappoja ja sokerialkoholeja, joista xylitoli on terveysvaikutustensa vuoksi kaikille tuttu. Niin ympäristömme kuin myös energiataloutemme kannalta on ensiarvoisen tärkeää pyrkiä kaikin keinoin tehostamaan maallemme arvokkaan, sekä luonnonvaroistamme yhden suurimman, metsän, vastuullista hyödyntämistä. Tämän tutkimuksen tulokset hyödyntävät merkittävästi maahamme nousevaa uutta metsään pohjautuvaa biojalostusteollisuutta, joka valmistaa uusia huipputuotteita sekä luo työpaikkoja.
Resumo:
Torrefaction is moderate thermal treatment (~200-300 °C) of biomass in an inert atmosphere. The torrefied fuel offers advantages to traditional biomass, such as higher heating value, reduced hydrophilic nature, increased its resistance to biological decay, and improved grindability. These factors could, for instance, lead to better handling and storage of biomass and increased use of biomass in pulverized combustors. In this work, we look at several aspects of changes in the biomass during torrefaction. We investigate the fate of carboxylic groups during torrefaction and its dependency to equilibrium moisture content. The changes in the wood components including carbohydrates, lignin, extractable materials and ashforming matters are also studied. And at last, the effect of K on torrefaction is investigated and then modeled. In biomass, carboxylic sites are partially responsible for its hydrophilic characteristic. These sites are degraded to varying extents during torrefaction. In this work, methylene blue sorption and potentiometric titration were applied to measure the concentration of carboxylic groups in torrefied spruce wood. The results from both methods were applicable and the values agreed well. A decrease in the equilibrium moisture content at different humidity was also measured for the torrefied wood samples, which is in good agreement with the decrease in carboxylic group contents. Thus, both methods offer a means of directly measuring the decomposition of carboxylic groups in biomass during torrefaction as a valuable parameter in evaluating the extent of torrefaction. This provides new information to the chemical changes occurring during torrefaction. The effect of torrefaction temperature on the chemistry of birch wood was investigated. The samples were from a pilot plant at Energy research Center of the Netherlands (ECN). And in that way they were representative of industrially produced samples. Sugar analysis was applied to analyze the hemicellulose and cellulose content during torrefaction. The results show a significant degradation of hemicellulose already at 240 °C, while cellulose degradation becomes significant above 270 °C torrefaction. Several methods including Klason lignin method, solid state NMR and Py-GC-MS analyses were applied to measure the changes in lignin during torrefaction. The changes in the ratio of phenyl, guaiacyl and syringyl units show that lignin degrades already at 240 °C to a small extent. To investigate the changes in the extractives from acetone extraction during torrefaction, gravimetric method, HP-SEC and GC-FID followed by GC-MS analysis were performed. The content of acetone-extractable material increases already at 240 °C torrefaction through the degradation of carbohydrate and lignin. The molecular weight of the acetone-extractable material decreases with increasing the torrefaction temperature. The formation of some valuable materials like syringaresinol or vanillin is also observed which is important from biorefinery perspective. To investigate the change in the chemical association of ash-forming elements in birch wood during torrefaction, chemical fractionation was performed on the original and torrefied birch samples. These results give a first understanding of the changes in the association of ashforming elements during torrefaction. The most significant changes can be seen in the distribution of calcium, magnesium and manganese, with some change in water solubility seen in potassium. These changes may in part be due to the destruction of carboxylic groups. In addition to some changes in water and acid solubility of phosphorous, a clear decrease in the concentration of both chlorine and sulfur was observed. This would be a significant additional benefit for the combustion of torrefied biomass. Another objective of this work is studying the impact of organically bound K, Na, Ca and Mn on mass loss of biomass during torrefaction. These elements were of interest because they have been shown to be catalytically active in solid fuels during pyrolysis and/or gasification. The biomasses were first acid washed to remove the ash-forming matters and then organic sites were doped with K, Na, Ca or Mn. The results show that K and Na bound to organic sites can significantly increase the mass loss during torrefaction. It is also seen that Mn bound to organic sites increases the mass loss and Ca addition does not influence the mass loss rate on torrefaction. This increase in mass loss during torrefaction with alkali addition is unlike what has been found in the case of pyrolysis where alkali addition resulted in a reduced mass loss. These results are important for the future operation of torrefaction plants, which will likely be designed to handle various biomasses with significantly different contents of K. The results imply that shorter retention times are possible for high K-containing biomasses. The mass loss of spruce wood with different content of K was modeled using a two-step reaction model based on four kinetic rate constants. The results show that it is possible to model the mass loss of spruce wood doped with different levels of K using the same activation energies but different pre-exponential factors for the rate constants. Three of the pre-exponential factors increased linearly with increasing K content, while one of the preexponential factors decreased with increasing K content. Therefore, a new torrefaction model was formulated using the hemicellulose and cellulose content and K content. The new torrefaction model was validated against the mass loss during the torrefaction of aspen, miscanthus, straw and bark. There is good agreement between the model and the experimental data for the other biomasses, except bark. For bark, the mass loss of acetone extractable material is also needed to be taken into account. The new model can describe the kinetics of mass loss during torrefaction of different types of biomass. This is important for considering fuel flexibility in torrefaction plants.
Resumo:
Since its discovery, chaos has been a very interesting and challenging topic of research. Many great minds spent their entire lives trying to give some rules to it. Nowadays, thanks to the research of last century and the advent of computers, it is possible to predict chaotic phenomena of nature for a certain limited amount of time. The aim of this study is to present a recently discovered method for the parameter estimation of the chaotic dynamical system models via the correlation integral likelihood, and give some hints for a more optimized use of it, together with a possible application to the industry. The main part of our study concerned two chaotic attractors whose general behaviour is diff erent, in order to capture eventual di fferences in the results. In the various simulations that we performed, the initial conditions have been changed in a quite exhaustive way. The results obtained show that, under certain conditions, this method works very well in all the case. In particular, it came out that the most important aspect is to be very careful while creating the training set and the empirical likelihood, since a lack of information in this part of the procedure leads to low quality results.