709 resultados para Scaled semivariogram


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Information about rainfall erosivity is important during soil and water conservation planning. Thus, the spatial variability of rainfall erosivity of the state Mato Grosso do Sul was analyzed using ordinary kriging interpolation. For this, three pluviograph stations were used to obtain the regression equations between the erosivity index and the rainfall coefficient EI30. The equations obtained were applied to 109 pluviometric stations, resulting in EI30 values. These values were analyzed from geostatistical technique, which can be divided into: descriptive statistics, adjust to semivariogram, cross-validation process and implementation of ordinary kriging to generate the erosivity map.Highest erosivity values were found in central and northeast regions of the State, while the lowest values were observed in the southern region. In addition, high annual precipitation values not necessarily produce higher erosivity values.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The air dry-bulb temperature (t db),as well as the black globe humidity index (BGHI), exert great influence on the development of broiler chickens during their heating phase. Therefore, the aim of this study was to analyze the structure and the magnitude of the t db and BGHI spatial variability, using geostatistics tools such as semivariogram analysis and also producing kriging maps. The experiment was conducted in the west mesoregion of the states of Minas Gerais in 2010, in a commercial broiler house with heating system consisting of two furnaces that heat the air indirectly, in the firsts 14 days of the birds' life. The data were registered at intervals of five minutes in the period from 8 a.m. to 10 a.m. The variables were evaluated by variograms fitted by residual maximum likelihood (REML) testing the Spherical and Exponential models. Kriging maps were generated based on the best model used to fit the variogram. It was possible to characterize the variability of the t db and BGHI, which allowed observing the spatial dependence by using geostatistics techniques. In addition, the use of geostatistics and distribution maps made possible to identify problems in the heating system in regions inside the broiler house that may harm the development of chicks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Combating climate change is one of the key tasks of humanity in the 21st century. One of the leading causes is carbon dioxide emissions due to usage of fossil fuels. Renewable energy sources should be used instead of relying on oil, gas, and coal. In Finland a significant amount of energy is produced using wood. The usage of wood chips is expected to increase in the future significantly, over 60 %. The aim of this research is to improve understanding over the costs of wood chip supply chains. This is conducted by utilizing simulation as the main research method. The simulation model utilizes both agent-based modelling and discrete event simulation to imitate the wood chip supply chain. This thesis concentrates on the usage of simulation based decision support systems in strategic decision-making. The simulation model is part of a decision support system, which connects the simulation model to databases but also provides a graphical user interface for the decisionmaker. The main analysis conducted with the decision support system concentrates on comparing a traditional supply chain to a supply chain utilizing specialized containers. According to the analysis, the container supply chain is able to have smaller costs than the traditional supply chain. Also, a container supply chain can be more easily scaled up due to faster emptying operations. Initially the container operations would only supply part of the fuel needs of a power plant and it would complement the current supply chain. The model can be expanded to include intermodal supply chains as due to increased demand in the future there is not enough wood chips located close to current and future power plants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tämän tutkimuksen tavoitteena oli selvittää niitä tekijöitä, jotka vaikuttavat työn ja työympäristön hallintaan tietoteknisessä työssä: työn hallinta, työn nopeampi tekeminen, työn sisältö, työn tulokset ja vaikutusmahdollisuudet. Näitä tekijöitä tutkittiin suhteessa työntekijän tietoteknisiin taitoihin ja työntekijän ikään. Tutkimuksen viitekehyksenä ovat maailmanlaajuisesti tapahtuneet ja edelleen tapahtuvat muutokset työelämässä sekä tietoyhteiskunnan kehittyminen. Aikaisemmat tutkimukset nostavat esiin yleisiä kehityslinjoja, mutta myös ristiriitaisia tuloksia. Siksi tässä tutkimuksessa nämä kysymykset purettiin hyvin konkreettisiksi, työyhteisössä havaituiksi osakysymyksiksi ja suhteutettiin vastaajia koskeviin taustatietoihin. Tutkimus kohdistui eri ikäisiin työntekijöihin, joista osa on kokenut tietotekniikan tulon työelämään ja sen käytön lisääntymisen useamman viime vuosikymmenen aikana. Tänä aikana yhteiskunnassa on tapahtunut yleisemminkin merkittävää kehitystä, joka on nostanut esille laajemminkin tarpeet tietoteknisen osaamisen ja taitojen kehittymiseen ja elinikäiseen oppimiseen. Tämä tutkimus ajoittuu noihin vuosikymmeniin ja vaikka se kohdistuukin yhden organisaation sisällä koettuihin vaikutuksiin, niin on selvää, että yleisempikin yhteiskunnallinen kehitys heijastuu tuloksiin. Tämä tutkimus on tehty kvantitatiivisella tutkimusotteella. Tutkimusaineisto on kerätty sähköisellä lomakkeella. Tämän tutkimuksen mukaan tietoteknisiltä taidoiltaan hyvät kokivat hallitsevansa työnsä ja työympäristönsä paremmin kuin tietoteknisiltä taidoiltaan heikot. He kokivat myös tietotekniikan helpottaneen työtä, mahdollisuudet itsensä kehittämiseen lisääntyneen ja työn itsenäisyyden lisääntyneen. Tietoteknisiltä taidoiltaan heikot puolestaan kokivat työn ja työnympäristön hallinnan vähentyneen, työn vaativuuden lisääntyneen, työn tauotuksen vähentyneen, sosiaalisen ympäristön heikentyneen, työn henkisen rasittavuuden lisääntyneen, oman ammattialan arvostuksen vähentyneen ja työhön liittyvien vaikutusmahdollisuuksien vähentyneen. Tulosten perusteella kaikki työntekijät kokivat työn tulosten parantuneen tietotekniikan avulla, mutta toisaalta kokivat työn sisällön köyhtyneen tietotekniikan vuoksi – varsinkin iäkkäät työntekijät kokivat näin. Tutkimuksessa selvitettiin myös työn ja työympäristön hallintaa tukevia ja uhkaavia tekijöitä. Tämän tutkimuksen mukaan pelot tietotekniikkaa kohtaan ja puutteellinen tuki tietoteknisissä ongelmissa uhkaavat työn ja työympäristön hallintaa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Polymeric materials that conduct electricity are highly interesting for fundamental studies and beneficial for modern applications in e.g. solar cells, organic field effect transistors (OFETs) as well as in chemical and bio‐sensing. Therefore, it is important to characterize this class of materials with a wide variety of methods. This work summarizes the use of electrochemistry also in combination with spectroscopic methods in synthesis and characterization of electrically conducting polymers and other π‐conjugated systems. The materials studied in this work are intended for organic electronic devices and chemical sensors. Additionally, an important part of the presented work, concerns rational approaches to the development of water‐based inks containing conducting particles. Electrochemical synthesis and electroactivity of conducting polymers can be greatly enhanced in room temperature ionic liquids (RTILs) in comparison to conventional electrolytes. Therefore, poly(para‐phyenylene) (PPP) was electrochemically synthesized in the two representative RTILs: bmimPF6 and bmiTf2N (imidazolium and pyrrolidinium‐based salts, respectively). It was found that the electrochemical synthesis of PPP was significantly enhanced in bmimPF6. Additionally, the results from doping studies of PPP films indicate improved electroactivity in bmimPF6 during oxidation (p‐doping) and in bmiTf2N in the case of reduction (n‐doping). These findings were supported by in situ infrared spectroscopy studies. Conducting poly(benzimidazobenzophenanthroline) (BBL) is a material which can provide relatively high field‐effect mobility of charge carriers in OFET devices. The main disadvantage of this n‐type semiconductor is its limited processability. Therefore in this work BBL was functionalized with poly(ethylene oxide) PEO, varying the length of side chains enabling water dispersions of the studied polymer. It was found that functionalization did not distract the electrochemical activity of the BBL backbone while the processability was improved significantly in comparison to conventional BBL. Another objective was to study highly processable poly(3,4‐ethylenedioxythiophene) poly(styrenesulfonate) (PEDOT:PSS) water‐based inks for controlled patterning scaled‐down to nearly a nanodomain with the intention to fabricate various chemical sensors. Developed PEDOT:PSS inks greatly improved printing of nanoarrays and with further modification with quaternary ammonium cations enabled fabrication of PEDOT:PSS‐based chemical sensors for lead (II) ions with enhanced adhesion and stability in aqueous environments. This opens new possibilities for development of PEDOT:PSS films that can be used in bio‐related applications. Polycyclic aromatic hydrocarbons (PAHs) are a broad group of π‐conjugated materials consisting of aromatic rings in the range from naphthalene to even hundred rings in one molecule. The research on this type of materials is intriguing, due to their interesting optical properties and resemblance of graphene. The objective was to use electrochemical synthesis to yield relatively large PAHs and fabricate electroactive films that could be used as template material in chemical sensors. Spectroscopic, electrochemical and electrical investigations evidence formation of highly stable films with fast redox response, consisting of molecules with 40 to 60 carbon atoms. Additionally, this approach in synthesis, starting from relatively small PAH molecules was successfully used in chemical sensor for lead (II).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, the feasibility of the floating-gate technology in analog computing platforms in a scaled down general-purpose CMOS technology is considered. When the technology is scaled down the performance of analog circuits tends to get worse because the process parameters are optimized for digital transistors and the scaling involves the reduction of supply voltages. Generally, the challenge in analog circuit design is that all salient design metrics such as power, area, bandwidth and accuracy are interrelated. Furthermore, poor flexibility, i.e. lack of reconfigurability, the reuse of IP etc., can be considered the most severe weakness of analog hardware. On this account, digital calibration schemes are often required for improved performance or yield enhancement, whereas high flexibility/reconfigurability can not be easily achieved. Here, it is discussed whether it is possible to work around these obstacles by using floating-gate transistors (FGTs), and analyze problems associated with the practical implementation. FGT technology is attractive because it is electrically programmable and also features a charge-based built-in non-volatile memory. Apart from being ideal for canceling the circuit non-idealities due to process variations, the FGTs can also be used as computational or adaptive elements in analog circuits. The nominal gate oxide thickness in the deep sub-micron (DSM) processes is too thin to support robust charge retention and consequently the FGT becomes leaky. In principle, non-leaky FGTs can be implemented in a scaled down process without any special masks by using “double”-oxide transistors intended for providing devices that operate with higher supply voltages than general purpose devices. However, in practice the technology scaling poses several challenges which are addressed in this thesis. To provide a sufficiently wide-ranging survey, six prototype chips with varying complexity were implemented in four different DSM process nodes and investigated from this perspective. The focus is on non-leaky FGTs, but the presented autozeroing floating-gate amplifier (AFGA) demonstrates that leaky FGTs may also find a use. The simplest test structures contain only a few transistors, whereas the most complex experimental chip is an implementation of a spiking neural network (SNN) which comprises thousands of active and passive devices. More precisely, it is a fully connected (256 FGT synapses) two-layer spiking neural network (SNN), where the adaptive properties of FGT are taken advantage of. A compact realization of Spike Timing Dependent Plasticity (STDP) within the SNN is one of the key contributions of this thesis. Finally, the considerations in this thesis extend beyond CMOS to emerging nanodevices. To this end, one promising emerging nanoscale circuit element - memristor - is reviewed and its applicability for analog processing is considered. Furthermore, it is discussed how the FGT technology can be used to prototype computation paradigms compatible with these emerging two-terminal nanoscale devices in a mature and widely available CMOS technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is concerned with the state and parameter estimation in state space models. The estimation of states and parameters is an important task when mathematical modeling is applied to many different application areas such as the global positioning systems, target tracking, navigation, brain imaging, spread of infectious diseases, biological processes, telecommunications, audio signal processing, stochastic optimal control, machine learning, and physical systems. In Bayesian settings, the estimation of states or parameters amounts to computation of the posterior probability density function. Except for a very restricted number of models, it is impossible to compute this density function in a closed form. Hence, we need approximation methods. A state estimation problem involves estimating the states (latent variables) that are not directly observed in the output of the system. In this thesis, we use the Kalman filter, extended Kalman filter, Gauss–Hermite filters, and particle filters to estimate the states based on available measurements. Among these filters, particle filters are numerical methods for approximating the filtering distributions of non-linear non-Gaussian state space models via Monte Carlo. The performance of a particle filter heavily depends on the chosen importance distribution. For instance, inappropriate choice of the importance distribution can lead to the failure of convergence of the particle filter algorithm. In this thesis, we analyze the theoretical Lᵖ particle filter convergence with general importance distributions, where p ≥2 is an integer. A parameter estimation problem is considered with inferring the model parameters from measurements. For high-dimensional complex models, estimation of parameters can be done by Markov chain Monte Carlo (MCMC) methods. In its operation, the MCMC method requires the unnormalized posterior distribution of the parameters and a proposal distribution. In this thesis, we show how the posterior density function of the parameters of a state space model can be computed by filtering based methods, where the states are integrated out. This type of computation is then applied to estimate parameters of stochastic differential equations. Furthermore, we compute the partial derivatives of the log-posterior density function and use the hybrid Monte Carlo and scaled conjugate gradient methods to infer the parameters of stochastic differential equations. The computational efficiency of MCMC methods is highly depend on the chosen proposal distribution. A commonly used proposal distribution is Gaussian. In this kind of proposal, the covariance matrix must be well tuned. To tune it, adaptive MCMC methods can be used. In this thesis, we propose a new way of updating the covariance matrix using the variational Bayesian adaptive Kalman filter algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this research is to examine the pricing anomalies existing in the U.S. market during 1986 to 2011. The sample of stocks is divided into decile portfolios based on seven individual valuation ratios (E/P, B/P, S/P, EBIT/EV, EVITDA/EV, D/P, and CE/P) and price momentum to investigate the efficiency of individual valuation ratio and their combinations as portfolio formation criteria. This is the first time in financial literature when CE/P is employed as a constituent of composite value measure. The combinations are based on median scaled composite value measures and TOPSIS method. During the sample period value portfolios significantly outperform both the market portfolio and comparable glamour portfolios. The results show the highest return for the value portfolio that was based on the combination of S/P & CE/P ratios. The outcome of this research will increase the understanding on the suitability of different methodologies for portfolio selection. It will help managers to take advantage of the results of different methodologies in order to gain returns above the market.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tässä työssä esiteltiin Android laitteisto- ja sovellusalustana sekä kuvattiin, kuinka Android-pelisovelluksen käyttöliittymä voidaan pitää yhtenäisenä eri näyttölaitteilla skaalauskertoimien ja ankkuroinnin avulla. Toisena osiona työtä käsiteltiin yksinkertaisia tapoja, joilla pelisovelluksien suorituskykyä voidaan parantaa. Näistä tarkempiin mittauksiin valittiin matalatarkkuuksinen piirtopuskuri ja näkymättömissä olevien kappaleiden piilotus. Mittauksissa valitut menetelmät vaikuttivat demosovelluksen suorituskykyyn huomattavasti. Tässä työssä rajauduttiin Android-ohjelmointiin Java-kielellä ilman ulkoisia kirjastoja, jolloin työn tuloksia voi helposti hyödyntää mahdollisimman monessa eri käyttökohteessa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Effective processes to fractionate the main compounds in biomass, such as wood, are a prerequisite for an effective biorefinery. Water is environmentally friendly and widely used in industry, which makes it a potential solvent also for forest biomass. At elevated temperatures over 100 °C, water can readily hydrolyse and dissolve hemicelluloses from biomass. In this work, birch sawdust was extracted using pressurized hot water (PHWE) flow-through systems. The hypothesis of the work was that it is possible to obtain polymeric, water-soluble hemicelluloses from birch sawdust using flow-through PHW extractions at both laboratory and large scale. Different extraction temperatures in the range 140–200 °C were evaluated to see the effect of temperature to the xylan yield. The yields and extracted hemicelluloses were analysed to obtain sugar ratios, the amount of acetyl groups, furfurals and the xylan yields. Higher extraction temperatures increased the xylan yield, but decreased the molar mass of the dissolved xylan. As the extraction temperature increased, more acetic acid was released from the hemicelluloses, thus further decreasing the pH of the extract. There were only trace amounts of furfurals present after the extractions, indicating that the treatment was mild enough not to degrade the sugars further. The sawdust extraction density was increased by packing more sawdust in the laboratory scale extraction vessel. The aim was to obtain extracts with higher concentration than in typical extraction densities. The extraction times and water flow rates were kept constant during these extractions. The higher sawdust packing degree decreased the water use in the extractions and the extracts had higher hemicellulose concentrations than extractions with lower sawdust degrees of packing. The molar masses of the hemicelluloses were similar in higher packing degrees and in the degrees of packing that were used in typical PHWE flow-through extractions. The structure of extracted sawdust was investigated using small angle-(SAXS) and wide angle (WAXS) x-ray scattering. The cell wall topography of birch sawdust and extracted sawdust was compared using x-ray tomography. The results showed that the structure of the cell walls of extracted birch sawdust was preserved but the cell walls were thinner after the extractions. Larger pores were opened inside the fibres and cellulose microfibrils were more tightly packed after the extraction. Acetate buffers were used to control the pH of the extracts during the extractions. The pH control prevented excessive xylan hydrolysis and increased the molar masses of the extracted xylans. The yields of buffered extractions were lower than for plain water extractions at 160–170 °C, but at 180 °C yields were similar to those from plain water and pH buffers. The pH can thus be controlled during extraction with acetate buffer to obtain xylan with higher molar mass than those obtainable using plain water. Birch sawdust was extracted both in the laboratory and pilot scale. The performance of the PHWE flow-through system was evaluated in the laboratory and the pilot scale using vessels with the same shape but different volumes, with the same relative water flow through the sawdust bed, and in the same extraction temperature. Pre-steaming improved the extraction efficiency and the water flow through the sawdust bed. The extracted birch sawdust and the extracted xylan were similar in both laboratory and pilot scale. The PHWE system was successfully scaled up by a factor of 6000 from the laboratory to pilot scale and extractions performed equally well in both scales. The results show that a flow-through system can be further scaled up and used to extract water-soluble xylans from birch sawdust. Extracted xylans can be concentrated, purified, and then used in e.g. films and barriers, or as building blocks for novel material applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Decrease in microbial contacts in affluent societies is considered to lie behind the rise in allergic and other chronic inflammatory diseases during the last decades. Indeed, deviations in the intestinal microbiota composition and diversity have been associated with several diseases, such as atopic eczema. However, there is no consensus yet on what would constitute a beneficial or harmful microbiota. The aim of this thesis was to study the microbiota development in healthy infants and to characterize intestinal microbiota signatures associated with disease status and severity in infants with atopic eczema. The methodological aim was to compare and optimize methods for DNA extraction from fecal samples to be used in high-throughput microbiota analyses. It was confirmed that the most critical step in successful microbial DNA extraction from fecal samples is the mechanical cell lysis procedure. Based on this finding, an efficient semi-automated extraction process was developed that can be scaled for use in high-throughput platforms such as phylogenetic microarray used in this series of studies. By analyzing a longitudinal motherchild cohort for 3 years it was observed that the microbiota development is a gradual process, where some bacterial groups reach the degree of adult-type pattern earlier than others. During the breast-feeding period, the microbiota appeared to be relatively simple, while major diversification was found to start during the weaning process. By the age of 3 years, the child’s microbiota composition started to resemble that of an adult, but the bacterial diversity has still not reached the full diversity, indicating that the microbiota maturation extends beyond this age. In addition, at three years of age, the child’s microbiota was more similar to mother’s microbiota than to microbiota of nonrelated women.In infants with atopic eczema, a high total microbiota diversity and abundance of butyrate-producing bacteria was found to correlate with mild symptoms at 6 months. At 18 months, infants with mild eczema had significantly higher microbiota diversity and aberrant microbiota composition when compared to healthy controls at the same age. In conclusion, the comprehensive phylogenetic microarray analysis of early life microbiota shows the synergetic effect of vertical transmission and shared environment on the intestinal microbiota development. By the age of three years, the compositional development of intestinal microbiota is close to adult level, but the microbiota diversification continues beyond this age. In addition, specific microbiota signatures are associated with the existence and severity of atopic eczema and intestinal microbiota seems to have a role in alleviating the symptoms of this disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tämän työn tarkoitus on seuloa oleelliset prosessiparametrit superkondensaattoreiden elektrodikomposiittien valmistuksessa, jotka vaikuttavat kondensaattorin laatuun. Tarkoitus on tutkia parametreja, joiden avulla prosessia on mahdollista optimoida. Työn tarkoituksena on tutkia myös itse komponenttimateriaalien valmistusvaiheen sekoitusprosessia mitatulla ja laskennallisella seokseen siirtyvällä tehonkulutuksella. Työn kirjallisuusosassa esitetään superkondensaattoreiden rakennetta, toimintamekanismia ja ominaisuuksia sähköenergian varastoijana. Lisäksi tarkastellaan tavallisimpia kondensaattoreihin sisältyviä materiaaleja, erityisesti hiilinanoputkia ja selluloosakuituja. Sekoitusprosesseista tarkastellaan kokeellisessa osassa käytettävien sekoituslaitteita ja niiden toimintamekanismeja komponenttien sekoitusprosesseissa. Kokeellisessa osassa tutkimuskysymyksiksi asetettiin eri sekoitusparametrien (materiaalin määrä ja laatu sekä sekoitusajat) vaikutus superkondensaattorien elektrodiarkkien ominaiskapasitansseihin. Testit suoritettiin LUT Prosessien laboratoriossa, ja testeissä massojen sekoitukseen käytettiin roottoristaattoria ja ultraäänisekoitinta. Lisäksi tutkittiin prosessin skaalausta varten skaalatulla laitteistolla sekoitettuja massanäytteitä. Sekoitusprosessin riittävyyttä varten tutkittiin kokeellisesti käytettyjen sekoituslaitteiden tehonkulutusta. Lisäksi roottoristaattorille tehtiin laskentaohjelmalla virtaussimulaatio paikallisen tehonkulutuksen selvittämiseksi Testeissä todettiin tutkittujen parametrien vaikutus, mutta tulosten perusteella varsinaista optimointia ei kyetty tekemään. Tulokset kuitenkin antavat suunnan, johon prosessia voi optimointia varten kehittää. Myös sekoitukseen todettiin siirtyvän suuri määrä tehoa tutkituilla laitteilla, mitä voidaan pitää mahdollisesti riittävänä käytettyjen komponenttien sekoitukseen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goal of this research – which is to critically analyze current theories and methods of intangible assets evaluation and potentially develop and test new methodology based on the practical example(s) in the IT industry. Having this goal in mind the main research questions in this paper will be: What are advantages and disadvantages of the current practices of measurement intellectual capital or valuation of intangible assets? How to properly measure intellectual capital in IT? Resulting method exhibits a new unique approach to the IC measurement and potentially even larger field of application. Despite the fact that in this particular research, I focused my attention on IT (Software and Internet services cluster – to be exact), the logic behind the method is applicable within any industry since the method is designed to be fully compliant with measurement theory and thus can be properly scaled for any application. Building a new method is a difficult and iterative process: in the current iteration the method stands out as rather a theoretical concept rather than a business tool, however even current concept totally fulfills its purpose as a benchmarking tool for measuring intellectual capital in IT industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method to synthesize ethyl β-ᴅ-glucopyranoside (BEG) was searched. Feasibility of different ion exchange resins was examined to purify the product from the synthetic binary solution of BEG and glucose. The target was to produce at least 50 grams of 99 % pure BEG with a scaled up process. Another target was to transfer the batch process into steady-state recycle chromatography process (SSR). BEG was synthesized enzymatically with reverse hydrolysis utilizing β-glucosidase as a catalyst. 65 % of glucose reacted with ethanol into BEG during the synthesis. Different ion exchanger based resins were examined to separate BEG from glucose. Based on batch chromatography experiments the best adsorbent was chosen between styrene based strong acid cation exchange resins (SAC) and acryl based weak acid cation exchange resins (WAC). CA10GC WAC resin in Na+ form was chosen for the further separation studies. To produce greater amounts of the product the batch process was scaled up. The adsorption isotherms for the components were linear. The target purity was possible to reach already in batch without recycle with flowrate and injection size small enough. 99 % pure product was produced with scaled-up batch process. Batch process was transferred to SSR process utilizing the data from design pulse chromatograms and Matlab simulations. The optimal operating conditions for the system were determined. Batch and SSR separation results were compared and by using SSR 98 % pure products were gained with 40 % higher productivity and 40 % lower eluent consumption compared to batch process producing as pure products.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Kotkan jakelualueella käytetään historiallisista syistä keskijänniteverkossa kahta jänniteta-soa, 10 kV ja 20 kV. Eri jännitetasojen lisäksi Kotkan jakelualueen 110/20 päämuuntajien kytkentäryhmänä on käytössä YNyn0, kun taas muualla verkossa käytetään YNd11 kyt-kentäryhmää. Jännitetasoista ja kytkentäryhmistä johtuen Kotkan jakeluverkon yhteenkyt-kettävyys ympäröivän verkon kanssa on haastavaa. Työn tavoitteena on selvittää Kotkan kaupungin keskijänniteverkon nykytila ja käytettä-vyys häiriötilanteissa pahimpana mahdollisena aikana, sekä löytää mahdolliset ongelma-kohdat. Verkon nykytila tarkasteltiin verkkotietojärjestelmän avulla käyttäen laskennallisia tuloksia, jotka skaalattiin vastaamaan kovemman pakkastalven kuormitusta. Skaalaus teh-tiin useamman vuoden takaiseen tilanteeseen, jolloin yleinen taloustilanne oli parempi ja verkon kuormitus suurempi, jolloin verkko ei tule alimitoitetuksi taloustilanteen parantuessa. Tulevaisuuden varalta muodostettiin alueen tulevaisuuden kuormitusennusteet käyttämällä historiatietoja sekä tulevaisuuden näkymiä apuna. Työn keskeisimmäksi sisällöksi muodostui selvittää tarve usean käyttöjännitteen säilyttä-miselle sekä erilaisten kytkentäryhmien ylläpitämiseen ja kosketusjänniteongelman ratkai-seminen. Alueen sähköverkon kehittämiseksi tehtiin useita eri vaihtoehtoja, joita vertailtiin elinkaarikustannusperiaatteella toisiinsa. Vertailun pohjalta saatiin investointistrategia ehdo-tukset, joiden pohjalta verkkoyhtiö voi tehdä tulevaisuuteen sijoittuvia ratkaisuja.