996 resultados para Drying methods
Resumo:
En aquest treball estudiem si el valor intrínsec de Tubacex entre 1994-2013 coincideix amb la seva tendència bursàtil a llarg termini, tenint en compte part de la teoria defensada per Shiller. També verifiquem la possible infravaloració de l’acció de Tubacex a 31/12/13. A la primera part expliquem els principals mètodes de valoració d’empreses y a la segona part fem una anàlisi del sector en el que opera Tubacex (acer inoxidable) i calculem el valor de l’acció de Tubacex per mitjà de tres mètodes de valoració (Free Cash Flow, Cash Flow i Valor en Llibres). Apliquem aquests tres mètodes de valoració per verificar si com a mínim algun d’ells coincideix amb la tendència bursàtil a llarg termini.
Resumo:
Väitöstutkimuksessa on tarkasteltuinfrapunaspektroskopian ja monimuuttujaisten aineistonkäsittelymenetelmien soveltamista kiteytysprosessin monitoroinnissa ja kidemäisen tuotteen analysoinnissa. Parhaillaan kiteytysprosessitutkimuksessa maailmanlaajuisesti tutkitaan intensiivisesti erilaisten mittausmenetelmien soveltamista kiteytysprosessin ilmiöidenjatkuvaan mittaamiseen niin nestefaasista kuin syntyvistä kiteistäkin. Lisäksi tuotteen karakterisointi on välttämätöntä tuotteen laadun varmistamiseksi. Erityisesti lääkeaineiden valmistuksessa kiinnostusta tämäntyyppiseen tutkimukseen edistää Yhdysvaltain elintarvike- ja lääkeaineviraston (FDA) prosessianalyyttisiintekniikoihin (PAT) liittyvä ohjeistus, jossa määritellään laajasti vaatimukset lääkeaineiden valmistuksessa ja tuotteen karakterisoinnissa tarvittaville mittauksille turvallisten valmistusprosessien takaamiseksi. Jäähdytyskiteytyson erityisesti lääketeollisuudessa paljon käytetty erotusmenetelmä kiinteän raakatuotteen puhdistuksessa. Menetelmässä puhdistettava kiinteä raaka-aine liuotetaan sopivaan liuottimeen suhteellisen korkeassa lämpötilassa. Puhdistettavan aineen liukoisuus käytettävään liuottimeen laskee lämpötilan laskiessa, joten systeemiä jäähdytettäessä liuenneen aineen konsentraatio prosessissa ylittää liukoisuuskonsentraation. Tällaiseen ylikylläiseen systeemiin pyrkii muodostumaan uusia kiteitä tai olemassa olevat kiteet kasvavat. Ylikylläisyys on yksi tärkeimmistä kidetuotteen laatuun vaikuttavista tekijöistä. Jäähdytyskiteytyksessä syntyvän tuotteen ominaisuuksiin voidaan vaikuttaa mm. liuottimen valinnalla, jäähdytyprofiililla ja sekoituksella. Lisäksi kiteytysprosessin käynnistymisvaihe eli ensimmäisten kiteiden muodostumishetki vaikuttaa tuotteen ominaisuuksiin. Kidemäisen tuotteen laatu määritellään kiteiden keskimääräisen koon, koko- ja muotojakaumansekä puhtauden perusteella. Lääketeollisuudessa on usein vaatimuksena, että tuote edustaa tiettyä polymorfimuotoa, mikä tarkoittaa molekyylien kykyä järjestäytyä kidehilassa usealla eri tavalla. Edellä mainitut ominaisuudet vaikuttavat tuotteen jatkokäsiteltävyyteen, kuten mm. suodattuvuuteen, jauhautuvuuteen ja tabletoitavuuteen. Lisäksi polymorfiamuodolla on vaikutusta moniin tuotteen käytettävyysominaisuuksiin, kuten esim. lääkeaineen liukenemisnopeuteen elimistössä. Väitöstyössä on tutkittu sulfatiatsolin jäähdytyskiteytystä käyttäen useita eri liuotinseoksia ja jäähdytysprofiileja sekä tarkasteltu näiden tekijöiden vaikutustatuotteen laatuominaisuuksiin. Infrapunaspektroskopia on laajalti kemian alan tutkimuksissa sovellettava menetelmä. Siinä mitataan tutkittavan näytteenmolekyylien värähtelyjen aiheuttamia spektrimuutoksia IR alueella. Tutkimuksessa prosessinaikaiset mittaukset toteutettiin in-situ reaktoriin sijoitettavalla uppoanturilla käyttäen vaimennettuun kokonaisheijastukseen (ATR) perustuvaa Fourier muunnettua infrapuna (FTIR) spektroskopiaa. Jauhemaiset näytteet mitattiin off-line diffuusioheijastukseen (DRIFT) perustuvalla FTIR spektroskopialla. Monimuuttujamenetelmillä (kemometria) voidaan useita satoja, jopa tuhansia muuttujia käsittävä spektridata jalostaa kvalitatiiviseksi (laadulliseksi) tai kvantitatiiviseksi (määrälliseksi) prosessia kuvaavaksi informaatioksi. Väitöstyössä tarkasteltiin laajasti erilaisten monimuuttujamenetelmien soveltamista mahdollisimman monipuolisen prosessia kuvaavan informaation saamiseksi mitatusta spektriaineistosta. Väitöstyön tuloksena on ehdotettu kalibrointirutiini liuenneen aineen konsentraation ja edelleen ylikylläisyystason mittaamiseksi kiteytysprosessin aikana. Kalibrointirutiinin kehittämiseen kuuluivat aineiston hyvyyden tarkastelumenetelmät, aineiston esikäsittelymenetelmät, varsinainen kalibrointimallinnus sekä mallin validointi. Näin saadaan reaaliaikaista informaatiota kiteytysprosessin ajavasta voimasta, mikä edelleen parantaa kyseisen prosessin tuntemusta ja hallittavuutta. Ylikylläisyystason vaikutuksia syntyvän kidetuotteen laatuun seurattiin usein kiteytyskokein. Työssä on esitetty myös monimuuttujaiseen tilastolliseen prosessinseurantaan perustuva menetelmä, jolla voidaan ennustaa spontaania primääristä ytimenmuodostumishetkeä mitatusta spektriaineistosta sekä mahdollisesti päätellä ydintymisessä syntyvä polymorfimuoto. Ehdotettua menetelmää hyödyntäen voidaan paitsi ennakoida kideytimien muodostumista myös havaita mahdolliset häiriötilanteet kiteytysprosessin alkuhetkillä. Syntyvää polymorfimuotoa ennustamalla voidaan havaita ei-toivotun polymorfin ydintyminen,ja mahdollisesti muuttaa kiteytyksen ohjausta halutun polymorfimuodon saavuttamiseksi. Monimuuttujamenetelmiä sovellettiin myös kiteytyspanosten välisen vaihtelun määrittämiseen mitatusta spektriaineistosta. Tämäntyyppisestä analyysistä saatua informaatiota voidaan hyödyntää kiteytysprosessien suunnittelussa ja optimoinnissa. Väitöstyössä testattiin IR spektroskopian ja erilaisten monimuuttujamenetelmien soveltuvuutta kidetuotteen polymorfikoostumuksen nopeaan määritykseen. Jauhemaisten näytteiden luokittelu eri polymorfeja sisältäviin näytteisiin voitiin tehdä käyttäen tarkoitukseen soveltuvia monimuuttujaisia luokittelumenetelmiä. Tämä tarjoaa nopean menetelmän jauhemaisen näytteen polymorfikoostumuksen karkeaan arviointiin, eli siihen mitä yksittäistä polymorfia kyseinen näyte pääasiassa sisältää. Varsinainen kvantitatiivinen analyysi, eli sen selvittäminen paljonko esim. painoprosentteina näyte sisältää eri polymorfeja, vaatii kaikki polymorfit kattavan fysikaalisen kalibrointisarjan, mikä voi olla puhtaiden polymorfien huonon saatavuuden takia hankalaa.
Resumo:
Superheater corrosion causes vast annual losses for the power companies. With a reliable corrosion prediction method, the plants can be designed accordingly, and knowledge of fuel selection and determination of process conditions may be utilized to minimize superheater corrosion. Growing interest to use recycled fuels creates additional demands for the prediction of corrosion potential. Models depending on corrosion theories will fail, if relations between the inputs and the output are poorly known. A prediction model based on fuzzy logic and an artificial neural network is able to improve its performance as the amount of data increases. The corrosion rate of a superheater material can most reliably be detected with a test done in a test combustor or in a commercial boiler. The steel samples can be located in a special, temperature-controlled probe, and exposed to the corrosive environment for a desired time. These tests give information about the average corrosion potential in that environment. Samples may also be cut from superheaters during shutdowns. The analysis ofsamples taken from probes or superheaters after exposure to corrosive environment is a demanding task: if the corrosive contaminants can be reliably analyzed, the corrosion chemistry can be determined, and an estimate of the material lifetime can be given. In cases where the reason for corrosion is not clear, the determination of the corrosion chemistry and the lifetime estimation is more demanding. In order to provide a laboratory tool for the analysis and prediction, a newapproach was chosen. During this study, the following tools were generated: · Amodel for the prediction of superheater fireside corrosion, based on fuzzy logic and an artificial neural network, build upon a corrosion database developed offuel and bed material analyses, and measured corrosion data. The developed model predicts superheater corrosion with high accuracy at the early stages of a project. · An adaptive corrosion analysis tool based on image analysis, constructedas an expert system. This system utilizes implementation of user-defined algorithms, which allows the development of an artificially intelligent system for thetask. According to the results of the analyses, several new rules were developed for the determination of the degree and type of corrosion. By combining these two tools, a user-friendly expert system for the prediction and analyses of superheater fireside corrosion was developed. This tool may also be used for the minimization of corrosion risks by the design of fluidized bed boilers.
Establishing intercompany relationships: Motives and methods for successful collaborative engagement
Resumo:
This study explores the early phases of intercompany relationship building, which is a very important topic for purchasing and business development practitioners as well as for companies' upper management. There is a lot ofevidence that a proper engagement with markets increases a company's potential for achieving business success. Taking full advantage of the market possibilities requires, however, a holistic view of managing related decision-making chain. Most literature as well as the business processes of companies are lacking this holism. Typically they observe the process from the perspective of individual stages and thus lead to discontinuity and sub-optimization. This study contains a comprehensive introduction to and evaluation of literature related to various steps of the decision-making process. It is studied from a holistic perspective ofdetermining a company's vertical integration position within its demand/ supplynetwork context; translating the vertical integration objectives to feasible strategies and objectives; and operationalizing the decisions made through engagement with collaborative intercompany relationships. The empirical part of the research has been conducted in two sections. First the phenomenon of intercompany engagement is studied using two complementary case studies. Secondly a survey hasbeen conducted among the purchasing and business development managers of several electronics manufacturing companies, to analyze the processes, decision-makingcriteria and success factors of engagement for collaboration. The aim has been to identify the reasons why companies and their management act the way they do. As a combination of theoretical and empirical research an analysis has been produced of what would be an ideal way of engaging with markets. Based on the respective findings the study concludes by proposing a holistic framework for successful engagement. The evidence presented throughout the study demonstrates clear gaps, discontinuities and limitations in both current research and in practical purchasing decision-making chains. The most significant discontinuity is the identified disconnection between the supplier selection process and related criteria and the relationship success factors.
Resumo:
This thesis considers nondestructive optical methods for metal surface and subsurface inspection. The main purpose of this thesis was to study some optical methods in order to find out their applicability to industrial measurements. In laboratory testing the simplest light scattering approach, measurement of specular reflectance, was used for surface roughness evaluation. Surface roughness, curvature and finishing process of metal sheets were determined by specular reflectance measurements. Using a fixed angleof incidence, the specular reflectance method might be automated for industrialinspection. For defect detection holographic interferometry and thermography were compared. Using either holographic interferometry or thermography, relativelysmall-size defects in metal plates could be revealed. Holographic techniques have some limitations for industrial measurements. On the contrary, thermography has excellent prospects for on-line inspection, especially with scanning techniques.
Resumo:
Canopy characterization is a key factor to improve pesticide application methods in tree crops and vineyards. Development of quick, easy and efficient methods to determine the fundamental parameters used to characterize canopy structure is thus an important need. In this research the use of ultrasonic and LIDAR sensors have been compared with the traditional manual and destructive canopy measurement procedure. For both methods the values of key parameters such as crop height, crop width, crop volume or leaf area have been compared. Obtained results indicate that an ultrasonic sensor is an appropriate tool to determine the average canopy characteristics, while a LIDAR sensor provides more accuracy and detailed information about the canopy. Good correlations have been obtained between crop volume (CVU) values measured with ultrasonic sensors and leaf area index, LAI (R2 = 0.51). A good correlation has also been obtained between the canopy volume measured with ultrasonic and LIDAR sensors (R2 = 0.52). Laser measurements of crop height (CHL) allow one to accurately predict the canopy volume. The proposed new technologies seems very appropriate as complementary tools to improve the efficiency of pesticide applications, although further improvements are still needed.
Resumo:
OBJECTIVES: Many nanomaterials (materials with structures smaller than 100 nm) have chemical, physical and bioactive characteristics of interest for novel applications. Considerable research efforts have been launched in this field. This study aimed to study exposure scenarios commonly encountered in research settings. METHODS: We studied one of the leading Swiss universities and first identified all research units dealing with nanomaterials. After a preliminary evaluation of quantities and process types used, a detailed analysis was conducted in units where more than a few micrograms were used per week. RESULTS: In the investigated laboratories, background levels were usually low and in the range of a few thousand particles per cubic centimeter. Powder applications resulted in concentrations of 10,000 to 100,000 particles/cm(3) when measured inside fume hoods, but there were no or mostly minimal increases in the breathing zone of researchers. Mostly low exposures were observed for activities involving liquid applications. However, centrifugation and lyophilization of nanoparticle-containing solutions resulted in high particle number levels (up to 300,000 particles/cm(3)) in work spaces where researchers did not always wear respiratory protection. No significant increases were found for processes involving nanoparticles bound to surfaces, nor were they found in laboratories that were visualizing properties and structure of small amounts of nanomaterials. CONCLUSIONS: Research activities in modern laboratories equipped with control techniques were associated with minimal releases of nanomaterials into the working space. However, the focus should not only be on processes involving nanopowders but should also be on processes involving nanoparticle-containing liquids, especially if the work involves physical agitation, aerosolization or drying of the liquids.
Resumo:
The present study aims to compare yield and quality of pequi pulp oil when applying two distinct processes: in the first, pulp drying in a tray dryer at 60ºC was combined with enzymatic treatment and pressing to oil extraction; in the second, a simple process was carried out by combining sun-drying pulp and pressing. In this study, raw pequi fruits were collected in Mato Grosso State, Brazil. The fruits were autoclaved at 121ºC and stored under refrigeration. An enzymatic extract with pectinase and CMCase activities was used for hydrolysis of pequi pulp, prior to oil extraction. The oil extractions were carried out by hydraulic pressing, with or without enzymatic incubation. The oil content in the pequi pulp (45% w/w) and the physicochemical characteristic of the oil was determined according to standard analytical methods. Free fatty acids, peroxide values, iodine and saponification indices were respectively 1.46 mgKOH/g, 2.98 meq/kg, 49.13 and 189.40. The acidity and peroxide values were lower than the obtained values in commercial oil samples, respectively 2.48 mgKOH/g and 5.22 meq/kg. Aqueous extraction has presented lower efficiency and higher oxidation of unsaturated fatty acids. On the other hand, pequi pulp pressing at room temperature has produced better quality oil. However its efficiency is still smaller than the combined enzymatic treatment and pressing process. This combined process promotes cellular wall hydrolysis and pulp viscosity reduction, contributing to at least 20% of oil yield increase by pressing.
Resumo:
For more than a decade scientists tried to develop methods capable of dating ink by monitoring the loss of phenoxyethanol (PE) over time. While many methods were proposed in the literature, few were really used to solve practical cases and they still raise much concern within the scientific community. In fact, due to the complexity of ink drying processes it is particularly difficult to find a reliable ageing parameter to reproducibly follow ink ageing. Moreover, systematic experiments are required in order to evaluate how different factors actually influence the results over time. Therefore, this work aimed at evaluating the capacity of four different ageing parameters to reliably follow ink ageing over time: (1) the quantity of solvent PE in an ink line, (2) the relative peak area (RPA) normalising the PE results using stable volatile compounds present in the ink formulation, (3) the solvent loss ratio (R%) calculated from PE results obtained by the analyses of naturally and artificially aged samples, (4) a modified solvent loss ratio version (R%*) calculated from RPA results. After the determination of the limits of reliable measurements of the analytical method, the repeatability of the different ageing parameters was evaluated over time, as well as the influence of ink composition, writing pressure and storage conditions on the results. Surprisingly, our results showed that R% was not the most reliable parameter, as it showed the highest standard deviation. Discussion of the results in an ink dating perspective suggests that other proposed parameters, such as RPA values, may be more adequate to follow ink ageing over time.
Resumo:
Agile software development methods are attempting to provide an answer to the software development industry's need of lighter weight, more agile processes that offer the possibility to react to changes during the software development process. The objective of this thesis is to analyze and experiment the possibility of using agile methods or practices also in small software projects, even in projects containing only one developer. In the practical part of the thesis a small software project was executed with some agile methods and practices that in the theoretical part of the thesis were found possible to be applied to the project. In the project a Bluetooth proxy application that is run in the S60 smartphone platform and PC was developed further to contain some new features. As a result it was found that certain agile practices can be useful even in the very small projects. The selection of the suitable practices depends on the project and the size of the project team.
Resumo:
Head space gas chromatography with flame-ionization detection (HS-GC-FID), ancl purge and trap gas chromatography-mass spectrometry (P&T-GC-MS) have been used to determine methyl-tert-butyl ether (MTBE) and benzene, toluene, and the ylenes (BTEX) in groundwater. In the work discussed in this paper measures of quality, e.g. recovery (94-111%), precision (4.6 - 12.2%), limits of detection (0.3 - 5.7 I~g L 1 for HS and 0.001 I~g L 1 for PT), and robust-ness, for both methods were compared. In addition, for purposes of comparison, groundwater samples from areas suffering from odor problems because of fuel spillage and tank leakage were analyzed by use of both techniques. For high concentration levels there was good correlation between results from both methods.
Resumo:
OBJECTIVE: Routinely collected health data, collected for administrative and clinical purposes, without specific a priori research questions, are increasingly used for observational, comparative effectiveness, health services research, and clinical trials. The rapid evolution and availability of routinely collected data for research has brought to light specific issues not addressed by existing reporting guidelines. The aim of the present project was to determine the priorities of stakeholders in order to guide the development of the REporting of studies Conducted using Observational Routinely-collected health Data (RECORD) statement. METHODS: Two modified electronic Delphi surveys were sent to stakeholders. The first determined themes deemed important to include in the RECORD statement, and was analyzed using qualitative methods. The second determined quantitative prioritization of the themes based on categorization of manuscript headings. The surveys were followed by a meeting of RECORD working committee, and re-engagement with stakeholders via an online commentary period. RESULTS: The qualitative survey (76 responses of 123 surveys sent) generated 10 overarching themes and 13 themes derived from existing STROBE categories. Highest-rated overall items for inclusion were: Disease/exposure identification algorithms; Characteristics of the population included in databases; and Characteristics of the data. In the quantitative survey (71 responses of 135 sent), the importance assigned to each of the compiled themes varied depending on the manuscript section to which they were assigned. Following the working committee meeting, online ranking by stakeholders provided feedback and resulted in revision of the final checklist. CONCLUSIONS: The RECORD statement incorporated the suggestions provided by a large, diverse group of stakeholders to create a reporting checklist specific to observational research using routinely collected health data. Our findings point to unique aspects of studies conducted with routinely collected health data and the perceived need for better reporting of methodological issues.
Resumo:
This thesis gives an overview of the use of the level set methods in the field of image science. The similar fast marching method is discussed for comparison, also the narrow band and the particle level set methods are introduced. The level set method is a numerical scheme for representing, deforming and recovering structures in an arbitrary dimensions. It approximates and tracks the moving interfaces, dynamic curves and surfaces. The level set method does not define how and why some boundary is advancing the way it is but simply represents and tracks the boundary. The principal idea of the level set method is to represent the N dimensional boundary in the N+l dimensions. This gives the generality to represent even the complex boundaries. The level set methods can be powerful tools to represent dynamic boundaries, but they can require lot of computing power. Specially the basic level set method have considerable computational burden. This burden can be alleviated with more sophisticated versions of the level set algorithm like the narrow band level set method or with the programmable hardware implementation. Also the parallel approach can be used in suitable applications. It is concluded that these methods can be used in a quite broad range of image applications, like computer vision and graphics, scientific visualization and also to solve problems in computational physics. Level set methods and methods derived and inspired by it will be in the front line of image processing also in the future.
Resumo:
Työ on tehty osana ympäristöklusterin tutkimusohjelmaa "Materiaalivirrat ja energiankäyttö metsäteollisuusintegraatissa ja niihin liittyvät toimintastrategiat ympäristövaikutuslähtöisesti". Juha Räsänen on tehnyt metsäteollisuuden sivuainevirroista projektissa perusselvityksen, joka on tämän työn pohjana. Työn tavoitteena on ollut selvittää neljän itäsuomalaisen metsäteollisuusintegraatin tapauksissa vaihtoehtoisten lietteenkäsittelymenetelmien tekninen ja taloudellinen soveltuvuus nykyiseen käsittelyyn verrattuna. Tutkimuksessa hyödynnettiin aikaisempia tutkimustuloksia ja eri laitevalmistajien ja metsäteollisuusintegraattien kokemuksia. Työssä esitettäviä arvioita voidaan hyödyntää myös sektoritasolla. Metsäteollisuuden jätevedenpuhdistamon lietteistä käsiteltävyyden kannalta vaativin on bioliete, jonka osuuden kasvaessa perinteinen mekaaninen puristaminen ja poltto vaikeutuvat useilla tehtailla merkittävästikin nykyään ja lähivuosina. Polton ongelmat ja niistä aiheutuva ajoittainen aumakompostointitarve voivat puoltaa joko sekalietteen termistä tai biotermistä kuivausta ennen polttamista. Toinen tapa ratkaista lieteongelma on käsitellä bio- ja primäärilietteet erikseen. Biolietteen lipeälinjakäsittelyssä liete lingotaan, käsitellään mustalipeällä, haihdutetaan ja poltetaan soodakattilassa. Bioliete voidaan myös mädättää ja käsitellä sen jälkeen perinteisessä mekaanisessa puristuksessa. Kaikki käsitellyt menetelmät ovat teknisesti toteutettavissa, kunhan tietyt prosessireunaehdot täyttyvät. Vaihtoehtoiset käsittelymenetelmät vähentävät lietteen jäteluonnetta, mutta vastaavasti kustannukset lisääntyvät usein merkittävästi. Menetelmien käytöstä aiheutuvat integraatin puhdistamolietteen käsittelyn kokonaiskustannukset laskettiin työn osana olevalla taulukkolaskentasovelluksella laitetoimittajien budjettitarjouksia hyödyntäen. Biolietteen poltto soodakattilassa tarjoaa kustannusten kannalta houkuttelevimman ratkaisun. Työhön sisältyvää laskentamenettelyä voidaan soveltaa periaatteessa minkä tahansa metsäteollisuusintegraatin tapaukseen.