127 resultados para Gas measurement
Resumo:
Verenpaineen kotimittaus − epidemiologia ja kliininen käyttö Kohonnutta verenpainetta, maailmanlaajuisesti merkittävintä ennenaikaiselle kuolemalle altistavaa riskitekijää, ei voida tunnistaa tai hoitaa ilman tarkkoja ja käytännöllisiä verenpaineen mittausmenetelmiä. Verenpaineen kotimittaus on saavuttanut suuren suosion potilaiden keskuudessa. Lääkärit eivät ole kuitenkaan vielä täysin hyväksyneet verenpaineen kotimittausta, sillä riittävä todistusaineisto sen toimivuudesta ja eduista on puuttunut. Tämän tutkimuksen tarkoituksena oli osoittaa, että kotona mitattu verenpaine (kotipaine) on perinteistä vastaanotolla mitattua verenpainetta (vastaanottopaine) tarkempi, ja että se on tehokas myös kliinisessä käytössä. Tutkimme kotipaineen käyttöä verenpainetaudin diagnosoinnissa ja hoidossa. Lisäksi tarkastelimme kotipaineen yhteyttä verenpainetaudin aiheuttamiin kohde-elinvaurioihin. Ensimmäinen aineisto, joka oli edustava otos Suomen aikuisväestöstä, koostui 2 120 45–74-vuotiaasta tutkimushenkilöstä. Tutkittavat mittasivat kotipainettaan viikon ajan ja osallistuivat terveystarkastukseen, johon sisältyi kliinisen tutkimuksen ja haastattelun lisäksi sydänfilmin otto ja vastaanottopaineen mittaus. 758 tutkittavalle suoritettiin lisäksi kaulavaltimon seinämän intima-mediakerroksen paksuuden (valtimonkovettumataudin mittari) mittaus ja 237:lle valtimon pulssiaallon nopeuden (valtimojäykkyyden mittari) mittaus. Toisessa aineistossa, joka koostui 98 verenpainetautia sairastavasta potilaasta, hoitoa ohjattiin satunnaistamisesta riippuen joko ambulatorisen eli vuorokausirekisteröinnillä mitatun verenpaineen tai kotipaineen perusteella. Vastaanottopaine oli kotipainetta merkittävästi korkeampi (systolisen/diastolisen paineen keskiarvoero oli 8/3 mmHg) ja yksimielisyys verenpainetaudin diagnoosissa kahden menetelmän välillä oli korkeintaan kohtalainen (75 %). 593 tutkittavasta, joilla oli kohonnut verenpaine vastaanotolla, 38 %:lla oli normaali verenpaine kotona eli ns. valkotakkiverenpaine. Verenpainetauti voidaan siis ylidiagnosoida joka kolmannella potilaalla seulontatilanteessa. Valkotakkiverenpaine oli yhteydessä lievästi kohonneeseen verenpaineeseen, matalaan painoindeksiin ja tupakoimattomuuteen, muttei psykiatriseen sairastavuuteen. Valkotakkiverenpaine ei kuitenkaan vaikuttaisi olevan täysin vaaraton ilmiö ja voi ennustaa tulevaa verenpainetautia, sillä siitä kärsivien sydän- ja verisuonitautien riskitekijäprofiili oli normaalipaineisten ja todellisten verenpainetautisten riskitekijäprofiilien välissä. Kotipaineella oli vastaanottopainetta vahvempi yhteys verenpainetaudin aiheuttamiin kohde-elinvaurioihin (intima-mediakerroksen paksuus, pulssiaallon nopeus ja sydänfilmistä todettu vasemman kammion suureneminen). Kotipaine oli tehokas verenpainetaudin hoidon ohjaaja, sillä kotipaineeseen ja ambulatoriseen paineeseen, jota on pidetty verenpainemittauksen ”kultaisena standardina”, perustuva lääkehoidon ohjaus johti yhtä hyvään verenpaineen hallintaan. Tämän ja aikaisempien tutkimusten tulosten pohjalta voidaan todeta, että verenpaineen kotimittaus on selkeä parannus perinteiseen vastaanotolla tapahtuvaan verenpainemittaukseen verrattuna. Verenpaineen kotimittaus on käytännöllinen, tarkka ja laajasti saatavilla oleva menetelmä, josta voi tulla jopa ensisijainen vaihtoehto verenpainetautia diagnosoitaessa ja hoitaessa. Verenpaineen mittauskäytäntöön tarvitaan muutos, sillä näyttöön perustuvan lääketieteen perusteella vaikuttaa, että vastaanotolla tapahtuvaa verenpainemittausta tulisi käyttää vain seulontatarkoitukseen.
Resumo:
This work gives a reader basic knowledge about mineralogy and mineral processing. Main focus of this work was on flotation process and pulp electrochemistry on flotation. Three different sulphide poor ores are examined on experimental part. Platinum and palladium were the noble metals, which were contained into studied ores. Electrochemistry of flotation of PGE minerals on sulphide poor ores has been examined only slightly. Bench scale flotation test was used in this study. Chalcopyrite, nickel-pentlandite, pyrite, platinum and pH electrodes were used to investigation of pulp electrochemistry during flotation tests. Effects of grinding media, carbon dioxide atmosphere in grinding and mixture of carbon dioxide and air as flotation gas to PGE flotation and electrochemistry of flotation were studied. Stainless steel grinding media created more oxidising pulp environment to flotation than mild steel grinding media. Concentrate quality improved also with stainless steel grinding media, but the recovery was remarkably poorer, than with mild steel grinding media. Carbon dioxide atmosphere in grinding created very reducing pulp environment, which caused very good concentrate quality. But the recovery was again poorer than with normal mild steel grinding media. Mixture of carbon dioxide and air as flotation gas improved PGE recovery with some ores, but not always. Effect of carbon dioxide to pulp electrochemistry was detected mainly via pH-value.
Resumo:
Tässä diplomityössä on mallinnettu höyry- ja kaasuturbiini Balas -prosessisimulointi-ohjelmaan. Balas on Valtion Teknillisen Tutkimuskeskuksen kehittämä simulointiohjelma, erityisesti paperi- ja selluteollisuuden prosessien staattiseen simulointiin. Työn tavoitteena on kehittää simulointimallit höyry- ja kaasuturbiinille, sekä tutkia niiden toimivuutta vertaamalla simulointeja mittaus- ja mitoitustietoihin. Työssä on muodostettu matemaattiset mallit höyryturbiinille, höyryturbiinin säätövyöhykkeelle sekä höyryturbiinin off-design laskennalle. Kaasuturbiinille muodostettiin toimintakäyrät, joiden avulla tarkastellaan sen toimintaa off-design tilanteessa. Komponentit mallinnettiin diplomityövaiheessa Matlab-ympäristöön, josta ne siirretään Balasiin erillisessä työvaiheessa. Malleissa on kiinnitetty huomiota erityisesti niiden helppokäyttöisyyteen ja monipuolisuuteen. Höyryturbiinimalleja testattiin simuloimalla erään paperitehtaan yhteydessä toimivan voimalaitoksen vastapaineturbiini säätövyöhykkeineen ja vertaamalla simulointituloksia tehtaan mittaustietoihin. Kaasuturbiinimallia testattiin vertaamalla GE Power MS 7001 kaasuturbiinin mitoitustietoja vastaavilla parametreilla simuloituun tapaukseen.
Resumo:
A novel cantilever pressure sensor was developed in the Department of Physics at the University of Turku in order to solve the sensitivity problems which are encountered when condenser microphones are used in photoacoustic spectroscopy. The cantilever pressure sensor, combined with a laser interferometer for the measurement of the cantilever movements, proved to be highly sensitive. The original aim of this work was to integrate the sensor in a photoacoustic gas detector working in a differential measurement scheme. The integration was made successfully into three prototypes. In addition, the cantilever was also integrated in the photoacoustic FTIR measurement schemes of gas-, liquid-, and solid-phase samples. A theoretical model for the signal generation in each measurement scheme was created and the optimal celldesign discussed. The sensitivity and selectivity of the differential method were evaluated when a blackbody radiator and a mechanical chopper were used with CO2, CH4, CO, and C2H4 gases. The detection limits were in the sub-ppm level for all four gases with only a 1.3 second integration time and the cross interference was well below one percent for all gas combinations other than those between hydrocarbons. Sensitivity with other infrared sources was compared using ethylene as an example gas. In the comparison of sensitivity with different infrared sources the electrically modulated blackbody radiator gave a 35 times higher and the CO2-laser a 100 times lower detection limit than the blackbody radiator with a mechanical chopper. As a conclusion, the differential system is well suited to rapid single gas measurements. Gas-phase photoacoustic FTIR spectroscopy gives the best performance, when several components have to be analyzed simultaneously from multicomponent samples. Multicomponent measurements were demonstrated with a sample that contained different concentrations of CO2, H2O, CO, and four different hydrocarbons. It required an approximately 10 times longer measurement time to achieve the same detection limit for a single gas as with the differential system. The properties of the photoacoustic FTIR spectroscopy were also compared to conventional transmission FTIR spectroscopy by simulations. Solid- and liquid-phase photoacoustic FTIR spectroscopy has several advantages compared to other techniques and therefore it also has a great variety of applications. A comparison of the signal-to-noise ratio between photoacoustic cells with a cantilever microphone and a condenser microphone was done with standard carbon black, polyethene, and sunflower oil samples. The cell with the cantilever microphone proved to have a 5-10 times higher signal-to-noise ratio than the reference detector, depending on the sample. Cantilever enhanced photoacoustics will be an effective tool for gas detection and analysis of solid- and liquid-phase samples. The preliminary prototypes gave good results in all three measurement schemes that were studied. According to simulations, there are possibilities for further enhancement of the sensitivity, as well as other properties, of each system.
Resumo:
Induction motors are widely used in industry, and they are generally considered very reliable. They often have a critical role in industrial processes, and their failure can lead to significant losses as a result of shutdown times. Typical failures of induction motors can be classified into stator, rotor, and bearing failures. One of the reasons for a bearing damage and eventually a bearing failure is bearing currents. Bearing currents in induction motors can be divided into two main categories; classical bearing currents and inverter-induced bearing currents. A bearing damage caused by bearing currents results, for instance, from electrical discharges that take place through the lubricant film between the raceways of the inner and the outer ring and the rolling elements of a bearing. This phenomenon can be considered similar to the one of electrical discharge machining, where material is removed by a series of rapidly recurring electrical arcing discharges between an electrode and a workpiece. This thesis concentrates on bearing currents with a special reference to bearing current detection in induction motors. A bearing current detection method based on radio frequency impulse reception and detection is studied. The thesis describes how a motor can work as a “spark gap” transmitter and discusses a discharge in a bearing as a source of radio frequency impulse. It is shown that a discharge, occurring due to bearing currents, can be detected at a distance of several meters from the motor. The issues of interference, detection, and location techniques are discussed. The applicability of the method is shown with a series of measurements with a specially constructed test motor and an unmodified frequency-converter-driven motor. The radio frequency method studied provides a nonintrusive method to detect harmful bearing currents in the drive system. If bearing current mitigation techniques are applied, their effectiveness can be immediately verified with the proposed method. The method also gives a tool to estimate the harmfulness of the bearing currents by making it possible to detect and locate individual discharges inside the bearings of electric motors.
Resumo:
Neste Oil has introduced plant oils and animal fats for the production of NExBTL renewable diesel, and these raw materials differ from the conventional mineral based oils. One subject of new raw materials study is thermal degradation, or in another name pyrolysis, of these organic oils and fats. The aim of this master’s thesis is to increase knowledge on thermal degradation of these new raw materials, and to identify possible gaseous harmful thermal degradation compounds. Another aim is to de-termine the health and environmental hazards of identified compounds. One objective is also to examine the formation possibilities of hazardous compounds in the produc-tion of NExBTL-diesel. Plant oils and animal fats consist mostly of triglycerides. Pyrolysis of triglycerides is a complex phenomenon, and many degradation products can be formed. Based on the literature studies, 13 hazardous degradation products were identified, one of which was acrolein. This compound is very toxic and dangerous to the environment. Own pyrolysis experiments were carried out with rapeseed and palm oils, and with a mix-ture of palm oil and animal fat. At least 12 hazardous compounds, including acrolein, were analysed from the gas phase. According to the experiments, the factors which influence on acrolein formation are the time of the experiment, the sphere (air/hydrogen) in which the experiment is carried out, and the characteristics of the used oil. The production of NExBTL-diesel is not based on pyrolysis. This is why thermal degradation is possible only when abnormal process conditions prevail.
Resumo:
One of the main industries which form the basis of Russian Economical structure is oil and gas. This industry is also playing a significant role for CIS countries. Oil and gas industry is developing intensively attracting foreign investments. This situation is providing sustainable development of machinery production for hazardous areas. Operating in oil and gas areas is always related with occurrence of explosion gas atmospheres. Machines for hazardous areas must be furnished with additional protection of different types. Explosion protection is regulated with standards according to which equipment must be manufactured. In Russia and CIS countries explosion-proof equipment must be constructed in compliance with GOST standards. To confirm that equipment is manufactured according to standards’ requirements and is safe and reliable it must undergo the approval procedure. Certification in Russia is governed by Federal Laws and legislation. Each CIS country has its own approval certificates and permissions for operating in hazardous areas.
Resumo:
Background: Measurement of serum cotinine, a major metabolite of nicotine, provides a valid marker for quantifying exposure to tobacco smoke. Exposure to tobacco smoke causes vascular damage by multiple mechanisms, and it has been acknowledged as a risk factor for atherosclerosis. Multifactorial atherosclerosis begins in childhood, but the relationship between exposure to tobacco smoke and arterial changes related to early atherosclerosis have not been studied in children. Aims: The aim of the present study was to evaluate exposure to tobacco smoke with a biomarker, serum cotinine concentration, and its associations with markers of subclinical atherosclerosis and lipid profile in school-aged children and adolescents. Subjects and Methods: Serum cotinine concentration was measured using a gas chromatographic method annually between the ages 8 and 13 years in 538-625 children participating since infancy in a randomized, prospective atherosclerosis prevention trial STRIP (Special Turku coronary Risk factor Intervention Project). Conventional atherosclerosis risk factors were measured repeatedly. Vascular ultrasound studies were performed among 402 healthy 11-year-old children and among 494 adolescents aged 13 years. Results: According to serum cotinine measurements, a notable number of the school aged children and adolescents were exposed to tobacco smoke, but the exposure levels were only moderate. Exposure to tobacco smoke was associated with decreased endothelial function as measured with flow-mediated dilation of the brachial artery, decreased elasticity of the aorta, and increased carotid and aortic intima-media thickness. Longitudinal exposure to tobacco smoke was also related with increased apolipoprotein B and triglyceride levels in 13-year-old adolescents, whose body mass index and nutrient intakes did not differ. Conclusions: These findings suggest that exposure to tobacco smoke in childhood may play a significant role in the development of early atherosclerosis. Key Words: arterial elasticity, atherosclerosis, children, cotinine, endothelial function, environmental tobacco smoke, intima-media thickness, risk factors, ultrasound
Resumo:
The purpose of this thesis was to investigate creating and improving category purchasing visibility for corporate procurement by utilizing financial information. This thesis was a part of the global category driven spend analysis project of Konecranes Plc. While creating general understanding for building category driven corporate spend visibility, the IT architecture and needed purchasing parameters for spend analysis were described. In the case part of the study three manufacturing plants of Konecranes Standard Lifting, Heavy Lifting and Services business areas were examined. This included investigating the operative IT system architecture and needed processes for building corporate spend visibility. The key findings of this study were the identification of the needed processes for gathering purchasing data elements while creating corporate spend visibility in fragmented source system environment. As an outcome of the study, roadmap presenting further development areas was introduced for Konecranes.
Resumo:
The research around performance measurement and management has focused mainly on the design, implementation and use of performance measurement systems. However, there is little evidence about the actual impacts of performance measurement on the different levels of business and operations of organisations, as well as the underlying factors that lead to a positive impact of performance measurement. The study thus focuses on this research gap, which can be considered both important and challenging to cover. The first objective of the study was to examine the impacts of performance measurement on different aspects of management, leadership and the quality of working life, after which the factors that facilitate and improve performance and performance measurement at the operative level of an organisation were examined. The second objective was to study how these factors operate in practice. The third objective focused on the construction of a framework for successful operative level performance measurement and the utilisation of the factors in the organisations. The research objectives have been studied through six research papers utilising empirical data from three separate studies, including two sets of interview data and one of quantitative data. The study applies mainly the hermeneutical research approach. As a contribution of the study, a framework for successful operative level performance measurement was formed by matching the findings of the current study and performance measurement theory. The study extents the prior research regarding the impacts of performance measurement and the factors that have a positive effect on operative level performance and performance measurement. The results indicate that under suitable circumstances, performance measurement has positive impacts on different aspects of management, leadership, and the quality of working life. The results reveal that for example the perception of the employees and the management of the impacts of performance measurement on leadership style differ considerably. Furthermore, the fragmented literature has been reorganised into six factors that facilitate and improve the performance of the operations and employees, and the use of performance measurement at the operative level of an organisation. Regarding the managerial implications of the study, managers who operate around performance measurement can utilise the framework for example by putting the different phases of the framework into practice.
Resumo:
Centrifugal compressors are widely used for example in refrigeration processes, the oil and gas industry, superchargers, and waste water treatment. In this work, five different vaneless diffusers and six different vaned diffusers are investigated numerically. The vaneless diffusers vary only by their diffuser width, so that four of the geometries have pinch implemented to them. Pinch means a decrease in the diffuser width. Four of the vaned diffusers have the same vane turning angle and a different number of vanes, and two have different vane turning angles. The flow solver used to solve the flow fields is Finflo, which is a Navier-Stokes solver. All the cases are modeled with the Chien's k – έ- turbulence model, and selected cases are modeled also with the k – ώ-SST turbulence model. All five vaneless diffusers and three vaned diffusers are investigated also experimentally. For each construction, the compressor operating map is measured according to relevant standards. In addition to this, the flow fields before and after the diffuser are measured with static and total pressure, flow angle and total temperature measurements. When comparing the computational results to the measured results, it is evident that the k – ώ-SST turbulence model predicts the flow fields better. The simulation results indicate that it is possible to improve the efficiency with the pinch, and according to the numerical results, the two best geometries are the ones with most pinch at the shroud. These geometries have approximately 4 percentage points higher efficiency than the unpinched vaneless diffusers. The hub pinch does not seem to have any major benefits. In general, the pinches make the flow fields before and after the diffuser more uniform. The pinch also seems to improve the impeller efficiency. This is down to two reasons. The major reason is that the pinch decreases the size of slow flow and possible backflow region located near the shroud after the impeller. Secondly, the pinches decrease the flow velocity in the tip clearance, leading to a smaller tip leakage flow and therefore slightly better impeller efficiency. Also some of the vaned diffusers improve the efficiency, the increment being 1...3 percentage points, when compared to the vaneless unpinched geometry. The measurement results confirm that the pinch is beneficial to the performance of the compressor. The flow fields are more uniform with the pinched cases, and the slow flow regions are smaller. The peak efficiency is approximately 2 percentage points and the design point efficiency approximately 4 percentage points higher with the pinched geometries than with the un- pinched geometry. According to the measurements, the two best geometries are the ones with the most pinch at the shroud, the case with the pinch only at the shroud being slightly better of the two. The vaned diffusers also have better efficiency than the vaneless unpinched geometries. However, the pinched cases have even better efficiencies. The vaned diffusers narrow the operating range considerably, whilst the pinch has no significant effect on the operating range.
Resumo:
This thesis was produced for the Technology Marketing unit at the Nokia Research Center. Technology marketing was a new function at Nokia Research Center, and needed an established framework with the capacity to take into account multiple aspects for measuring the team performance. Technology marketing functions had existed in other parts of Nokia, yet no single method had been agreed upon for measuring their performance. The purpose of this study was to develop a performance measurement system for Nokia Research Center Technology Marketing. The target was that Nokia Research Center Technology Marketing had a framework for separate metrics; including benchmarking for starting level and target values in the future planning (numeric values were kept confidential within the company). As a result of this research, the Balanced Scorecard model of Kaplan and Norton, was chosen for the performance measurement system for Nokia Research Center Technology Marketing. This research selected the indicators, which were utilized in the chosen performance measurement system. Furthermore, performance measurement system was defined to guide the Head of Marketing in managing Nokia Research Center Technology Marketing team. During the research process the team mission, vision, strategy and critical success factors were outlined.
Resumo:
This thesis presents the calibration and comparison of two systems, a machine vision system that uses 3 channel RGB images and a line scanning spectral system. Calibration. is the process of checking and adjusting the accuracy of a measuring instrument by comparing it with standards. For the RGB system self-calibrating methods for finding various parameters of the imaging device were developed. Color calibration was done and the colors produced by the system were compared to the known colors values of the target. Software drivers for the Sony Robot were also developed and a mechanical part to connect a camera to the robot was also designed. For the line scanning spectral system, methods for the calibrating the alignment of the system and the measurement of the dimensions of the line scanned by the system were developed. Color calibration of the spectral system is also presented.
Resumo:
The environmental impact of landfill is a growing concern in waste management practices. Thus, assessing the effectiveness of the solutions implemented to alter the issue is of importance. The objectives of the study were to provide an insight of landfill advantages, and to consolidate landfill gas importance among others alternative fuels. Finally, a case study examining the performances of energy production from a land disposal at Ylivieska was carried out to ascertain the viability of waste to energy project. Both qualitative and quantitative methods were applied. The study was conducted in two parts; the first was the review of literatures focused on landfill gas developments. Specific considerations were the conception of mechanism governing the variability of gas production and the investigation of mathematical models often used in landfill gas modeling. Furthermore, the analysis of two main distributed generation technologies used to generate energy from landfill was carried out. The review of literature revealed a high influence of waste segregation and high level of moisture content for waste stabilization process. It was found that the enhancement in accuracy for forecasting gas rate generation can be done with both mathematical modeling and field test measurements. The result of the case study mainly indicated the close dependence of the power output with the landfill gas quality and the fuel inlet pressure.
Resumo:
Aims:This study was carried out to evaluate the feasibility of two different methods to determine free flap perfusion in cancer patients undergoing major reconstructive surgery. The hypotheses was that low perfusion in the flap is associated with flap complications. Patients and methods: Between August 2002 and June 2008 at the Department of Otorhinolaryngology – Head and Neck Surgery, Department of Surgery, and at the PET Centre, Turku, 30 consecutive patients with 32 free flaps were included in this study. The perfusion of the free microvascular flaps was assessed with positron emission tomography (PET) and radioactive water ([15O] H2O) in 40 radiowater injections in 33 PET studies. Furthermore, 24 free flaps were monitored with a continuous tissue oxygen measurement using flexible polarographic catheters for an average of three postoperative days. Results: Of the 17 patients operated on for head and neck (HN) cancer and reconstructed with 18 free flaps, three re-operations were carried out due to poor tissue oxygenation as indicated by ptiO2 monitoring results and three other patients were reoperated on for postoperative hematomas in the operated area. Blood perfusion assessed with PET (BFPET) was above 2.0 mL / min / 100 g in all flaps and a low flap-to-muscle BFPET ratio appeared to correlate with poor survival of the flap. Survival in this group of HN cancer patients was 9.0 months (median, range 2.4-34.2) after a median follow-up of 11.9 months (range 1.0-61.0 months). Seven HN patients of this group are alive without any sign of recurrence and one patient has died of other causes. All of the 13 breast reconstruction patients included in the study are alive and free of disease at a median follow-up time of 27.4 months (range 13.9-35.7 months). Re-explorations were carried out in three patients due data provided by ptiO2 monitoring and one re-exploration was avoided on the basis of adequate blood perfusion assessed with PET. Two patients had donorsite morbidity and 3 patients had partial flap necrosis or fat necrosis. There were no total flap losses. Conclusions: PtiO2 monitoring is a feasible method of free flap monitoring when flap temperature is monitored and maintained close to the core temperature. When other monitoring methods give controversial results or are unavailable, [15O] H2O PET technique is feasible in the evaluation of the perfusion of the newly reconstructed free flaps.