18 resultados para Cohesive And Adhesive Failure

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coronary artery disease is an atherosclerotic disease, which leads to narrowing of coronary arteries, deteriorated myocardial blood flow and myocardial ischaemia. In acute myocardial infarction, a prolonged period of myocardial ischaemia leads to myocardial necrosis. Necrotic myocardium is replaced with scar tissue. Myocardial infarction results in various changes in cardiac structure and function over time that results in “adverse remodelling”. This remodelling may result in a progressive worsening of cardiac function and development of chronic heart failure. In this thesis, we developed and validated three different large animal models of coronary artery disease, myocardial ischaemia and infarction for translational studies. In the first study the coronary artery disease model had both induced diabetes and hypercholesterolemia. In the second study myocardial ischaemia and infarction were caused by a surgical method and in the third study by catheterisation. For model characterisation, we used non-invasive positron emission tomography (PET) methods for measurement of myocardial perfusion, oxidative metabolism and glucose utilisation. Additionally, cardiac function was measured by echocardiography and computed tomography. To study the metabolic changes that occur during atherosclerosis, a hypercholesterolemic and diabetic model was used with [18F] fluorodeoxyglucose ([18F]FDG) PET-imaging technology. Coronary occlusion models were used to evaluate metabolic and structural changes in the heart and the cardioprotective effects of levosimendan during post-infarction cardiac remodelling. Large animal models were used in testing of novel radiopharmaceuticals for myocardial perfusion imaging. In the coronary artery disease model, we observed atherosclerotic lesions that were associated with focally increased [18F]FDG uptake. In heart failure models, chronic myocardial infarction led to the worsening of systolic function, cardiac remodelling and decreased efficiency of cardiac pumping function. Levosimendan therapy reduced post-infarction myocardial infarct size and improved cardiac function. The novel 68Ga-labeled radiopharmaceuticals tested in this study were not successful for the determination of myocardial blood flow. In conclusion, diabetes and hypercholesterolemia lead to the development of early phase atherosclerotic lesions. Coronary artery occlusion produced considerable myocardial ischaemia and later infarction following myocardial remodelling. The experimental models evaluated in these studies will enable further studies concerning disease mechanisms, new radiopharmaceuticals and interventions in coronary artery disease and heart failure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of load-bearing osseous implant with desired mechanical and surface properties in order to promote incorporation with bone and to eliminate risk of bone resorption and implant failure is a very challenging task. Bone formation and resoption processes depend on the mechanical environment. Certain stress/strain conditions are required to promote new bone growth and to prevent bone mass loss. Conventional metallic implants with high stiffness carry most of the load and the surrounding bone becomes virtually unloaded and inactive. Fibre-reinforced composites offer an interesting alternative to metallic implants, because their mechanical properties can be tailored to be equal to those of bone, by the careful selection of matrix polymer, type of fibres, fibre volume fraction, orientation and length. Successful load transfer at bone-implant interface requires proper fixation between the bone and implant. One promising method to promote fixation is to prepare implants with porous surface. Bone ingrowth into porous surface structure stabilises the system and improves clinical success of the implant. The experimental part of this work was focused on polymethyl methacrylate (PMMA) -based composites with dense load-bearing core and porous surface. Three-dimensionally randomly orientated chopped glass fibres were used to reinforce the composite. A method to fabricate those composites was developed by a solvent treatment technique and some characterisations concerning the functionality of the surface structure were made in vitro and in vivo. Scanning electron microscope observations revealed that the pore size and interconnective porous architecture of the surface layer of the fibre-reinforced composite (FRC) could be optimal for bone ingrowth. Microhardness measurements showed that the solvent treatment did not have an effect on the mechanical properties of the load-bearing core. A push-out test, using dental stone as a bone model material, revealed that short glass fibre-reinforced porous surface layer is strong enough to carry load. Unreacted monomers can cause the chemical necrosis of the tissue, but the levels of leachable resisidual monomers were considerably lower than those found in chemically cured fibre-reinforced dentures and in modified acrylic bone cements. Animal experiments proved that surface porous FRC implant can enhance fixation between bone and FRC. New bone ingrowth into the pores was detected and strong interlocking between bone and the implant was achieved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This doctoral dissertation investigates the adult education policy of the European Union (EU) in the framework of the Lisbon agenda 2000–2010, with a particular focus on the changes of policy orientation that occurred during this reference decade. The year 2006 can be considered, in fact, a turning point for the EU policy-making in the adult learning sector: a radical shift from a wide--ranging and comprehensive conception of educating adults towards a vocationally oriented understanding of this field and policy area has been observed, in particular in the second half of the so--called ‘Lisbon decade’. In this light, one of the principal objectives of the mainstream policy set by the Lisbon Strategy, that of fostering all forms of participation of adults in lifelong learning paths, appears to have muted its political background and vision in a very short period of time, reflecting an underlying polarisation and progressive transformation of European policy orientations. Hence, by means of content analysis and process tracing, it is shown that the new target of the EU adult education policy, in this framework, has shifted from citizens to workers, and the competence development model, borrowed from the corporate sector, has been established as the reference for the new policy road maps. This study draws on the theory of governance architectures and applies a post-ontological perspective to discuss whether the above trends are intrinsically due to the nature of the Lisbon Strategy, which encompasses education policies, and to what extent supranational actors and phenomena such as globalisation influence the European governance and decision--making. Moreover, it is shown that the way in which the EU is shaping the upgrading of skills and competences of adult learners is modeled around the needs of the ‘knowledge economy’, thus according a great deal of importance to the ‘new skills for new jobs’ and perhaps not enough to life skills in its broader sense which include, for example, social and civic competences: these are actually often promoted but rarely implemented in depth in the EU policy documents. In this framework, it is conveyed how different EU policy areas are intertwined and interrelated with global phenomena, and it is emphasised how far the building of the EU education systems should play a crucial role in the formation of critical thinking, civic competences and skills for a sustainable democratic citizenship, from which a truly cohesive and inclusive society fundamentally depend, and a model of environmental and cosmopolitan adult education is proposed in order to address the challenges of the new millennium. In conclusion, an appraisal of the EU’s public policy, along with some personal thoughts on how progress might be pursued and actualised, is outlined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Panel at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the new age of information technology, big data has grown to be the prominent phenomena. As information technology evolves, organizations have begun to adopt big data and apply it as a tool throughout their decision-making processes. Research on big data has grown in the past years however mainly from a technical stance and there is a void in business related cases. This thesis fills the gap in the research by addressing big data challenges and failure cases. The Technology-Organization-Environment framework was applied to carry out a literature review on trends in Business Intelligence and Knowledge management information system failures. A review of extant literature was carried out using a collection of leading information system journals. Academic papers and articles on big data, Business Intelligence, Decision Support Systems, and Knowledge Management systems were studied from both failure and success aspects in order to build a model for big data failure. I continue and delineate the contribution of the Information System failure literature as it is the principal dynamics behind technology-organization-environment framework. The gathered literature was then categorised and a failure model was developed from the identified critical failure points. The failure constructs were further categorized, defined, and tabulated into a contextual diagram. The developed model and table were designed to act as comprehensive starting point and as general guidance for academics, CIOs or other system stakeholders to facilitate decision-making in big data adoption process by measuring the effect of technological, organizational, and environmental variables with perceived benefits, dissatisfaction and discontinued use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the new age of information technology, big data has grown to be the prominent phenomena. As information technology evolves, organizations have begun to adopt big data and apply it as a tool throughout their decision-making processes. Research on big data has grown in the past years however mainly from a technical stance and there is a void in business related cases. This thesis fills the gap in the research by addressing big data challenges and failure cases. The Technology-Organization-Environment framework was applied to carry out a literature review on trends in Business Intelligence and Knowledge management information system failures. A review of extant literature was carried out using a collection of leading information system journals. Academic papers and articles on big data, Business Intelligence, Decision Support Systems, and Knowledge Management systems were studied from both failure and success aspects in order to build a model for big data failure. I continue and delineate the contribution of the Information System failure literature as it is the principal dynamics behind technology-organization-environment framework. The gathered literature was then categorised and a failure model was developed from the identified critical failure points. The failure constructs were further categorized, defined, and tabulated into a contextual diagram. The developed model and table were designed to act as comprehensive starting point and as general guidance for academics, CIOs or other system stakeholders to facilitate decision-making in big data adoption process by measuring the effect of technological, organizational, and environmental variables with perceived benefits, dissatisfaction and discontinued use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Infektiivinen endokardiitti yliopistollisessa keskussairaalassa vuosina 1980-2004 hoidetuilla aikuispotilailla Tausta: Infektiivinen endokardiitti on edelleen vakava sairaus. Huolimatta siitä, että taudin diagnostiikka ja hoito ovat kehittyneet, siihen liittyy edelleen merkittävää sairastuvuutta ja kuolleisuutta. Endokardiitin taudinkuvassa on viime vuosina tapahtunut muutoksia monissa maissa. Tavoitteet: Tutkia endokardiitin kliinista kuvaa ja ennustetta suomalaisessa yliopistosairaalassa vuosina 1980-2004 endokardiitin vuoksi hoidetuilla aikuispotilailla. Aineisto: Osatyössä I endokardiitin todennäköisyyttä analysoitiin 222:lla vuosina 1980-1995 endokardiittiepäilyn vuoksi hoidetulla potilaalla käyttäen apuna sekä Duken että von Reyn diagnostisia kriteereitä. Osatyössä II tutkittiin endokardiittiin liittyviä neurologisia komplikaatioita 218 varmassa tai mahdollisessa endokardiittiepisodissa. Osatyössä III tutkittiin seerumin C-reaktiivisen proteiinin (CRP) käyttökelpoisuutta hoitovasteen arvioinnissa 134:ssä varmaksi luokitellussa endokardiittiepisodissa. Osatyössä IV tutkittiin yleisbakteeri-PCRmenetelmän käyttökelpoisuutta etiologisessa diagnostiikassa 56:lla endokardiittiepäilyn vuoksi leikatulla potilaalla. Osatöissä V ja VI analysoitiin kaikki vuosina 1980-2004 hoidetut 303 endokardiittipotilasta lyhytaikais- ja 1-vuotisennusteen suhteen sekä tutkittiin endokardiitin taudinkuvassa tapahtuneita muutoksia sairaalassamme. Tulokset: Duken kriteerit osoittautuivat von Reyn kriteereitä herkemmiksi endokardiitin diagnostiikassa: 243 tutkitusta episodista 114 luokiteltiin varmoiksi endokardiiteiksi Duken kriteereillä, kun vastaavasti ainoastaan 64 luoteltiin varmoiksi von Reyn kriteereillä (p<0.001). Lisäksi peräti 115 episodissa endokardiitin diagnoosi hylättiin von Reyn kriteereillä, kun diagnoosi hylättiin Duken kriteereillä ainoastaan 37 episodissa (p<0.001). Neurologinen komplikaatio ilmeni ennen mikrobilääkehoidon aloittamista 76 %:ssa episodeja ollen ensimmäinen oire 47 %:ssa. Kuolema oli merkitsevästi yhteydessä neurologisiin komplikaatioihin. Hoitovastetta seurattaessa seerumin CRP:n lasku oli merkitsevästi nopeampaa komplikaatioitta toipuvilla potilailla kuin niillä, joille kehittyi komplikaatioita tai jotka menehtyivät tautiinsa. PCR-tutkimus poistetusta läpästä antoi ainoana menetelmänä etiologisen diagnoosin neljässä tapauksessa (2 stafylokokkilajia, 1 Streptococcus bovis,1 Bartonella quintana), joissa kaikissa mikrobilääkehoito oli ollut käytössä ennen näytteiden ottamista. Koko aineistossa kahden läpän infektio tai neurologisten komplikaatioiden, perifeeristen embolioiden tai sydämen vajaatoiminnan kehittyminen ennustivat sekä sairaalakuolleisuutta että 1-vuotiskuolleisuutta, kun taas ≥65 vuoden ikä ja sydämen ultraäänitutkimuksessa todettu vegetaatio tai Duken luokittelun mukainen pääkriteeri ennustivat kuolemaa vuoden sisällä. Korkea CRP-taso sairaalaan tullessa ennusti sekä sairaalakuolleisuutta että 1-vuotiskuolleisuutta. Huumeiden käyttäjien endokardiitit lisääntyivät tutkimusaikana merkitsevästi (p<0.001). Päätelmät: Tässä työssä vahvistetaan Duken kriteerien käyttökelpoisuus endokardiitin diagnostiikassa. Lisäksi vahvistui käsitys, että nopea diagnoosi ja mikrobilääkehoidon aloittaminen ovat parhaat keinot ehkäistä neurologisia komplikaatioita ja parantaa endokardiittipotilaiden ennustetta. CRP:n normalisoituminen on endokardiittipotilailla hyvän ennusteen merkki. Suoraan läppäkudoksesta tehty PCR-tutkimus on hyödyllinen, kun taudin aiheuttaja on kasvuominaisuuksiltaan vaativa tai potilas on saanut mikrobilääkehoitoa ennen viljelynäytteiden ottamista. Muutamat aiemmissa tutkimuksissa todetut huonon ennusteen merkit ennustavat huonoa ennustetta myös tämän tutkimuksen potilailla. Uutena löydöksenä ilmeni, että korkea CRP-arvo sairaalaan tullessa merkitsee sekä huonoa lyhyt- että pitkäaikaisennustetta. Huumeiden käyttäjien endokardiittien ilmaantuminen on tärkein epidemiologinen muutos 25 vuoden tutkimusaikana.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tässä diplomityössä tarkastellaan käyttökokemustietojen ja erityisesti häiriötietojen analysoinnin käyttömahdollisuuksia käyttövarmuuden ja kunnossapidon kehittämisessä. Työn tavoitteena on löytää sopiva toimintamalli häiriötietojen kirjaamiseen ja analysointiin kohdeorganisaatiossa. Työn teoriaosassa tarkastellaan kunnossapitoon ja käyttövarmuuteen liittyviä tekijöitä yleisesti. Lisäksi tarkastellaankunnossapidon ja käyttövarmuuden kehittämiseen ja optimointiin liittyviä malleja. Erityisesti tarkastellaan käyttökokemustietojen kirjaamista ja analysointia. Esimerkkinä käydään lyhyesti läpi käyttökokemustietojen hyödyntäminen kaasu- ja öljyteollisuudessa. Työn empiriaosassa kartoitetaan ja arvioidaan kohdeorganisaation käyttökokemustietojen kirjausten ja analysoinnin nykytilaa. Tässä yhteydessä käydään läpi käytössä olevan toiminnanohjaus- ja raportointijärjestelmän ominaisuudet tietojen hyödyntämisen kannalta. Työn lopputuloksena suositellaan kehitystoimenpiteitä tietojen kirjaamisen ja analysoinnin käytäntöihin ja toiminnanohjausjärjestelmän työkaluihin liittyen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Woven monofilament, multifilament, and spun yarn filter media have long been the standard media in liquid filtration equipment. While the energy for a solid-liquid separation process is determined by the engineering work, it is the interface between the slurry and the equipment - the filter media - that greatly affects the performance characteristics of the unit operation. Those skilled in the art are well aware that a poorly designed filter medium may endanger the whole operation, whereas well-performing filter media can make the operation smooth and economical. As the mineral and pulp producers seek to produce ever finer and more refined fractions of their products, it is becoming increasingly important to be able to dewater slurries with average particle sizes around 1 ¿m using conventional, high-capacity filtration equipment. Furthermore, the surface properties of the media must not allow sticky and adhesive particles to adhere to the media. The aim of this thesis was to test how the dirt-repellency, electrical resistance and highpressure filtration performance of selected woven filter media can be improved by modifying the fabric or yarn with coating, chemical treatment and calendering. The results achieved by chemical surface treatments clearly show that the woven media surface properties can be modified to achieve lower electrical resistance and improved dirt-repellency. The main challenge with the chemical treatments is the abrasion resistance and, while the experimental results indicate that the treatment is sufficiently permanent to resist standard weathering conditions, they may still prove to be inadequately strong in terms of actual use.From the pressure filtration studies in this work, it seems obvious that the conventional woven multifilament fabrics still perform surprisingly well against the coated media in terms of filtrate clarity and cake build-up. Especially in cases where the feed slurry concentration was low and the pressures moderate, the conventional media seemed to outperform the coated media. In the cases where thefeed slurry concentration was high, the tightly woven media performed well against the monofilament reference fabrics, but seemed to do worse than some of the coated media. This result is somewhat surprising in that the high initial specific resistance of the coated media would suggest that the media will blind more easily than the plain woven media. The results indicate, however, that it is actually the woven media that gradually clogs during the coarse of filtration. In conclusion, it seems obvious that there is a pressure limit above which the woven media looses its capacity to keep the solid particles from penetrating the structure. This finding suggests that for extreme pressures the only foreseeable solution is the coated fabrics supported by a strong enough woven fabric to hold thestructure together. Having said that, the high pressure filtration process seems to follow somewhat different laws than the more conventional processes. Based on the results, it may well be that the role of the cloth is most of all to support the cake, and the main performance-determining factor is a long life time. Measuring the pore size distribution with a commercially available porometer gives a fairly accurate picture of the pore size distribution of a fabric, but failsto give insight into which of the pore sizes is the most important in determining the flow through the fabric. Historically air, and sometimes water, permeability measures have been the standard in evaluating media filtration performance including particle retention. Permeability, however, is a function of a multitudeof variables and does not directly allow the estimation of the effective pore size. In this study a new method for estimating the effective pore size and open pore area in a densely woven multifilament fabric was developed. The method combines a simplified equation of the electrical resistance of fabric with the Hagen-Poiseuille flow equation to estimate the effective pore size of a fabric and the total open area of pores. The results are validated by comparison to the measured values of the largest pore size (Bubble point) and the average pore size. The results show good correlation with measured values. However, the measured and estimated values tend to diverge in high weft density fabrics. This phenomenon is thought to be a result of a more tortuous flow path of denser fabrics, and could most probably be cured by using another value for the tortuosity factor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theultimate goal of any research in the mechanism/kinematic/design area may be called predictive design, ie the optimisation of mechanism proportions in the design stage without requiring extensive life and wear testing. This is an ambitious goal and can be realised through development and refinement of numerical (computational) technology in order to facilitate the design analysis and optimisation of complex mechanisms, mechanical components and systems. As a part of the systematic design methodology this thesis concentrates on kinematic synthesis (kinematic design and analysis) methods in the mechanism synthesis process. The main task of kinematic design is to find all possible solutions in the form of structural parameters to accomplish the desired requirements of motion. Main formulations of kinematic design can be broadly divided to exact synthesis and approximate synthesis formulations. The exact synthesis formulation is based in solving n linear or nonlinear equations in n variables and the solutions for the problem areget by adopting closed form classical or modern algebraic solution methods or using numerical solution methods based on the polynomial continuation or homotopy. The approximate synthesis formulations is based on minimising the approximation error by direct optimisation The main drawbacks of exact synthesis formulationare: (ia) limitations of number of design specifications and (iia) failure in handling design constraints- especially inequality constraints. The main drawbacks of approximate synthesis formulations are: (ib) it is difficult to choose a proper initial linkage and (iib) it is hard to find more than one solution. Recentformulations in solving the approximate synthesis problem adopts polynomial continuation providing several solutions, but it can not handle inequality const-raints. Based on the practical design needs the mixed exact-approximate position synthesis with two exact and an unlimited number of approximate positions has also been developed. The solutions space is presented as a ground pivot map but thepole between the exact positions cannot be selected as a ground pivot. In this thesis the exact synthesis problem of planar mechanism is solved by generating all possible solutions for the optimisation process ¿ including solutions in positive dimensional solution sets - within inequality constraints of structural parameters. Through the literature research it is first shown that the algebraic and numerical solution methods ¿ used in the research area of computational kinematics ¿ are capable of solving non-parametric algebraic systems of n equations inn variables and cannot handle the singularities associated with positive-dimensional solution sets. In this thesis the problem of positive-dimensional solutionsets is solved adopting the main principles from mathematical research area of algebraic geometry in solving parametric ( in the mathematical sense that all parameter values are considered ¿ including the degenerate cases ¿ for which the system is solvable ) algebraic systems of n equations and at least n+1 variables.Adopting the developed solution method in solving the dyadic equations in direct polynomial form in two- to three-precision-points it has been algebraically proved and numerically demonstrated that the map of the ground pivots is ambiguousand that the singularities associated with positive-dimensional solution sets can be solved. The positive-dimensional solution sets associated with the poles might contain physically meaningful solutions in the form of optimal defectfree mechanisms. Traditionally the mechanism optimisation of hydraulically driven boommechanisms is done at early state of the design process. This will result in optimal component design rather than optimal system level design. Modern mechanismoptimisation at system level demands integration of kinematic design methods with mechanical system simulation techniques. In this thesis a new kinematic design method for hydraulically driven boom mechanism is developed and integrated in mechanical system simulation techniques. The developed kinematic design method is based on the combinations of two-precision-point formulation and on optimisation ( with mathematical programming techniques or adopting optimisation methods based on probability and statistics ) of substructures using calculated criteria from the system level response of multidegree-of-freedom mechanisms. Eg. by adopting the mixed exact-approximate position synthesis in direct optimisation (using mathematical programming techniques) with two exact positions and an unlimitednumber of approximate positions the drawbacks of (ia)-(iib) has been cancelled.The design principles of the developed method are based on the design-tree -approach of the mechanical systems and the design method ¿ in principle ¿ is capable of capturing the interrelationship between kinematic and dynamic synthesis simultaneously when the developed kinematic design method is integrated with the mechanical system simulation techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän diplomityön lähtökohtana on tutkia kiinteän kunnonvalvonnan ja ehkäisevän kunnossapidon edellytyksiä ja mahdollisuuksia Anjalan Paperitehtaan painehiomossa. Työn tavoitteena on kunnossapidon kustannustehokkuuden parantaminen ja häiriöaikojen vähentäminen ja sitä kautta koko tuotantoprosessin tuottavuuden nostaminen. Työn alussa tarkastellaan tämänhetkisiä prosessilaitteiston häiriötekijöitä ja kunnossapidon vaikutusmahdollisuuksia häiriöiden korjaukseen ja kustannuksiin. Tarkastelun perusteella päädyttiin painehiomossa ehkäisevän kunnossapidon määrän nostamiseen ja kiinteän kunnonvalvontajärjestelmän käyttöönottoon H4-linjalla. Jatkuvatoimisen kunnonvalvonnan edut tutkimuksen ja teorian perusteella ovat laitteiden käyttövarmuuden olennainen paraneminen ja kunnossapitokustannusten aleneminen. Työn kokeellisen osan perusteella kunnossapitokustannukset alenivat noin 24 % ja käytettävyyden tehostumisen johdosta laitehäiriöt alenivat 50 %:lla. Tulokset saatiin aikaan toteuttamalla suunnitellut ehkäisevät huollot kriittisille laitteille. Kiinteä kunnonvalvonta antoi kunnossapidolle tiedon laitteiden oikea-aikaisesta huoltovälistä. Investoinnin hyödyt saatiin täysmääräisesti käyttöön jo laitteiston ensimmäisen käyttövuoden aikana. Henkilöstön osaamiskapasiteetin nosto tulee vielä lisäämään edellä mainittuja hyötyjä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ydinenergian tuottamisessa turvallisuus on tärkeää. Todennäköisyyspohjaisella riskianalyysillä voidaan arvioida turvallisuusvaatimusten täyttymistä eri tilanteissa. Tässä diplomityössä tarkastellaan todennäköisyyspohjaisen riskianalyysin käyttöä ydinvoimalaitoksen kaapelipalojen vaikutusten arvioinnissa. Työn tarkoituksena on omalta osaltaan edistää ydinvoimalaitosten kaapelipaloturvallisuuden parantamista. Työssä esitellään todennäköisyyspohjaisen riskianalyysin ja todennäköisyyspohjaisen paloanalyysin periaatteet sekä nykyiset kaapelipaloanalyysimenetelmät. Olemassa olevien menetelmien pohjalta kehitettiin menetelmä Olkiluoto 1 ja 2 laitosyksiköiden kaapelipaloturvallisuuden arviointiin. Työssä tarkastellaan myös maailmalla sattuneita kaapelipaloja sekä ydinvoimalaitosten palosimulointiin kehitettyä ohjelmistoa. Työssä kehitetty kaapelipaloanalyysi jakautuu kahteen päävaiheeseen: virtapiirien vika-analyysiin ja virtapiirivikojen todennäköisyysanalyysiin. Virtapiirien vika-analyysi käsittää kaapeleiden vikamoodien, virtapiirien vikaantumisluokkien sekä vikaantumisten vaikutuksien määrittämisen. Virtapiirivikojen todennäköisyysanalyysissä määritetään puolestaan vikaantumistodennäköisyydet kaapelipalokokeiden tulosten pohjalta. Kehitettyä analyysimenetelmää sovellettiin esimerkinomaisesti Olkiluoto 1 ja 2 laitosyksiköiden kahdelle eri huonetilalle. Tuloksena saatiin turvallisuudelle tärkeiden järjestelmien virtapiirien vikaantumismallit sekä niiden todennäköisyydet. Tulosten perusteella voidaan todeta, että työssä kehitetty kaapelipaloanalyysimenetelmä toimi hyvin. Tulevaisuudessa menetelmää on tarkoitus hyödyntää Olkiluoto 1 ja 2 -laitosyksiköiden kaapelipaloturvallisuuden arvioinnissa.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Induction motors are widely used in industry, and they are generally considered very reliable. They often have a critical role in industrial processes, and their failure can lead to significant losses as a result of shutdown times. Typical failures of induction motors can be classified into stator, rotor, and bearing failures. One of the reasons for a bearing damage and eventually a bearing failure is bearing currents. Bearing currents in induction motors can be divided into two main categories; classical bearing currents and inverter-induced bearing currents. A bearing damage caused by bearing currents results, for instance, from electrical discharges that take place through the lubricant film between the raceways of the inner and the outer ring and the rolling elements of a bearing. This phenomenon can be considered similar to the one of electrical discharge machining, where material is removed by a series of rapidly recurring electrical arcing discharges between an electrode and a workpiece. This thesis concentrates on bearing currents with a special reference to bearing current detection in induction motors. A bearing current detection method based on radio frequency impulse reception and detection is studied. The thesis describes how a motor can work as a “spark gap” transmitter and discusses a discharge in a bearing as a source of radio frequency impulse. It is shown that a discharge, occurring due to bearing currents, can be detected at a distance of several meters from the motor. The issues of interference, detection, and location techniques are discussed. The applicability of the method is shown with a series of measurements with a specially constructed test motor and an unmodified frequency-converter-driven motor. The radio frequency method studied provides a nonintrusive method to detect harmful bearing currents in the drive system. If bearing current mitigation techniques are applied, their effectiveness can be immediately verified with the proposed method. The method also gives a tool to estimate the harmfulness of the bearing currents by making it possible to detect and locate individual discharges inside the bearings of electric motors.