914 resultados para Empirical Mode Decomposition, vibration-based analysis, damage detection, signal decomposition


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ao longo dos últimos anos, acompanhada da evolução tecnológica, da dificuldade da inspeção visual e da consciencialização dos efeitos de uma má inspeção, verificou-se uma maior sensibilidade para a importância da monitorização estrutural, principalmente nas grandes infra-estruturas de engenharia civil. Os sistemas de monitorização estrutural permitem o acompanhamento contínuo do comportamento de uma determinada estrutura de tal forma que com os dados obtidos, é possível avaliar alterações no comportamento da mesma. Com isso, tem-se desenvolvido e implementado estratégias de identificação de danos estruturais com o intuito de aumentar a fiabilidade estrutural e evitar precocemente que alterações na condição da estrutura possam evoluir para situações mais severas. Neste contexto, a primeira parte desta dissertação consiste numa introdução à monitorização estrutural e à deteção de dano estrutural. Relativamente à monitorização, são expostos os seus objetivos e os princípios da sua aplicação. Conjuntamente são apresentados e descritos os principais sensores e são explicadas as funcionalidades de um sistema de aquisição de dados. O segundo tema aborda a importância da deteção de dano introduzindo os métodos estudados neste trabalho. Destaca-se o método das linhas de influência, o método da curvatura dos modos de vibração e o método da transformada de wavelet. Na segunda parte desta dissertação são apresentados dois casos de estudo. O primeiro estudo apresenta uma componente numérica e uma componente experimental. Estuda-se um modelo de viga que se encontra submetida a vários cenários de dano e valida-se a capacidade do método das linhas de influência em detetar e localizar essas anomalias. O segundo estudo consiste na modelação numérica de uma ponte real, na posterior simulação de cenários de dano e na análise comparativa da eficácia de cada um dos três métodos de deteção de dano na identificação e localização dos danos simulados. Por último, são apresentadas as principais conclusões deste trabalho e são sugeridos alguns tópicos a explorar na elaboração de trabalhos futuros.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

6th Graduate Student Symposium on Molecular Imprinting

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Complex industrial plants exhibit multiple interactions among smaller parts and with human operators. Failure in one part can propagate across subsystem boundaries causing a serious disaster. This paper analyzes the industrial accident data series in the perspective of dynamical systems. First, we process real world data and show that the statistics of the number of fatalities reveal features that are well described by power law (PL) distributions. For early years, the data reveal double PL behavior, while, for more recent time periods, a single PL fits better into the experimental data. Second, we analyze the entropy of the data series statistics over time. Third, we use the Kullback–Leibler divergence to compare the empirical data and multidimensional scaling (MDS) techniques for data analysis and visualization. Entropy-based analysis is adopted to assess complexity, having the advantage of yielding a single parameter to express relationships between the data. The classical and the generalized (fractional) entropy and Kullback–Leibler divergence are used. The generalized measures allow a clear identification of patterns embedded in the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gas chromatography (GC) is an analytical tool very useful to investigate the composition of gaseous mixtures. The different gases are separated by specific columns but, if hydrogen (H2 ) is present in the sample, its detection can be performed by a thermal conductivity detector or a helium ionization detector. Indeed, coupled to GC, no other detector can perform this detection except the expensive atomic emission detector. Based on the detection and analysis of H2 isotopes by low-pressure chemical ionization mass spectrometry (MS), a new method for H2 detection by GC coupled to MS with an electron ionization ion source and a quadrupole analyser is presented. The presence of H2 in a gaseous mixture could easily be put in evidence by the monitoring of the molecular ion of the protonated carrier gas. Copyright © 2013 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanomotors are nanoscale devices capable of converting energy into movement and forces. Among them, self-propelled nanomotors offer considerable promise for developing new and novel bioanalytical and biosensing strategies based on the direct isolation of target biomolecules or changes in their movement in the presence of target analytes. The mainachievements of this project consists on the development of receptor-functionalized nanomotors that offer direct and rapid target detection, isolation and transport from raw biological samples without preparatory and washing steps. For example, microtube engines functionalized with aptamer, antibody, lectin and enzymes receptors were used for the direct isolation of analytes of biomedical interest, including proteins and whole cells, among others. A target protein was also isolated from a complex sample by using an antigen-functionalized microengine navigating into the reservoirs of a lab-on-a-chip device. The new nanomotorbased target biomarkers detection strategy not only offers highly sensitive, rapid, simple and low cost alternative for the isolation and transport of target molecules, but also represents a new dimension of analytical information based on motion. The recognition events can be easily visualized by optical microscope (without any sophisticated analytical instrument) to reveal the target presence and concentration. The use of artificial nanomachines has shown not only to be useful for (bio)recognition and (bio)transport but also for detection of environmental contamination and remediation. In this context, micromotors modified with superhydrophobic layer demonstrated that effectively interacted, captured, transported and removed oil droplets from oil contaminated samples. Finally, a unique micromotor-based strategy for water-quality testing, that mimics live-fish water-quality testing, based on changes in the propulsion behavior of artificial biocatalytic microswimmers in the presence of aquatic pollutants was also developed. The attractive features of the new micromachine-based target isolation and signal transduction protocols developed in this project offer numerous potential applications in biomedical diagnostics, environmental monitoring, and forensic analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A procedure is described that allows the simple identification and sorting of live human cells that transcribe actively the HIV virus, based on the detection of GFP fluorescence in cells. Using adenoviral vectors for gene transfer, an expression cassette including the HIV-1 LTR driving the reporter gene GFP was introduced into cells that expressed stably either the Tat transcriptional activator, or an inactive mutant of Tat. Both northern and fluorescence-activated cell sorting (FACS) analysis indicate that cells containing the functional Tat protein presented levels of GFP mRNA and GFP fluorescence several orders of magnitude higher than control cells. Correspondingly, cells infected with HIV-1 showed similar enhanced reporter gene activation. HIV-1-infected cells of the lymphocytic line Jurkat were easily identified by fluorescence-activated cell sorting (FACS) as they displayed a much higher green fluorescence after transduction with the reporter adenoviral vector. This procedure could also be applied on primary human cells as blood monocyte-derived macrophages exposed to the adenoviral LTR-GFP reporter presented a much higher fluorescence when infected with HIV-1 compared with HIV-uninfected cells. The vector described has the advantages of labelling cells independently of their proliferation status and that analysis can be carried on intact cells which can be isolated subsequently by fluorescence-activated cell sorting (FACS) for further culture. This work suggests that adenoviral vectors carrying a virus-specific transcriptional control element controlling the expressions of a fluorescent protein will be useful in the identification and isolation of cells transcribing actively the viral template, and to be of use for drug screening and susceptibility assays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One major methodological problem in analysis of sequence data is the determination of costs from which distances between sequences are derived. Although this problem is currently not optimally dealt with in the social sciences, it has some similarity with problems that have been solved in bioinformatics for three decades. In this article, the authors propose an optimization of substitution and deletion/insertion costs based on computational methods. The authors provide an empirical way of determining costs for cases, frequent in the social sciences, in which theory does not clearly promote one cost scheme over another. Using three distinct data sets, the authors tested the distances and cluster solutions produced by the new cost scheme in comparison with solutions based on cost schemes associated with other research strategies. The proposed method performs well compared with other cost-setting strategies, while it alleviates the justification problem of cost schemes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Wigner higher order moment spectra (WHOS)are defined as extensions of the Wigner-Ville distribution (WD)to higher order moment spectra domains. A general class oftime-frequency higher order moment spectra is also defined interms of arbitrary higher order moments of the signal as generalizations of the Cohen’s general class of time-frequency representations. The properties of the general class of time-frequency higher order moment spectra can be related to theproperties of WHOS which are, in fact, extensions of the properties of the WD. Discrete time and frequency Wigner higherorder moment spectra (DTF-WHOS) distributions are introduced for signal processing applications and are shown to beimplemented with two FFT-based algorithms. One applicationis presented where the Wigner bispectrum (WB), which is aWHOS in the third-order moment domain, is utilized for thedetection of transient signals embedded in noise. The WB iscompared with the WD in terms of simulation examples andanalysis of real sonar data. It is shown that better detectionschemes can be derived, in low signal-to-noise ratio, when theWB is applied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Activity decreases, or deactivations, of midline and parietal cortical brain regions are routinely observed in human functional neuroimaging studies that compare periods of task-based cognitive performance with passive states, such as rest. It is now widely held that such task-induced deactivations index a highly organized"default-mode network" (DMN): a large-scale brain system whose discovery has had broad implications in the study of human brain function and behavior. In this work, we show that common task-induced deactivations from rest also occur outside of the DMN as a function of increased task demand. Fifty healthy adult subjects performed two distinct functional magnetic resonance imaging tasks that were designed to reliably map deactivations from a resting baseline. As primary findings, increases in task demand consistently modulated the regional anatomy of DMN deactivation. At high levels of task demand, robust deactivation was observed in non-DMN regions, most notably, the posterior insular cortex. Deactivation of this region was directly implicated in a performance-based analysis of experienced task difficulty. Together, these findings suggest that task-induced deactivations from rest are not limited to the DMN and extend to brain regions typically associated with integrative sensory and interoceptive processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tarkoituksena oli kerätä käyttövarmuustietoa savukaasulinjasta kahdelta suomalaiselta sellutehtaalta niiden käyttöönotosta aina tähän päivään asti. Käyttövarmuustieto koostuu luotettavuustiedoista sekä kunnossapitotiedoista. Kerätyn tiedon avulla on mahdollista kuvata tarkasti laitoksen käyttövarmuutta seuraavilla tunnusluvuilla: suunnittelemattomien häiriöiden lukumäärä ja korjausajat, laitteiden seisokkiaika, vikojen todennäköisyys ja korjaavan kunnossapidon kustannukset suhteessa savukaasulinjan korjaavan kunnossapidon kokonaiskustannuksiin. Käyttövarmuustiedon keräysmetodi on esitelty. Savukaasulinjan kriittisten laitteiden määrittelyyn käytetty metodi on yhdistelmä kyselytutkimuksesta ja muunnellusta vian vaikutus- ja kriittisyysanalyysistä. Laitteiden valitsemiskriteerit lopulliseen kriittisyysanalyysiin päätettiin käyttövarmuustietojen sekä kyselytutkimuksen perusteella. Kriittisten laitteiden määrittämisen tarkoitus on löytää savukaasulinjasta ne laitteet, joiden odottamaton vikaantuminen aiheuttaa vakavimmat seuraukset savukaasulinjan luotettavuuteen, tuotantoon, turvallisuuteen, päästöihin ja kustannuksiin. Tiedon avulla rajoitetut kunnossapidon resurssit voidaan suunnata oikein. Kriittisten laitteiden määrittämisen tuloksena todetaan, että kolme kriittisintä laitetta savukaasulinjassa ovat molemmille sellutehtaille yhteisesti: savukaasupuhaltimet, laahakuljettimet sekä ketjukuljettimet. Käyttövarmuustieto osoittaa, että laitteiden luotettavuus on tehdaskohtaista, mutta periaatteessa samat päälinjat voidaan nähdä suunnittelemattomien vikojen todennäköisyyttä esittävissä kuvissa. Kustannukset, jotka esitetään laitteen suunnittelemattomien kunnossapitokustannusten suhteena savukaasulinjan kokonaiskustannuksiin, noudattelevat hyvin pitkälle luotettavuuskäyrää, joka on laskettu laitteen seisokkiajan suhteena käyttötunteihin. Käyttövarmuustiedon keräys yhdistettynä kriittisten laitteiden määrittämiseen mahdollistavat ennakoivan kunnossapidon oikean kohdistamisen ja ajoittamisen laitteiston elinaikana siten, että luotettavuus- ja kustannustehokkuusvaatimukset saavutetaan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Master's Thesis has been written for Stora Enso Flexible Packaging Papers business unit. In the North-American mills, the business unit has developed a range of new flexible packaging paper grades. The Master's Thesis researches opportunities for sales of these new flexible packaging papers in selected Western-European markets. This study consists of theoretical and empirical part. Theoretical part presents supply chain of flexible packaging, discovering of customer requirements, concept of an offering, general market analysis, customer analysis and basis for sales planning. Empirical part includes preliminary market analysis based on secondary sources, results of lead user interviews and conclusions and recommendations. Potential customers' technical and commercial requirements were found and these were compared to current Stora Enso Flexible Packaging Papers offering. Also a list of potential new customers was made and sales action suggestions were presented in order to gain new accounts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis develops a comprehensive and a flexible statistical framework for the analysis and detection of space, time and space-time clusters of environmental point data. The developed clustering methods were applied in both simulated datasets and real-world environmental phenomena; however, only the cases of forest fires in Canton of Ticino (Switzerland) and in Portugal are expounded in this document. Normally, environmental phenomena can be modelled as stochastic point processes where each event, e.g. the forest fire ignition point, is characterised by its spatial location and occurrence in time. Additionally, information such as burned area, ignition causes, landuse, topographic, climatic and meteorological features, etc., can also be used to characterise the studied phenomenon. Thereby, the space-time pattern characterisa- tion represents a powerful tool to understand the distribution and behaviour of the events and their correlation with underlying processes, for instance, socio-economic, environmental and meteorological factors. Consequently, we propose a methodology based on the adaptation and application of statistical and fractal point process measures for both global (e.g. the Morisita Index, the Box-counting fractal method, the multifractal formalism and the Ripley's K-function) and local (e.g. Scan Statistics) analysis. Many measures describing the space-time distribution of environmental phenomena have been proposed in a wide variety of disciplines; nevertheless, most of these measures are of global character and do not consider complex spatial constraints, high variability and multivariate nature of the events. Therefore, we proposed an statistical framework that takes into account the complexities of the geographical space, where phenomena take place, by introducing the Validity Domain concept and carrying out clustering analyses in data with different constrained geographical spaces, hence, assessing the relative degree of clustering of the real distribution. Moreover, exclusively to the forest fire case, this research proposes two new methodologies to defining and mapping both the Wildland-Urban Interface (WUI) described as the interaction zone between burnable vegetation and anthropogenic infrastructures, and the prediction of fire ignition susceptibility. In this regard, the main objective of this Thesis was to carry out a basic statistical/- geospatial research with a strong application part to analyse and to describe complex phenomena as well as to overcome unsolved methodological problems in the characterisation of space-time patterns, in particular, the forest fire occurrences. Thus, this Thesis provides a response to the increasing demand for both environmental monitoring and management tools for the assessment of natural and anthropogenic hazards and risks, sustainable development, retrospective success analysis, etc. The major contributions of this work were presented at national and international conferences and published in 5 scientific journals. National and international collaborations were also established and successfully accomplished. -- Cette thèse développe une méthodologie statistique complète et flexible pour l'analyse et la détection des structures spatiales, temporelles et spatio-temporelles de données environnementales représentées comme de semis de points. Les méthodes ici développées ont été appliquées aux jeux de données simulées autant qu'A des phénomènes environnementaux réels; nonobstant, seulement le cas des feux forestiers dans le Canton du Tessin (la Suisse) et celui de Portugal sont expliqués dans ce document. Normalement, les phénomènes environnementaux peuvent être modélisés comme des processus ponctuels stochastiques ou chaque événement, par ex. les point d'ignition des feux forestiers, est déterminé par son emplacement spatial et son occurrence dans le temps. De plus, des informations tels que la surface bru^lée, les causes d'ignition, l'utilisation du sol, les caractéristiques topographiques, climatiques et météorologiques, etc., peuvent aussi être utilisées pour caractériser le phénomène étudié. Par conséquent, la définition de la structure spatio-temporelle représente un outil puissant pour compren- dre la distribution du phénomène et sa corrélation avec des processus sous-jacents tels que les facteurs socio-économiques, environnementaux et météorologiques. De ce fait, nous proposons une méthodologie basée sur l'adaptation et l'application de mesures statistiques et fractales des processus ponctuels d'analyse global (par ex. l'indice de Morisita, la dimension fractale par comptage de boîtes, le formalisme multifractal et la fonction K de Ripley) et local (par ex. la statistique de scan). Des nombreuses mesures décrivant les structures spatio-temporelles de phénomènes environnementaux peuvent être trouvées dans la littérature. Néanmoins, la plupart de ces mesures sont de caractère global et ne considèrent pas de contraintes spatiales com- plexes, ainsi que la haute variabilité et la nature multivariée des événements. A cet effet, la méthodologie ici proposée prend en compte les complexités de l'espace géographique ou le phénomène a lieu, à travers de l'introduction du concept de Domaine de Validité et l'application des mesures d'analyse spatiale dans des données en présentant différentes contraintes géographiques. Cela permet l'évaluation du degré relatif d'agrégation spatiale/temporelle des structures du phénomène observé. En plus, exclusif au cas de feux forestiers, cette recherche propose aussi deux nouvelles méthodologies pour la définition et la cartographie des zones périurbaines, décrites comme des espaces anthropogéniques à proximité de la végétation sauvage ou de la forêt, et de la prédiction de la susceptibilité à l'ignition de feu. A cet égard, l'objectif principal de cette Thèse a été d'effectuer une recherche statistique/géospatiale avec une forte application dans des cas réels, pour analyser et décrire des phénomènes environnementaux complexes aussi bien que surmonter des problèmes méthodologiques non résolus relatifs à la caractérisation des structures spatio-temporelles, particulièrement, celles des occurrences de feux forestières. Ainsi, cette Thèse fournit une réponse à la demande croissante de la gestion et du monitoring environnemental pour le déploiement d'outils d'évaluation des risques et des dangers naturels et anthro- pogéniques. Les majeures contributions de ce travail ont été présentées aux conférences nationales et internationales, et ont été aussi publiées dans 5 revues internationales avec comité de lecture. Des collaborations nationales et internationales ont été aussi établies et accomplies avec succès.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study is to examine macroeconomic indicators‟ and technical analysis‟ ability to signal market crashes. Indicators examined were Yield Spread, The Purchasing Managers Index and the Consumer Confidence Index. Technical Analysis indicators were moving average, Moving Average Convergence-Divergence and Relative Strength Index. We studied if commonly used macroeconomic indicators can be used as a warning system for a stock market crashes as well. The hypothesis is that the signals of recession can be used as signals of stock market crash and that way a basis for a hedging strategy. The data is collected from the U.S. markets from the years 1983-2010. Empirical studies show that macroeconomic indicators have been able to explain the future GDP development in the U.S. in research period and they were statistically significant. A hedging strategy that combined the signals of yield spread and Consumer Confidence Index gave most useful results as a basis of a hedging strategy in selected time period. It was able to outperform buy-and-hold strategy as well as all of the technical indicator based hedging strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is known already from 1970´s that laser beam is suitable for processing paper materials. In this thesis, term paper materials mean all wood-fibre based materials, like dried pulp, copy paper, newspaper, cardboard, corrugated board, tissue paper etc. Accordingly, laser processing in this thesis means all laser treatments resulting material removal, like cutting, partial cutting, marking, creasing, perforation etc. that can be used to process paper materials. Laser technology provides many advantages for processing of paper materials: non-contact method, freedom of processing geometry, reliable technology for non-stop production etc. Especially packaging industry is very promising area for laser processing applications. However, there are only few industrial laser processing applications worldwide even in beginning of 2010´s. One reason for small-scale use of lasers in paper material manufacturing is that there is a shortage of published research and scientific articles. Another problem, restraining the use of laser for processing of paper materials, is colouration of paper material i.e. the yellowish and/or greyish colour of cut edge appearing during cutting or after cutting. These are the main reasons for selecting the topic of this thesis to concern characterization of interaction of laser beam and paper materials. This study was carried out in Laboratory of Laser Processing at Lappeenranta University of Technology (Finland). Laser equipment used in this study was TRUMPF TLF 2700 carbon dioxide laser that produces a beam with wavelength of 10.6 μm with power range of 190-2500 W (laser power on work piece). Study of laser beam and paper material interaction was carried out by treating dried kraft pulp (grammage of 67 g m-2) with different laser power levels, focal plane postion settings and interaction times. Interaction between laser beam and dried kraft pulp was detected with different monitoring devices, i.e. spectrometer, pyrometer and active illumination imaging system. This way it was possible to create an input and output parameter diagram and to study the effects of input and output parameters in this thesis. When interaction phenomena are understood also process development can be carried out and even new innovations developed. Fulfilling the lack of information on interaction phenomena can assist in the way of lasers for wider use of technology in paper making and converting industry. It was concluded in this thesis that interaction of laser beam and paper material has two mechanisms that are dependent on focal plane position range. Assumed interaction mechanism B appears in range of average focal plane position of 3.4 mm and 2.4 mm and assumed interaction mechanism A in range of average focal plane position of 0.4 mm and -0.6 mm both in used experimental set up. Focal plane position 1.4 mm represents midzone of these two mechanisms. Holes during laser beam and paper material interaction are formed gradually: first small hole is formed to interaction area in the centre of laser beam cross-section and after that, as function of interaction time, hole expands, until interaction between laser beam and dried kraft pulp is ended. By the image analysis it can be seen that in beginning of laser beam and dried kraft pulp material interaction small holes off very good quality are formed. It is obvious that black colour and heat affected zone appear as function of interaction time. This reveals that there still are different interaction phases within interaction mechanisms A and B. These interaction phases appear as function of time and also as function of peak intensity of laser beam. Limit peak intensity is the value that divides interaction mechanism A and B from one-phase interaction into dual-phase interaction. So all peak intensity values under limit peak intensity belong to MAOM (interaction mechanism A one-phase mode) or to MBOM (interaction mechanism B onephase mode) and values over that belong to MADM (interaction mechanism A dual-phase mode) or to MBDM (interaction mechanism B dual-phase mode). Decomposition process of cellulose is evolution of hydrocarbons when temperature is between 380- 500°C. This means that long cellulose molecule is split into smaller volatile hydrocarbons in this temperature range. As temperature increases, decomposition process of cellulose molecule changes. In range of 700-900°C, cellulose molecule is mainly decomposed into H2 gas; this is why this range is called evolution of hydrogen. Interaction in this range starts (as in range of MAOM and MBOM), when a small good quality hole is formed. This is due to “direct evaporation” of pulp via decomposition process of evolution of hydrogen. And this can be seen can be seen in spectrometer as high intensity peak of yellow light (in range of 588-589 nm) which refers to temperature of ~1750ºC. Pyrometer does not detect this high intensity peak since it is not able to detect physical phase change from solid kraft pulp to gaseous compounds. As interaction time between laser beam and dried kraft pulp continues, hypothesis is that three auto ignition processes occurs. Auto ignition of substance is the lowest temperature in which it will spontaneously ignite in a normal atmosphere without an external source of ignition, such as a flame or spark. Three auto ignition processes appears in range of MADM and MBDM, namely: 1. temperature of auto ignition of hydrogen atom (H2) is 500ºC, 2. temperature of auto ignition of carbon monoxide molecule (CO) is 609ºC and 3. temperature of auto ignition of carbon atom (C) is 700ºC. These three auto ignition processes leads to formation of plasma plume which has strong emission of radiation in range of visible light. Formation of this plasma plume can be seen as increase of intensity in wavelength range of ~475-652 nm. Pyrometer shows maximum temperature just after this ignition. This plasma plume is assumed to scatter laser beam so that it interacts with larger area of dried kraft pulp than what is actual area of beam cross-section. This assumed scattering reduces also peak intensity. So result shows that assumably scattered light with low peak intensity is interacting with large area of hole edges and due to low peak intensity this interaction happens in low temperature. So interaction between laser beam and dried kraft pulp turns from evolution of hydrogen to evolution of hydrocarbons. This leads to black colour of hole edges.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crack formation and growth in steel bridge structural elements may be due to loading oscillations. The welded elements are liable to internal discontinuities along welded joints and sensible to stress variations. The evaluation of the remaining life of a bridge is needed to make cost-effective decisions regarding inspection, repair, rehabilitation, and replacement. A steel beam model has been proposed to simulate crack openings due to cyclic loads. Two possible alternatives have been considered to model crack propagation, which the initial phase is based on the linear fracture mechanics. Then, the model is extended to take into account the elastoplastic fracture mechanic concepts. The natural frequency changes are directly related to moment of inertia variation and consequently to a reduction in the flexural stiffness of a steel beam. Thus, it is possible to adopt a nondestructive technique during steel bridge inspection to quantify the structure eigenvalue variation that will be used to localize the grown fracture. A damage detection algorithm is developed for the proposed model and the numerical results are compared with the solutions achieved by using another well know computer code.