914 resultados para Empirical Mode Decomposition, vibration-based analysis, damage detection, signal decomposition
Resumo:
Nowadays there is great interest in structural damage detection in systems using nondestructive tests. Once the failure is detected, as for instance a crack, it is possible to take providences. There are several different approaches that can be used to obtain information about the existence, location and extension of the fault in the system by non-destructive tests. Among these methodologies, one can mention different optimization techniques, as for instance classical methods, genetic algorithms, neural networks, etc. Most of these techniques, which are based on element-byelement adjustments of a finite element (FE) model, take advantage of the dynamic behavior of the model. However, in practical situations, usually, is almost impossible to obtain an accuracy model. In this paper, it is proposed an experimental technique for damage location. This technique is based on H: norm to obtain the damage location. The dynamic properties of the structure were identified using experimental data by eigensystem realization algorithm (ERA). The experimental test was carried out in a beam structure through varying the mass of an element. For the output signal was used a piezoelectric sensor. The signal of input of sine form was generated through SignalCalc® software.
Resumo:
In this paper is presented a multilayer perceptron neural network combined with the Nelder-Mead Simplex method to detect damage in multiple support beams. The input parameters are based on natural frequencies and modal flexibility. It was considered that only a number of modes were available and that only vertical degrees of freedom were measured. The reliability of the proposed methodology is assessed from the generation of random damages scenarios and the definition of three types of errors, which can be found during the damage identification process. Results show that the methodology can reliably determine the damage scenarios. However, its application to large beams may be limited by the high computational cost of training the neural network.
Resumo:
Abstract Background A popular model for gene regulatory networks is the Boolean network model. In this paper, we propose an algorithm to perform an analysis of gene regulatory interactions using the Boolean network model and time-series data. Actually, the Boolean network is restricted in the sense that only a subset of all possible Boolean functions are considered. We explore some mathematical properties of the restricted Boolean networks in order to avoid the full search approach. The problem is modeled as a Constraint Satisfaction Problem (CSP) and CSP techniques are used to solve it. Results We applied the proposed algorithm in two data sets. First, we used an artificial dataset obtained from a model for the budding yeast cell cycle. The second data set is derived from experiments performed using HeLa cells. The results show that some interactions can be fully or, at least, partially determined under the Boolean model considered. Conclusions The algorithm proposed can be used as a first step for detection of gene/protein interactions. It is able to infer gene relationships from time-series data of gene expression, and this inference process can be aided by a priori knowledge available.
Resumo:
A serological follow-up study was carried out on 27 children (1–12 years old) with visceral and/or ocular toxocariasis, after treatment with thiabendazole. A total of 159 serum samples were collected in a period ranging from 22–116 months. Enzyme-linked immunosorbent assays (IgG, IgA, and IgE ELISA) were standardized, using excretory–secretory antigens obtained from the second-stage larvae of a Toxocara canis culture. The sensitivity found for the IgG, IgA, and IgE ELISA, as determined in visceral toxocariasis patients, was 100%, 47.8%, and 78.3%, respectively. Approximately 84% of the patients presented single or multiple parasitosis, as diagnosed by stool examination, yet such variables did not appear to affect the anti-Toxocara immune response. Titers of specific IgE antibody showed a significant decrease during the first year after treatment, followed by a decrease in the IgA titers in the second year, and in the IgG titers from the fourth year onwards. Sera from all patients presented high avidity IgG antibodies, indicating that they were in the chronic phase of the disease. Moreover, 1 year after treatment, the level of leukocytes, eosinophils, and anti-A isohemagglutinin in patients decreased significantly. The present data suggest that IgE antibodies plus eosinophil counts are helpful parameters for patient followup after chemotherapy.
Resumo:
This work is focused on the analysis of sea–level change (last century), based mainly on instrumental observations. During this period, individual components of sea–level change are investigated, both at global and regional scales. Some of the geophysical processes responsible for current sea-level change such as glacial isostatic adjustments and current melting terrestrial ice sources, have been modeled and compared with observations. A new value of global mean sea level change based of tide gauges observations has been independently assessed in 1.5 mm/year, using corrections for glacial isostatic adjustment obtained with different models as a criterion for the tide gauge selection. The long wavelength spatial variability of the main components of sea–level change has been investigated by means of traditional and new spectral methods. Complex non–linear trends and abrupt sea–level variations shown by tide gauges records have been addressed applying different approaches to regional case studies. The Ensemble Empirical Mode Decomposition technique has been used to analyse tide gauges records from the Adriatic Sea to ascertain the existence of cyclic sea-level variations. An Early Warning approach have been adopted to detect tipping points in sea–level records of North East Pacific and their relationship with oceanic modes. Global sea–level projections to year 2100 have been obtained by a semi-empirical approach based on the artificial neural network method. In addition, a model-based approach has been applied to the case of the Mediterranean Sea, obtaining sea-level projection to year 2050.
Resumo:
In this paper we propose an innovative approach to tackle the problem of traffic sign detection using a computer vision algorithm and taking into account real-time operation constraints, trying to establish intelligent strategies to simplify as much as possible the algorithm complexity and to speed up the process. Firstly, a set of candidates is generated according to a color segmentation stage, followed by a region analysis strategy, where spatial characteristic of previously detected objects are taken into account. Finally, temporal coherence is introduced by means of a tracking scheme, performed using a Kalman filter for each potential candidate. Taking into consideration time constraints, efficiency is achieved two-fold: on the one side, a multi-resolution strategy is adopted for segmentation, where global operation will be applied only to low-resolution images, increasing the resolution to the maximum only when a potential road sign is being tracked. On the other side, we take advantage of the expected spacing between traffic signs. Namely, the tracking of objects of interest allows to generate inhibition areas, which are those ones where no new traffic signs are expected to appear due to the existence of a TS in the neighborhood. The proposed solution has been tested with real sequences in both urban areas and highways, and proved to achieve higher computational efficiency, especially as a result of the multi-resolution approach.
Resumo:
In this paper we propose an innovative method for the automatic detection and tracking of road traffic signs using an onboard stereo camera. It involves a combination of monocular and stereo analysis strategies to increase the reliability of the detections such that it can boost the performance of any traffic sign recognition scheme. Firstly, an adaptive color and appearance based detection is applied at single camera level to generate a set of traffic sign hypotheses. In turn, stereo information allows for sparse 3D reconstruction of potential traffic signs through a SURF-based matching strategy. Namely, the plane that best fits the cloud of 3D points traced back from feature matches is estimated using a RANSAC based approach to improve robustness to outliers. Temporal consistency of the 3D information is ensured through a Kalman-based tracking stage. This also allows for the generation of a predicted 3D traffic sign model, which is in turn used to enhance the previously mentioned color-based detector through a feedback loop, thus improving detection accuracy. The proposed solution has been tested with real sequences under several illumination conditions and in both urban areas and highways, achieving very high detection rates in challenging environments, including rapid motion and significant perspective distortion
Resumo:
A novel GPU-based nonparametric moving object detection strategy for computer vision tools requiring real-time processing is proposed. An alternative and efficient Bayesian classifier to combine nonparametric background and foreground models allows increasing correct detections while avoiding false detections. Additionally, an efficient region of interest analysis significantly reduces the computational cost of the detections.
Resumo:
Due to the existence of global modes and local modes of the neighbouring members, damage detection on a structure is more challenging than damage on isolated beams. Detection of an artificial circumferential crack on a joint in a frame-like welded structure is studied in this paper using coupled response measurements. Similarity to real engineering structures is maintained in the fabrication of the test frame. Both the chords and the branch members have hollow sections and the branch members have smaller sizes. The crack is created by a hacksaw on a joint where a branch meets the chord. The methodology is first demonstrated on a single hollow section beam. The test results are then presented for the damaged and undamaged frame. The existence of the damage is clearly observable from the experimental results. It is suggested that this approach offers the-potential to detect damage in welded structures such as cranes, mining equipment, steel-frame bridges, naval and offshore structures. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Phytophthora diseases cause major losses to agricultural and horticultural production in Australia and worldwide. Most Phytophthora diseases are soilborne and difficult to control, making disease prevention an important component of many disease management strategies. Detection and identification of the causal agent, therefore, is an essential part of effective disease management. This paper describes the development and validation of a DNA-based diagnostic assay that can detect and identify 27 different Phytophthora species. We have designed PCR primers that are specific to the genus Phytophthora. The resulting amplicon after PCR is subjected to digestion by restriction enzymes to yield a specific restriction pattern or fingerprint unique to each species. The restriction patterns are compared with a key comprising restriction patterns of type specimens or representative isolates of 27 different Phytophthora species. A number of fundamental issues, such as genetic diversity within and among species which underpin the development and validation of DNA-based diagnostic assays, are addressed in this paper.
Resumo:
A simple protein-DNA interaction analysis has been developed using a high-affinity/high-specificity zinc finger protein. In essence, purified protein samples are immobilized directly onto the surface of microplate wells, and fluorescently labeled DNA is added in solution. After incubation and washing, bound DNA is detected in a standard microplate reader. The minimum sensitivity of the assay is approximately 0.2 nM DNA. Since the detection of bound DNA is noninvasive and the protein-DNA interaction is not disrupted during detection, iterative readings may be taken from the same well, after successive alterations in interaction conditions, if required. In this respect, the assay may therefore be considered real time and permits appropriate interaction conditions to be determined quantitatively. The assay format is ideally suited to investigate the interactions of purified unlabeled DNA binding proteins in a high-throughput format.
Resumo:
This thesis considers two basic aspects of impact damage in composite materials, namely damage severity discrimination and impact damage location by using Acoustic Emissions (AE) and Artificial Neural Networks (ANNs). The experimental work embodies a study of such factors as the application of AE as Non-destructive Damage Testing (NDT), and the evaluation of ANNs modelling. ANNs, however, played an important role in modelling implementation. In the first aspect of the study, different impact energies were used to produce different level of damage in two composite materials (T300/914 and T800/5245). The impacts were detected by their acoustic emissions (AE). The AE waveform signals were analysed and modelled using a Back Propagation (BP) neural network model. The Mean Square Error (MSE) from the output was then used as a damage indicator in the damage severity discrimination study. To evaluate the ANN model, a comparison was made of the correlation coefficients of different parameters, such as MSE, AE energy, AE counts, etc. MSE produced an outstanding result based on the best performance of correlation. In the second aspect, a new artificial neural network model was developed to provide impact damage location on a quasi-isotropic composite panel. It was successfully trained to locate impact sites by correlating the relationship between arriving time differences of AE signals at transducers located on the panel and the impact site coordinates. The performance of the ANN model, which was evaluated by calculating the distance deviation between model output and real location coordinates, supports the application of ANN as an impact damage location identifier. In the study, the accuracy of location prediction decreased when approaching the central area of the panel. Further investigation indicated that this is due to the small arrival time differences, which defect the performance of ANN prediction. This research suggested increasing the number of processing neurons in the ANNs as a practical solution.
Resumo:
Purpose: In today's competitive scenario, effective supply chain management is increasingly dependent on third-party logistics (3PL) companies' capabilities and performance. The dissemination of information technology (IT) has contributed to change the supply chain role of 3PL companies and IT is considered an important element influencing the performance of modern logistics companies. Therefore, the purpose of this paper is to explore the relationship between IT and 3PLs' performance, assuming that logistics capabilities play a mediating role in this relationship. Design/methodology/approach: Empirical evidence based on a questionnaire survey conducted on a sample of logistics service companies operating in the Italian market was used to test a conceptual resource-based view (RBV) framework linking IT adoption, logistics capabilities and firm performance. Factor analysis and ordinary least square (OLS) regression analysis have been used to test hypotheses. The focus of the paper is multidisciplinary in nature; management of information systems, strategy, logistics and supply chain management approaches have been combined in the analysis. Findings: The results indicate strong relationships among data gathering technologies, transactional capabilities and firm performance, in terms of both efficiency and effectiveness. Moreover, a positive correlation between enterprise information technologies and 3PL financial performance has been found. Originality/value: The paper successfully uses the concept of logistics capabilities as mediating factor between IT adoption and firm performance. Objective measures have been proposed for IT adoption and logistics capabilities. Direct and indirect relationships among variables have been successfully tested. © Emerald Group Publishing Limited.
Resumo:
In the last decade, large numbers of social media services have emerged and been widely used in people's daily life as important information sharing and acquisition tools. With a substantial amount of user-contributed text data on social media, it becomes a necessity to develop methods and tools for text analysis for this emerging data, in order to better utilize it to deliver meaningful information to users. ^ Previous work on text analytics in last several decades is mainly focused on traditional types of text like emails, news and academic literatures, and several critical issues to text data on social media have not been well explored: 1) how to detect sentiment from text on social media; 2) how to make use of social media's real-time nature; 3) how to address information overload for flexible information needs. ^ In this dissertation, we focus on these three problems. First, to detect sentiment of text on social media, we propose a non-negative matrix tri-factorization (tri-NMF) based dual active supervision method to minimize human labeling efforts for the new type of data. Second, to make use of social media's real-time nature, we propose approaches to detect events from text streams on social media. Third, to address information overload for flexible information needs, we propose two summarization framework, dominating set based summarization framework and learning-to-rank based summarization framework. The dominating set based summarization framework can be applied for different types of summarization problems, while the learning-to-rank based summarization framework helps utilize the existing training data to guild the new summarization tasks. In addition, we integrate these techneques in an application study of event summarization for sports games as an example of how to better utilize social media data. ^
Resumo:
Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. ^ We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. ^ We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. ^ We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). ^ In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.^