994 resultados para Simulator FTE DATCOM damage tolerance inspections algorithms


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solid organ transplantation (SOT) is considered the treatment of choice for many end-stage organ diseases. Thus far, short term results are excellent, with patient survival rates greater than 90% one year post-surgery, but there are several problems with the long term acceptance and use of immunosuppressive drugs. Hematopoietic Stem Cells Transplantation (HSCT) concerns the infusion of haematopoietic stem cells to re-establish acquired and congenital disorders of the hematopoietic system. The main side effect is the Graft versus Host Disease (GvHD) where donor T cells can cause pathology involving the damage of host tissues. Patients undergoing acute or chronic GvHD receive immunosuppressive regimen that is responsible for several side effects. The use of immunosuppressive drugs in the setting of SOT and GvHD has markedly reduced the incidence of acute rejection and the tissue damage in GvHD however, the numerous adverse side effects observed boost the development of alternative strategies to improve the long-term outcome. To this effect, the use of CD4+CD25+FOXP3+ regulatory T cells (Treg) as a cellular therapy is an attractive approach for autoimmunity disease, GvHD and limiting immune responses to allograft after transplantation. Treg have a pivotal role in maintaining peripheral immunological tolerance, by preventing autoimmunity and chronic inflammation. Results of my thesis provide the characterization and cell processing of Tregs from healthy controls and patients in waiting list for liver transplantation, followed by the development of an efficient expansion-protocol and the investigation of the impact of the main immunosuppressive drugs on viability, proliferative capacity and function of expanded cells after expansion. The conclusion is that ex vivo expansion is necessary to infuse a high Treg dose and although many other factors in vivo can contribute to the success of Treg therapy, the infusion of Tregs during the administration of the highest dose of immunosuppressants should be carefully considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A set of algorithms, which allows a computer to determine the answers of simulated patients during pure tone and speech audiometry, is presented. Based on these algorithms, a computer program for training in audiometry was written and found to be useful for teaching purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Post-soviet countries are in the process of transformation from a totalitarian order to a democratic one, a transformation which is impossible without a profound shift in people's way of thinking. The group set themselves the task of determining the essence of this shift. Using a multidisciplinary approach, they looked at concrete ways of overcoming the totalitarian mentality and forming that necessary for an open democratic society. They studied the contemporary conceptions of tolerance and critical thinking and looked for new foundations of criticism, especially in hermeneutics. They then sought to substantiate the complementary relation between tolerance and criticism in the democratic way of thinking and to prepare a a syllabus for teaching on the subject in Ukrainian higher education. In a philosophical exploration of tolerance they began with relgious tolerance as its first and most important form. Political and social interests often lay at the foundations of religious intolerance and this implicitly comprised the transition to religious tolerance when conditions changed. Early polytheism was more or less indifferent to dogmatic deviations but monotheism is intolerant of heresies. The damage wrought by the religious wars of the Reformations transformed tolerance into a value. They did not create religious tolerance but forced its recognition as a positive phenomenon. With the weakening of religious institutions in the modern era, the purely political nature of many conflicts became evident and this stimulated the extrapolation of tolerance into secular life. Each historical era has certain acts and operations which may be interpreted as tolerant and these can be classified as to whether or not they are based on the conscious following of the principle of tolerance. This criterion requires the separation of the phenomenon of tolerance from its concept and from tolerance as a value. Only the conjunction of a concept of tolerance with a recognition of its value can transform it into a principle dictating a norm of conscious behaviour. The analysis of the contemporary conception of tolerance focused on the diversity of the concept and concluded that the notions used cannot be combined in the framework of a single more or less simple classification, as the distinctions between them are stimulated by the complexity of the realty considered and the variety of its manifestations. Notions considered in relation to tolerance included pluralism, respect and particular-universal. The rationale of tolerance was also investigated and the group felt that any substantiation of the principle of tolerance must take into account human beings' desire for knowledge. Before respecting or being tolerant of another person different from myself, I should first know where the difference lies, so knowledge is a necessary condition of tolerance.The traditional division of truth into scientific (objective and unique) and religious, moral, political (subjective and so multiple) intensifies the problem of the relationship between truth and tolerance. Science was long seen as a field of "natural" intolerance whereas the validity of tolerance was accepted in other intellectual fields. As tolerance eemrges when there is difference and opposition, it is essentially linked with rivaly and there is a a growing recognition today that unlimited rivalry is neither able to direct the process of development nor to act as creative matter. Social and economic reality has led to rivalry being regulated by the state and a natural requirement of this is to associate tolerance with a special "purified" form of rivalry, an acceptance of the actiivity of different subjects and a specification of the norms of their competition. Tolerance and rivalry should therefore be subordinate to a degree of discipline and the group point out that discipline, including self-discipline, is a regulator of the balance between them. Two problematic aspects of tolerance were identified: why something traditionally supposed to have no positive content has become a human activity today, and whether tolerance has full-scale cultural significance. The resolution of these questions requires a revision of the phenomenon and conception of tolerance to clarify its immanent positive content. This involved an investigation of the contemporary concept of tolerance and of the epistemological foundations of a negative solution of tolerance in Greek thought. An original soution to the problem of the extrapolation of tolerance to scientific knowledge was proposed based on the Duhem-Quine theses and conceptiion of background knowledge. In this way tolerance as a principle of mutual relations between different scientific positions gains an essential epistemological rationale and so an important argument for its own universal status. The group then went on to consider the ontological foundations for a positive solution of this problem, beginning with the work of Poincare and Reichenbach. The next aspect considered was the conceptual foundations of critical thinking, looking at the ideas of Karl Popper and St. Augustine and at the problem of the demarcation line between reasonable criticism and apologetic reasoning. Dogmatic and critical thinking in a political context were also considered, before an investigation of critical thinking's foundations. As logic is essential to critical thinking, the state of this discipline in Ukrainian and Russian higher education was assessed, together with the limits of formal-logical grounds for criticism, the role of informal logical as a basis for critical thinking today, dialectical logic as a foundation for critical thinking and the universality of the contemporary demand for criticism. The search for new foundations of critical thinking covered deconstructivism and critical hermeneutics, including the problem of the author. The relationship between tolerance and criticism was traced from the ancient world, both eastern and Greek, through the transitional community of the Renaissance to the industrial community (Locke and Mill) and the evolution of this relationship today when these are viewed not as moral virtues but as ordinary norms. Tolerance and criticism were discussed as complementary manifestations of human freedom. If the completeness of freedom were accepted it would be impossible to avoid recognition of the natural and legal nature of these manifestations and the group argue that critical tolerance is able to avoid dismissing such negative phenomena as the degradation of taste and manner, pornography, etc. On the basis of their work, the group drew up the syllabus of a course in "Logic with Elements of Critical Thinking, and of a special course on the "Problem of Tolerance".

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the dual ex vivo perfusion of an isolated human placental cotyledon it takes on average 20-30 min to set up stable perfusion circuits for the maternal and fetal vascular compartments. In vivo placental tissue of all species maintains a highly active metabolism and it continues to puzzle investigators how this tissue can survive 30 min of ischemia with more or less complete anoxia following expulsion of the organ from the uterus and do so without severe damage. There seem to be parallels between "depressed metabolism" seen in the fetus and the immature neonate in the peripartum period and survival strategies described in mammals with increased tolerance of severe hypoxia like hibernators in the state of torpor or deep sea diving turtles. Increased tolerance of hypoxia in both is explained by "partial metabolic arrest" in the sense of a temporary suspension of Kleiber's rule. Furthermore the fetus can react to major changes in surrounding oxygen tension by decreasing or increasing the rate of specific basal metabolism, providing protection against severe hypoxia as well as oxidative stress. There is some evidence that adaptive mechanisms allowing increased tolerance of severe hypoxia in the fetus or immature neonate can also be found in placental tissue, of which at least the villous portion is of fetal origin. A better understanding of the molecular details of reprogramming of fetal and placental tissues in late pregnancy may be of clinical relevance for an improved risk assessment of the individual fetus during the critical transition from intrauterine life to the outside and for the development of potential prophylactic measures against severe ante- or intrapartum hypoxia. Responses of the tissue to reperfusion deserve intensive study, since they may provide a rational basis for preventive measures against reperfusion injury and related oxidative stress. Modification of the handling of placental tissue during postpartum ischemia, and adaptation of the artificial reperfusion, may lead to an improvement of the ex vivo perfusion technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Cardiac arrests are handled by teams rather than by individual health-care workers. Recent investigations demonstrate that adherence to CPR guidelines can be less than optimal, that deviations from treatment algorithms are associated with lower survival rates, and that deficits in performance are associated with shortcomings in the process of team-building. The aim of this study was to explore and quantify the effects of ad-hoc team-building on the adherence to the algorithms of CPR among two types of physicians that play an important role as first responders during CPR: general practitioners and hospital physicians. Methods To unmask team-building this prospective randomised study compared the performance of preformed teams, i.e. teams that had undergone their process of team-building prior to the onset of a cardiac arrest, with that of teams that had to form ad-hoc during the cardiac arrest. 50 teams consisting of three general practitioners each and 50 teams consisting of three hospital physicians each, were randomised to two different versions of a simulated witnessed cardiac arrest: the arrest occurred either in the presence of only one physician while the remaining two physicians were summoned to help ("ad-hoc"), or it occurred in the presence of all three physicians ("preformed"). All scenarios were videotaped and performance was analysed post-hoc by two independent observers. Results Compared to preformed teams, ad-hoc forming teams had less hands-on time during the first 180 seconds of the arrest (93 ± 37 vs. 124 ± 33 sec, P < 0.0001), delayed their first defibrillation (67 ± 42 vs. 107 ± 46 sec, P < 0.0001), and made less leadership statements (15 ± 5 vs. 21 ± 6, P < 0.0001). Conclusion Hands-on time and time to defibrillation, two performance markers of CPR with a proven relevance for medical outcome, are negatively affected by shortcomings in the process of ad-hoc team-building and particularly deficits in leadership. Team-building has thus to be regarded as an additional task imposed on teams forming ad-hoc during CPR. All physicians should be aware that early structuring of the own team is a prerequisite for timely and effective execution of CPR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present the Cellular Dynamic Simulator (CDS) for simulating diffusion and chemical reactions within crowded molecular environments. CDS is based on a novel event driven algorithm specifically designed for precise calculation of the timing of collisions, reactions and other events for each individual molecule in the environment. Generic mesh based compartments allow the creation / importation of very simple or detailed cellular structures that exist in a 3D environment. Multiple levels of compartments and static obstacles can be used to create a dense environment to mimic cellular boundaries and the intracellular space. The CDS algorithm takes into account volume exclusion and molecular crowding that may impact signaling cascades in small sub-cellular compartments such as dendritic spines. With the CDS, we can simulate simple enzyme reactions; aggregation, channel transport, as well as highly complicated chemical reaction networks of both freely diffusing and membrane bound multi-protein complexes. Components of the CDS are generally defined such that the simulator can be applied to a wide range of environments in terms of scale and level of detail. Through an initialization GUI, a simple simulation environment can be created and populated within minutes yet is powerful enough to design complex 3D cellular architecture. The initialization tool allows visual confirmation of the environment construction prior to execution by the simulator. This paper describes the CDS algorithm, design implementation, and provides an overview of the types of features available and the utility of those features are highlighted in demonstrations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud Computing has evolved to become an enabler for delivering access to large scale distributed applications running on managed network-connected computing systems. This makes possible hosting Distributed Enterprise Information Systems (dEISs) in cloud environments, while enforcing strict performance and quality of service requirements, defined using Service Level Agreements (SLAs). {SLAs} define the performance boundaries of distributed applications, and are enforced by a cloud management system (CMS) dynamically allocating the available computing resources to the cloud services. We present two novel VM-scaling algorithms focused on dEIS systems, which optimally detect most appropriate scaling conditions using performance-models of distributed applications derived from constant-workload benchmarks, together with SLA-specified performance constraints. We simulate the VM-scaling algorithms in a cloud simulator and compare against trace-based performance models of dEISs. We compare a total of three SLA-based VM-scaling algorithms (one using prediction mechanisms) based on a real-world application scenario involving a large variable number of users. Our results show that it is beneficial to use autoregressive predictive SLA-driven scaling algorithms in cloud management systems for guaranteeing performance invariants of distributed cloud applications, as opposed to using only reactive SLA-based VM-scaling algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is known that the techniques under the topic of Soft Computing have a strong capability of learning and cognition, as well as a good tolerance to uncertainty and imprecision. Due to these properties they can be applied successfully to Intelligent Vehicle Systems; ITS is a broad range of technologies and techniques that hold answers to many transportation problems. The unmannedcontrol of the steering wheel of a vehicle is one of the most important challenges facing researchers in this area. This paper presents a method to adjust automatically a fuzzy controller to manage the steering wheel of a mass-produced vehicle; to reach it, information about the car state while a human driver is handling the car is taken and used to adjust, via iterative geneticalgorithms an appropriated fuzzy controller. To evaluate the obtained controllers, it will be considered the performance obtained in the track following task, as well as the smoothness of the driving carried out.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tanto los robots autónomos móviles como los robots móviles remotamente operados se utilizan con éxito actualmente en un gran número de ámbitos, algunos de los cuales son tan dispares como la limpieza en el hogar, movimiento de productos en almacenes o la exploración espacial. Sin embargo, es difícil garantizar la ausencia de defectos en los programas que controlan dichos dispositivos, al igual que ocurre en otros sectores informáticos. Existen diferentes alternativas para medir la calidad de un sistema en el desempeño de las funciones para las que fue diseñado, siendo una de ellas la fiabilidad. En el caso de la mayoría de los sistemas físicos se detecta una degradación en la fiabilidad a medida que el sistema envejece. Esto es debido generalmente a efectos de desgaste. En el caso de los sistemas software esto no suele ocurrir, ya que los defectos que existen en ellos generalmente no han sido adquiridos con el paso del tiempo, sino que han sido insertados en el proceso de desarrollo de los mismos. Si dentro del proceso de generación de un sistema software se focaliza la atención en la etapa de codificación, podría plantearse un estudio que tratara de determinar la fiabilidad de distintos algoritmos, válidos para desempeñar el mismo cometido, según los posibles defectos que pudieran introducir los programadores. Este estudio básico podría tener diferentes aplicaciones, como por ejemplo elegir el algoritmo menos sensible a los defectos, para el desarrollo de un sistema crítico o establecer procedimientos de verificación y validación, más exigentes, si existe la necesidad de utilizar un algoritmo que tenga una alta sensibilidad a los defectos. En el presente trabajo de investigación se ha estudiado la influencia que tienen determinados tipos de defectos software en la fiabilidad de tres controladores de velocidad multivariable (PID, Fuzzy y LQR) al actuar en un robot móvil específico. La hipótesis planteada es que los controladores estudiados ofrecen distinta fiabilidad al verse afectados por similares patrones de defectos, lo cual ha sido confirmado por los resultados obtenidos. Desde el punto de vista de la planificación experimental, en primer lugar se realizaron los ensayos necesarios para determinar si los controladores de una misma familia (PID, Fuzzy o LQR) ofrecían una fiabilidad similar, bajo las mismas condiciones experimentales. Una vez confirmado este extremo, se eligió de forma aleatoria un representante de clase de cada familia de controladores, para efectuar una batería de pruebas más exhaustiva, con el objeto de obtener datos que permitieran comparar de una forma más completa la fiabilidad de los controladores bajo estudio. Ante la imposibilidad de realizar un elevado número de pruebas con un robot real, así como para evitar daños en un dispositivo que generalmente tiene un coste significativo, ha sido necesario construir un simulador multicomputador del robot. Dicho simulador ha sido utilizado tanto en las actividades de obtención de controladores bien ajustados, como en la realización de los diferentes ensayos necesarios para el experimento de fiabilidad. ABSTRACT Autonomous mobile robots and remotely operated robots are used successfully in very diverse scenarios, such as home cleaning, movement of goods in warehouses or space exploration. However, it is difficult to ensure the absence of defects in programs controlling these devices, as it happens in most computer sectors. There exist different quality measures of a system when performing the functions for which it was designed, among them, reliability. For most physical systems, a degradation occurs as the system ages. This is generally due to the wear effect. In software systems, this does not usually happen, and defects often come from system development and not from use. Let us assume that we focus on the coding stage in the software development pro¬cess. We could consider a study to find out the reliability of different and equally valid algorithms, taking into account any flaws that programmers may introduce. This basic study may have several applications, such as choosing the algorithm less sensitive to pro¬gramming defects for the development of a critical system. We could also establish more demanding procedures for verification and validation if we need an algorithm with high sensitivity to programming defects. In this thesis, we studied the influence of certain types of software defects in the reliability of three multivariable speed controllers (PID, Fuzzy and LQR) designed to work in a specific mobile robot. The hypothesis is that similar defect patterns affect differently the reliability of controllers, and it has been confirmed by the results. From the viewpoint of experimental planning, we followed these steps. First, we conducted the necessary test to determine if controllers of the same family (PID, Fuzzy or LQR) offered a similar reliability under the same experimental conditions. Then, a class representative was chosen at ramdom within each controller family to perform a more comprehensive test set, with the purpose of getting data to compare more extensively the reliability of the controllers under study. The impossibility of performing a large number of tests with a real robot and the need to prevent the damage of a device with a significant cost, lead us to construct a multicomputer robot simulator. This simulator has been used to obtain well adjusted controllers and to carry out the required reliability experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kinetic Monte Carlo (KMC) is a widely used technique to simulate the evolution of radiation damage inside solids. Despite de fact that this technique was developed several decades ago, there is not an established and easy to access simulating tool for researchers interested in this field, unlike in the case of molecular dynamics or density functional theory calculations. In fact, scientists must develop their own tools or use unmaintained ones in order to perform these types of simulations. To fulfil this need, we have developed MMonCa, the Modular Monte Carlo simulator. MMonCa has been developed using professional C++ programming techniques and has been built on top of an interpreted language to allow having a powerful yet flexible, robust but customizable and easy to access modern simulator. Both non lattice and Lattice KMC modules have been developed. We will present in this conference, for the first time, the MMonCa simulator. Along with other (more detailed) contributions in this meeting, the versatility of MMonCa to study a number of problems in different materials (particularly, Fe and W) subject to a wide range of conditions will be shown. Regarding KMC simulations, we have studied neutron-generated cascade evolution in Fe (as a model material). Starting with a Frenkel pair distribution we have followed the defect evolution up to 450 K. Comparison with previous simulations and experiments shows excellent agreement. Furthermore, we have studied a more complex system (He-irradiated W:C) using a previous parametrization [1]. He-irradiation at 4 K followed by isochronal annealing steps up to 500 K has been simulated with MMonCa. The He energy was 400 eV or 3 keV. In the first case, no damage is associated to the He implantation, whereas in the second one, a significant Frenkel pair concentration (evolving into complex clusters) is associated to the He ions. We have been able to explain He desorption both in the absence and in the presence of Frenkel pairs and we have also applied MMonCa to high He doses and fluxes at elevated temperatures. He migration and trapping dominate the kinetics of He desorption. These processes will be discussed and compared to experimental results. [1] C.S. Becquart et al. J. Nucl. Mater. 403 (2010) 75

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En muchas áreas de la ingeniería, la integridad y confiabilidad de las estructuras son aspectos de extrema importancia. Estos son controlados mediante el adecuado conocimiento de danos existentes. Típicamente, alcanzar el nivel de conocimiento necesario que permita caracterizar la integridad estructural implica el uso de técnicas de ensayos no destructivos. Estas técnicas son a menudo costosas y consumen mucho tiempo. En la actualidad, muchas industrias buscan incrementar la confiabilidad de las estructuras que emplean. Mediante el uso de técnicas de última tecnología es posible monitorizar las estructuras y en algunos casos, es factible detectar daños incipientes que pueden desencadenar en fallos catastróficos. Desafortunadamente, a medida que la complejidad de las estructuras, los componentes y sistemas incrementa, el riesgo de la aparición de daños y fallas también incrementa. Al mismo tiempo, la detección de dichas fallas y defectos se torna más compleja. En años recientes, la industria aeroespacial ha realizado grandes esfuerzos para integrar los sensores dentro de las estructuras, además de desarrollar algoritmos que permitan determinar la integridad estructural en tiempo real. Esta filosofía ha sido llamada “Structural Health Monitoring” (o “Monitorización de Salud Estructural” en español) y este tipo de estructuras han recibido el nombre de “Smart Structures” (o “Estructuras Inteligentes” en español). Este nuevo tipo de estructuras integran materiales, sensores, actuadores y algoritmos para detectar, cuantificar y localizar daños dentro de ellas mismas. Una novedosa metodología para detección de daños en estructuras se propone en este trabajo. La metodología está basada en mediciones de deformación y consiste en desarrollar técnicas de reconocimiento de patrones en el campo de deformaciones. Estas últimas, basadas en PCA (Análisis de Componentes Principales) y otras técnicas de reducción dimensional. Se propone el uso de Redes de difracción de Bragg y medidas distribuidas como sensores de deformación. La metodología se validó mediante pruebas a escala de laboratorio y pruebas a escala real con estructuras complejas. Los efectos de las condiciones de carga variables fueron estudiados y diversos experimentos fueron realizados para condiciones de carga estáticas y dinámicas, demostrando que la metodología es robusta ante condiciones de carga desconocidas. ABSTRACT In many engineering fields, the integrity and reliability of the structures are extremely important aspects. They are controlled by the adequate knowledge of existing damages. Typically, achieving the level of knowledge necessary to characterize the structural integrity involves the usage of nondestructive testing techniques. These are often expensive and time consuming. Nowadays, many industries look to increase the reliability of the structures used. By using leading edge techniques it is possible to monitoring these structures and in some cases, detect incipient damage that could trigger catastrophic failures. Unfortunately, as the complexity of the structures, components and systems increases, the risk of damages and failures also increases. At the same time, the detection of such failures and defects becomes more difficult. In recent years, the aerospace industry has done great efforts to integrate the sensors within the structures and, to develop algorithms for determining the structural integrity in real time. The ‘philosophy’ has being called “Structural Health Monitoring” and these structures have been called “smart structures”. These new types of structures integrate materials, sensors, actuators and algorithms to detect, quantify and locate damage within itself. A novel methodology for damage detection in structures is proposed. The methodology is based on strain measurements and consists in the development of strain field pattern recognition techniques. The aforementioned are based on PCA (Principal Component Analysis) and other dimensional reduction techniques. The use of fiber Bragg gratings and distributed sensing as strain sensors is proposed. The methodology have been validated by using laboratory scale tests and real scale tests with complex structures. The effects of the variable load conditions were studied and several experiments were performed for static and dynamic load conditions, demonstrating that the methodology is robust under unknown load conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary algorithms are suitable to solve damage identification problems in a multiobjective context. However, the performance of these methods can deteriorate quickly with increasing noise intensities originating numerous uncertainties. In this paper, a statistic structural damage detection method formulated in a multiobjective context is proposed. The statistic analysis is implemented to take into account the uncertainties existing in the structural model and measured structural modal parameters. The presented method is verified by a number of simulated damage scenarios. The effects of noise and damage levels on damage detection are investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary algorithms are suitable to solve damage identification problems in a multiobjective context. However, the performance of these methods can deteriorate quickly with increasing noise intensities originating numerous uncertainties. In this work, a statistic structural damage detection method formulated in a multiobjective context is proposed, taking into account the uncertainties existing. The presented method is verified by a number of simulated damage scenarios. The effects of noise on damage detection are investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary algorithms are suitable to solve damage identification problems in a multi-objective context. However, the performance of these methods can deteriorate quickly with increasing noise intensities originating numerous uncertainties. In this paper, a statistic structural damage detection method formulated in a multi-objective context is proposed. The statistic analysis is implemented to take into account the uncertainties existing in the structural model and measured structural modal parameters. The presented method is verified by a number of simulated damage scenarios. The effects of noise and damage levels on damage detection are investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PAMELA (Phased Array Monitoring for Enhanced Life Assessment) SHMTM System is an integrated embedded ultrasonic guided waves based system consisting of several electronic devices and one system manager controller. The data collected by all PAMELA devices in the system must be transmitted to the controller, who will be responsible for carrying out the advanced signal processing to obtain SHM maps. PAMELA devices consist of hardware based on a Virtex 5 FPGA with a PowerPC 440 running an embedded Linux distribution. Therefore, PAMELA devices, in addition to the capability of performing tests and transmitting the collected data to the controller, have the capability of perform local data processing or pre-processing (reduction, normalization, pattern recognition, feature extraction, etc.). Local data processing decreases the data traffic over the network and allows CPU load of the external computer to be reduced. Even it is possible that PAMELA devices are running autonomously performing scheduled tests, and only communicates with the controller in case of detection of structural damages or when programmed. Each PAMELA device integrates a software management application (SMA) that allows to the developer downloading his own algorithm code and adding the new data processing algorithm to the device. The development of the SMA is done in a virtual machine with an Ubuntu Linux distribution including all necessary software tools to perform the entire cycle of development. Eclipse IDE (Integrated Development Environment) is used to develop the SMA project and to write the code of each data processing algorithm. This paper presents the developed software architecture and describes the necessary steps to add new data processing algorithms to SMA in order to increase the processing capabilities of PAMELA devices.An example of basic damage index estimation using delay and sum algorithm is provided.