858 resultados para Electrodynamic Shaker Control Loop Adaptive Filtering Inverse Modeling Algorithm
Robust controller design of a wheelchair mobile via LMI approach to SPR systems with feedback output
Resumo:
This article discusses the design of robust controller applied to Wheelchair Furniture via Linear Matrix Inequalities (LMI), to obtain Strictly Positive Real (SPR) systems. The contributions of this work were the choice of a mathematical model for wheelchair: mobile with uncertainty about the position of the center of gravity (CG), the decoupling of the kinematic and dynamical systems, linearization of the models, the headquarters building of parametric uncertainties, the proposal of the control loop and control law with a specified decay rate.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Ocean Island Basalts (OIB) provide important information on the chemical and physical characteristics of their mantle sources. However, the geochemical composition of a generated magma is significantly affected by partial melting and/or subsequent fractional crystallization processes. In addition, the isotopic composition of an ascending magma may be modified during transport through the oceanic crust. The influence of these different processes on the chemical and isotopic composition of OIB from two different localities, Hawaii and Tubuai in the Pacific Ocean, are investigated here. In a first chapter, the Os-isotope variations in suites of lavas from Kohala Volcano, Hawaii, are examined to constrain the role of melt/crust interactions on the evolution of these lavas. As 187Os/188Os sensitivity to any radiogenic contaminant strongly depend on the Os content in the melt, Os and other PGE variations are investigated first. This study reveals that Os and other PGE behavior change during the Hawaiian magma differentiation. While PGE concentrations are relatively constant in lavas with relatively primitive compositions, all PGE contents strongly decrease in the melt as it evolved through ~ 8% MgO. This likely reflects the sulfur saturation of the Hawaiian magma and the onset of sulfide fractionation at around 8% MgO. Kohala tholeiites with more than 8% MgO and rich in Os have homogeneous 187Os/188Os values likely to represent the mantle signature of Kohala lavas. However, Os isotopic ratios become more radiogenic with decreasing MgO and Os contents in the lavas, which reflects assimilation of local crust material during fractional crystallization processes. Less than 8% upper oceanic crust assimilation could have produced the most radiogenic Os-isotope ratios recorded in the shield lavas. However, these small amounts of upper crust assimilation have only negligible effects on Sr and Nd isotopic ratios and therefore, are not responsible for the Sr and Nd isotopic heterogeneities observed in Kohala lavas. In a second chapter, fractional crystallization and partial melting processes are constrained using major and trace element variations in the same suites of lavas from Kohala Volcano, Hawaii. This inverse modeling approach allows the estimation of most of the trace element composition of the Hawaiian mantle source. The calculated initial trace element pattern shows slight depletion of the concentrations from LREE to the most incompatible elements, which indicates that the incompatible element enrichments described by the Hawaiian melt patterns are entirely produced by partial melting processes. The “Kea trend” signature of lavas from Kohala Volcano is also confirmed, with Kohala lavas having lower Sr/Nd and La/Th ratios than lavas from Mauna Loa Volcano. Finally, the magmatic evolution of Tubuai Island is investigated in a last chapter using the trace element and Sr, Nd, Hf isotopic variations in mafic lava suites. The Sr, Nd and Hf isotopic data are homogeneous and typical for the HIMU-type OIB and confirms the cogenetic nature of the different mafic lavas from Tubuai Island. The trace element patterns show progressive enrichment of incompatible trace elements with increasing alkali content in the lavas, which reflect progressive decrease in the degree of partial melting towards the later volcanic events. In addition, this enrichment of incompatible trace elements is associated with relative depletion of Rb, Ba, K, Nb, Ta and Ti in the lavas, which require the presence of small amount of residual phlogopite and of a Ti-bearing phase (ilmenite or rutile) during formation of the younger analcitic and nephelinitic magmas.
Resumo:
The present thesis focuses on the problem of robust output regulation for minimum phase nonlinear systems by means of identification techniques. Given a controlled plant and an exosystem (an autonomous system that generates eventual references or disturbances), the control goal is to design a proper regulator able to process the only measure available, i.e the error/output variable, in order to make it asymptotically vanishing. In this context, such a regulator can be designed following the well known “internal model principle” that states how it is possible to achieve the regulation objective by embedding a replica of the exosystem model in the controller structure. The main problem shows up when the exosystem model is affected by parametric or structural uncertainties, in this case, it is not possible to reproduce the exact behavior of the exogenous system in the regulator and then, it is not possible to achieve the control goal. In this work, the idea is to find a solution to the problem trying to develop a general framework in which coexist both a standard regulator and an estimator able to guarantee (when possible) the best estimate of all uncertainties present in the exosystem in order to give “robustness” to the overall control loop.
Resumo:
We noninvasively detected the characteristics and location of a regional fault in an area of poor bedrock exposure complicated by karst weathering features in the subsurface. Because this regional fault is associated with sinkhole formation, its location is important for hazard avoidance. The bedrock lithologies on either side of the fault trace are similar; hence, we chose an approach that capitalized on the complementary strengths of very low frequency (VLF) electromagnetic, resistivity, and gravity methods. VLF proved most useful as a first-order reconnaissance tool, allowing us to define a narrow target area for further geophysical exploration. Fault-related epikarst was delineated using resistivity. Ultimately, a high-resolution gravity survey and subsequent inverse modeling using the results of the resistivity survey helped to further constrain the location and approximate orientation of the fault. The combined results indicated that the location of the fault trace needed to be adjusted 53 m south of the current published location and was consistent with a north-dipping thrust fault. Additionally, a gravity low south of the fault trace agreed with the location of conductive material from the resistivity and VLF surveys. We interpreted these anomalies to represent enhanced epikarst in the fault footwall. We clearly found that a staged approach involving a progression of methods beginning with a reconnaissance VLF survey, followed by high-resolution gravity and electrical resistivity surveys, can be used to characterize a fault and fault-related karst in an area of poor bedrock surface exposure.
Resumo:
Introduction Reconstitution of peripheral blood (PB) B cells after therapeutic depletion with the chimeric anti-CD20 antibody rituximab (RTX) mimics lymphatic ontogeny. In this situation, the repletion kinetics and migratory properties of distinct developmental B-cell stages and their correlation to disease activity might facilitate our understanding of innate and adaptive B-cell functions in rheumatoid arthritis (RA). Methods Thirty-five 'RTX-naïve' RA patients with active arthritis were treated after failure of tumour necrosis factor blockade in an open-label study with two infusions of 1,000 mg RTX. Prednisone dose was tapered according to clinical improvement from a median of 10 mg at baseline to 5 mg at 9 and 12 months. Conventional disease-modifying antirheumatic drugs were kept stable. Subsets of CD19+ B cells were assessed by flow cytometry according to their IgD and CD27 surface expression. Their absolute number and relative frequency in PB were followed every 3 months and were determined in parallel in synovial tissue (n = 3) or synovial fluid (n = 3) in the case of florid arthritis. Results Six of 35 patients fulfilled the European League Against Rheumatism criteria for moderate clinical response, and 19 others for good clinical response. All PB B-cell fractions decreased significantly in number (P < 0.001) after the first infusion. Disease activity developed independently of the total B-cell number. B-cell repopulation was dominated in quantity by CD27-IgD+ 'naïve' B cells. The low number of CD27+IgD- class-switched memory B cells (MemB) in the blood, together with sustained reduction of rheumatoid factor serum concentrations, correlated with good clinical response. Class-switched MemB were found accumulated in flaring joints. Conclusions The present data support the hypothesis that control of adaptive immune processes involving germinal centre-derived, antigen, and T-cell-dependently matured B cells is essential for successful RTX treatment.
Resumo:
Directly imaged exoplanets are unexplored laboratories for the application of the spectral and temperature retrieval method, where the chemistry and composition of their atmospheres are inferred from inverse modeling of the available data. As a pilot study, we focus on the extrasolar gas giant HR 8799b, for which more than 50 data points are available. We upgrade our non-linear optimal estimation retrieval method to include a phenomenological model of clouds that requires the cloud optical depth and monodisperse particle size to be specified. Previous studies have focused on forward models with assumed values of the exoplanetary properties; there is no consensus on the best-fit values of the radius, mass, surface gravity, and effective temperature of HR 8799b. We show that cloud-free models produce reasonable fits to the data if the atmosphere is of super-solar metallicity and non-solar elemental abundances. Intermediate cloudy models with moderate values of the cloud optical depth and micron-sized particles provide an equally reasonable fit to the data and require a lower mean molecular weight. We report our best-fit values for the radius, mass, surface gravity, and effective temperature of HR 8799b. The mean molecular weight is about 3.8, while the carbon-to-oxygen ratio is about unity due to the prevalence of carbon monoxide. Our study emphasizes the need for robust claims about the nature of an exoplanetary atmosphere to be based on analyses involving both photometry and spectroscopy and inferred from beyond a few photometric data points, such as are typically reported for hot Jupiters.
Resumo:
Seismic velocities in rocks are influenced by the properties of the solid, the pore fluid, and the pore space. Cracks dramatically affect seismic velocities in rocks; their influence on the effective elastic moduli of rocks depends on their shape and concentration. Thin cracks (or fractures) substantially lower the moduli of a rock relative to the effect of spherical voids (or vesicles), and lower moduli are reflected by lower P- and S-wave velocities. The objective of this research is to determine the types and concentrations of cracks and their influence on the seismic properties of subaerially erupted basalts drilled from Hole 990A on the Southeast Greenland margin during Ocean Drilling Program Leg 163. Ellipsoidal cracks are used to model the voids in the rocks. The elastic moduli of the solid (grains) are also free parameters in the inverse modeling procedure. The apparent grain moduli reflect a weighted average of the moduli of the constituent minerals (e.g., plagioclase, augite, and clay minerals). The results indicate that (1) there is a strong relationship between P-wave velocity and porosity, suggesting a similarity of pore shape distributions, (2) the distribution of crack types within the massive, central region of aa flows from Hole 990A is independent of total porosity, (3) thin cracks are the first to be effectively sealed by alteration products, and (4) grain densities (an alteration index) and apparent grain moduli of the basalt samples are directly related.
Resumo:
The grain size of deep-sea sediments provides an apparently simple proxy for current speed. However, grain size-based proxies may be ambiguous when the size distribution reflects a combination of processes, with current sorting only one of them. In particular, such sediment mixing hinders reconstruction of deep circulation changes associated with ice-rafting events in the glacial North Atlantic because variable ice-rafted detritus (IRD) input may falsely suggest current speed changes. Inverse modeling has been suggested as a way to overcome this problem. However, this approach requires high-precision size measurements that register small changes in the size distribution. Here we show that such data can be obtained using electrosensing and laser diffraction techniques, despite issues previously raised on the low precision of electrosensing methods and potential grain shape effects on laser diffraction. Down-core size patterns obtained from a sediment core from the North Atlantic are similar for both techniques, reinforcing the conclusion that both techniques yield comparable results. However, IRD input leads to a coarsening that spuriously suggests faster current speed. We show that this IRD influence can be accounted for using inverse modeling as long as wide size spectra are taken into account. This yields current speed variations that are in agreement with other proxies. Our experiments thus show that for current speed reconstruction, the choice of instrument is subordinate to a proper recognition of the various processes that determine the size distribution and that by using inverse modeling meaningful current speed reconstructions can be obtained from mixed sediments.
Resumo:
The runtime management of the infrastructure providing service-based systems is a complex task, up to the point where manual operation struggles to be cost effective. As the functionality is provided by a set of dynamically composed distributed services, in order to achieve a management objective multiple operations have to be applied over the distributed elements of the managed infrastructure. Moreover, the manager must cope with the highly heterogeneous characteristics and management interfaces of the runtime resources. With this in mind, this paper proposes to support the configuration and deployment of services with an automated closed control loop. The automation is enabled by the definition of a generic information model, which captures all the information relevant to the management of the services with the same abstractions, describing the runtime elements, service dependencies, and business objectives. On top of that, a technique based on satisfiability is described which automatically diagnoses the state of the managed environment and obtains the required changes for correcting it (e.g., installation, service binding, update, or configuration). The results from a set of case studies extracted from the banking domain are provided to validate the feasibility of this proposa
Resumo:
Decreasing the accidents on highway and urban environments is the main motivation for the research and developing of driving assistance systems, also called ADAS (Advanced Driver Assistance Systems). In recent years, there are many applications of these systems in commercial vehicles: ABS systems, Cruise Control (CC), parking assistance and warning systems (including GPS), among others. However, the implementation of driving assistance systems on the steering wheel is more limited, because of their complexity and sensitivity. This paper is focused in the development, test and implementation of a driver assistance system for controlling the steering wheel in curve zones. This system is divided in two levels: an inner control loop which permits to execute the position and speed target, softening the action over the steering wheel, and a second control outer loop (controlling for fuzzy logic) that sends the reference to the inner loop according the environment and vehicle conditions. The tests have been done in different curves and speeds. The system has been proved in a commercial vehicle with satisfactory results.
Resumo:
El proyecto, “Aplicaciones de filtrado adaptativo LMS para mejorar la respuesta de acelerómetros”, se realizó con el objetivo de eliminar señales no deseadas de la señal de información procedentes de los acelerómetros para aplicaciones automovilísticas, mediante los algoritmos de los filtros adaptativos LMS. Dicho proyecto, está comprendido en tres áreas para su realización y ejecución, los cuales fueron ejecutados desde el inicio hasta el último día de trabajo. En la primera área de aplicación, diseñamos filtros paso bajo, paso alto, paso banda y paso banda eliminada, en lo que son los filtros de butterworth, filtros Chebyshev, de tipo uno como de tipo dos y filtros elípticos. Con esta primera parte, lo que se quiere es conocer, o en nuestro caso, recordar el entorno de Matlab, en sus distintas ecuaciones prediseñadas que nos ofrece el mencionado entorno, como también nos permite conocer un poco las características de estos filtros. Para posteriormente probar dichos filtros en el DSP. En la segunda etapa, y tras recordar un poco el entorno de Matlab, nos centramos en la elaboración y/o diseño de nuestro filtro adaptativo LMS; experimentado primero con Matlab, para como ya se dijo, entender y comprender el comportamiento del mismo. Cuando ya teníamos claro esta parte, procedimos a “cargar” el código en el DSP, compilarlo y depurarlo, realizando estas últimas acciones gracias al Visual DSP. Resaltaremos que durante esta segunda etapa se empezó a excitar las entradas del sistema, con señales provenientes del Cool Edit Pro, y además para saber cómo se comportaba el filtro adaptativo LMS, se utilizó señales provenientes de un generador de funciones, para obtener de esta manera un desfase entre las dos señales de entrada; aunque también se utilizó el propio Cool Edit Pro para obtener señales desfasadas, pero debido que la fase tres no podíamos usar el mencionado software, realizamos pruebas con el generador de funciones. Finalmente, en la tercera etapa, y tras comprobar el funcionamiento deseado de nuestro filtro adaptativo DSP con señales de entrada simuladas, pasamos a un laboratorio, en donde se utilizó señales provenientes del acelerómetro 4000A, y por supuesto, del generador de funciones; el cual sirvió para la formación de nuestra señal de referencia, que permitirá la eliminación de una de las frecuencias que se emitirá del acelerómetro. Por último, cabe resaltar que pudimos obtener un comportamiento del filtro adaptativo LMS adecuado, y como se esperaba. Realizamos pruebas, con señales de entrada desfasadas, y obtuvimos curiosas respuestas a la salida del sistema, como son que la frecuencia a eliminar, mientras más desfasado estén estas señales, mas se notaba. Solucionando este punto al aumentar el orden del filtro. Finalmente podemos concluir que pese a que los filtros digitales probados en la primera etapa son útiles, para tener una respuesta lo más ideal posible hay que tener en cuenta el orden del filtro, el cual debe ser muy alto para que las frecuencias próximas a la frecuencia de corte, no se atenúen. En cambio, en los filtros adaptativos LMS, si queremos por ejemplo, eliminar una señal de entre tres señales, sólo basta con introducir la frecuencia a eliminar, por una de las entradas del filtro, en concreto la señal de referencia. De esta manera, podemos eliminar una señal de entre estas tres, de manera que las otras dos, no se vean afectadas por el procedimiento. Abstract The project, "LMS adaptive filtering applications to improve the response of accelerometers" was conducted in order to remove unwanted signals from the information signal from the accelerometers for automotive applications using algorithms LMS adaptive filters. The project is comprised of three areas for implementation and execution, which were executed from the beginning until the last day. In the first area of application, we design low pass filters, high pass, band pass and band-stop, as the filters are Butterworth, Chebyshev filters, type one and type two and elliptic filters. In this first part, what we want is to know, or in our case, remember the Matlab environment, art in its various equations offered by the mentioned environment, as well as allows us to understand some of the characteristics of these filters. To further test these filters in the DSP. In the second stage, and recalling some Matlab environment, we focus on the development and design of our LMS adaptive filter; experimented first with Matlab, for as noted above, understand the behavior of the same. When it was clear this part, proceeded to "load" the code in the DSP, compile and debug, making these latest actions by the Visual DSP. Will highlight that during this second stage began to excite the system inputs, with signals from the Cool Edit Pro, and also for how he behaved the LMS adaptive filter was used signals from a function generator, to thereby obtain a gap between the two input signals, but also used Cool Edit Pro himself for phase signals, but due to phase three could not use such software, we test the function generator. Finally, in the third stage, and after checking the desired performance of our DSP adaptive filter with simulated input signals, we went to a laboratory, where we used signals from the accelerometer 4000A, and of course, the function generator, which was used for the formation of our reference signal, enabling the elimination of one of the frequencies to be emitted from the accelerometer. Note that they were able to obtain a behavior of the LMS adaptive filter suitable as expected. We test with outdated input signals, and got curious response to the output of the system, such as the frequency to remove, the more outdated are these signs, but noticeable. Solving this point with increasing the filter order. We can conclude that although proven digital filters in the first stage are useful, to have a perfect answer as possible must be taken into account the order of the filter, which should be very high for frequencies near the frequency cutting, not weakened. In contrast, in the LMS adaptive filters if we for example, remove a signal from among three signals, only enough to eliminate the frequency input on one of the inputs of the filter, namely the reference signal. Thus, we can remove a signal between these three, so that the other two, not affected by the procedure.
Resumo:
This paper presents the complete development of the Simbiosis Smart Walker. The device is equipped with a set of sensor subsystems to acquire user-machine interaction forces and the temporal evolution of user's feet during gait. The authors present an adaptive filtering technique used for the identification and separation of different components found on the human-machine interaction forces. This technique allowed isolating the components related with the navigational commands and developing a Fuzzy logic controller to guide the device. The Smart Walker was clinically validated at the Spinal Cord Injury Hospital of Toledo - Spain, presenting great acceptability by spinal chord injury patients and clinical staff
Resumo:
La Diabetes Mellitus se define como el trastorno del metabolismo de los carbohidratos, resultante de una producción insuficiente o nula de insulina en las células beta del páncreas, o la manifestación de una sensibilidad reducida a la insulina por parte del sistema metabólico. La diabetes tipo 1 se caracteriza por la nula producción de insulina por la destrucción de las células beta del páncreas. Si no hay insulina en el torrente sanguíneo, la glucosa no puede ser absorbida por las células, produciéndose un estado de hiperglucemia en el paciente, que a medio y largo plazo si no es tratado puede ocasionar severas enfermedades, conocidos como síndromes de la diabetes. La diabetes tipo 1 es una enfermedad incurable pero controlable. La terapia para esta enfermedad consiste en la aplicación exógena de insulina con el objetivo de mantener el nivel de glucosa en sangre dentro de los límites normales. Dentro de las múltiples formas de aplicación de la insulina, en este proyecto se usará una bomba de infusión, que unida a un sensor subcutáneo de glucosa permitirá crear un lazo de control autónomo que regule la cantidad optima de insulina aplicada en cada momento. Cuando el algoritmo de control se utiliza en un sistema digital, junto con el sensor subcutáneo y bomba de infusión subcutánea, se conoce como páncreas artificial endocrino (PAE) de uso ambulatorio, hoy día todavía en fase de investigación. Estos algoritmos de control metabólico deben de ser evaluados en simulación para asegurar la integridad física de los pacientes, por lo que es necesario diseñar un sistema de simulación mediante el cual asegure la fiabilidad del PAE. Este sistema de simulación conecta los algoritmos con modelos metabólicos matemáticos para obtener una visión previa de su funcionamiento. En este escenario se diseñó DIABSIM, una herramienta desarrollada en LabViewTM, que posteriormente se trasladó a MATLABTM, y basada en el modelo matemático compartimental propuesto por Hovorka, con la que poder simular y evaluar distintos tipos de terapias y reguladores en lazo cerrado. Para comprobar que estas terapias y reguladores funcionan, una vez simulados y evaluados, se tiene que pasar a la experimentación real a través de un protocolo de ensayo clínico real, como paso previo al PEA ambulatorio. Para poder gestionar este protocolo de ensayo clínico real para la verificación de los algoritmos de control, se creó una interfaz de usuario a través de una serie de funciones de simulación y evaluación de terapias con insulina realizadas con MATLABTM (GUI: Graphics User Interface), conocido como Entorno de Páncreas artificial con Interfaz Clínica (EPIC). EPIC ha sido ya utilizada en 10 ensayos clínicos de los que se han ido proponiendo posibles mejoras, ampliaciones y/o cambios. Este proyecto propone una versión mejorada de la interfaz de usuario EPIC propuesta en un proyecto anterior para gestionar un protocolo de ensayo clínico real para la verificación de algoritmos de control en un ambiente hospitalario muy controlado, además de estudiar la viabilidad de conectar el GUI con SimulinkTM (entorno gráfico de Matlab de simulación de sistemas) para su conexión con un nuevo simulador de pacientes aprobado por la JDRF (Juvenil Diabetes Research Foundation). SUMMARY The diabetes mellitus is a metabolic disorder of carbohydrates, as result of an insufficient or null production of insulin in the beta cellules of pancreas, or the manifestation of a reduced sensibility to the insulin from the metabolic system. The type 1 diabetes is characterized for a null production of insulin due to destruction of the beta cellules. Without insulin in the bloodstream, glucose can’t be absorbed by the cellules, producing a hyperglycemia state in the patient and if pass a medium or long time and is not treated can cause severe disease like diabetes syndrome. The type 1 diabetes is an incurable disease but controllable one. The therapy for this disease consists on the exogenous insulin administration with the objective to maintain the glucose level in blood within the normal limits. For the insulin administration, in this project is used an infusion pump, that permit with a subcutaneous glucose sensor, create an autonomous control loop that regulate the optimal insulin amount apply in each moment. When the control algorithm is used in a digital system, with the subcutaneous senor and infusion subcutaneous pump, is named as “Artificial Endocrine Pancreas” for ambulatory use, currently under investigate. These metabolic control algorithms should be evaluates in simulation for assure patients’ physical integrity, for this reason is necessary to design a simulation system that assure the reliability of PAE. This simulation system connects algorithms with metabolic mathematics models for get a previous vision of its performance. In this scenario was created DIABSIMTM, a tool developed in LabView, that later was converted to MATLABTM, and based in the compartmental mathematic model proposed by Hovorka that could simulate and evaluate several different types of therapy and regulators in closed loop. To check the performance of these therapies and regulators, when have been simulated and evaluated, will be necessary to pass to real experimentation through a protocol of real clinical test like previous step to ambulatory PEA. To manage this protocol was created an user interface through the simulation and evaluation functions od therapies with insulin realized with MATLABTM (GUI: Graphics User Interface), known as “Entorno de Páncreas artificial con Interfaz Clínica” (EPIC).EPIC have been used in 10 clinical tests which have been proposed improvements, adds and changes. This project proposes a best version of user interface EPIC proposed in another project for manage a real test clinical protocol for checking control algorithms in a controlled hospital environment and besides studying viability to connect the GUI with SimulinkTM (Matlab graphical environment in systems simulation) for its connection with a new patients simulator approved for the JDRF (Juvenil Diabetes Research Foundation).
Resumo:
In this paper, we introduce B2DI model that extends BDI model to perform Bayesian inference under uncertainty. For scalability and flexibility purposes, Multiply Sectioned Bayesian Network (MSBN) technology has been selected and adapted to BDI agent reasoning. A belief update mechanism has been defined for agents, whose belief models are connected by public shared beliefs, and the certainty of these beliefs is updated based on MSBN. The classical BDI agent architecture has been extended in order to manage uncertainty using Bayesian reasoning. The resulting extended model, so-called B2DI, proposes a new control loop. The proposed B2DI model has been evaluated in a network fault diagnosis scenario. The evaluation has compared this model with two previously developed agent models. The evaluation has been carried out with a real testbed diagnosis scenario using JADEX. As a result, the proposed model exhibits significant improvements in the cost and time required to carry out a reliable diagnosis.