700 resultados para Multilayer Perceptron
Resumo:
Die vorliegende Arbeit befasst sich mit der Entwicklung eines nichtviralen, effizienten Transfektionsmittels mit einer Kern-Schale-Struktur in der Größenordnung bis 100 nm. Dafür werden magnetische, negativ geladene Eisenoxid-Nanopartikel mittels Thermolyse mit einem Durchmesser von 17 nm synthetisiert und in Wasser überführt. Diese Nanopartikel bilden den Kern des Erbgut-Trägers und werden mittels Layer-by-Layer –Verfahren (LbL) mit geladenen Polymeren, den bioabbaubaren Makromolekülen Poly-L-Lysin und Heparin, beschichtet. Dafür wird zunächst eine geeignete Apparatur aufgebaut. Diese wird zur Herstellung von Kern-Schale-Strukturen mit fünf Polyelektrolytschichten verwendet und liefert Partikel mit einem hydrodynamischen Durchmesser von 58 nm, die bei Abwesenheit von niedermolekularem Salz aggregatfrei sind. Das System wird gegen Salz stabilisiert, indem die letzte Poly-L-Lysin-Schicht mit Polyethylenglycol modifiziert wird. Die so entstandenen Multischalenpartikel zeigen weder im PBS-Puffer noch in humanem Serum Aggregation. Mittels winkelabhängiger dynamischer Lichtstreuung wird die Aggregatbildung kontrolliert, während ζ-Potential-Messungen die Kontrolle der Oberflächenladung erlauben.rnDa siRNA auf Grund ihres negativ geladenen Phosphat-Rückgrats ebenfalls ein Polyelektrolyt ist, wird sie aggregatfrei auf die positiv geladenen PLL-Nanopartikel aufgetragen. Die eingesetzte siRNA ist farbstoffmarkiert, um eine Detektion in vitro zu ermöglichen. Jedoch sind die entstandenen Komplexe mittels Fluoreszenzkorrelations-spektroskopie (FCS) nicht nachweisbar. Auch die Fluoreszenzmarkierung der PEGylierten Außenschale mittels kupferfreier Click-Chemie ist in der FCS nicht sichtbar, sodass eine Fluoreszenzauslöschung der Farbstoffe in den Multischalenpartikeln vermutet wird.rn
Resumo:
Pericyte loss and capillary regression are characteristic for incipient diabetic retinopathy. Pericyte recruitment is involved in vessel maturation, and ligand-receptor systems contributing to pericyte recruitment are survival factors for endothelial cells in pericyte-free in vitro systems. We studied pericyte recruitment in relation to the susceptibility toward hyperoxia-induced vascular remodeling using the pericyte reporter X-LacZ mouse and the mouse model of retinopathy of prematurity (ROP). Pericytes were found in close proximity to vessels, both during formation of the superficial and the deep capillary layers. When exposure of mice to the ROP was delayed by 24 h, i.e., after the deep retinal layer had formed [at postnatal (p) day 8], preretinal neovascularizations were substantially diminished at p18. Mice with a delayed ROP exposure had 50% reduced avascular zones. Formation of the deep capillary layers at p8 was associated with a combined up-regulation of angiopoietin-1 and PDGF-B, while VEGF was almost unchanged during the transition from a susceptible to a resistant capillary network. Inhibition of Tie-2 function either by soluble Tie-2 or by a sulindac analog, an inhibitor of Tie-2 phosphorylation, resensitized retinal vessels to neovascularizations due to a reduction of the deep capillary network. Inhibition of Tie-2 function had no effect on pericyte recruitment. Our data indicate that the final maturation of the retinal vasculature and its resistance to regressive signals such as hyperoxia depend on the completion of the multilayer structure, in particular the deep capillary layers, and are independent of the coverage by pericytes.
Resumo:
The goal of this research is to provide a framework for vibro-acoustical analysis and design of a multiple-layer constrained damping structure. The existing research on damping and viscoelastic damping mechanism is limited to the following four mainstream approaches: modeling techniques of damping treatments/materials; control through the electrical-mechanical effect using the piezoelectric layer; optimization by adjusting the parameters of the structure to meet the design requirements; and identification of the damping material’s properties through the response of the structure. This research proposes a systematic design methodology for the multiple-layer constrained damping beam giving consideration to vibro-acoustics. A modeling technique to study the vibro-acoustics of multiple-layered viscoelastic laminated beams using the Biot damping model is presented using a hybrid numerical model. The boundary element method (BEM) is used to model the acoustical cavity whereas the Finite Element Method (FEM) is the basis for vibration analysis of the multiple-layered beam structure. Through the proposed procedure, the analysis can easily be extended to other complex geometry with arbitrary boundary conditions. The nonlinear behavior of viscoelastic damping materials is represented by the Biot damping model taking into account the effects of frequency, temperature and different damping materials for individual layers. A curve-fitting procedure used to obtain the Biot constants for different damping materials for each temperature is explained. The results from structural vibration analysis for selected beams agree with published closed-form results and results for the radiated noise for a sample beam structure obtained using a commercial BEM software is compared with the acoustical results of the same beam with using the Biot damping model. The extension of the Biot damping model is demonstrated to study MDOF (Multiple Degrees of Freedom) dynamics equations of a discrete system in order to introduce different types of viscoelastic damping materials. The mechanical properties of viscoelastic damping materials such as shear modulus and loss factor change with respect to different ambient temperatures and frequencies. The application of multiple-layer treatment increases the damping characteristic of the structure significantly and thus helps to attenuate the vibration and noise for a broad range of frequency and temperature. The main contributions of this dissertation include the following three major tasks: 1) Study of the viscoelastic damping mechanism and the dynamics equation of a multilayer damped system incorporating the Biot damping model. 2) Building the Finite Element Method (FEM) model of the multiple-layer constrained viscoelastic damping beam and conducting the vibration analysis. 3) Extending the vibration problem to the Boundary Element Method (BEM) based acoustical problem and comparing the results with commercial simulation software.
Resumo:
One dimensional magnetic photonic crystals (1D-MPC) are promising structures for integrated optical isolator applications. Rare earth substituted garnet thin films with proper Faraday rotation are required to fabricate planar 1D-MPCs. In this thesis, flat-top response 1D-MPC was proposed and spectral responses and Faraday rotation were modeled. Bismuth substituted iron garnet films were fabricated by RF magnetron sputtering and structures, compositions, birefringence and magnetooptical properties were studied. Double layer structures for single mode propagation were also fabricated by sputtering for the first time. Multilayer stacks with multiple defects (phase shift) composed of Ce-YIG and GGG quarter-wave plates were simulated by the transfer matrix method. The transmission and Faraday rotation characteristics were theoretically studied. It is found that flat-top response, with 100% transmission and near 45o rotation is achievable by adjusting the inter-defect spacing, for film structures as thin as 30 to 35 μm. This is better than 3-fold reduction in length compared to the best Ce-YIG films for comparable rotations, thus allows a considerable reduction in size in manufactured optical isolators. Transmission bands as wide as 7nm were predicted, which is considerable improvement over 2 defects structure. Effect of repetition number and ratio factor on transmission and Faraday rotation ripple factors for the case of 3 and 4 defects structure has been discussed. Diffraction across the structure corresponds to a longer optical path length. Thus the use of guided optics is required to minimize the insertion losses in integrated devices. This part is discussed in chapter 2 in this thesis. Bismuth substituted iron garnet thin films were prepared by RF magnetron sputtering. We investigated or measured the deposition parameters optimization, crystallinity, surface morphologies, composition, magnetic and magnetooptical properties. A very high crystalline quality garnet film with smooth surface has been heteroepitaxially grown on (111) GGG substrate for films less than 1μm. Dual layer structures with two distinct XRD peaks (within a single sputtered film) start to develop when films exceed this thickness. The development of dual layer structure was explained by compositional gradient across film thickness, rather than strain gradient proposed by other authors. Lower DC self bias or higher substrate temperature is found to help to delay the appearance of the 2nd layer. The deposited films show in-plane magnetization, which is advantageous for waveguide devices application. Propagation losses of fabricated waveguides can be decreased by annealing in an oxygen atmosphere from 25dB/cm to 10dB/cm. The Faraday rotation at λ=1.55μm were also measured for the waveguides. FR is small (10° for a 3mm long waveguide), due to the presence of linear birefringence. This part is covered in chapter 4. We also investigated the elimination of linear birefringence by thickness tuning method for our sputtered films. We examined the compressively and tensilely strained films and analyze the photoelastic response of the sputter deposited garnet films. It has been found that the net birefringence can be eliminated under planar compressive strain conditions by sputtering. Bi-layer GGG on garnet thin film yields a reduced birefringence. Temperature control during the sputter deposition of GGG cover layer is critical and strongly influences the magnetization and birefringence level in the waveguide. High temperature deposition lowers the magnetization and increases the linear birefringence in the garnet films. Double layer single mode structures fabricated by sputtering were also studied. The double layer, which shows an in-plane magnetization, has an increased RMS roughness upon upper layer deposition. The single mode characteristic was confirmed by prism coupler measurement. This part is discussed in chapter 5.
Resumo:
Given the complex structure of the brain, how can synaptic plasticity explain the learning and forgetting of associations when these are continuously changing? We address this question by studying different reinforcement learning rules in a multilayer network in order to reproduce monkey behavior in a visuomotor association task. Our model can only reproduce the learning performance of the monkey if the synaptic modifications depend on the pre- and postsynaptic activity, and if the intrinsic level of stochasticity is low. This favored learning rule is based on reward modulated Hebbian synaptic plasticity and shows the interesting feature that the learning performance does not substantially degrade when adding layers to the network, even for a complex problem.
Resumo:
High reflective materials in the microwave region play a very important role in the realization of antenna reflectors for a broad range of applications, including radiometry. These reflectors have a characteristic emissivity which needs to be characterized accurately in order to perform a correct radiometric calibration of the instrument. Such a characterization can be performed by using open resonators, waveguide cavities or by radiometric measurements. The latter consists of comparative radiometric observations of absorbers, reference mirrors and the sample under test, or using the cold sky radiation as a direct reference source. While the first two mentioned techniques are suitable for the characterization of metal plates and mirrors, the latter has the advantages to be also applicable to soft materials. This paper describes how, through this radiometric techniques, it is possible to characterize the emissivity of the sample relative to a reference mirror and how to characterize the absolute emissivity of the latter by performing measurements at different incident angles. The results presented in this paper are based on our investigations on emissivity of a multilayer insulation material (MLI) for space mission, at the frequencies of 22 and 90 GHz.
Resumo:
It is well known that gases adsorb on many surfaces, in particular metal surfaces. There are two main forms responsible for these effects (i) physisorption and (ii) chemisorption. Physisorption is associated with lower binding energies in the order of 1–10 kJ mol−¹, compared to chemisorption which ranges from 100 to 1000 kJ mol−¹. Furthermore, chemisorption only forms monolayers, contrasting physisorption that can form multilayer adsorption. The reverse process is called desorption and follows similar mathematical laws; however, it can be influenced by hysteresis effects. In the present experiment, we investigated the adsorption/desorption phenomena on three steel and three aluminium cylinders containing compressed air in our laboratory and under controlled conditions in a climate chamber, respectively. Our observations from completely decanting one steel and two aluminium cylinders are in agreement with the pressure dependence of physisorption for CO₂, CH₄, and H₂O. The CO₂ results for both cylinder types are in excellent agreement with the pressure dependence of a monolayer adsorption model. However, mole fraction changes due to adsorption on aluminium (< 0.05 and 0 ppm for CO₂ and H₂O) were significantly lower than on steel (< 0.41 ppm and about < 2.5 ppm, respectively). The CO₂ amount adsorbed (5.8 × 1019 CO₂ molecules) corresponds to about the fivefold monolayer adsorption, indicating that the effective surface exposed for adsorption is significantly larger than the geometric surface area. Adsorption/desorption effects were minimal for CH₄ and for CO but require further attention since they were only studied on one aluminium cylinder with a very low mole fraction. In the climate chamber, the cylinders were exposed to temperatures between −10 and +50 °C to determine the corresponding temperature coefficients of adsorption. Again, we found distinctly different values for CO₂, ranging from 0.0014 to 0.0184 ppm °C−¹ for steel cylinders and −0.0002 to −0.0003 ppm °C−¹ for aluminium cylinders. The reversed temperature dependence for aluminium cylinders points to significantly lower desorption energies than for steel cylinders and due to the small values, they might at least partly be influenced by temperature, permeation from/to sealing materials, and gas-consumption-induced pressure changes. Temperature coefficients for CH₄, CO, and H₂O adsorption were, within their error bands, insignificant. These results do indicate the need for careful selection and usage of gas cylinders for high-precision calibration purposes such as requested in trace gas applications.
Resumo:
The aim is to obtain computationally more powerful, neuro physiologically founded, artificial neurons and neural nets. Artificial Neural Nets (ANN) of the Perceptron type evolved from the original proposal by McCulloch an Pitts classical paper [1]. Essentially, they keep the computing structure of a linear machine followed by a non linear operation. The McCulloch-Pitts formal neuron (which was never considered by the author’s to be models of real neurons) consists of the simplest case of a linear computation of the inputs followed by a threshold. Networks of one layer cannot compute anylogical function of the inputs, but only those which are linearly separable. Thus, the simple exclusive OR (contrast detector) function of two inputs requires two layers of formal neurons
Resumo:
We have recently demonstrated a biosensor based on a lattice of SU8 pillars on a 1 μm SiO2/Si wafer by measuring vertically reflectivity as a function of wavelength. The biodetection has been proven with the combination of Bovine Serum Albumin (BSA) protein and its antibody (antiBSA). A BSA layer is attached to the pillars; the biorecognition of antiBSA involves a shift in the reflectivity curve, related with the concentration of antiBSA. A detection limit in the order of 2 ng/ml is achieved for a rhombic lattice of pillars with a lattice parameter (a) of 800 nm, a height (h) of 420 nm and a diameter(d) of 200 nm. These results correlate with calculations using 3D-finite difference time domain method. A 2D simplified model is proposed, consisting of a multilayer model where the pillars are turned into a 420 nm layer with an effective refractive index obtained by using Beam Propagation Method (BPM) algorithm. Results provided by this model are in good correlation with experimental data, reaching a reduction in time from one day to 15 minutes, giving a fast but accurate tool to optimize the design and maximizing sensitivity, and allows analyzing the influence of different variables (diameter, height and lattice parameter). Sensitivity is obtained for a variety of configurations, reaching a limit of detection under 1 ng/ml. Optimum design is not only chosen because of its sensitivity but also its feasibility, both from fabrication (limited by aspect ratio and proximity of the pillars) and fluidic point of view. (© 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)
Resumo:
In this work, we report the magnetic properties of sputtered Permalloy (Py: Ni80Fe20)/molybdenum (Mo) multilayer thin films. We show that it is possible to maintain a low coercivity and a high permeability in thick sputtered Py films when reducing the out-of-plane component of the anisotropy by inserting thin film spacers of a non-magnetic material like Mo. For these kind of multilayers, we have found coercivities which are close to those for single layer films with no out-of-plane anisotropy. The coercivity is also dependent on the number of layers exhibiting a minimum value when each single Py layer has a thickness close to the transition thickness between Neel and Bloch domain walls.
Resumo:
Six-port network is an interesting radiofrequency architecture with multiple possibilities. Since it was firstly introduced in the seventies as an alternative network analyzer, the six-port network has been used for many applications, such as homodyne receivers, radar systems, direction of arrival estimation, UWB (Ultra-Wide-Band), or MIMO (Multiple Input Multiple Output) systems. Currently, it is considered as a one of the best candidates to implement a Software Defined Radio (SDR). This thesis comprises an exhaustive study of this promising architecture, where its fundamentals and the state-of-the-art are also included. In addition, the design and development of a SDR 0.3-6 GHz six-port receiver prototype is presented in this thesis, which is implemented in conventional technology. The system is experimentally characterized and validated for RF signal demodulation with good performance. The analysis of the six-port architecture is complemented by a theoretical and experimental comparison with other radiofrequency architectures suitable for SDR. Some novel contributions are introduced in the present thesis. Such novelties are in the direction of the highly topical issues on six-port technique: development and optimization of real-time I-Q regeneration techniques for multiport networks; and search of new techniques and technologies to contribute to the miniaturization of the six-port architecture. In particular, the novel contributions of this thesis can be summarized as: - Introduction of a new real-time auto-calibration method for multiport receivers, particularly suitable for broadband designs and high data rate applications. - Introduction of a new direct baseband I-Q regeneration technique for five-port receivers. - Contribution to the miniaturization of six-port receivers by the use of the multilayer LTCC (Low Temperature Cofired Ceramic) technology. Implementation of a compact (30x30x1.25 mm) broadband (0.3-6 GHz) six-port receiver in LTTC technology. The results and conclusions derived from this thesis have been satisfactory, and quite fruitful in terms of publications. A total of fourteen works have been published, considering international journals and conferences, and national conferences. Aditionally, a paper has been submitted to an internationally recognized journal, which is currently under review.
Resumo:
OntoTag - A Linguistic and Ontological Annotation Model Suitable for the Semantic Web
1. INTRODUCTION. LINGUISTIC TOOLS AND ANNOTATIONS: THEIR LIGHTS AND SHADOWS
Computational Linguistics is already a consolidated research area. It builds upon the results of other two major ones, namely Linguistics and Computer Science and Engineering, and it aims at developing computational models of human language (or natural language, as it is termed in this area). Possibly, its most well-known applications are the different tools developed so far for processing human language, such as machine translation systems and speech recognizers or dictation programs.
These tools for processing human language are commonly referred to as linguistic tools. Apart from the examples mentioned above, there are also other types of linguistic tools that perhaps are not so well-known, but on which most of the other applications of Computational Linguistics are built. These other types of linguistic tools comprise POS taggers, natural language parsers and semantic taggers, amongst others. All of them can be termed linguistic annotation tools.
Linguistic annotation tools are important assets. In fact, POS and semantic taggers (and, to a lesser extent, also natural language parsers) have become critical resources for the computer applications that process natural language. Hence, any computer application that has to analyse a text automatically and ‘intelligently’ will include at least a module for POS tagging. The more an application needs to ‘understand’ the meaning of the text it processes, the more linguistic tools and/or modules it will incorporate and integrate.
However, linguistic annotation tools have still some limitations, which can be summarised as follows:
1. Normally, they perform annotations only at a certain linguistic level (that is, Morphology, Syntax, Semantics, etc.).
2. They usually introduce a certain rate of errors and ambiguities when tagging. This error rate ranges from 10 percent up to 50 percent of the units annotated for unrestricted, general texts.
3. Their annotations are most frequently formulated in terms of an annotation schema designed and implemented ad hoc.
A priori, it seems that the interoperation and the integration of several linguistic tools into an appropriate software architecture could most likely solve the limitations stated in (1). Besides, integrating several linguistic annotation tools and making them interoperate could also minimise the limitation stated in (2). Nevertheless, in the latter case, all these tools should produce annotations for a common level, which would have to be combined in order to correct their corresponding errors and inaccuracies. Yet, the limitation stated in (3) prevents both types of integration and interoperation from being easily achieved.
In addition, most high-level annotation tools rely on other lower-level annotation tools and their outputs to generate their own ones. For example, sense-tagging tools (operating at the semantic level) often use POS taggers (operating at a lower level, i.e., the morphosyntactic) to identify the grammatical category of the word or lexical unit they are annotating. Accordingly, if a faulty or inaccurate low-level annotation tool is to be used by other higher-level one in its process, the errors and inaccuracies of the former should be minimised in advance. Otherwise, these errors and inaccuracies would be transferred to (and even magnified in) the annotations of the high-level annotation tool.
Therefore, it would be quite useful to find a way to
(i) correct or, at least, reduce the errors and the inaccuracies of lower-level linguistic tools;
(ii) unify the annotation schemas of different linguistic annotation tools or, more generally speaking, make these tools (as well as their annotations) interoperate.
Clearly, solving (i) and (ii) should ease the automatic annotation of web pages by means of linguistic tools, and their transformation into Semantic Web pages (Berners-Lee, Hendler and Lassila, 2001). Yet, as stated above, (ii) is a type of interoperability problem. There again, ontologies (Gruber, 1993; Borst, 1997) have been successfully applied thus far to solve several interoperability problems. Hence, ontologies should help solve also the problems and limitations of linguistic annotation tools aforementioned.
Thus, to summarise, the main aim of the present work was to combine somehow these separated approaches, mechanisms and tools for annotation from Linguistics and Ontological Engineering (and the Semantic Web) in a sort of hybrid (linguistic and ontological) annotation model, suitable for both areas. This hybrid (semantic) annotation model should (a) benefit from the advances, models, techniques, mechanisms and tools of these two areas; (b) minimise (and even solve, when possible) some of the problems found in each of them; and (c) be suitable for the Semantic Web. The concrete goals that helped attain this aim are presented in the following section.
2. GOALS OF THE PRESENT WORK
As mentioned above, the main goal of this work was to specify a hybrid (that is, linguistically-motivated and ontology-based) model of annotation suitable for the Semantic Web (i.e. it had to produce a semantic annotation of web page contents). This entailed that the tags included in the annotations of the model had to (1) represent linguistic concepts (or linguistic categories, as they are termed in ISO/DCR (2008)), in order for this model to be linguistically-motivated; (2) be ontological terms (i.e., use an ontological vocabulary), in order for the model to be ontology-based; and (3) be structured (linked) as a collection of ontology-based
Resumo:
La Organización Mundial de la Salud (OMS) prevé que para el año 2020, el Daño Cerebral Adquirido (DCA) estará entre las 10 causas más comunes de discapacidad. Estas lesiones, dadas sus consecuencias físicas, sensoriales, cognitivas, emocionales y socioeconómicas, cambian dramáticamente la vida de los pacientes y sus familias. Las nuevas técnicas de intervención precoz y el desarrollo de la medicina intensiva en la atención al DCA han mejorado notablemente la probabilidad de supervivencia. Sin embargo, hoy por hoy, las lesiones cerebrales no tienen ningún tratamiento quirúrgico que tenga por objetivo restablecer la funcionalidad perdida, sino que las terapias rehabilitadoras se dirigen hacia la compensación de los déficits producidos. Uno de los objetivos principales de la neurorrehabilitación es, por tanto, dotar al paciente de la capacidad necesaria para ejecutar las Actividades de Vida Diaria (AVDs) necesarias para desarrollar una vida independiente, siendo fundamentales aquellas en las que la Extremidad Superior (ES) está directamente implicada, dada su gran importancia a la hora de la manipulación de objetos. Con la incorporación de nuevas soluciones tecnológicas al proceso de neurorrehabilitación se pretende alcanzar un nuevo paradigma centrado en ofrecer una práctica personalizada, monitorizada y ubicua con una valoración continua de la eficacia y de la eficiencia de los procedimientos y con capacidad de generar conocimientos que impulsen la ruptura del paradigma de actual. Los nuevos objetivos consistirán en minimizar el impacto de las enfermedades que afectan a la capacidad funcional de las personas, disminuir el tiempo de incapacidad y permitir una gestión más eficiente de los recursos. Estos objetivos clínicos, de gran impacto socio-económico, sólo pueden alcanzarse desde una apuesta decidida en nuevas tecnologías, metodologías y algoritmos capaces de ocasionar la ruptura tecnológica necesaria que permita superar las barreras que hasta el momento han impedido la penetración tecnológica en el campo de la rehabilitación de manera universal. De esta forma, los trabajos y resultados alcanzados en la Tesis son los siguientes: 1. Modelado de AVDs: como paso previo a la incorporación de ayudas tecnológicas al proceso rehabilitador, se hace necesaria una primera fase de modelado y formalización del conocimiento asociado a la ejecución de las actividades que se realizan como parte de la terapia. En particular, las tareas más complejas y a su vez con mayor repercusión terapéutica son las AVDs, cuya formalización permitirá disponer de modelos de movimiento sanos que actuarán de referencia para futuros desarrollos tecnológicos dirigidos a personas con DCA. Siguiendo una metodología basada en diagramas de estados UML se han modelado las AVDs 'servir agua de una jarra' y 'coger un botella' 2. Monitorización ubícua del movimiento de la ES: se ha diseñado, desarrollado y validado un sistema de adquisición de movimiento basado en tecnología inercial que mejora las limitaciones de los dispositivos comerciales actuales (coste muy elevado e incapacidad para trabajar en entornos no controlados); los altos coeficientes de correlación y los bajos niveles de error obtenidos en los corregistros llevados a cabo con el sistema comercial BTS SMART-D demuestran la alta precisión del sistema. También se ha realizado un trabajo de investigación exploratorio de un sistema de captura de movimiento de coste muy reducido basado en visión estereoscópica, habiéndose detectado los puntos clave donde se hace necesario incidir desde un punto de vista tecnológico para su incorporación en un entorno real 3. Resolución del Problema Cinemático Inverso (PCI): se ha diseñado, desarrollado y validado una solución al PCI cuando el manipulador se corresponde con una ES humana estudiándose 2 posibles alternativas, una basada en la utilización de un Perceptrón Multicapa (PMC) y otra basada en sistemas Artificial Neuro-Fuzzy Inference Systems (ANFIS). La validación, llevada a cabo utilizando información relativa a los modelos disponibles de AVDs, indica que una solución basada en un PMC con 3 neuronas en la capa de entrada, una capa oculta también de 3 neuronas y una capa de salida con tantas neuronas como Grados de Libertad (GdLs) tenga el modelo de la ES, proporciona resultados, tanto de precisión como de tiempo de cálculo, que la hacen idónea para trabajar en sistemas con requisitos de tiempo real 4. Control inteligente assisted-as-needed: se ha diseñado, desarrollado y validado un algoritmo de control assisted-as-needed para una ortesis robótica con capacidades de actuación anticipatoria de la que existe un prototipo implementado en la actualidad. Los resultados obtenidos demuestran cómo el sistema es capaz de adaptarse al perfil disfuncional del paciente activando la ayuda en instantes anteriores a la ocurrencia de movimientos incorrectos. Esta estrategia implica un aumento en la participación del paciente y, por tanto, en su actividad muscular, fomentándose los procesos la plasticidad cerebral responsables del reaprendizaje o readaptación motora 5. Simuladores robóticos para planificación: se propone la utilización de un simulador robótico assisted-as-needed como herramienta de planificación de sesiones de rehabilitación personalizadas y con un objetivo clínico marcado en las que interviene una ortesis robotizada. Los resultados obtenidos evidencian como, tras la ejecución de ciertos algoritmos sencillos, es posible seleccionar automáticamente una configuración para el algoritmo de control assisted-as-needed que consigue que la ortesis se adapte a los criterios establecidos desde un punto de vista clínico en función del paciente estudiado. Estos resultados invitan a profundizar en el desarrollo de algoritmos más avanzados de selección de parámetros a partir de baterías de simulaciones Estos trabajos han servido para corroborar las hipótesis de investigación planteadas al inicio de la misma, permitiendo, asimismo, la apertura de nuevas líneas de investigación. Summary The World Health Organization (WHO) predicts that by the year 2020, Acquired Brain Injury (ABI) will be among the ten most common ailments. These injuries dramatically change the life of the patients and their families due to their physical, sensory, cognitive, emotional and socio-economic consequences. New techniques of early intervention and the development of intensive ABI care have noticeably improved the survival rate. However, in spite of these advances, brain injuries still have no surgical or pharmacological treatment to re-establish the lost functions. Neurorehabilitation therapies address this problem by restoring, minimizing or compensating the functional alterations in a person disabled because of a nervous system injury. One of the main objectives of Neurorehabilitation is to provide patients with the capacity to perform specific Activities of the Daily Life (ADL) required for an independent life, especially those in which the Upper Limb (UL) is directly involved due to its great importance in manipulating objects within the patients' environment. The incorporation of new technological aids to the neurorehabilitation process tries to reach a new paradigm focused on offering a personalized, monitored and ubiquitous practise with continuous assessment of both the efficacy and the efficiency of the procedures and with the capacity of generating new knowledge. New targets will be to minimize the impact of the sicknesses affecting the functional capabilitiies of the subjects, to decrease the time of the physical handicap and to allow a more efficient resources handling. These targets, of a great socio-economic impact, can only be achieved by means of new technologies and algorithms able to provoke the technological break needed to beat the barriers that are stopping the universal penetration of the technology in the field of rehabilitation. In this way, this PhD Thesis has achieved the following results: 1. ADL Modeling: as a previous step to the incorporation of technological aids to the neurorehabilitation process, it is necessary a first modelling and formalization phase of the knowledge associated to the execution of the activities that are performed as a part of the therapy. In particular, the most complex and therapeutically relevant tasks are the ADLs, whose formalization will produce healthy motion models to be used as a reference for future technological developments. Following a methodology based on UML state-chart diagrams, the ADLs 'serving water from a jar' and 'picking up a bottle' have been modelled 2. Ubiquitous monitoring of the UL movement: it has been designed, developed and validated a motion acquisition system based on inertial technology that improves the limitations of the current devices (high monetary cost and inability of working within uncontrolled environments); the high correlation coefficients and the low error levels obtained throughout several co-registration sessions with the commercial sys- tem BTS SMART-D show the high precision of the system. Besides an exploration of a very low cost stereoscopic vision-based motion capture system has been carried out and the key points where it is necessary to insist from a technological point of view have been detected 3. Inverse Kinematics (IK) problem solving: a solution to the IK problem has been proposed for a manipulator that corresponds to a human UL. This solution has been faced by means of two different alternatives, one based on a Mulilayer Perceptron (MLP) and another based on Artificial Neuro-Fuzzy Inference Systems (ANFIS). The validation of these solutions, carried out using the information regarding the previously generated motion models, indicate that a MLP-based solution, with an architecture consisting in 3 neurons in the input layer, one hidden layer of 3 neurons and an output layer with as many neurons as the number of Degrees of Freedom (DoFs) that the UL model has, is the one that provides the best results both in terms of precission and in terms of processing time, making in idoneous to be integrated within a system with real time restrictions 4. Assisted-as-needed intelligent control: an assisted-as-needed control algorithm with anticipatory actuation capabilities has been designed, developed and validated for a robotic orthosis of which there is an already implemented prototype. Obtained results demonstrate that the control system is able to adapt to the dysfunctional profile of the patient by triggering the assistance right before an incorrect movement is going to take place. This strategy implies an increase in the participation of the patients and in his or her muscle activity, encouraging the neural plasticity processes in charge of the motor learning 5. Planification with a robotic simulator: in this work a robotic simulator is proposed as a planification tool for personalized rehabilitation sessions under a certain clinical criterium. Obtained results indicate that, after the execution of simple parameter selection algorithms, it is possible to automatically choose a specific configuration that makes the assisted-as-needed control algorithm to adapt both to the clinical criteria and to the patient. These results invite researchers to work in the development of more complex parameter selection algorithms departing from simulation batteries Obtained results have been useful to corroborate the hypotheses set out at the beginning of this PhD Thesis. Besides, they have allowed the creation of new research lines in all the studied application fields.
Resumo:
Monolithical series connection of silicon thin-film solar cells modules performed by laser scribing plays a very important role in the entire production of these devices. In the current laser process interconnection the two last steps are developed for a configuration of modules where the glass is essential as transparent substrate. In addition, the change of wavelength in the employed laser sources is sometimes enforced due to the nature of the different materials of the multilayer structure which make up the device. The aim of this work is to characterize the laser patterning involved in the monolithic interconnection process in a different configurations of processing than the usually performed with visible laser sources. To carry out this study, we use nanosecond and picosecond laser sources working at 355nm of wavelength in order to achieve the selective ablation of the material from the film side. To assess this selective removal of material has been used EDX (energy dispersive using X-ray) analysis
Resumo:
This paper presents some ideas about a new neural network architecture that can be compared to a Taylor analysis when dealing with patterns. Such architecture is based on lineal activation functions with an axo-axonic architecture. A biological axo-axonic connection between two neurons is defined as the weight in a connection in given by the output of another third neuron. This idea can be implemented in the so called Enhanced Neural Networks in which two Multilayer Perceptrons are used; the first one will output the weights that the second MLP uses to computed the desired output. This kind of neural network has universal approximation properties even with lineal activation functions. There exists a clear difference between cooperative and competitive strategies. The former ones are based on the swarm colonies, in which all individuals share its knowledge about the goal in order to pass such information to other individuals to get optimum solution. The latter ones are based on genetic models, that is, individuals can die and new individuals are created combining information of alive one; or are based on molecular/celular behaviour passing information from one structure to another. A swarm-based model is applied to obtain the Neural Network, training the net with a Particle Swarm algorithm.