916 resultados para Measure of adaptability


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Photothermal spectroscopy is a group of high sensitivity methods used to measure optical absorption and thermal characteristics of a sample.The basis of photothermal spectroscopy is a photo-induced change in the thermal state of the sample.Light energy absorbed and not lost by subsequent emission results in sample heating.This heating results in a temperature change as well as changes in thermodynamic parameters of the sample which are related to temperature.Measurements of the temperature,pressure,or density changes that occur due to optical absorption are ultimately the basis for the photothermal spectroscopic methods.This is a more direct measure of optical absorption than optical transmission based spectroscopies.Sample heating is a direct consequence of optical absorption and so photothermal spectroscopy signals are directly dependent on light absorption.Scattering and reflection losses do not produce photothermal signals.Subsequently,photothermal spectroscopy more accurately measures optical absorption in scattering solutions,in solids,and at interfaces.This aspect makes it particularly attractive for application to surface and solid absorption studies,and studies in scattering media.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The overall focus of the thesis involves the systematics,germplasm evaluation and pattern of distribution and abundance of freshwater fishes of kerala (india).Biodiversity is the measure of variety of Life.With the signing on the convention on biodiversity, the countries become privileged with absolute rights and responsibility to conserve and utilize their diverse resources for the betterment of mankind in a sustainable way. South-east Asia along with Africa and South America were considered to be the most biodiversity rich areas in the world .The tremendous potential associated with the sustainable utilization of fish germplasm resources of various river systems of Kerala for food, aquaculture and ornamental purposes have to be fully tapped for economic upliftment of fisherman community and also for equitable sharing of benefits among the mankind without compromising the conservation of the rare and unique fish germplasm resources for the future generations.The study was carried during April 2000 to December 2004. 25 major river systems of Kerala were surveyed for fish fauna for delineating the pattern of distribution and abundance of fishes both seasonally and geographically.The results of germplasm inventory and evaluation of fish species were presented both for the state and also river wise. The results of evaluation of fish species for their commercial utilization revealed that, of the 145, 76 are ornamental, 47 food and 22 cultivable. 21 species are strictly endemic to Kerala rivers. The revalidation on biodiversity status of the fishes assessed based on IUCN is so alarming that a high percentage of fishes (59spp.) belong to threatened category which is inclusive of 8 critically ndangered (CR), 36 endangered and 15 species under vulnerable (VU) category.The river wise fish germplasm inventory surveys were conducted in 25 major river systems of Kerala.The results of the present study is indicative of existence of several new fish species in the streams and rivulets located in remote areas of the forests and therefore, new exclusive surveys are required to surface fish species new to science, new distributional records etc, for the river systems.The results of fish germplasm evaluation revealed that there exist many potential endemic ornamental and cultivable fishes in Kerala. It is found imperative to utilize these species sustainably for improving the aquaculture production and aquarium trade of the country which would definitely fetch more income and generate employment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The service quality of any sector has two major aspects namely technical and functional. Technical quality can be attained by maintaining technical specification as decided by the organization. Functional quality refers to the manner which service is delivered to customer which can be assessed by the customer feed backs. A field survey was conducted based on the management tool SERVQUAL, by designing 28 constructs under 7 dimensions of service quality. Stratified sampling techniques were used to get 336 valid responses and the gap scores of expectations and perceptions are analyzed using statistical techniques to identify the weakest dimension. To assess the technical aspects of availability six months live outage data of base transceiver were collected. The statistical and exploratory techniques were used to model the network performance. The failure patterns have been modeled in competing risk models and probability distribution of service outage and restorations were parameterized. Since the availability of network is a function of the reliability and maintainability of the network elements, any service provider who wishes to keep up their service level agreements on availability should be aware of the variability of these elements and its effects on interactions. The availability variations were studied by designing a discrete time event simulation model with probabilistic input parameters. The probabilistic distribution parameters arrived from live data analysis was used to design experiments to define the availability domain of the network under consideration. The availability domain can be used as a reference for planning and implementing maintenance activities. A new metric is proposed which incorporates a consistency index along with key service parameters that can be used to compare the performance of different service providers. The developed tool can be used for reliability analysis of mobile communication systems and assumes greater significance in the wake of mobile portability facility. It is also possible to have a relative measure of the effectiveness of different service providers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The present work emphasizes the use of chirality as an efficient tool to synthesize new types of second order nonlinear materials. Second harmonic generation efficiency (SHG) is used as a measure of second order nonlinear response. Nonlinear optical properties of polymers have been studied theoretically and experimentally. Polymers were designed theoretically by ab initio and semiempirical calculations. All the polymeric systems have been synthesized by condensation polymerization. Second harmonic generation efficiency of the synthesized systems has been measured experimentally by Kurtz and Perry powder method

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A total of eighty-one Escherichia coli isolates belonging to forty-three different serotypes including several pathogenic strains such as enterotoxigenic E. coli (ETEC), enterohaemorrhagic E. coli (EHEC), enteropathogenic E. coli (EPEC) and uropathogenic E. coli (UPEC) isolated from Cochin estuary between November 2001 and October 2002 were tested against twelve antibiotics to determine the prevalence of multiple antibiotic resistance (MAR) and antimicrobial resistance profiles as a measure of high risk source of contamination. The results revealed that more than 95% of the isolates were multiple antibiotic resistant (resistant to more than three antibiotics). The MAR indexing of the isolates showed that all these strains originated from high risk source of contamination. The incidence of multiple antibiotic resistant E. coli especially the pathogenic strains in natural waters will pose a serious threat to human population

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There are several centrality measures that have been introduced and studied for real world networks. They account for the different vertex characteristics that permit them to be ranked in order of importance in the network. Betweenness centrality is a measure of the influence of a vertex over the flow of information between every pair of vertices under the assumption that information primarily flows over the shortest path between them. In this paper we present betweenness centrality of some important classes of graphs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this article, we study reliability measures such as geometric vitality function and conditional Shannon’s measures of uncertainty proposed by Ebrahimi (1996) and Sankaran and Gupta (1999), respectively, for the doubly (interval) truncated random variables. In survival analysis and reliability engineering, these measures play a significant role in studying the various characteristics of a system/component when it fails between two time points. The interrelationships among these uncertainty measures for various distributions are derived and proved characterization theorems arising out of them

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Magnetism and magnetic materials have been playing a lead role in improving the quality of life. They are increasingly being used in a wide variety of applications ranging from compasses to modern technological devices. Metallic glasses occupy an important position among magnetic materials. They assume importance both from a scientific and an application point of view since they represent an amorphous form of condensed matter with significant deviation from thermodynamic equilibrium. Metallic glasses having good soft magnetic properties are widely used in tape recorder heads, cores of high-power transformers and metallic shields. Superconducting metallic glasses are being used to produce high magnetic fields and magnetic levitation effect. Upon heat treatment, they undergo structural relaxation leading to subtle rearrangements of constituent atoms. This leads to densification of amorphous phase and subsequent nanocrystallisation. The short-range structural relaxation phenomenon gives rise to significant variations in physical, mechanical and magnetic properties. Magnetic amorphous alloys of Co-Fe exhibit excellent soft magnetic properties which make them promising candidates for applications as transformer cores, sensors, and actuators. With the advent of microminiaturization and nanotechnology, thin film forms of these alloys are sought after for soft under layers for perpendicular recording media. The thin film forms of these alloys can also be used for fabrication of magnetic micro electro mechanical systems (magnetic MEMS). In bulk, they are drawn in the form of ribbons, often by melt spinning. The main constituents of these alloys are Co, Fe, Ni, Si, Mo and B. Mo acts as the grain growth inhibitor and Si and B facilitate the amorphous nature in the alloy structure. The ferromagnetic phases such as Co-Fe and Fe-Ni in the alloy composition determine the soft magnetic properties. The grain correlation length, a measure of the grain size, often determines the soft magnetic properties of these alloys. Amorphous alloys could be restructured in to their nanocrystalline counterparts by different techniques. The structure of nanocrystalline material consists of nanosized ferromagnetic crystallites embedded in an amorphous matrix. When the amorphous phase is ferromagnetic, they facilitate exchange coupling between nanocrystallites. This exchange coupling results in the vanishing of magnetocrystalline anisotropy which improves the soft magnetic properties. From a fundamental perspective, exchange correlation length and grain size are the deciding factors that determine the magnetic properties of these nanocrystalline materials. In thin films, surfaces and interfaces predominantly decides the bulk property and hence tailoring the surface roughness and morphology of the film could result in modified magnetic properties. Surface modifications can be achieved by thermal annealing at various temperatures. Ion irradiation is an alternative tool to modify the surface/structural properties. The surface evolution of a thin film under swift heavy ion (SHI) irradiation is an outcome of different competing mechanism. It could be sputtering induced by SHI followed by surface roughening process and the material transport induced smoothening process. The impingement of ions with different fluence on the alloy is bound to produce systematic microstructural changes and this could effectively be used for tailoring magnetic parameters namely coercivity, saturation magnetization, magnetic permeability and remanence of these materials. Swift heavy ion irradiation is a novel and an ingenious tool for surface modification which eventually will lead to changes in the bulk as well as surface magnetic property. SHI has been widely used as a method for the creation of latent tracks in thin films. The bombardment of SHI modifies the surfaces or interfaces or creates defects, which induces strain in the film. These changes will have profound influence on the magnetic anisotropy and the magnetisation of the specimen. Thus inducing structural and morphological changes by thermal annealing and swift heavy ion irradiation, which in turn induce changes in the magnetic properties of these alloys, is one of the motivation of this study. Multiferroic and magneto-electrics is a class of functional materials with wide application potential and are of great interest to material scientists and engineers. Magnetoelectric materials combine both magnetic as well as ferroelectric properties in a single specimen. The dielectric properties of such materials can be controlled by the application of an external magnetic field and the magnetic properties by an electric field. Composites with magnetic and piezo/ferroelectric individual phases are found to have strong magnetoelectric (ME) response at room temperature and hence are preferred to single phasic multiferroic materials. Currently research in this class of materials is towards optimization of the ME coupling by tailoring the piezoelectric and magnetostrictive properties of the two individual components of ME composites. The magnetoelectric coupling constant (MECC) (_ ME) is the parameter that decides the extent of interdependence of magnetic and electric response of the composite structure. Extensive investigates have been carried out in bulk composites possessing on giant ME coupling. These materials are fabricated by either gluing the individual components to each other or mixing the magnetic material to a piezoelectric matrix. The most extensively investigated material combinations are Lead Zirconate Titanate (PZT) or Lead Magnesium Niobate-Lead Titanate (PMNPT) as the piezoelectric, and Terfenol-D as the magnetostrictive phase and the coupling is measured in different configurations like transverse, longitudinal and inplane longitudinal. Fabrication of a lead free multiferroic composite with a strong ME response is the need of the hour from a device application point of view. The multilayer structure is expected to be far superior to bulk composites in terms of ME coupling since the piezoelectric (PE) layer can easily be poled electrically to enhance the piezoelectricity and hence the ME effect. The giant magnetostriction reported in the Co-Fe thin films makes it an ideal candidate for the ferromagnetic component and BaTiO3 which is a well known ferroelectric material with improved piezoelectric properties as the ferroelectric component. The multilayer structure of BaTiO3- CoFe- BaTiO3 is an ideal system to understand the underlying fundamental physics behind the ME coupling mechanism. Giant magnetoelectric coupling coefficient is anticipated for these multilayer structures of BaTiO3-CoFe-BaTiO3. This makes it an ideal candidate for cantilever applications in magnetic MEMS/NEMS devices. SrTiO3 is an incipient ferroelectric material which is paraelectric up to 0K in its pure unstressed form. Recently few studies showed that ferroelectricity can be induced by application of stress or by chemical / isotopic substitution. The search for room temperature magnetoelectric coupling in SrTiO3-CoFe-SrTiO3 multilayer structures is of fundamental interest. Yet another motivation of the present work is to fabricate multilayer structures consisting of CoFe/ BaTiO3 and CoFe/ SrTiO3 for possible giant ME coupling coefficient (MECC) values. These are lead free and hence promising candidates for MEMS applications. The elucidation of mechanism for the giant MECC also will be the part of the objective of this investigation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The restarting automaton is a restricted model of computation that was introduced by Jancar et al. to model the so-called analysis by reduction, which is a technique used in linguistics to analyze sentences of natural languages. The most general models of restarting automata make use of auxiliary symbols in their rewrite operations, although this ability does not directly correspond to any aspect of the analysis by reduction. Here we put restrictions on the way in which restarting automata use auxiliary symbols, and we investigate the influence of these restrictions on their expressive power. In fact, we consider two types of restrictions. First, we consider the number of auxiliary symbols in the tape alphabet of a restarting automaton as a measure of its descriptional complexity. Secondly, we consider the number of occurrences of auxiliary symbols on the tape as a dynamic complexity measure. We establish some lower and upper bounds with respect to these complexity measures concerning the ability of restarting automata to recognize the (deterministic) context-free languages and some of their subclasses.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The main objective of this ex post facto study is to compare the differences in cognitive functions and their relation to schizotypal personality traits between a group of unaffected parents of schizophrenic patients and a control group. A total of 52 unaffected biological parents of schizophrenic patients and 52 unaffected parents of unaffected subjects were assessed in measures of attention (Continuous Performance Test- Identical Pairs Version, CPT-IP), memory and verbal learning (California Verbal Learning Test, CVLT) as well as schizotypal personality traits (Oxford-Liverpool Inventory of Feelings and Experiences, O-LIFE). The parents of the patients with schizophrenia differ from the parents of the control group in omission errors on the Continuous Performance Test- Identical Pairs, on a measure of recall and on two contrast measures of the California Verbal Learning Test. The associations between neuropsychological variables and schizotpyal traits are of a low magnitude. There is no defined pattern of the relationship between cognitive measures and schizotypal traits

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La optimización y armonización son factores clave para tener un buen desempeño en la industria química. BASF ha desarrollado un proyecto llamada acelerador. El objetivo de este proyecto ha sido la armonización y la integración de los procesos de la cadena de suministro a nivel mundial. El proceso básico de manejo de inventarios se quedó fuera del proyecto y debía ser analizado. El departamento de manejo de inventarios en BASF SE ha estado desarrollando su propia estrategia para la definición de procesos globales de manufactura. En este trabajo se presentará un informe de las fases de la formulación de la estrategia y establecer algunas pautas para la fase de implementación que está teniendo lugar en 2012 y 2013.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

El artículo busca encontrar evidencia empírica de los determinantes de la salud, como una medición de capital salud en un país en desarrollo después de una profunda reforma en el sector salud. Siguiendo el modelo de Grossman (1972) y tomando factores institucionales, además de las variables individuales y socioeconómicas. Se usaron las encuestas de 1997 y 2000 donde se responde subjetivamente sobre el estado de salud y tipo de afiliación al sistema de salud. El proceso de estimación usado es un probit ordenado. Los resultados muestran una importante conexión entre las variables individuales, institucionales y socioeconómicas con el estado de salud. El efecto de tipo de acceso al sistema de salud presiona las inequidades en salud.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

El treball desenvolupat en aquesta tesi presenta un profund estudi i proveïx solucions innovadores en el camp dels sistemes recomanadors. Els mètodes que usen aquests sistemes per a realitzar les recomanacions, mètodes com el Filtrat Basat en Continguts (FBC), el Filtrat Col·laboratiu (FC) i el Filtrat Basat en Coneixement (FBC), requereixen informació dels usuaris per a predir les preferències per certs productes. Aquesta informació pot ser demogràfica (Gènere, edat, adreça, etc), o avaluacions donades sobre algun producte que van comprar en el passat o informació sobre els seus interessos. Existeixen dues formes d'obtenir aquesta informació: els usuaris ofereixen explícitament aquesta informació o el sistema pot adquirir la informació implícita disponible en les transaccions o historial de recerca dels usuaris. Per exemple, el sistema recomanador de pel·lícules MovieLens (http://movielens.umn.edu/login) demana als usuaris que avaluïn almenys 15 pel·lícules dintre d'una escala de * a * * * * * (horrible, ...., ha de ser vista). El sistema genera recomanacions sobre la base d'aquestes avaluacions. Quan els usuaris no estan registrat en el sistema i aquest no té informació d'ells, alguns sistemes realitzen les recomanacions tenint en compte l'historial de navegació. Amazon.com (http://www.amazon.com) realitza les recomanacions tenint en compte les recerques que un usuari a fet o recomana el producte més venut. No obstant això, aquests sistemes pateixen de certa falta d'informació. Aquest problema és generalment resolt amb l'adquisició d'informació addicional, se li pregunta als usuaris sobre els seus interessos o es cerca aquesta informació en fonts addicionals. La solució proposada en aquesta tesi és buscar aquesta informació en diverses fonts, específicament aquelles que contenen informació implícita sobre les preferències dels usuaris. Aquestes fonts poden ser estructurades com les bases de dades amb informació de compres o poden ser no estructurades com les pàgines web on els usuaris deixen la seva opinió sobre algun producte que van comprar o posseïxen. Nosaltres trobem tres problemes fonamentals per a aconseguir aquest objectiu: 1 . La identificació de fonts amb informació idònia per als sistemes recomanadors. 2 . La definició de criteris que permetin la comparança i selecció de les fonts més idònies. 3 . La recuperació d'informació de fonts no estructurades. En aquest sentit, en la tesi proposada s'ha desenvolupat: 1 . Una metodologia que permet la identificació i selecció de les fonts més idònies. Criteris basats en les característiques de les fonts i una mesura de confiança han estat utilitzats per a resoldre el problema de la identificació i selecció de les fonts. 2 . Un mecanisme per a recuperar la informació no estructurada dels usuaris disponible en la web. Tècniques de Text Mining i ontologies s'han utilitzat per a extreure informació i estructurar-la apropiadament perquè la utilitzin els recomanadors. Les contribucions del treball desenvolupat en aquesta tesi doctoral són: 1. Definició d'un conjunt de característiques per a classificar fonts rellevants per als sistemes recomanadors 2. Desenvolupament d'una mesura de rellevància de les fonts calculada sobre la base de les característiques definides 3. Aplicació d'una mesura de confiança per a obtenir les fonts més fiables. La confiança es definida des de la perspectiva de millora de la recomanació, una font fiable és aquella que permet millorar les recomanacions. 4. Desenvolupament d'un algorisme per a seleccionar, des d'un conjunt de fonts possibles, les més rellevants i fiable utilitzant les mitjanes esmentades en els punts previs. 5. Definició d'una ontologia per a estructurar la informació sobre les preferències dels usuaris que estan disponibles en Internet. 6. Creació d'un procés de mapatge que extreu automàticament informació de les preferències dels usuaris disponibles en la web i posa aquesta informació dintre de l'ontologia. Aquestes contribucions permeten aconseguir dos objectius importants: 1 . Millorament de les recomanacions usant fonts d'informació alternatives que sigui rellevants i fiables. 2 . Obtenir informació implícita dels usuaris disponible en Internet.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this thesis is to narrow the gap between two different control techniques: the continuous control and the discrete event control techniques DES. This gap can be reduced by the study of Hybrid systems, and by interpreting as Hybrid systems the majority of large-scale systems. In particular, when looking deeply into a process, it is often possible to identify interaction between discrete and continuous signals. Hybrid systems are systems that have both continuous, and discrete signals. Continuous signals are generally supposed continuous and differentiable in time, since discrete signals are neither continuous nor differentiable in time due to their abrupt changes in time. Continuous signals often represent the measure of natural physical magnitudes such as temperature, pressure etc. The discrete signals are normally artificial signals, operated by human artefacts as current, voltage, light etc. Typical processes modelled as Hybrid systems are production systems, chemical process, or continuos production when time and continuous measures interacts with the transport, and stock inventory system. Complex systems as manufacturing lines are hybrid in a global sense. They can be decomposed into several subsystems, and their links. Another motivation for the study of Hybrid systems is the tools developed by other research domains. These tools benefit from the use of temporal logic for the analysis of several properties of Hybrid systems model, and use it to design systems and controllers, which satisfies physical or imposed restrictions. This thesis is focused in particular types of systems with discrete and continuous signals in interaction. That can be modelled hard non-linealities, such as hysteresis, jumps in the state, limit cycles, etc. and their possible non-deterministic future behaviour expressed by an interpretable model description. The Hybrid systems treated in this work are systems with several discrete states, always less than thirty states (it can arrive to NP hard problem), and continuous dynamics evolving with expression: with Ki ¡ Rn constant vectors or matrices for X components vector. In several states the continuous evolution can be several of them Ki = 0. In this formulation, the mathematics can express Time invariant linear system. By the use of this expression for a local part, the combination of several local linear models is possible to represent non-linear systems. And with the interaction with discrete events of the system the model can compose non-linear Hybrid systems. Especially multistage processes with high continuous dynamics are well represented by the proposed methodology. Sate vectors with more than two components, as third order models or higher is well approximated by the proposed approximation. Flexible belt transmission, chemical reactions with initial start-up and mobile robots with important friction are several physical systems, which profits from the benefits of proposed methodology (accuracy). The motivation of this thesis is to obtain a solution that can control and drive the Hybrid systems from the origin or starting point to the goal. How to obtain this solution, and which is the best solution in terms of one cost function subject to the physical restrictions and control actions is analysed. Hybrid systems that have several possible states, different ways to drive the system to the goal and different continuous control signals are problems that motivate this research. The requirements of the system on which we work is: a model that can represent the behaviour of the non-linear systems, and that possibilities the prediction of possible future behaviour for the model, in order to apply an supervisor which decides the optimal and secure action to drive the system toward the goal. Specific problems can be determined by the use of this kind of hybrid models are: - The unity of order. - Control the system along a reachable path. - Control the system in a safe path. - Optimise the cost function. - Modularity of control The proposed model solves the specified problems in the switching models problem, the initial condition calculus and the unity of the order models. Continuous and discrete phenomena are represented in Linear hybrid models, defined with defined eighth-tuple parameters to model different types of hybrid phenomena. Applying a transformation over the state vector : for LTI system we obtain from a two-dimensional SS a single parameter, alpha, which still maintains the dynamical information. Combining this parameter with the system output, a complete description of the system is obtained in a form of a graph in polar representation. Using Tagaki-Sugeno type III is a fuzzy model which include linear time invariant LTI models for each local model, the fuzzyfication of different LTI local model gives as a result a non-linear time invariant model. In our case the output and the alpha measure govern the membership function. Hybrid systems control is a huge task, the processes need to be guided from the Starting point to the desired End point, passing a through of different specific states and points in the trajectory. The system can be structured in different levels of abstraction and the control in three layers for the Hybrid systems from planning the process to produce the actions, these are the planning, the process and control layer. In this case the algorithms will be applied to robotics ¡V a domain where improvements are well accepted ¡V it is expected to find a simple repetitive processes for which the extra effort in complexity can be compensated by some cost reductions. It may be also interesting to implement some control optimisation to processes such as fuel injection, DC-DC converters etc. In order to apply the RW theory of discrete event systems on a Hybrid system, we must abstract the continuous signals and to project the events generated for these signals, to obtain new sets of observable and controllable events. Ramadge & Wonham¡¦s theory along with the TCT software give a Controllable Sublanguage of the legal language generated for a Discrete Event System (DES). Continuous abstraction transforms predicates over continuous variables into controllable or uncontrollable events, and modifies the set of uncontrollable, controllable observable and unobservable events. Continuous signals produce into the system virtual events, when this crosses the bound limits. If this event is deterministic, they can be projected. It is necessary to determine the controllability of this event, in order to assign this to the corresponding set, , controllable, uncontrollable, observable and unobservable set of events. Find optimal trajectories in order to minimise some cost function is the goal of the modelling procedure. Mathematical model for the system allows the user to apply mathematical techniques over this expression. These possibilities are, to minimise a specific cost function, to obtain optimal controllers and to approximate a specific trajectory. The combination of the Dynamic Programming with Bellman Principle of optimality, give us the procedure to solve the minimum time trajectory for Hybrid systems. The problem is greater when there exists interaction between adjacent states. In Hybrid systems the problem is to determine the partial set points to be applied at the local models. Optimal controller can be implemented in each local model in order to assure the minimisation of the local costs. The solution of this problem needs to give us the trajectory to follow the system. Trajectory marked by a set of set points to force the system to passing over them. Several ways are possible to drive the system from the Starting point Xi to the End point Xf. Different ways are interesting in: dynamic sense, minimum states, approximation at set points, etc. These ways need to be safe and viable and RchW. And only one of them must to be applied, normally the best, which minimises the proposed cost function. A Reachable Way, this means the controllable way and safe, will be evaluated in order to obtain which one minimises the cost function. Contribution of this work is a complete framework to work with the majority Hybrid systems, the procedures to model, control and supervise are defined and explained and its use is demonstrated. Also explained is the procedure to model the systems to be analysed for automatic verification. Great improvements were obtained by using this methodology in comparison to using other piecewise linear approximations. It is demonstrated in particular cases this methodology can provide best approximation. The most important contribution of this work, is the Alpha approximation for non-linear systems with high dynamics While this kind of process is not typical, but in this case the Alpha approximation is the best linear approximation to use, and give a compact representation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper discusses lipreading and development of a standardized measure of lipreading skill.