947 resultados para Specifications
Resumo:
Residue Number System (RNS) based Finite Impulse Response (FIR) digital filters and traditional FIR filters. This research is motivated by the importance of an efficient filter implementation for digital signal processing. The comparison is done in terms of speed and area requirement for various filter specifications. RNS based FIR filters operate more than three times faster and consumes only about 60% of the area than traditional filter when number of filter taps is more than 32. The area for RNS filter is increasing at a lesser rate than that for traditional resulting in lower power consumption. RNS is a nonweighted number system without carry propogation between different residue digits.This enables simultaneous parallel processing on all the digits resulting in high speed addition and multiplication in the RNS domain
Resumo:
The current research investigates the possibility of using unmodified and modified nanokaolin, multiwalled carbon nanotube (MWCNT) and graphene as fillers to impart enhancement in mechanical, thermal, and electrical properties to the elastomers. Taking advantage of latex blending method, nanoclay, MWCNT and graphene dispersions, prepared by ultra sound sonication are dispersed in polymer latices. The improvement in material properties indicated better interaction between filler and the polymer.MWCNT and graphene imparted electrical conductivity with simultaneous improvement in mechanical properties. Layered silicates prepared by microwave method also significantly improve the mechanical properties of the nanocomposites. The thesis entitled ‘Studies on the use of Nanokaolin, MWCNT and Graphene in NBR and SBR’ consists of ten chapters. The first chapter is a concise introduction of nanocomposites, nanofillers, elastomeric matrices and applications of polymer nanocomposites. The state-of-art research in elastomer based nanocomposites is also presented. At the end of this chapter the main objectives of the work are mentioned. Chapter 2 outlines the specifications of various materials used, details of experimental techniques employed for preparing and characterizing nanocomposites. Chapter3 includes characterization of the nanofillers, optimsation of cure time of latex based composites and the methods used for the preparation of latex based and dry rubber based nanocomposites. Chapter4 presents the reinforcing effect of the nanofillers in XNBR latex and the characterization of the nanocomposites. Chapter5 comprises the effect of nanofillers on the properties of SBR latex and their characterization Chapter 6 deals with the study of cure characteristics, mechanical and thermal properties and the characterization of NBR based nanocomposites. Chapter7 is the microwave studies of MWCNT and graphene filled elastomeric nanocomposites. Chapter 8 gives details of the preparation of layered silicates, their characterization and use in different elastomeric matrices. Chapter 9 is the study of mechanical properties of nanoclay incorporated nitrile gloves .Chapter 10 presents the summary and conclusions of the investigation.
Resumo:
Futures trading in Commodities has three specific economic functions viz. price discovery, hedging and reduction in volatility. Natural rubber possesses all the specifications required for futures trading. Commodity futures trading in India attained momentum after the starting of national level commodity exchanges in 2003. The success of futures trading depends upon effective price risk management, price discovery and reduced volatility which in turn depends upon the volume of trading. In the case of rubber futures market, the volume of trading depends upon the extent of participation by market players like growers, dealers, manufacturers, rubber marketing co-operative societies and Rubber Producer’s Societies (RPS). The extent of participation by market players has a direct bearing on their awareness level and their perception about futures trading. In the light of the above facts and the review of literature available on rubber futures market, it is felt that a study on rubber futures market is necessary to fill the research gap, with specific focus on (1) the awareness and perception of rubber futures market participants viz. (i) rubber growers, (ii) dealers, (iii) rubber product manufacturers, (iv) rubber marketing co-operative societies and Rubber Producer’s Societies (RPS) about futures trading and (2) whether the rubber futures market is fulfilling the economic functions of futures market viz. hedging, reduction in volatility and price discovery or not. The study is confined to growers, dealers, rubber goods manufacturers, rubber marketing co-operative societies and RPS in Kerala. In order to achieve the stated objectives, the study utilized secondary data for the period from 2003 to 2013 from different published sources like bulletins, newsletters, circulars from NMCE, Reserve Bank of India (RBI), Warehousing Corporation and traders. The primary data required for this study were collected from rubber growers, rubber dealers, RPS & Rubber Marketing Co-operative Societies and rubber goods manufacturers in Kerala. Data pertaining to the awareness and perception of futures trading, participation in the futures trading, use of spot and futures prices and source of price information by dealers, farmers, manufacturers and cooperative societies also were collected. Statistical tools used for analysis include percentage, standard deviation, Chi-square test, Mann – Whitney U test, Kruskal Wallis test, Augmented Dickey – Fuller test statistic, t- statistic, Granger causality test, F- statistic, Johansen co – integration test, Trace statistic and Max –Eigen statistic. The study found that 71.5 per cent of the total hedges are effective and 28.5 per cent are ineffective for the period under study. It implies that futures market in rubber reduced the impact of price risks by approximately 71.5 per cent. Further, it is observed that, on 54.4 per cent occasions, the futures market exercised a stabilizing effect on the spot market, and on 45.6 per cent occasions futures trading exercised a destabilizing effect on the spot market. It implies that elasticity of expectation of futures market in rubber has a predominant stabilizing effect on spot prices. The market, as a whole, exhibits a bias in favour of long hedges. Spot price volatility of rubber during futures suspension period is more than that of the pre suspension period and post suspension period. There is a bi-directional association-ship or bi-directional causality or pair- wise causality between spot price and futures price of rubber. From the results of the hedging efficiency, spot price volatility, and price discovery, it can be concluded that rubber futures market fulfils all the economic functions expected from a commodity futures market. Thus in India, the future of rubber futures is Bright…!!!
Resumo:
Distributed systems are one of the most vital components of the economy. The most prominent example is probably the internet, a constituent element of our knowledge society. During the recent years, the number of novel network types has steadily increased. Amongst others, sensor networks, distributed systems composed of tiny computational devices with scarce resources, have emerged. The further development and heterogeneous connection of such systems imposes new requirements on the software development process. Mobile and wireless networks, for instance, have to organize themselves autonomously and must be able to react to changes in the environment and to failing nodes alike. Researching new approaches for the design of distributed algorithms may lead to methods with which these requirements can be met efficiently. In this thesis, one such method is developed, tested, and discussed in respect of its practical utility. Our new design approach for distributed algorithms is based on Genetic Programming, a member of the family of evolutionary algorithms. Evolutionary algorithms are metaheuristic optimization methods which copy principles from natural evolution. They use a population of solution candidates which they try to refine step by step in order to attain optimal values for predefined objective functions. The synthesis of an algorithm with our approach starts with an analysis step in which the wanted global behavior of the distributed system is specified. From this specification, objective functions are derived which steer a Genetic Programming process where the solution candidates are distributed programs. The objective functions rate how close these programs approximate the goal behavior in multiple randomized network simulations. The evolutionary process step by step selects the most promising solution candidates and modifies and combines them with mutation and crossover operators. This way, a description of the global behavior of a distributed system is translated automatically to programs which, if executed locally on the nodes of the system, exhibit this behavior. In our work, we test six different ways for representing distributed programs, comprising adaptations and extensions of well-known Genetic Programming methods (SGP, eSGP, and LGP), one bio-inspired approach (Fraglets), and two new program representations called Rule-based Genetic Programming (RBGP, eRBGP) designed by us. We breed programs in these representations for three well-known example problems in distributed systems: election algorithms, the distributed mutual exclusion at a critical section, and the distributed computation of the greatest common divisor of a set of numbers. Synthesizing distributed programs the evolutionary way does not necessarily lead to the envisaged results. In a detailed analysis, we discuss the problematic features which make this form of Genetic Programming particularly hard. The two Rule-based Genetic Programming approaches have been developed especially in order to mitigate these difficulties. In our experiments, at least one of them (eRBGP) turned out to be a very efficient approach and in most cases, was superior to the other representations.
Resumo:
Im Rahmen dieser Arbeit wird eine gemeinsame Optimierung der Hybrid-Betriebsstrategie und des Verhaltens des Verbrennungsmotors vorgestellt. Die Übernahme von den im Steuergerät verwendeten Funktionsmodulen in die Simulationsumgebung für Fahrzeuglängsdynamik stellt eine effiziente Applikationsmöglichkeit der Originalparametrierung dar. Gleichzeitig ist es notwendig, das Verhalten des Verbrennungsmotors derart nachzubilden, dass das stationäre und das dynamische Verhalten, inklusive aller relevanten Einflussmöglichkeiten, wiedergegeben werden kann. Das entwickelte Werkzeug zur Übertragung der in Ascet definierten Steurgerätefunktionen in die Simulink-Simulationsumgebung ermöglicht nicht nur die Simulation der relevanten Funktionsmodule, sondern es erfüllt auch weitere wichtige Eigenschaften. Eine erhöhte Flexibilität bezüglich der Daten- und Funktionsstandänderungen, sowie die Parametrierbarkeit der Funktionsmodule sind Verbesserungen die an dieser Stelle zu nennen sind. Bei der Modellierung des stationären Systemverhaltens des Verbrennungsmotors erfolgt der Einsatz von künstlichen neuronalen Netzen. Die Auswahl der optimalen Neuronenanzahl erfolgt durch die Betrachtung des SSE für die Trainings- und die Verifikationsdaten. Falls notwendig, wird zur Sicherstellung der angestrebten Modellqualität, das Interpolationsverhalten durch Hinzunahme von Gauß-Prozess-Modellen verbessert. Mit den Gauß-Prozess-Modellen werden hierbei zusätzliche Stützpunkte erzeugt und mit einer verminderten Priorität in die Modellierung eingebunden. Für die Modellierung des dynamischen Systemverhaltens werden lineare Übertragungsfunktionen verwendet. Bei der Minimierung der Abweichung zwischen dem Modellausgang und den Messergebnissen wird zusätzlich zum SSE das 2σ-Intervall der relativen Fehlerverteilung betrachtet. Die Implementierung der Steuergerätefunktionsmodule und der erstellten Steller-Sensor-Streckenmodelle in der Simulationsumgebung für Fahrzeuglängsdynamik führt zum Anstieg der Simulationszeit und einer Vergrößerung des Parameterraums. Das aus Regelungstechnik bekannte Verfahren der Gütevektoroptimierung trägt entscheidend zu einer systematischen Betrachtung und Optimierung der Zielgrößen bei. Das Ergebnis des Verfahrens ist durch das Optimum der Paretofront der einzelnen Entwurfsspezifikationen gekennzeichnet. Die steigenden Simulationszeiten benachteiligen Minimumsuchverfahren, die eine Vielzahl an Iterationen benötigen. Um die Verwendung einer Zufallsvariablen, die maßgeblich zur Steigerung der Iterationanzahl beiträgt, zu vermeiden und gleichzeitig eine Globalisierung der Suche im Parameterraum zu ermöglichen wird die entwickelte Methode DelaunaySearch eingesetzt. Im Gegensatz zu den bekannten Algorithmen, wie die Partikelschwarmoptimierung oder die evolutionären Algorithmen, setzt die neu entwickelte Methode bei der Suche nach dem Minimum einer Kostenfunktion auf eine systematische Analyse der durchgeführten Simulationsergebnisse. Mit Hilfe der bei der Analyse gewonnenen Informationen werden Bereiche mit den bestmöglichen Voraussetzungen für ein Minimum identifiziert. Somit verzichtet das iterative Verfahren bei der Bestimmung des nächsten Iterationsschrittes auf die Verwendung einer Zufallsvariable. Als Ergebnis der Berechnungen steht ein gut gewählter Startwert für eine lokale Optimierung zur Verfügung. Aufbauend auf der Simulation der Fahrzeuglängsdynamik, der Steuergerätefunktionen und der Steller-Sensor-Streckenmodelle in einer Simulationsumgebung wird die Hybrid-Betriebsstrategie gemeinsam mit der Steuerung des Verbrennungsmotors optimiert. Mit der Entwicklung und Implementierung einer neuen Funktion wird weiterhin die Verbindung zwischen der Betriebsstrategie und der Motorsteuerung erweitert. Die vorgestellten Werkzeuge ermöglichten hierbei nicht nur einen Test der neuen Funktionalitäten, sondern auch eine Abschätzung der Verbesserungspotentiale beim Verbrauch und Abgasemissionen. Insgesamt konnte eine effiziente Testumgebung für eine gemeinsame Optimierung der Betriebsstrategie und des Verbrennungsmotorverhaltens eines Hybridfahrzeugs realisiert werden.
Resumo:
This thesis describes a methodology, a representation, and an implemented program for troubleshooting digital circuit boards at roughly the level of expertise one might expect in a human novice. Existing methods for model-based troubleshooting have not scaled up to deal with complex circuits, in part because traditional circuit models do not explicitly represent aspects of the device that troubleshooters would consider important. For complex devices the model of the target device should be constructed with the goal of troubleshooting explicitly in mind. Given that methodology, the principal contributions of the thesis are ways of representing complex circuits to help make troubleshooting feasible. Temporally coarse behavior descriptions are a particularly powerful simplification. Instantiating this idea for the circuit domain produces a vocabulary for describing digital signals. The vocabulary has a level of temporal detail sufficient to make useful predictions abut the response of the circuit while it remains coarse enough to make those predictions computationally tractable. Other contributions are principles for using these representations. Although not embodied in a program, these principles are sufficiently concrete that models can be constructed manually from existing circuit descriptions such as schematics, part specifications, and state diagrams. One such principle is that if there are components with particularly likely failure modes or failure modes in which their behavior is drastically simplified, this knowledge should be incorporated into the model. Further contributions include the solution of technical problems resulting from the use of explicit temporal representations and design descriptions with tangled hierarchies.
Resumo:
This thesis presents the ideas underlying a computer program that takes as input a schematic of a mechanical or hydraulic power transmission system, plus specifications and a utility function, and returns catalog numbers from predefined catalogs for the optimal selection of components implementing the design. Unlike programs for designing single components or systems, the program provides the designer with a high level "language" in which to compose new designs. It then performs some of the detailed design process. The process of "compilation" is based on a formalization of quantitative inferences about hierarchically organized sets of artifacts and operating conditions. This allows the design compilation without the exhaustive enumeration of alternatives.
Resumo:
Testing constraints for real-time systems are usually verified through the satisfiability of propositional formulae. In this paper, we propose an alternative where the verification of timing constraints can be done by counting the number of truth assignments instead of boolean satisfiability. This number can also tell us how “far away” is a given specification from satisfying its safety assertion. Furthermore, specifications and safety assertions are often modified in an incremental fashion, where problematic bugs are fixed one at a time. To support this development, we propose an incremental algorithm for counting satisfiability. Our proposed incremental algorithm is optimal as no unnecessary nodes are created during each counting. This works for the class of path RTL. To illustrate this application, we show how incremental satisfiability counting can be applied to a well-known rail-road crossing example, particularly when its specification is still being refined.
Resumo:
Inicialmente integrada en el piloto de gvSIG Mobile, la librería libLocation tiene como objetivo dotar a los proyectos gvSIG Desktop y gvSIG Mobile un acceso transparente a fuentes de localización. La librería se fundamenta en las especificaciones JSR-179 -API de localización para J2ME- y JSR-293 -API de localización para J2ME v2.0-, proporcionando una interfaz uniforme a diferentes fuentes de localización, mediante funciones de alto nivel. Asimismo, se extiende la funcionalidad de estas APIs para permitir la gestión de datos específicos del tipo de fuente de localización y el ajuste de parámetros de bajo nivel, además de incorporar métodos de localización adicionales, como la aplicación de correcciones vía protocolo NTRIP. La librería libLocation está actualmente en proceso de desarrollo y será publicada y liberada junto con la versión definitiva de gvSIG Mobile. Junto con libLocation se están desarrollando extensiones que permiten el acceso a esta librería desde gvSIG Desktop y gvSIG Mobile
Resumo:
Interaction effects are usually modeled by means of moderated regression analysis. Structural equation models with non-linear constraints make it possible to estimate interaction effects while correcting for measurement error. From the various specifications, Jöreskog and Yang's (1996, 1998), likely the most parsimonious, has been chosen and further simplified. Up to now, only direct effects have been specified, thus wasting much of the capability of the structural equation approach. This paper presents and discusses an extension of Jöreskog and Yang's specification that can handle direct, indirect and interaction effects simultaneously. The model is illustrated by a study of the effects of an interactive style of use of budgets on both company innovation and performance
Resumo:
This paper presents a first approach of Evaluation Engine Architecture (EEA) as proposal to support adaptive integral assessment, in the context of a virtual learning environment. The goal of our research is design an evaluation engine tool to assist in the whole assessment process within the A2UN@ project, linking that tool with the other key elements of a learning design (learning task, learning resources and learning support). The teachers would define the relation between knowledge, competencies, activities, resources and type of assessment. Providing this relation is possible obtain more accurate estimations of student's knowledge for adaptive evaluations and future recommendations. The process is supported by usage of educational standards and specifications and for an integral user modelling
Resumo:
Learning contents adaptation has been a subject of interest in the research area of the adaptive hypermedia systems. Defining which variables and which standards can be considered to model adaptive content delivery processes is one of the main challenges in pedagogical design over e-learning environments. In this paper some specifications, architectures and technologies that can be used in contents adaptation processes considering characteristics of the context are described and a proposal to integrate some of these characteristics in the design of units of learning using adaptation conditions in a structure of IMS-Learning Design (IMS-LD) is presented. The key contribution of this work is the generation of instructional designs considering the context, which can be used in Learning Management Systems (LMSs) and diverse mobile devices
Resumo:
Catadioptric sensors are combinations of mirrors and lenses made in order to obtain a wide field of view. In this paper we propose a new sensor that has omnidirectional viewing ability and it also provides depth information about the nearby surrounding. The sensor is based on a conventional camera coupled with a laser emitter and two hyperbolic mirrors. Mathematical formulation and precise specifications of the intrinsic and extrinsic parameters of the sensor are discussed. Our approach overcomes limitations of the existing omni-directional sensors and eventually leads to reduced costs of production
Resumo:
Aula de música es una herramienta e-learning para el desarrollo del aprendizaje de la música para niños con edades comprendidas entre los 6 y 12 años, edades correspondientes a las de los alumnos de la etapa de la Educación Primaria. En esta herramienta destaca el uso de estándares y especificaciones como LOM, IMS, etc. que van a facilitar la tarea de reutilizar la documentación incluida para compartir conocimiento. El proceso de elaboración del contenido ha sido fundamental y en relación con el entorno de trabajo debe mencionarse que se ha primado la construcción de una GUI que sirva para aprender y que motive a los alumnos a aprender música de una forma diferente, en contraposición a realizar una diseño estético que fuera incapaz de adaptarse a las capacidades de cada tipo de usuario, para lo que se han tenido en cuenta criterios de usabilidad y accesibilidad (WAI).
Resumo:
This document outlines the material covered by the main UK exam board specifications at A-level in chemistry. This is for the A-level taught up until and including June 2009 (i.e. relevant to undergraduates arriving at university in October 2009).