938 resultados para front end studies


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A novel approach to the modelling of passive intermodulation (PIM) generation in passive components with distributed weak nonlinearities is outlined. Based upon the formalism of X-parameters, it provides a unified framework for co-design of antenna beamforming networks, filters, combiners, phase shifters and other passive and active devices containing nonlinearities at RF front-end. The effects of discontinuities and complex circuit layouts can be efficiently evaluated with the aid of the equivalent networks of the canonical nonlinear elements. The main concepts are illustrated by examples of numerical simulations of PIM generation in the transmission lines and comparison with the measurement results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The X-parameter based nonlinear modelling tools have been adopted as the foundation for the advanced methodology
of experimental characterisation and design of passive nonlinear devices. Based upon the formalism of the Xparameters,
it provides a unified framework for co-design of antenna beamforming networks, filters, phase shifters and
other passive and active devices of RF front-end, taking into account the effect of their nonlinearities. The equivalent
circuits of the canonical elements are readily incorporated in the models, thus enabling evaluation of PIM effect on the
performance of individual devices and their assemblies. An important advantage of the presented methodology is its
compatibility with the industry-standard established commercial RF circuit simulator Agilent ADS.
The major challenge in practical implementation of the proposed approach is concerned with experimental retrieval of the X-parameters for canonical passive circuit elements. To our best knowledge commercial PIM testers and practical laboratory test instruments are inherently narrowband and do not allow for simultaneous vector measurements at the PIM and harmonic frequencies. Alternatively, existing nonlinear vector analysers (NVNA) support X-parameter measurements in a broad frequency bands with a range of stimuli, but their dynamic range is insufficient for the PIM characterisation in practical circuits. Further opportunities for adaptation of the X-parameters methodology to the PIM
characterisation of passive devices using the existing test instruments are explored.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents initial results of evaluating suitability of the conventional two-tone CW passive intermodulation (PIM) test for characterization of modulated signal distortion by passive nonlinearities in base station antennas and RF front-end. A comprehensive analysis of analog and digitally modulated waveforms in the transmission lines with weak distributed nonlinearity has been performed using the harmonic balance analysis and X-parameters in Advanced Design System (ADS) simulator. The nonlinear distortion metrics used in the conventional two-tone CW PIM test have been compared with the respective spectral metrics applied to the modulated waveforms, such as adjacent channel power ratio (ACPR) and error vector magnitude (EVM). It is shown that the results of two-tone CW PIM tests are consistent with the metrics used for assessment of signal integrity of both analog and digitally modulated waveforms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we carry out a detailed performance analysis of the blind source separation based I/Q corrector operating at the baseband. Performance of the digital I/Q corrector is evaluated not only under time-varying phase and gain errors but also in the presence of multipath and Rayleigh fading channels. Performance under low-SNR and different modulation formats and constellation sizes is also evaluated. What is more, BER improvement after correction is illustrated. The results indicate that the adaptive algorithm offers adequate performance for most communication applications hence, reducing the matching requirements of the analog front-end enabling higher levels of integration.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents the system developed to promote the rational use of electric energy among consumers and, thus, increase the energy efficiency. The goal is to provide energy consumers with an application that displays the energy consumption/production profiles, sets up consuming ceilings, defines automatic alerts and alarms, compares anonymously consumers with identical energy usage profiles by region and predicts, in the case of non-residential installations, the expected consumption/production values. The resulting distributed system is organized in two main blocks: front-end and back-end. The front-end includes user interface applications for Android mobile devices and Web browsers. The back-end provides data storage and processing functionalities and is installed in a cloud computing platform - the Google App Engine - which provides a standard Web service interface. This option ensures interoperability, scalability and robustness to the system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

"Mémoire présenté à la faculté des études supérieures en vue de l'obtention du grade de Maître en droit (LL.M.)"

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La littérature contient une abondance d’information sur les approches de design impliquant les utilisateurs. Bien que les chercheurs soulèvent de nombreux avantages concernant ces approches, on en sait peu sur ce que les concepteurs des entreprises en pensent. Ce projet a pour but de connaître les perceptions des concepteurs de produits quant aux outils de design participatif puis, d’identifier les opportunités et limites qu’ils évoquent à ce sujet, et finalement, de faire des suggestions d’outils qui faciliteraient l’introduction du design participatif dans un processus de design existant. Après avoir fait un survol du domaine du design participatif et de ses outils, six cas sont étudiés au moyen d’entrevues semi-dirigées conduites auprès de concepteurs de produits. Les données sont analysées à l’aide de cartes cognitives. En ce qui concerne les outils de design participatif, les participants rencontrés perçoivent un accès direct aux besoins des utilisateurs et la possibilité de minimiser les erreurs en début de processus donc, d’éviter les modifications coûteuses qu’elles auraient entraînées. Les obstacles perçus par les concepteurs sont principalement liés à la résistance au changement, à la crainte de laisser créer ou décider les utilisateurs, ainsi qu’au manque de temps et de ressources de l’équipe. Finalement, sur la base des informations collectées, nous suggérons quatre outils de design participatif qui semblent plus intéressants : l’enquête contextuelle, les sondes, les tests de prototypes et l’approche « lead user ». Pour faire suite à ce travail, il serait intéressant d’élaborer un protocole de recherche plus exhaustif pour augmenter la portée des résultats, ou encore, d’appliquer le design participatif, dans une entreprise, afin d’explorer la satisfaction des gens quant aux produits conçus, les effets collatéraux sur les équipes impliquées, l’évolution des prototypes ou le déroulement des ateliers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The dynamics of plasma plume, formed by the laser-blow-off of multicomponent LiF-C thin film under various ambient pressures ranging from high vacuum to argon pressure of 3 Torr, has been studied using fast imaging technique. In vacuum, the plume has ellipsoidal shape. With the increase in the ambient pressure, sharp plume boundary is developed showing a focusing-like confinement in the lateral space behavior in the front end, which persists for long times. At higher ambient pressure (> 10−1 Torr ), structures are developed in the plasma plume due to hydrodynamic instability/turbulences.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Motivation for Speaker recognition work is presented in the first part of the thesis. An exhaustive survey of past work in this field is also presented. A low cost system not including complex computation has been chosen for implementation. Towards achieving this a PC based system is designed and developed. A front end analog to digital convertor (12 bit) is built and interfaced to a PC. Software to control the ADC and to perform various analytical functions including feature vector evaluation is developed. It is shown that a fixed set of phrases incorporating evenly balanced phonemes is aptly suited for the speaker recognition work at hand. A set of phrases are chosen for recognition. Two new methods are adopted for the feature evaluation. Some new measurements involving a symmetry check method for pitch period detection and ACE‘ are used as featured. Arguments are provided to show the need for a new model for speech production. Starting from heuristic, a knowledge based (KB) speech production model is presented. In this model, a KB provides impulses to a voice producing mechanism and constant correction is applied via a feedback path. It is this correction that differs from speaker to speaker. Methods of defining measurable parameters for use as features are described. Algorithms for speaker recognition are developed and implemented. Two methods are presented. The first is based on the model postulated. Here the entropy on the utterance of a phoneme is evaluated. The transitions of voiced regions are used as speaker dependent features. The second method presented uses features found in other works, but evaluated differently. A knock—out scheme is used to provide the weightage values for the selection of features. Results of implementation are presented which show on an average of 80% recognition. It is also shown that if there are long gaps between sessions, the performance deteriorates and is speaker dependent. Cross recognition percentages are also presented and this in the worst case rises to 30% while the best case is 0%. Suggestions for further work are given in the concluding chapter.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Speech processing and consequent recognition are important areas of Digital Signal Processing since speech allows people to communicate more natu-rally and efficiently. In this work, a speech recognition system is developed for re-cognizing digits in Malayalam. For recognizing speech, features are to be ex-tracted from speech and hence feature extraction method plays an important role in speech recognition. Here, front end processing for extracting the features is per-formed using two wavelet based methods namely Discrete Wavelet Transforms (DWT) and Wavelet Packet Decomposition (WPD). Naive Bayes classifier is used for classification purpose. After classification using Naive Bayes classifier, DWT produced a recognition accuracy of 83.5% and WPD produced an accuracy of 80.7%. This paper is intended to devise a new feature extraction method which produces improvements in the recognition accuracy. So, a new method called Dis-crete Wavelet Packet Decomposition (DWPD) is introduced which utilizes the hy-brid features of both DWT and WPD. The performance of this new approach is evaluated and it produced an improved recognition accuracy of 86.2% along with Naive Bayes classifier.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Animportant step in the residue number system(RNS) based signal processing is the conversion of signal into residue domain. Many implementations of this conversion have been proposed for various goals, and one of the implementations is by a direct conversion from an analogue input. A novel approach for analogue-to-residue conversion is proposed in this research using the most popular Sigma–Delta analogue-to-digital converter (SD-ADC). In this approach, the front end is the same as in traditional SD-ADC that uses Sigma–Delta (SD) modulator with appropriate dynamic range, but the filtering is doneby a filter implemented usingRNSarithmetic. Hence, the natural output of the filter is an RNS representation of the input signal. The resolution, conversion speed, hardware complexity and cost of implementation of the proposed SD based analogue-to-residue converter are compared with the existing analogue-to-residue converters based on Nyquist rate ADCs

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La creciente dinamización de las IDE's genera una demanda de la construcción de Geoportales y por ende la demanda de herramientas que además de facilitar su construcción, configuración e implementación, ofrezcan la posibilidad de contratar un soporte técnico profesionalizado. OpenGeo Suite, paquete de software libre profesional e integrado, que permite desde el almacenamiento de datos geográficos, hasta su publicación utilizando estándares OGC e implementación de soluciones web GIS con librerías de código abierto Javascript. OpenGeo Suite permite un despliegue multiplataforma (Linux, Windows y OSX), con cuatro componentes de software libre fuertemente integrados basados en el uso de estándares OGC. Los componentes del lado del servidor están orientados al almacenamiento, configuración y publicación de datos por parte de usuarios técnicos en SIG: PostgreSQL+ la extensión espacial PostGIS que se encarga del almacenamiento de la información geográfica dando soporte a funciones de análisis espacial. pgAdmin como sistema de gestión de base de datos, facilitando la importación y actualización de datos. Geoserver se encarga de la publicación de la información geográfica proveniente de diferentes orígenes de datos: PostGIS, SHP, Oracle Spatial, GeoTIFF, etc. soportando la mayoría de estándares OGC de publicación de información geográfica WMS, WFS, WCS y de formatos GML, KML, GeoJSON, SLD. Además, ofrece soporte a cacheado de teselas a través de Geowebcache. OpenGeo Suite ofrece dos aplicaciones: GeoExplorer y GeoEditor, que permiten al técnico construir un Geoportal con capacidades de edición de geometrías.OpenGeo Suite ofrece una consola de administración (Dashboard) que facilita la configuración de los componentes de administración. Del lado del cliente, los componentes son librerías de desarrollo JavaScript orientadas a desarrolladores de aplicaciones Web SIG. OpenLayers con soporte para capas raster, vectoriales, estilos, proyecciones, teselado, herramientas de edición, etc. Por último, GeoExt para la construcción del front-end de Geoportales, basada en ExtJS y fuertemente acoplada a OpenLayers

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we present an architecture for network and applications management, which is based on the Active Networks paradigm and shows the advantages of network programmability. The stimulus to develop this architecture arises from an actual need to manage a cluster of active nodes, where it is often required to redeploy network assets and modify nodes connectivity. In our architecture, a remote front-end of the managing entity allows the operator to design new network topologies, to check the status of the nodes and to configure them. Moreover, the proposed framework allows to explore an active network, to monitor the active applications, to query each node and to install programmable traps. In order to take advantage of the Active Networks technology, we introduce active SNMP-like MIBs and agents, which are dynamic and programmable. The programmable management agents make tracing distributed applications a feasible task. We propose a general framework that can inter-operate with any active execution environment. In this framework, both the manager and the monitor front-ends communicate with an active node (the Active Network Access Point) through the XML language. A gateway service performs the translation of the queries from XML to an active packet language and injects the code in the network. We demonstrate the implementation of an active network gateway for PLAN (Packet Language for Active Networks) in a forty active nodes testbed. Finally, we discuss an application of the active management architecture to detect the causes of network failures by tracing network events in time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The complexity of construction projects and the fragmentation of the construction industry undertaking those projects has effectively resulted in linear, uncoordinated and highly variable project processes in the UK construction sector. Research undertaken at the University of Salford resulted in the development of an improved project process, the Process Protocol, which considers the whole lifecycle of a construction project whilst integrating its participants under a common framework. The Process Protocol identifies the various phases of a construction project with particular emphasis on what is described in the manufacturing industry as the ‘fuzzy front end’. The participants in the process are described in terms of the activities that need to be undertaken in order to achieve a successful project and process execution. In addition, the decision-making mechanisms, from a client perspective, are illustrated and the foundations for a learning organization/industry are facilitated within a consistent Process Protocol.