928 resultados para Weak Compact Generating
Resumo:
In this paper the design issues of compact genetic microstrip antennas for mobile applications has been investigated. The antennas designed using Genetic Algorithms (GA) have an arbitrary shape and occupies less area (compact) compared to the traditionally designed antenna for the same frequency but with poor performance. An attempt has been made to improve the performance of the genetic microstrip antenna by optimizing the ground plane (GP) to have a fish bone like structure. The genetic antenna with the GP optimized is even better compared to the traditional and the genetic antenna.
Resumo:
Earlier studies on measurement of customer satisfaction are based on either transaction specific or overall approaches. The transaction specific approach evaluates customer satisfaction with single components in the whole purchase process but the overall satisfaction was based on all the encounters or experiences to the customer throughout the purchase process. Consumers will comment on particular events of their purchase process when asked about transaction-specific satisfaction and they will comment their overall impression and general experiences in overall satisfaction (Bitner & Hubbert 1994) Through a critical review on the literature, it has been identified a new approaches to customer satisfaction, say, cumulative approaches that can be more useful than overall and transaction specific approaches for strategic decision making (Fornell et al 1996). The cumulative approach to customer satisfaction doesn’t study earlier due to the difficulty in operationalization of the concept. But the influencers of customer satisfaction are context specific and the prevailing models doesn’t give the sources of variations in the satisfaction, the importance of cumulative approaches to customer satisfaction has emerges that lights to a new research. The current study has focused to explore the influencers of overall customer satisfaction to form individual elements that can be used to identify the cumulative customer satisfaction.
Resumo:
The main objective of this thesis is to develop a compact chipless RFID tag with high data encoding capacity. The design and development of chipless RFID tag based on multiresonator and multiscatterer methods are presented first. An RFID tag using using SIR capable of 79bits is proposed. The thesis also deals with some of the properties of SIR like harmonic separation, independent control on resonant modes and the capability to change the electrical length. A chipless RFID reader working in a frequency band of 2.36GHz to 2.54GHz has been designed to show the feasibility of the RFID system. For a practical system, a new approach based on UWB Impulse Radar (UWB IR) technology is employed and the decoding methods from noisy backscattered signal are successfully demonstrated. The thesis also proposes a simple calibration procedure, which is able to decode the backscattered signal up to a distance of 80cm with 1mW output power.
Resumo:
Division of Electronics Engineering
Resumo:
The aim of this paper is the investigation of the error which results from the method of approximate approximations applied to functions defined on compact in- tervals, only. This method, which is based on an approximate partition of unity, was introduced by V. Mazya in 1991 and has mainly been used for functions defied on the whole space up to now. For the treatment of differential equations and boundary integral equations, however, an efficient approximation procedure on compact intervals is needed. In the present paper we apply the method of approximate approximations to functions which are defined on compact intervals. In contrast to the whole space case here a truncation error has to be controlled in addition. For the resulting total error pointwise estimates and L1-estimates are given, where all the constants are determined explicitly.
Resumo:
In this paper, we solve the duplication problem P_n(ax) = sum_{m=0}^{n}C_m(n,a)P_m(x) where {P_n}_{n>=0} belongs to a wide class of polynomials, including the classical orthogonal polynomials (Hermite, Laguerre, Jacobi) as well as the classical discrete orthogonal polynomials (Charlier, Meixner, Krawtchouk) for the specific case a = −1. We give closed-form expressions as well as recurrence relations satisfied by the duplication coefficients.
Resumo:
The motion of a viscous incompressible fluid flow in bounded domains with a smooth boundary can be described by the nonlinear Navier-Stokes equations. This description corresponds to the so-called Eulerian approach. We develop a new approximation method for the Navier-Stokes equations in both the stationary and the non-stationary case by a suitable coupling of the Eulerian and the Lagrangian representation of the flow, where the latter is defined by the trajectories of the particles of the fluid. The method leads to a sequence of uniquely determined approximate solutions with a high degree of regularity containing a convergent subsequence with limit function v such that v is a weak solution of the Navier-Stokes equations.
Resumo:
Ultrafast laser pulses have become an integral part of the toolbox of countless laboratories doing physics, chemistry, and biological research. The work presented here is motivated by a section in the ever-growing, interdisciplinary research towards understanding the fundamental workings of light-matter interactions. Specifically, attosecond pulses can be useful tools to obtain the desired insight. However access to, and the utility of, such pulses is dependent on the generation of intense, few-cycle, carrier-envelope-phase stabilized laser pulses. The presented work can be thought of as a sort of roadmap towards the latter. From the oscillator which provides the broadband seed to amplification methods, the integral pieces necessary for the generation of attosecond pulses are discussed. A range of topics from the fundamentals to design challenges is presented, outfitting the way towards the practical implementation of an intense few-cycle carrier-envelope-phase stabilized laser source.
Resumo:
Time-resolved diffraction with femtosecond electron pulses has become a promising technique to directly provide insights into photo induced primary dynamics at the atomic level in molecules and solids. Ultrashort pulse duration as well as extensive spatial coherence are desired, however, space charge effects complicate the bunching of multiple electrons in a single pulse.Weexperimentally investigate the interplay between spatial and temporal aspects of resolution limits in ultrafast electron diffraction (UED) on our highly compact transmission electron diffractometer. To that end, the initial source size and charge density of electron bunches are systematically manipulated and the resulting bunch properties at the sample position are fully characterized in terms of lateral coherence, temporal width and diffracted intensity.Weobtain a so far not reported measured overall temporal resolution of 130 fs (full width at half maximum) corresponding to 60 fs (root mean square) and transversal coherence lengths up to 20 nm. Instrumental impacts on the effective signal yield in diffraction and electron pulse brightness are discussed as well. The performance of our compactUEDsetup at selected electron pulse conditions is finally demonstrated in a time-resolved study of lattice heating in multilayer graphene after optical excitation.
Resumo:
This study analyzes the linear relationship between climate variables and milk components in Iran by applying bootstrapping to include and assess the uncertainty. The climate parameters, Temperature Humidity Index (THI) and Equivalent Temperature Index (ETI) are computed from the NASA-Modern Era Retrospective-Analysis for Research and Applications (NASA-MERRA) reanalysis (2002–2010). Milk data for fat, protein (measured on fresh matter bases), and milk yield are taken from 936,227 milk records for the same period, using cows fed by natural pasture from April to September. Confidence intervals for the regression model are calculated using the bootstrap technique. This method is applied to the original times series, generating statistically equivalent surrogate samples. As a result, despite the short time data and the related uncertainties, an interesting behavior of the relationships between milk compound and the climate parameters is visible. During spring only, a weak dependency of milk yield and climate variations is obvious, while fat and protein concentrations show reasonable correlations. In summer, milk yield shows a similar level of relationship with ETI, but not with temperature and THI. We suggest this methodology for studies in the field of the impacts of climate change and agriculture, also environment and food with short-term data.
Resumo:
The flexibility of the robot is the key to its success as a viable aid to production. Flexibility of a robot can be explained in two directions. The first is to increase the physical generality of the robot such that it can be easily reconfigured to handle a wide variety of tasks. The second direction is to increase the ability of the robot to interact with its environment such that tasks can still be successfully completed in the presence of uncertainties. The use of articulated hands are capable of adapting to a wide variety of grasp shapes, hence reducing the need for special tooling. The availability of low mass, high bandwidth points close to the manipulated object also offers significant improvements I the control of fine motions. This thesis provides a framework for using articulated hands to perform local manipulation of objects. N particular, it addresses the issues in effecting compliant motions of objects in Cartesian space. The Stanford/JPL hand is used as an example to illustrate a number of concepts. The examples provide a unified methodology for controlling articulated hands grasping with point contacts. We also present a high-level hand programming system based on the methodologies developed in this thesis. Compliant motion of grasped objects and dexterous manipulations can be easily described in the LISP-based hand programming language.
Resumo:
In a distributed model of intelligence, peer components need to communicate with one another. I present a system which enables two agents connected by a thick twisted bundle of wires to bootstrap a simple communication system from observations of a shared environment. The agents learn a large vocabulary of symbols, as well as inflections on those symbols which allow thematic role-frames to be transmitted. Language acquisition time is rapid and linear in the number of symbols and inflections. The final communication system is robust and performance degrades gradually in the face of problems.
Resumo:
We develop efficient techniques for the non-rigid registration of medical images by using representations that adapt to the anatomy found in such images. Images of anatomical structures typically have uniform intensity interiors and smooth boundaries. We create methods to represent such regions compactly using tetrahedra. Unlike voxel-based representations, tetrahedra can accurately describe the expected smooth surfaces of medical objects. Furthermore, the interior of such objects can be represented using a small number of tetrahedra. Rather than describing a medical object using tens of thousands of voxels, our representations generally contain only a few thousand elements. Tetrahedra facilitate the creation of efficient non-rigid registration algorithms based on finite element methods (FEM). We create a fast, FEM-based method to non-rigidly register segmented anatomical structures from two subjects. Using our compact tetrahedral representations, this method generally requires less than one minute of processing time on a desktop PC. We also create a novel method for the non-rigid registration of gray scale images. To facilitate a fast method, we create a tetrahedral representation of a displacement field that automatically adapts to both the anatomy in an image and to the displacement field. The resulting algorithm has a computational cost that is dominated by the number of nodes in the mesh (about 10,000), rather than the number of voxels in an image (nearly 10,000,000). For many non-rigid registration problems, we can find a transformation from one image to another in five minutes. This speed is important as it allows use of the algorithm during surgery. We apply our algorithms to find correlations between the shape of anatomical structures and the presence of schizophrenia. We show that a study based on our representations outperforms studies based on other representations. We also use the results of our non-rigid registration algorithm as the basis of a segmentation algorithm. That algorithm also outperforms other methods in our tests, producing smoother segmentations and more accurately reproducing manual segmentations.
Resumo:
We present a new approach to model and classify breast parenchymal tissue. Given a mammogram, first, we will discover the distribution of the different tissue densities in an unsupervised manner, and second, we will use this tissue distribution to perform the classification. We achieve this using a classifier based on local descriptors and probabilistic Latent Semantic Analysis (pLSA), a generative model from the statistical text literature. We studied the influence of different descriptors like texture and SIFT features at the classification stage showing that textons outperform SIFT in all cases. Moreover we demonstrate that pLSA automatically extracts meaningful latent aspects generating a compact tissue representation based on their densities, useful for discriminating on mammogram classification. We show the results of tissue classification over the MIAS and DDSM datasets. We compare our method with approaches that classified these same datasets showing a better performance of our proposal
Resumo:
Este proyecto de grado pretende evaluar el comportamiento productivo de la empresa Plaspucol ubicada en la ciudad de Bogotá, identificando las deficiencias en su proceso y generando mecanismo de mejoramiento a través de un previo análisis. Para ello es necesario partir de un marco teórico del plástico como lo es su historia, su evolución, clasificación y su posicionamiento e influencia económica a nivel mundial y vista a su vez desde el ámbito nacional. Para analizar dichas situaciones se usaron herramientas aprendidas en lo largo de nuestra formación profesional como los diagramas de recorrido, diagramas de flujo, diagrama hombre-máquina, diagrama de balanceo, muestreo y un moderno simulador llamado Promodel con el cual se diagnostica a la empresa identificando puntos débiles y cuellos de botella en la producción en la situación actual y se crea una situación futura con propuestas de mejora empleando éste simulador.