159 resultados para Source wavelet estimation
Resumo:
El presente documento tiene por objetivo identificar la evolución que ha tenido la inteligencia de negocios en la ultima década, para determinar en que punto se encuentra y para donde va, además de identificar las alternativas de desarrollo que tiene, apalancándose con las Tecnologías de la Información, y más específicamente con las herramientas empaquetadas de BI de tipo Open Source.Así mismo, la presente investigación identifica las prácticas más convenientes, para sistematizar la elección y adaptación de la inteligencia de negocios en las organizaciones, analizando estas en uncaso exitoso real de implantación de la inteligencia de negocios sobre una herramienta BI tipo Open Source.
Resumo:
This case study introduces our continuous work to enhance the virtual classroom in order to provide faculty and students with an environment open to their needs, compliant with learning standards and, therefore compatible with other e-learning environments, and based on open source software. The result is a modulable, sustainable and interoperable learning environment that can be adapted to different teaching and learning situations by incorporating the LMS integrated tools as well as wikis, blogs, forums and Moodle activities among others.
Resumo:
Interaction effects are usually modeled by means of moderated regression analysis. Structural equation models with non-linear constraints make it possible to estimate interaction effects while correcting formeasurement error. From the various specifications, Jöreskog and Yang's(1996, 1998), likely the most parsimonious, has been chosen and further simplified. Up to now, only direct effects have been specified, thus wasting much of the capability of the structural equation approach. This paper presents and discusses an extension of Jöreskog and Yang's specification that can handle direct, indirect and interaction effects simultaneously. The model is illustrated by a study of the effects of an interactive style of use of budgets on both company innovation and performance
Resumo:
The estimation of camera egomotion is a well established problem in computer vision. Many approaches have been proposed based on both the discrete and the differential epipolar constraint. The discrete case is mainly used in self-calibrated stereoscopic systems, whereas the differential case deals with a unique moving camera. The article surveys several methods for mobile robot egomotion estimation covering more than 0.5 million samples using synthetic data. Results from real data are also given
Resumo:
We present a computer vision system that associates omnidirectional vision with structured light with the aim of obtaining depth information for a 360 degrees field of view. The approach proposed in this article combines an omnidirectional camera with a panoramic laser projector. The article shows how the sensor is modelled and its accuracy is proved by means of experimental results. The proposed sensor provides useful information for robot navigation applications, pipe inspection, 3D scene modelling etc
Resumo:
Epipolar geometry is a key point in computer vision and the fundamental matrix estimation is the only way to compute it. This article surveys several methods of fundamental matrix estimation which have been classified into linear methods, iterative methods and robust methods. All of these methods have been programmed and their accuracy analysed using real images. A summary, accompanied with experimental results, is given
Resumo:
In networks with small buffers, such as optical packet switching based networks, the convolution approach is presented as one of the most accurate method used for the connection admission control. Admission control and resource management have been addressed in other works oriented to bursty traffic and ATM. This paper focuses on heterogeneous traffic in OPS based networks. Using heterogeneous traffic and bufferless networks the enhanced convolution approach is a good solution. However, both methods (CA and ECA) present a high computational cost for high number of connections. Two new mechanisms (UMCA and ISCA) based on Monte Carlo method are proposed to overcome this drawback. Simulation results show that our proposals achieve lower computational cost compared to enhanced convolution approach with an small stochastic error in the probability estimation
Resumo:
In this paper we extend the reuse of paths to the shot from a moving light source. In the classical algorithm new paths have to be cast from each new position of a light source. We show that we can reuse all paths for all positions, obtaining in this way a theoretical maximum speed-up equal to the average length of the shooting path
Resumo:
En aquest treball, es proposa un nou mètode per estimar en temps real la qualitat del producte final en processos per lot. Aquest mètode permet reduir el temps necessari per obtenir els resultats de qualitat de les anàlisi de laboratori. S'utiliza un model de anàlisi de componentes principals (PCA) construït amb dades històriques en condicions normals de funcionament per discernir si un lot finalizat és normal o no. Es calcula una signatura de falla pels lots anormals i es passa a través d'un model de classificació per la seva estimació. L'estudi proposa un mètode per utilitzar la informació de les gràfiques de contribució basat en les signatures de falla, on els indicadors representen el comportament de les variables al llarg del procés en les diferentes etapes. Un conjunt de dades compost per la signatura de falla dels lots anormals històrics es construeix per cercar els patrons i entrenar els models de classifcació per estimar els resultas dels lots futurs. La metodologia proposada s'ha aplicat a un reactor seqüencial per lots (SBR). Diversos algoritmes de classificació es proven per demostrar les possibilitats de la metodologia proposada.
Resumo:
El presente proyecto consiste en una introducción al "cloud computing" y un estudio en profundidad de las herramientas OpenNebula, dentro del modelo IaaS (Infraestructure as a Service), y Hadoop, dentro del modelo PaaS (Platform as a Service). El trabajo también incluye la instalación, integración, configuración y puesta en marcha de una plataforma "cloud computing" utilizando OpenNebula y Hadoop con el objetivo de aplicar los conceptos teóricos en una solución real dentro de un entorno de laboratorio que puede ser extrapolable a una instalación real.
Resumo:
The emergence of open source software in the last years has become a common topic of study in different fields, from the most technical characteristics to the economical aspects. This paper examines the current status about the literature dealing with economics of open source and explores the uses, infrastructure and expectations of retail businesses and institutions of the town of Igualda about it. This qualitative case study finds out that the current equipment and level of uses of ICTs are low and that the current situation of the town stores is receptive to a potential introduction of open source software.
Resumo:
With this final master thesis we are going to contribute to the Asterisk open source project. Asterisk is an open source project that started with the main objective of develop an IP telephony platform, completely based on Software (so not hardware dependent) and under an open license like GPL. This project was started on 1999 by the software engineer Mark Spencer at Digium. The main motivation of that open source project was that the telecommunications sector is lack of open solutions, and most of the available solutions are based on proprietary standards, which are close and not compatible between them. Behind the Asterisk project there is a company, Digum, which is the project leading since the project was originated in its laboratories. This company has some of its employees fully dedicated to contribute to the Asterisk project, and also provide the whole infrastructure required by the open source project. But the business of Digium isn't based on licensing of products due to the open source nature of Asterisk, but it's based on offering services around Asteriskand designing and selling some hardware components to be used with Asterisk. The Asterisk project has grown up a lot since its birth, offering in its latest versions advanced functionalities for managing calls and compatibility with some hardware that previously was exclusive of proprietary solutions. Due to that, Asterisk is becoming a serious alternative to all these proprietaries solutions because it has reached a level of maturity that makes it very stable. In addition, as it is open source, it can be fully customized to a givenrequirement, which could be impossible with the proprietaries solutions. Due to the bigness that is reaching the project, every day there are more companies which develop value added software for telephony platforms, that are seriously evaluating the option of make their software fully compatible withAsterisk platforms. All these factors make Asterisk being a consolidated project but in constant evolution, trying to offer all those functionalities offered by proprietaries solutions. This final master thesis will be divided mainly in two blocks totally complementaries. In the first block we will analyze Asterisk as an open source project and Asterisk as a telephony platform (PBX). As a result of this analysis we will generate a document, written in English because it is Asterisk project's official language, which could be used by future contributors as an starting point on joining Asterisk. On the second block we will proceed with a development contribution to the Asterisk project. We will have several options in the form that we do the contribution, such as solving bugs, developing new functionalities or start an Asterisk satellite project. The type of contribution will depend on the needs of the project on that moment.
Resumo:
A method to estimate an extreme quantile that requires no distributional assumptions is presented. The approach is based on transformed kernel estimation of the cumulative distribution function (cdf). The proposed method consists of a double transformation kernel estimation. We derive optimal bandwidth selection methods that have a direct expression for the smoothing parameter. The bandwidth can accommodate to the given quantile level. The procedure is useful for large data sets and improves quantile estimation compared to other methods in heavy tailed distributions. Implementation is straightforward and R programs are available.
Resumo:
This paper presents and compares two approaches to estimate the origin (upstream or downstream) of voltage sag registered in distribution substations. The first approach is based on the application of a single rule dealing with features extracted from the impedances during the fault whereas the second method exploit the variability of waveforms from an statistical point of view. Both approaches have been tested with voltage sags registered in distribution substations and advantages, drawbacks and comparative results are presented
Resumo:
This paper focus on the problem of locating single-phase faults in mixed distribution electric systems, with overhead lines and underground cables, using voltage and current measurements at the sending-end and sequence model of the network. Since calculating series impedance for underground cables is not as simple as in the case of overhead lines, the paper proposes a methodology to obtain an estimation of zero-sequence impedance of underground cables starting from previous single-faults occurred in the system, in which an electric arc occurred at the fault location. For this reason, the signal is previously pretreated to eliminate its peaks voltage and the analysis can be done working with a signal as close as a sinus wave as possible