61 resultados para exclusive dealing
Resumo:
Proyecto de fin de carrera consistente en la construcción y explotación de un almacén de datos (Data Warehouse) de información cinematográfica.
Resumo:
Memoria de TFC de creación y explotación de un almacén de datos de festivales cinematográficos.
Resumo:
Oxford University learning technologies group offer a model for effective practice in creating and using OER in research-led teaching environments where academic practice includes dissemination of research which aids/supplements teaching but is not primarily designed as a teaching resource. The University is perceived by many people to be an exclusive institution. It is certainly unique and complex, with characteristics and traditions established over 900 years. An Oxford education offers an exciting combination of privilege and open-mindedness. The role and sustainability of open education technologies in this environment is subtle. Any strategy to effectively encourage the uptake of OERs must be informed by original thinking and reflection about the culture of the organisation. The OpenSpires project was a successful initiative to establish a sustainable set of policies and workflows that would allow departments from across the University of Oxford to regularly publish high quality open content material for global reuse.
Resumo:
R from http://www.r-project.org/ is ‘GNU S’ – a language and environment for statistical computingand graphics. The environment in which many classical and modern statistical techniques havebeen implemented, but many are supplied as packages. There are 8 standard packages and many moreare available through the cran family of Internet sites http://cran.r-project.org .We started to develop a library of functions in R to support the analysis of mixtures and our goal isa MixeR package for compositional data analysis that provides support foroperations on compositions: perturbation and power multiplication, subcomposition with or withoutresiduals, centering of the data, computing Aitchison’s, Euclidean, Bhattacharyya distances,compositional Kullback-Leibler divergence etc.graphical presentation of compositions in ternary diagrams and tetrahedrons with additional features:barycenter, geometric mean of the data set, the percentiles lines, marking and coloring ofsubsets of the data set, theirs geometric means, notation of individual data in the set . . .dealing with zeros and missing values in compositional data sets with R procedures for simpleand multiplicative replacement strategy,the time series analysis of compositional data.We’ll present the current status of MixeR development and illustrate its use on selected data sets
Resumo:
This paper proposes a field application of a high-level reinforcement learning (RL) control system for solving the action selection problem of an autonomous robot in cable tracking task. The learning system is characterized by using a direct policy search method for learning the internal state/action mapping. Policy only algorithms may suffer from long convergence times when dealing with real robotics. In order to speed up the process, the learning phase has been carried out in a simulated environment and, in a second step, the policy has been transferred and tested successfully on a real robot. Future steps plan to continue the learning process on-line while on the real robot while performing the mentioned task. We demonstrate its feasibility with real experiments on the underwater robot ICTINEU AUV
Resumo:
A major obstacle to processing images of the ocean floor comes from the absorption and scattering effects of the light in the aquatic environment. Due to the absorption of the natural light, underwater vehicles often require artificial light sources attached to them to provide the adequate illumination. Unfortunately, these flashlights tend to illuminate the scene in a nonuniform fashion, and, as the vehicle moves, induce shadows in the scene. For this reason, the first step towards application of standard computer vision techniques to underwater imaging requires dealing first with these lighting problems. This paper analyses and compares existing methodologies to deal with low-contrast, nonuniform illumination in underwater image sequences. The reviewed techniques include: (i) study of the illumination-reflectance model, (ii) local histogram equalization, (iii) homomorphic filtering, and, (iv) subtraction of the illumination field. Several experiments on real data have been conducted to compare the different approaches
Resumo:
It is well known that image processing requires a huge amount of computation, mainly at low level processing where the algorithms are dealing with a great number of data-pixel. One of the solutions to estimate motions involves detection of the correspondences between two images. For normalised correlation criteria, previous experiments shown that the result is not altered in presence of nonuniform illumination. Usually, hardware for motion estimation has been limited to simple correlation criteria. The main goal of this paper is to propose a VLSI architecture for motion estimation using a matching criteria more complex than Sum of Absolute Differences (SAD) criteria. Today hardware devices provide many facilities for the integration of more and more complex designs as well as the possibility to easily communicate with general purpose processors
Resumo:
This paper proposes a parallel architecture for estimation of the motion of an underwater robot. It is well known that image processing requires a huge amount of computation, mainly at low-level processing where the algorithms are dealing with a great number of data. In a motion estimation algorithm, correspondences between two images have to be solved at the low level. In the underwater imaging, normalised correlation can be a solution in the presence of non-uniform illumination. Due to its regular processing scheme, parallel implementation of the correspondence problem can be an adequate approach to reduce the computation time. Taking into consideration the complexity of the normalised correlation criteria, a new approach using parallel organisation of every processor from the architecture is proposed
Resumo:
In computer graphics, global illumination algorithms take into account not only the light that comes directly from the sources, but also the light interreflections. This kind of algorithms produce very realistic images, but at a high computational cost, especially when dealing with complex environments. Parallel computation has been successfully applied to such algorithms in order to make it possible to compute highly-realistic images in a reasonable time. We introduce here a speculation-based parallel solution for a global illumination algorithm in the context of radiosity, in which we have taken advantage of the hierarchical nature of such an algorithm
Resumo:
The R-package “compositions”is a tool for advanced compositional analysis. Its basicfunctionality has seen some conceptual improvement, containing now some facilitiesto work with and represent ilr bases built from balances, and an elaborated subsys-tem for dealing with several kinds of irregular data: (rounded or structural) zeroes,incomplete observations and outliers. The general approach to these irregularities isbased on subcompositions: for an irregular datum, one can distinguish a “regular” sub-composition (where all parts are actually observed and the datum behaves typically)and a “problematic” subcomposition (with those unobserved, zero or rounded parts, orelse where the datum shows an erratic or atypical behaviour). Systematic classificationschemes are proposed for both outliers and missing values (including zeros) focusing onthe nature of irregularities in the datum subcomposition(s).To compute statistics with values missing at random and structural zeros, a projectionapproach is implemented: a given datum contributes to the estimation of the desiredparameters only on the subcompositon where it was observed. For data sets withvalues below the detection limit, two different approaches are provided: the well-knownimputation technique, and also the projection approach.To compute statistics in the presence of outliers, robust statistics are adapted to thecharacteristics of compositional data, based on the minimum covariance determinantapproach. The outlier classification is based on four different models of outlier occur-rence and Monte-Carlo-based tests for their characterization. Furthermore the packageprovides special plots helping to understand the nature of outliers in the dataset.Keywords: coda-dendrogram, lost values, MAR, missing data, MCD estimator,robustness, rounded zeros
Resumo:
The aim of this study limit is to analyze the different learning software tools for groups in need of attention special and disability that exist in the market so exclusive and cost recorded consulting and licensing, and compare them with the tools repository of Open Source Software Community without loss of performance and efficiency. Our engineering knowledge should always help the most in need.
Resumo:
Objectiu: provar que, enfront de l’aparició de sibilàncies, l’alletament matern es comporta com a un factor protector i l’alletament artificial com a un factor inductor. Material i mètodes: assaig clínic controlat, randomitzat, a doble cec amb grup control i seguiment de 8 anys, de la submostra espanyola, en el seu 5è any de seguiment, del treball multicèntric europeu EU CHILDHOOD OBESITY PROGRAMME (QLK1-2001-00389). La població es va dividir en 3 grups: nadons alimentats amb lactància artificial amb baix contingut proteic, nadons alimentats amb lactància artificial amb alt contingut proteic i un grup control de nadons alimentats amb llet materna. Per avaluar l’aparició de sibilàncies i la seva evolució en el temps es van realitzar entrevistes als pares a mesura que la població assolia els 6 anys de vida sobre qüestions referides als 3 i als 6 anys i s’havien de realitzar entrevistes als 8 anys de vida sobre qüestions referdies a aquesta mateixa edat. Per comprovar la repercussió en la funció pulmonar i valorar la base atòpica, es tenia previst realitzar, als 8 anys, espirometria, prik test amb aeroalergens, determinació de IgE sèrica total i quantificació dels eosinòfils en sang perifèrica. S’han valorat possibles factors de confusió com antecedents familiars de malalties de base al•lèrgica, nivell socioeconòmic familiar, factors, ambient epidemiològic i s’ha estudiat altra morbiditat associada com episodis de febre, vòmits, diarrea, dermatitis atòpica, refredat de vies respiratòries altes i prescripció mèdica d’antibiòtics. Resultats: només un 20’8% van rebre alletament matern. No s’han trobat diferències estadísticament significatives entre la història d’episodis de sibilàncies i el tipus d’alletament rebut. Tampoc s’han trobat diferències estadísticament significatives entre l’alimentació rebuda i la història de dermatitis atòpica. La llet artificial es va associar, amb significació estadística, a una major prescripció d’antibiòtics i una major incidència de patir diarrees i, sense significació estadística, es va associar a un augment del risc de patir RVA. La lactància materna es va associar amb significació estadística a una menor prescripció d’antibiòtics. La presència de germans grans i un baix nivell d’educació de la mare van contribuir a augmentar la morbiditat durant el primer any de vida. El consum d’alcohol durant l’embaràs es va associar a més episodis de vòmits i el consum de tabac a més episodis de diarrea. Conclusions: l’alletament artificial no predisposa a patir més episodis de sibilàncies ni de dermatitis atòpica. La lactància materna exclusiva durant almenys 3 mesos disminueix el risc de diarrees en els primers 6 mesos de vida i retarda l’aparició d’infeccions aparentment bacterianes que requereixen tractament antibiòtic. L’alletament matern exclusiu durant un mínim de tres mesos no comporta una substancial disminució de la morbiditat durant els primers 12 mesos de vida.
Resumo:
Viri is a system for automatic distribution and execution of Python code on remote machines. This is especially useful when dealing with a large group of hosts.With Viri, Sysadmins can write their own scripts, and easily distribute and execute them on any number of remote machines. Depending on the number of computers to administrate, Viri can save thousands of hours, that Sysadmins would spend transferring files, logging into remote hosts, and waiting for the scripts to finish. Viri automates the whole process.Viri can also be useful for remotely managing host settings. It should work together with an application where the information about hosts would be maintained. This information can include cron tasks, firewall rules, backup settings,... After a simple Integration of this application with your Viri infrastructure, you can change any settings in the application, and see how it gets applied on the target host automatically.
Resumo:
The emergence of open source software in the last years has become a common topic of study in different fields, from the most technical characteristics to the economical aspects. This paper examines the current status about the literature dealing with economics of open source and explores the uses, infrastructure and expectations of retail businesses and institutions of the town of Igualda about it. This qualitative case study finds out that the current equipment and level of uses of ICTs are low and that the current situation of the town stores is receptive to a potential introduction of open source software.
Resumo:
With this final master thesis we are going to contribute to the Asterisk open source project. Asterisk is an open source project that started with the main objective of develop an IP telephony platform, completely based on Software (so not hardware dependent) and under an open license like GPL. This project was started on 1999 by the software engineer Mark Spencer at Digium. The main motivation of that open source project was that the telecommunications sector is lack of open solutions, and most of the available solutions are based on proprietary standards, which are close and not compatible between them. Behind the Asterisk project there is a company, Digum, which is the project leading since the project was originated in its laboratories. This company has some of its employees fully dedicated to contribute to the Asterisk project, and also provide the whole infrastructure required by the open source project. But the business of Digium isn't based on licensing of products due to the open source nature of Asterisk, but it's based on offering services around Asteriskand designing and selling some hardware components to be used with Asterisk. The Asterisk project has grown up a lot since its birth, offering in its latest versions advanced functionalities for managing calls and compatibility with some hardware that previously was exclusive of proprietary solutions. Due to that, Asterisk is becoming a serious alternative to all these proprietaries solutions because it has reached a level of maturity that makes it very stable. In addition, as it is open source, it can be fully customized to a givenrequirement, which could be impossible with the proprietaries solutions. Due to the bigness that is reaching the project, every day there are more companies which develop value added software for telephony platforms, that are seriously evaluating the option of make their software fully compatible withAsterisk platforms. All these factors make Asterisk being a consolidated project but in constant evolution, trying to offer all those functionalities offered by proprietaries solutions. This final master thesis will be divided mainly in two blocks totally complementaries. In the first block we will analyze Asterisk as an open source project and Asterisk as a telephony platform (PBX). As a result of this analysis we will generate a document, written in English because it is Asterisk project's official language, which could be used by future contributors as an starting point on joining Asterisk. On the second block we will proceed with a development contribution to the Asterisk project. We will have several options in the form that we do the contribution, such as solving bugs, developing new functionalities or start an Asterisk satellite project. The type of contribution will depend on the needs of the project on that moment.