998 resultados para dados agregados
Resumo:
This Master of Science Thesis deals with applying DEA (Data Envelopment Analysis) to the academic performance evaluation of graduate programs in Brazil, exploring it on a Mechanical and Production Engineering Program 2001-2003 data. The data used is that of the national assessment carried by CAPES, the governmental body in charge for graduate program assessment and certification. It is used the CCR output oriented DEA model, the CCR-Output with Assurance Region, and Window Analysis. The main findings are first that the CCR has the concerning problem of zero values of weights of outputs that is not appropriate in a sense that a graduate program has the higher efficiency score zeroing some output (e.g., number of academic papers published). Secondly, the Assurance Region method proved useful. Third, the Window Analysis also gave some light to the consistency of the performance in the time frame analysed. Also, the analysis results in the understanding that the Mechanics and Production Engineering should not be assessed jointly like currently applied by CAPES and rather should be assessed in its own field separately. Finally, the result of the DEA analysis showed some serious inconsistencies with the CAPES method. Graduate programs considered excellent has got low performance score and vice versa. This Thesis provides a strong argument in order to use DEA at least as a complimentary methodology for graduate program performance evaluation in Brazil
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
The objective of this thesis is proposes a method for a mobile robot to build a hybrid map of an indoor, semi-structured environment. The topological part of this map deals with spatial relationships among rooms and corridors. It is a topology-based map, where the edges of the graph are rooms or corridors, and each link between two distinct edges represents a door. The metric part of the map consists in a set of parameters. These parameters describe a geometric figure which adapts to the free space of the local environment. This figure is calculated by a set of points which sample the boundaries of the local free space. These points are obtained with range sensors and with knowledge about the robot s pose. A method based on generalized Hough transform is applied to this set of points in order to obtain the geomtric figure. The building of the hybrid map is an incremental procedure. It is accomplished while the robot explores the environment. Each room is associated with a metric local map and, consequently, with an edge of the topo-logical map. During the mapping procedure, the robot may use recent metric information of the environment to improve its global or relative pose
Resumo:
This graduate thesis proposes a model to asynchronously replicate heterogeneous databases. This model singularly combines -in a systematic way and in a single project -different concepts, techniques and paradigms related to the areas of database replication and management of heterogeneous databases. One of the main advantages of the replication is to allow applications to continue to process information, during time intervals when they are off the network and to trigger the database synchronization, as soon as the network connection is reestablished. Therefore, the model introduces a communication and update protocol that takes in consideration the environment of asynchronous characteristics used. As part of the work, a tool was developed in Java language, based on the model s premises in order to process, test, simulate and validate the proposed model
Resumo:
In recent decades, changes have been occurring in the telecommunications industry, allied to competition driven by the policies of privatization and concessions, have fomented the world market irrefutably causing the emergence of a new reality. The reflections in Brazil have become evident due to the appearance of significant growth rates, getting in 2012 to provide a net operating income of 128 billion dollars, placing the country among the five major powers in the world in mobile communications. In this context, an issue of increasing importance to the financial health of companies is their ability to retain their customers, as well as turn them into loyal customers. The appearance of infidelity from customer operators has been generating monthly rates shutdowns about two to four percent per month accounting for business management one of its biggest challenges, since capturing a new customer has meant an expenditure greater than five times to retention. For this purpose, models have been developed by means of structural equation modeling to identify the relationships between the various determinants of customer loyalty in the context of services. The original contribution of this thesis is to develop a model for loyalty from the identification of relationships between determinants of satisfaction (latent variables) and the inclusion of attributes that determine the perceptions of service quality for the mobile communications industry, such as quality, satisfaction, value, trust, expectation and loyalty. It is a qualitative research which will be conducted with customers of operators through simple random sampling technique, using structured questionnaires. As a result, the proposed model and statistical evaluations should enable operators to conclude that customer loyalty is directly influenced by technical and operational quality of the services offered, as well as provide a satisfaction index for the mobile communication segment
Resumo:
This work presents simulation results of an identification platform compatible with the INPE Brazilian Data Collection System, modeled with SystemC-AMS. SystemC-AMS that is a library of C++ classes dedicated to the simulation of heterogeneous systems, offering a powerful resource to describe models in digital, analog and RF domains, as well as mechanical and optic. The designed model was divided in four parts. The first block takes into account the satellite s orbit, necessary to correctly model the propagation channel, including Doppler effect, attenuation and thermal noise. The identification block detects the satellite presence. It is composed by low noise amplifier, band pass filter, power detector and logic comparator. The controller block is responsible for enabling the RF transmitter when the presence of the satellite is detected. The controller was modeled as a Petri net, due to the asynchronous nature of the system. The fourth block is the RF transmitter unit, which performs the modulation of the information in BPSK ±60o. This block is composed by oscillator, mixer, adder and amplifier. The whole system was simulated simultaneously. The results are being used to specify system components and to elaborate testbenchs for design verification
Resumo:
We propose a new approach to reduction and abstraction of visual information for robotics vision applications. Basically, we propose to use a multi-resolution representation in combination with a moving fovea for reducing the amount of information from an image. We introduce the mathematical formalization of the moving fovea approach and mapping functions that help to use this model. Two indexes (resolution and cost) are proposed that can be useful to choose the proposed model variables. With this new theoretical approach, it is possible to apply several filters, to calculate disparity and to obtain motion analysis in real time (less than 33ms to process an image pair at a notebook AMD Turion Dual Core 2GHz). As the main result, most of time, the moving fovea allows the robot not to perform physical motion of its robotics devices to keep a possible region of interest visible in both images. We validate the proposed model with experimental results
Resumo:
A compactação do solo tem sido assunto de intensivas pesquisas nos últimos anos; no entanto, os mecanismos que implicam o processo de compactação dos solos agrícolas, ainda permanecem pouco conhecidos. A contribuição do tamanho de agregados do solo, bem como o efeito do teor de água e da pressão normal aplicada na compactação e pressão de pré-compactação do solo, foi investigada em um Nitossolo Vermelho eutrófico. Amostras de solo deformado, constituídas por agregados menores que 2,5 mm e de 9,3 a 19,4 mm, foram submetidas a ensaio de compressão uniaxial drenado. O índice de vazios e a pressão de pré-compactação foram avaliados. Os resultados obtidos mostram que o tamanho de agregados teve efeito no processo de compactação do solo. A mudança da compactação do solo pode ser prevista em função do estado inicial do solo, da pressão aplicada e do teor de água.
Resumo:
Due to the large amount of television content, which emerged from the Digital TV, viewers are facing a new challenge, how to find interesting content intuitively and efficiently. The Personalized Electronic Programming Guides (pEPG) arise as an answer to this complex challenge. We propose TrendTV a layered architecture that allows the formation of social networks among viewers of Interactive Digital TV based on online microblogging. Associated with a pEPG, this social network allows the viewer to perform content filtering on a particular subject from the indications made by other viewers of his network. Allowing the viewer to create his own indications for a particular content when it is displayed, or to analyze the importance of a particular program online, based on these indications. This allows any user to perform filtering on content and generate or exchange information with other users in a flexible and transparent way, using several different devices (TVs, Smartphones, Tablets or PCs). Moreover, this architecture defines a mechanism to perform the automatic exchange of channels based on the best program that is showing at the moment, suggesting new components to be added to the middleware of the Brazilian Digital TV System (Ginga). The result is a constructed and dynamic database containing the classification of several TV programs as well as an application to automatically switch to the best channel of the moment
Resumo:
Digital signal processing (DSP) aims to extract specific information from digital signals. Digital signals are, by definition, physical quantities represented by a sequence of discrete values and from these sequences it is possible to extract and analyze the desired information. The unevenly sampled data can not be properly analyzed using standard techniques of digital signal processing. This work aimed to adapt a technique of DSP, the multiresolution analysis, to analyze unevenly smapled data, to aid the studies in the CoRoT laboratory at UFRN. The process is based on re-indexing the wavelet transform to handle unevenly sampled data properly. The was efective presenting satisfactory results
Resumo:
The last years have presented an increase in the acceptance and adoption of the parallel processing, as much for scientific computation of high performance as for applications of general intention. This acceptance has been favored mainly for the development of environments with massive parallel processing (MPP - Massively Parallel Processing) and of the distributed computation. A common point between distributed systems and MPPs architectures is the notion of message exchange, that allows the communication between processes. An environment of message exchange consists basically of a communication library that, acting as an extension of the programming languages that allow to the elaboration of applications parallel, such as C, C++ and Fortran. In the development of applications parallel, a basic aspect is on to the analysis of performance of the same ones. Several can be the metric ones used in this analysis: time of execution, efficiency in the use of the processing elements, scalability of the application with respect to the increase in the number of processors or to the increase of the instance of the treat problem. The establishment of models or mechanisms that allow this analysis can be a task sufficiently complicated considering parameters and involved degrees of freedom in the implementation of the parallel application. An joined alternative has been the use of collection tools and visualization of performance data, that allow the user to identify to points of strangulation and sources of inefficiency in an application. For an efficient visualization one becomes necessary to identify and to collect given relative to the execution of the application, stage this called instrumentation. In this work it is presented, initially, a study of the main techniques used in the collection of the performance data, and after that a detailed analysis of the main available tools is made that can be used in architectures parallel of the type to cluster Beowulf with Linux on X86 platform being used libraries of communication based in applications MPI - Message Passing Interface, such as LAM and MPICH. This analysis is validated on applications parallel bars that deal with the problems of the training of neural nets of the type perceptrons using retro-propagation. The gotten conclusions show to the potentiality and easinesses of the analyzed tools.
Resumo:
Self-organizing maps (SOM) are artificial neural networks widely used in the data mining field, mainly because they constitute a dimensionality reduction technique given the fixed grid of neurons associated with the network. In order to properly the partition and visualize the SOM network, the various methods available in the literature must be applied in a post-processing stage, that consists of inferring, through its neurons, relevant characteristics of the data set. In general, such processing applied to the network neurons, instead of the entire database, reduces the computational costs due to vector quantization. This work proposes a post-processing of the SOM neurons in the input and output spaces, combining visualization techniques with algorithms based on gravitational forces and the search for the shortest path with the greatest reward. Such methods take into account the connection strength between neighbouring neurons and characteristics of pattern density and distances among neurons, both associated with the position that the neurons occupy in the data space after training the network. Thus, the goal consists of defining more clearly the arrangement of the clusters present in the data. Experiments were carried out so as to evaluate the proposed methods using various artificially generated data sets, as well as real world data sets. The results obtained were compared with those from a number of well-known methods existent in the literature
Resumo:
The control of industrial processes has become increasingly complex due to variety of factory devices, quality requirement and market competition. Such complexity requires a large amount of data to be treated by the three levels of process control: field devices, control systems and management softwares. To use data effectively in each one of these levels is extremely important to industry. Many of today s industrial computer systems consist of distributed software systems written in a wide variety of programming languages and developed for specific platforms, so, even more companies apply a significant investment to maintain or even re-write their systems for different platforms. Furthermore, it is rare that a software system works in complete isolation. In industrial automation is common that, software had to interact with other systems on different machines and even written in different languages. Thus, interoperability is not just a long-term challenge, but also a current context requirement of industrial software production. This work aims to propose a middleware solution for communication over web service and presents an user case applying the solution developed to an integrated system for industrial data capture , allowing such data to be available simplified and platformindependent across the network
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Com o objetivo de avaliar o efeito do manejo do solo na estabilidade de agregados de um Nitossolo Vermelho distroférrico, localizado na Fazenda Experimental Lageado da FCA/UNESP, em Botucatu - SP, amostraram-se, em outubro de 2001, três sistemas de manejo de solo: (i) mata (MA), (ii) preparo convencional por 10 anos seguido de semeadura direta por 12 anos (PC/SD) e (iii) preparo convencional por 22 anos (PC), em quatro camadas: 0,0-0,10; 0,10-0,20; 0,20-0,30 e 0,30-0,40 m. O delineamento experimental empregado foi o inteiramente casualizado, com três repetições. As amostragens foram feitas após a cultura do milho (safra 2000-2001). As amostras foram submetidas às análises físicas e químicas, e as médias, comparadas pelo teste de Tukey. O diâmetro médio ponderado dos agregados (DMP), o índice de estabilidade dos agregados (IEA) e a percentagem de agregados em classes de diâmetro médio foram obtidos com os resultados do peneiramento obtidos pelo método por via úmida. O diâmetro médio ponderado e o índice de estabilidade dos agregados foram menores para o preparo convencional do solo. Os três sistemas de manejo apresentaram maior percentagem de agregados com diâmetro entre 7,93 e 2,00 mm. A substituição do preparo convencional pela semeadura direta favoreceu a estabilidade dos agregados do solo. O diâmetro médio ponderado, o índice de estabilidade de agregados e a percentagem de agregados por classe de diâmetro médio evidenciaram diferenças entre os sistemas de manejo do solo.