135 resultados para Ciência da informação - Pós-graduação


Relevância:

100.00% 100.00%

Publicador:

Resumo:

La recherche presentée, realisée sur le domaine de la méthaphysique, s´agit de rassembler des pressupositions pour une fondamentation ontologique de la technologie de l´Information, basé sur la philophie de Martin Heidegger; foncièrement, sur l´analytique existentiel du Dasein dans l´ouvrage Être et Temps. À partir de la pensée sur ce qui est aujourd´hui , il s´agit d´investiguer sur quels fondaments la Nouvelle Tecnologie se fut érigée de façon a que nous sommes engajés au projet de numérisation des étants que en même temps que destine l´homme a l´oubli de l´Être, l´offre la possibilité de transformation. Le rapport entre la question de l´Être et la question de la technique est analysé comme des chemins croisés et dans ce carrefour il devient possible penser ce qui est technique, ce qui est information pour Heidegger et de quel façon les modes existentiels du Dasein sont prêtes pour caractériser l ´homme au sein de la tecnologie de l´information. Par cette appropriation, il reste penser comment c´est possible l´ouverture d´une perspective de reconduction de l´homme à la vérité de l´Être. Finalement, la structuration des fondements rends possible la réflexion discursive général: avec qui nous nous ocuppons, comme nous sommes, dans quelle direction nous nous acheminons, les thèmes générales, respectivement, des trois chapitres. Les points d´investigation du premier chapitre son: a) La caractérisation précise du Dasein, appuyé sur des considerations de Benedito Nunes, Hans-Georg Gadamer, Jacques Derrida et Rüdiger Safränski; b) Le concept de technique et son essence chez Heidegger; c) la distinction entre technique et technologie, appuyé sur le pensée de J. Ellul, Michel Séris, Otto Pöggeler, Michel Haar, Dominique Janicaud; c) Le concept de cibernetique chez Heidegger et chez Norbert Wiener; d) La caractérisation preliminaire d´information, l´analyse étimologique e philosophique, l´avis de Heidegger te les théories de Rafael Capurro; f) L´Analyse du phénomène de la numérisation des étants, des considérations de Virilio, et l´analyse d´un concept de virtuel avec Henri Bergson et Gilles Deleuze. Dans le deuxième chapitre, l´analyse des existentiels du Dasein vers le sommaire des fondements de base pour la caractérisation de la technologie de l´information comme un problème philosophique. Finalement, aprés avoir presenté les concepts introdutoires que délimitent le questionement, suivi par les indications et pressupositions ontologiques trouvés sur Être et Temps, le troisième chapitre disserte sur le péril, ce qui sauve et la sérénité, les trois mots-clés de la pensée heideggerienne sur la technique que permettent l´approche conclusif de la question

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Em A gaia ciência Nietzsche irá abordar a ciência, especificamente a sua finalidade, sob uma perspectiva crítica, apontando a necessidade de um conhecimento científico mais humano e menos mecanicista, uma ciência mais próxima à arte, que ao invés de descobrir verdades, se ocupe em criar novos valores, visões e perspectivas. Ao longo da obra fica evidente a intenção do filósofo em não apenas criticar a ciência, mas também, conceber um novo método, uma nova ciência, uma gaya scienza, em alusão à arte dos trovadores medievais. Objetiva-se, com isso, unir vida e conhecimento como partes constituintes de um mesmo processo, fazendo da busca pelo conhecimento, não apenas um ócio ou profissão, mas, sobretudo, um meio de vida. Nietzsche anseia por uma ciência que não se enquadra nas categorias de verdadeiro ou falso, pois o seu valor é diferente do valor da verdade, seu valor maior é a máxima potencialidade da vida em todos os seus aspectos

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work aims to explore the impact that Thomas Kuhn‟s philosophical work had on the philosophy of science, especially on the common idea of scientific rationality. Besides this, it aims to make clear the position of the author about his understanding of what is to be rational in science. In order to achieve this goal we start giving a panoramic view of the philosophical scientific scene of the first half os the twentieth century, to evince the main character of the concept of rationality more accepted at Kuhn‟s time. In a second moment we show how the ideas of this author contrast with that concept, which gives rise to a series of criticisms of irrationalism. Lastly, we show how Kuhn circumvents these accusations by pointing to a new rationality concept, through which we can conciliate his philosophy with a description of the rational development of science

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Among the numerous policy changes that the world has experienced in recent years, occupies a prominent place in the quest for greater transparency of public agencies. Transparency has been an important tool in the accountability of the State to promote greater participation of the society by providing information that was previously restricted knowledge of public agencies. Brazil, following this trend, promulgated in May 2012 the Access to Information Act that seeks to disclose the actions of the State at all levels, in all public administration agencies. On the same day of the enactment of the law is provided society with a site that is empowering citizens to make their requests for information to government agencies. The Federal University of Rio Grande do Norte, which at that time had no a tool to assist them in managing this demand. This project has the objective to describe, build and implement a solution to solve this problem using Design Science Research as methodology. As result, the solution built in this research became a new module of the institution s ERP became it capable to control the entire process, and will be helpful to others partners which use our system ERP

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The information tecnology (IT) has, over the years, gaining prominence as a strategic element and competitive edge in organizations, public or private. In the judiciary, with the implementation of actions related to Judiciário Eletrônico, information technology (IT), definitely earns its status as a strategic element and significantly raises the level of dependence of the organs of their services and products. Increasingly, the quality of services provided by IT has direct impact on the quality of services provided by the agency as a whole. The Ministério Público do Estado do Rio Grande do Norte (MPRN) deployments shares of Electronic Government, along with an administrative reform, beyond these issues raised, caused a large increase in institutional demand for products and services provided by the Diretoria de Tecnologia da Informação (DTI), a sector responsible for the provision of IT services. Taking as starting point strategic goal set by MPRN to reach a 85% level of user satisfaction in four years, we seek to propose a method that assists in meeting the goal, respecting the capacity constraints of the IT sector. To achieve the proposed objective, we conducted a work in two distinct and complementary stages. In the first step we conducted a case study in MPRN, in which, through an internal and external diagnosis of DTI, accomplished by an action of internal consulting and one research of the user satisfaction, we seek to identify opportunities of change seeking to raise the quality perceived of the services provided by the DTI , from the viewpoint of their customers. The situational report, drawn from the data collected, fostered changes in DTI, which were then evaluated with the managers. In the second stage, with the results obtained in the initial process, empirical observation, evaluation of side projects of quality improvement in the sector, and validation with the managers, of the initial model, we developed an improved process, gazing beyond the identification of gaps in service a strategy for the selection of best management practices and deployment of these, in a incremental and adaptive way, allowing the application of the process in organs with little staff allocated to the provision of information technology services

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The contents introduction concerning the individual health cares reveals important since the school education. In this direction, the present study objectified to know an effect of Oral Health education intervention in the oral hygiene and in the schools children information level, of 4º e 5º basic education years. The study was composed by two groups, chosen of random form: control group (n = 115) and experimental group (n = 132), with 247 public net school children in total sample. The experimental group participated of some educative activities in Oral Health, with biweekly frequency, during the 4 months period, given for a surgeon-dentistry. Both the groups were submitted to a clinical examination for a previous verification of the Plaque Index (PI) and of Loe Silness Gingival Index (GI). A questionnaire with closed questions on Oral Health was applied before and after to verify the school children rightness index. After the intervention, the final data, represented for the PI, GI and Rightness Index verification, has been collected for statistical analyses through the chi-square test to a 95 % of reliable level, using the SPSS 10,0 software. The PI and GI were categorized in high and low on the initials index basis medium; already the Rightness Index was categorized in inadequate (< 50%) and adequate (≥ 50%). It was verified that the PI (p = 0,014; IC 0.24-0.86) and the GI (p = 0,013; IC 0,28-0,84) presented differences statistically significant, after the education activities, when compared to with the control, favoring the experimental group. It was verified too the experimental group got greater rightness index, presenting difference highly significant (p<0,0001; IC 3,73-26,81). It was still observed that there was no association between the oral hygiene indicators and the school children information level. Ahead the results, it can been concluded that education activities related in the school routine were capable to give positives effects in the oral hygiene control and in the information level about Oral Health, however, not necessarily, the individual with bigger information is that one who has practiced an oral hygiene more adjusted. One become necessary, however, that the education in Oral Health occurs of permanent and integrated form with others school actors, for the positive effect does not lose the student s life longterm

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study deals with cognitive competences and abilities that are relevant to selection and education regarding Information Technology (IT). These competences relate to problem solving, decision making, and practical intelligence that regard scholar and extracurricular knowledge mobilization. The research aimed to contribute for the improvement of a selection instrument, consisting of five arrays of skills (dealing with objectives and prospection), as well as the development and comprehension of those skills that are involved in IT education. This is done by means of an analysis on the selection instrument used in the first selective process that occurred at Metropole Digital an Institute at the Federal University of Rio Grande do Norte in Brazil. This was evaluated aiming to acknowledge IT education (with basic training and emphasis on Web programming and electronics). The methodology used was of quantitative method involving performance scores relating education delivery. An Anova analysis of variance was done along with descriptive analysis involving socioeconomic data that was not observed in the meaningful relations between parental instruction and student performance in the graduate course. These analyses were able to point out the importance and need of the policies for vacancy reservation on behalf of public school students. A Spearman correlation analysis was done considering the instrument selection performance in the training course. The instrument is presented as a predictor that is significantly moderate and presents a good performance in the course as a whole. A Cluster and Regression analysis was also realized in the process. The first analysis allowed finding performance groups (Clusters) that ranged from medium and inferior. The regression analysis was able to point out association amongst criterion variables and the (average performance in basic and advanced modules) and explanatory (five matrixes). Regression analysis indicated that matrix 1 and matrix 3 were pointed out as being the strongest ones. In all the above analysis, the correlation between the instrument and the course was considered moderate. Thus this can be related in some of the aspects present in the course such as emphasis on evaluation itself as well as in technical contents and practical skills (educational ones) and competences and selection skills. It is known that the mediation of technological artifact in cultural context can foster the development of skills and abilities relevant to IT training. This study provides subsidies to reflect on the adoption of selection instrument and IT training in the Institute. Thus the research offers means to achieve a interdisciplinary discussion and enriching of areas such as Psychology and Information Technology; all of which regarding competencies and skills relevant in IT training

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this dissertation we present some generalizations for the concept of distance by using more general value spaces, such as: fuzzy metrics, probabilistic metrics and generalized metrics. We show how such generalizations may be useful due to the possibility that the distance between two objects could carry more information about the objects than in the case where the distance is represented just by a real number. Also in this thesis we propose another generalization of distance which encompasses the notion of interval metric and generates a topology in a natural way. Several properties of this generalization are investigated, and its links with other existing generalizations

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Quadratic Minimum Spanning Tree Problem (QMST) is a version of the Minimum Spanning Tree Problem in which, besides the traditional linear costs, there is a quadratic structure of costs. This quadratic structure models interaction effects between pairs of edges. Linear and quadratic costs are added up to constitute the total cost of the spanning tree, which must be minimized. When these interactions are restricted to adjacent edges, the problem is named Adjacent Only Quadratic Minimum Spanning Tree (AQMST). AQMST and QMST are NP-hard problems that model several problems of transport and distribution networks design. In general, AQMST arises as a more suitable model for real problems. Although, in literature, linear and quadratic costs are added, in real applications, they may be conflicting. In this case, it may be interesting to consider these costs separately. In this sense, Multiobjective Optimization provides a more realistic model for QMST and AQMST. A review of the state-of-the-art, so far, was not able to find papers regarding these problems under a biobjective point of view. Thus, the objective of this Thesis is the development of exact and heuristic algorithms for the Biobjective Adjacent Only Quadratic Spanning Tree Problem (bi-AQST). In order to do so, as theoretical foundation, other NP-hard problems directly related to bi-AQST are discussed: the QMST and AQMST problems. Bracktracking and branch-and-bound exact algorithms are proposed to the target problem of this investigation. The heuristic algorithms developed are: Pareto Local Search, Tabu Search with ejection chain, Transgenetic Algorithm, NSGA-II and a hybridization of the two last-mentioned proposals called NSTA. The proposed algorithms are compared to each other through performance analysis regarding computational experiments with instances adapted from the QMST literature. With regard to exact algorithms, the analysis considers, in particular, the execution time. In case of the heuristic algorithms, besides execution time, the quality of the generated approximation sets is evaluated. Quality indicators are used to assess such information. Appropriate statistical tools are used to measure the performance of exact and heuristic algorithms. Considering the set of instances adopted as well as the criteria of execution time and quality of the generated approximation set, the experiments showed that the Tabu Search with ejection chain approach obtained the best results and the transgenetic algorithm ranked second. The PLS algorithm obtained good quality solutions, but at a very high computational time compared to the other (meta)heuristics, getting the third place. NSTA and NSGA-II algorithms got the last positions

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context-aware applications are typically dynamic and use services provided by several sources, with different quality levels. Context information qualities are expressed in terms of Quality of Context (QoC) metadata, such as precision, correctness, refreshment, and resolution. On the other hand, service qualities are expressed via Quality of Services (QoS) metadata such as response time, availability and error rate. In order to assure that an application is using services and context information that meet its requirements, it is essential to continuously monitor the metadata. For this purpose, it is needed a QoS and QoC monitoring mechanism that meet the following requirements: (i) to support measurement and monitoring of QoS and QoC metadata; (ii) to support synchronous and asynchronous operation, thus enabling the application to periodically gather the monitored metadata and also to be asynchronously notified whenever a given metadata becomes available; (iii) to use ontologies to represent information in order to avoid ambiguous interpretation. This work presents QoMonitor, a module for QoS and QoC metadata monitoring that meets the abovementioned requirement. The architecture and implementation of QoMonitor are discussed. To support asynchronous communication QoMonitor uses two protocols: JMS and Light-PubSubHubbub. In order to illustrate QoMonitor in the development of ubiquitous application it was integrated to OpenCOPI (Open COntext Platform Integration), a Middleware platform that integrates several context provision middleware. To validate QoMonitor we used two applications as proofof- concept: an oil and gas monitoring application and a healthcare application. This work also presents a validation of QoMonitor in terms of performance both in synchronous and asynchronous requests

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we propose a technique that uses uncontrolled small format aerial images, or SFAI, and stereohotogrammetry techniques to construct georeferenced mosaics. Images are obtained using a simple digital camera coupled with a radio controlled (RC) helicopter. Techniques for removing common distortions are applied and the relative orientation of the models are recovered using projective geometry. Ground truth points are used to get absolute orientation, plus a definition of scale and a coordinate system which relates image measures to the ground. The mosaic is read into a GIS system, providing useful information to different types of users, such as researchers, governmental agencies, employees, fishermen and tourism enterprises. Results are reported, illustrating the applicability of the system. The main contribution is the generation of georeferenced mosaics using SFAIs, which have not yet broadly explored in cartography projects. The proposed architecture presents a viable and much less expensive solution, when compared to systems using controlled pictures

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The intervalar arithmetic well-known as arithmetic of Moore, doesn't possess the same properties of the real numbers, and for this reason, it is confronted with a problem of operative nature, when we want to solve intervalar equations as extension of real equations by the usual equality and of the intervalar arithmetic, for this not to possess the inverse addictive, as well as, the property of the distributivity of the multiplication for the sum doesn t be valid for any triplet of intervals. The lack of those properties disables the use of equacional logic, so much for the resolution of an intervalar equation using the same, as for a representation of a real equation, and still, for the algebraic verification of properties of a computational system, whose data are real numbers represented by intervals. However, with the notion of order of information and of approach on intervals, introduced by Acióly[6] in 1991, the idea of an intervalar equation appears to represent a real equation satisfactorily, since the terms of the intervalar equation carry the information about the solution of the real equation. In 1999, Santiago proposed the notion of simple equality and, later on, local equality for intervals [8] and [33]. Based on that idea, this dissertation extends Santiago's local groups for local algebras, following the idea of Σ-algebras according to (Hennessy[31], 1988) and (Santiago[7], 1995). One of the contributions of this dissertation, is the theorem 5.1.3.2 that it guarantees that, when deducing a local Σ-equation E t t in the proposed system SDedLoc(E), the interpretations of t and t' will be locally the same in any local Σ-algebra that satisfies the group of fixed equations local E, whenever t and t have meaning in A. This assures to a kind of safety between the local equacional logic and the local algebras

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays several electronics devices support digital videos. Some examples of these devices are cellphones, digital cameras, video cameras and digital televisions. However, raw videos present a huge amount of data, millions of bits, for their representation as the way they were captured. To store them in its primary form it would be necessary a huge amount of disk space and a huge bandwidth to allow the transmission of these data. The video compression becomes essential to make possible information storage and transmission. Motion Estimation is a technique used in the video coder that explores the temporal redundancy present in video sequences to reduce the amount of data necessary to represent the information. This work presents a hardware architecture of a motion estimation module for high resolution videos according to H.264/AVC standard. The H.264/AVC is the most advanced video coder standard, with several new features which allow it to achieve high compression rates. The architecture presented in this work was developed to provide a high data reuse. The data reuse schema adopted reduces the bandwidth required to execute motion estimation. The motion estimation is the task responsible for the largest share of the gains obtained with the H.264/AVC standard so this module is essential for final video coder performance. This work is included in Rede H.264 project which aims to develop Brazilian technology for Brazilian System of Digital Television

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The World Wide Web has been consolidated over the last years as a standard platform to provide software systems in the Internet. Nowadays, a great variety of user applications are available on the Web, varying from corporate applications to the banking domain, or from electronic commerce to the governmental domain. Given the quantity of information available and the quantity of users dealing with their services, many Web systems have sought to present recommendations of use as part of their functionalities, in order to let the users to have a better usage of the services available, based on their profile, history navigation and system use. In this context, this dissertation proposes the development of an agent-based framework that offers recommendations for users of Web systems. It involves the conception, design and implementation of an object-oriented framework. The framework agents can be plugged or unplugged in a non-invasive way in existing Web applications using aspect-oriented techniques. The framework is evaluated through its instantiation to three different Web systems

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The segmentation of an image aims to subdivide it into constituent regions or objects that have some relevant semantic content. This subdivision can also be applied to videos. However, in these cases, the objects appear in various frames that compose the videos. The task of segmenting an image becomes more complex when they are composed of objects that are defined by textural features, where the color information alone is not a good descriptor of the image. Fuzzy Segmentation is a region-growing segmentation algorithm that uses affinity functions in order to assign to each element in an image a grade of membership for each object (between 0 and 1). This work presents a modification of the Fuzzy Segmentation algorithm, for the purpose of improving the temporal and spatial complexity. The algorithm was adapted to segmenting color videos, treating them as 3D volume. In order to perform segmentation in videos, conventional color model or a hybrid model obtained by a method for choosing the best channels were used. The Fuzzy Segmentation algorithm was also applied to texture segmentation by using adaptive affinity functions defined for each object texture. Two types of affinity functions were used, one defined using the normal (or Gaussian) probability distribution and the other using the Skew Divergence. This latter, a Kullback-Leibler Divergence variation, is a measure of the difference between two probability distributions. Finally, the algorithm was tested in somes videos and also in texture mosaic images composed by images of the Brodatz album