97 resultados para Internet (Redes de computação) - Negócios

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last decade, characterized by the vertiginous growth of the computers worldwide net, brought radical changes in the use of the information and communication. Internet s use at business world has been largely studied; however, few are the researches about the academic use of this technology, mainly if we take into consideration institutions of technologic education. In this context, this research made an analysis of internet s use in a technologic education institution in Brazil, analyzing, in particular, the Centro Federal de Educação Tecnológica do Rio Grande do Norte CEFET/RN for that standard use of this Information Technology (IT) tools and, at the same time, studying the determinant factors of this use. To reach the considered objectives, a survey research was effected, be given data collected daily through the research s questionnaire application to 150 teachers who answered a set of closed and scaled questions. The quantitative data were qualitatively analyzed, arriving a some significant results related to the standard use and the factors that influenced in the use of these Internet technologies, like: the age scale, the exposition s to the computer level, the area of academic graduation, the area of knowledge where acts and the title, exert significant influence in the academic use of Internet between the professors

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the two last decades of the past century, following the consolidation of the Internet as the world-wide computer network, applications generating more robust data flows started to appear. The increasing use of videoconferencing stimulated the creation of a new form of point-to-multipoint transmission called IP Multicast. All companies working in the area of software and the hardware development for network videoconferencing have adjusted their products as well as developed new solutionsfor the use of multicast. However the configuration of such different solutions is not easy done, moreover when changes in the operational system are also requirede. Besides, the existing free tools have limited functions, and the current comercial solutions are heavily dependent on specific platforms. Along with the maturity of IP Multicast technology and with its inclusion in all the current operational systems, the object-oriented programming languages had developed classes able to handle multicast traflic. So, with the help of Java APIs for network, data bases and hipertext, it became possible to the develop an Integrated Environment able to handle multicast traffic, which is the major objective of this work. This document describes the implementation of the above mentioned environment, which provides many functions to use and manage multicast traffic, functions which existed only in a limited way and just in few tools, normally the comercial ones. This environment is useful to different kinds of users, so that it can be used by common users, who want to join multimedia Internet sessions, as well as more advenced users such engineers and network administrators who may need to monitor and handle multicast traffic

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New versions of SCTP protocol allow the implementation of handover procedures in the transport layer, as well as the supply of a partially reliable communication service. A communication architecture is proposed herein, integrating SCTP with the session initiation protocol, SIP, besides additional protocols. This architecture is intended to handle voice applications over IP networks with mobility requirements. User localization procedures are specified in the application layer as well, using SIP, as an alternative mean to the mechanisms used by traditional protocols, that support mobility in the network layer. The SDL formal specification language is used to specify the operation of a control module, which coordinates the operation of the system component protocols. This formal specification is intended to prevent ambiguities and inconsistencies in the definition of this module, assisting in the correct implementation of the elements of this architecture

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In The paradoxical happiness , Gilles Lipovetsky elects five major paradigmatic models that command the pleasure and happiness in our societies. Starting with the paradigmatic models of penia (where it is emphasized the existential dissatisfaction supplied by the consumption and where advertising has a special place, bombarding consumers and creating consumer needs, in addition to selling a lifestyle rather than the products themselves), and narcissus (model constructed on the basis of self-exaltation and abdication of the social and political) intends to examine the relationship between the consumption exercised by young people and the advertising displayed on social networking sites, focusing on the social media Facebook, observing the virtual fan pages of the following brands: Coca-Cola; Pepsi; BlackBerry, Nokia, Riachuelo and C&A and their relationships with their consumers

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Apresentamos um sistema implementado em Linux® com o intuito de proteger redes contendo estações de trabalho Windows® contra agentes maliciosos. O sistema, denominado LIV - Linux® Integrated Viruswall, agrega características existentes em outras soluções e acrescenta novas funcionalidades. Uma das funcionalidades implementadas é a capacidade de detecção de estações de trabalho contaminadas tendo como base a análise do tráfego de rede. Outra é o uso de uma técnica denominada compartilhamento armadilha para identificar agentes maliciosos em propagação na rede local. Uma vez detectado um foco de contaminação, o LIV é capaz de isolá-lo da rede, contendo a difusão do agente malicioso. Resultados obtidos pelo LIV na proteção de uma rede corporativa demonstram a eficácia da análise do tráfego de rede como instrumento de detecção de agentes maliciosos, especialmente quando comparada a mecanismos tradicionais de detecção, baseados exclusivamente em assinaturas digitais de códigos maliciosos

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A Internet atual vem sofrendo vários problemas em termos de escalabilidade, desempenho, mobilidade, etc., devido ao vertiginoso incremento no número de usuários e o surgimento de novos serviços com novas demandas, propiciando assim o nascimento da Internet do Futuro. Novas propostas sobre redes orientadas a conteúdo, como a arquitetura Entidade Titulo (ETArch), proveem novos serviços para este tipo de cenários, implementados sobre o paradigma de redes definidas por software. Contudo, o modelo de transporte do ETArch é equivalente ao modelo best-effort da Internet atual, e vem limitando a confiabilidade das suas comunicações. Neste trabalho, ETArch é redesenhado seguindo o paradigma do sobreaprovisionamento de recursos para conseguir uma alocação de recursos avançada integrada com OpenFlow. Como resultado, o framework SMART (Suporte de Sessões Móveis com Alta Demanda de Recursos de Transporte), permite que a rede defina semanticamente os requisitos qualitativos das sessões para assim gerenciar o controle de Qualidade de Serviço visando manter a melhor Qualidade de Experiência possível. A avaliação do planos de dados e de controle teve lugar na plataforma de testes na ilha do projeto OFELIA, mostrando o suporte de aplicações móveis multimídia com alta demanda de recursos de transporte com QoS e QoE garantidos através de um esquema de sinalização restrito em comparação com o ETArch legado

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Through numerous technological advances in recent years along with the popularization of computer devices, the company is moving towards a paradigm “always connected”. Computer networks are everywhere and the advent of IPv6 paves the way for the explosion of the Internet of Things. This concept enables the sharing of data between computing machines and objects of day-to-day. One of the areas placed under Internet of Things are the Vehicular Networks. However, the information generated individually for a vehicle has no large amount and does not contribute to an improvement in transit, once information has been isolated. This proposal presents the Infostructure, a system that has to facilitate the efforts and reduce costs for development of applications context-aware to high-level semantic for the scenario of Internet of Things, which allows you to manage, store and combine the data in order to generate broader context. To this end we present a reference architecture, which aims to show the major components of the Infostructure. Soon after a prototype is presented which is used to validate our work reaches the level of contextualization desired high level semantic as well as a performance evaluation, which aims to evaluate the behavior of the subsystem responsible for managing contextual information on a large amount of data. After statistical analysis is performed with the results obtained in the evaluation. Finally, the conclusions of the work and some problems such as no assurance as to the integrity of the sensory data coming Infostructure, and future work that takes into account the implementation of other modules so that we can conduct tests in real environments are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The increasing demand for Internet data traffic in wireless broadband access networks requires both the development of efficient, novel wireless broadband access technologies and the allocation of new spectrum bands for that purpose. The introduction of a great number of small cells in cellular networks allied to the complimentary adoption of Wireless Local Area Network (WLAN) technologies in unlicensed spectrum is one of the most promising concepts to attend this demand. One alternative is the aggregation of Industrial, Science and Medical (ISM) unlicensed spectrum to licensed bands, using wireless networks defined by Institute of Electrical and Electronics Engineers (IEEE) and Third Generation Partnership Project (3GPP). While IEEE 802.11 (Wi-Fi) networks are aggregated to Long Term Evolution (LTE) small cells via LTE / WLAN Aggregation (LWA), in proposals like Unlicensed LTE (LTE-U) and LWA the LTE air interface itself is used for transmission on the unlicensed band. Wi-Fi technology is widespread and operates in the same 5 GHz ISM spectrum bands as the LTE proposals, which may bring performance decrease due to the coexistence of both technologies in the same spectrum bands. Besides, there is the need to improve Wi-Fi operation to support scenarios with a large number of neighbor Overlapping Basic Subscriber Set (OBSS) networks, with a large number of Wi-Fi nodes (i.e. dense deployments). It is long known that the overall Wi-Fi performance falls sharply with the increase of Wi-Fi nodes sharing the channel, therefore there is the need for introducing mechanisms to increase its spectral efficiency. This work is dedicated to the study of coexistence between different wireless broadband access systems operating in the same unlicensed spectrum bands, and how to solve the coexistence problems via distributed coordination mechanisms. The problem of coexistence between different networks (i.e. LTE and Wi-Fi) and the problem of coexistence between different networks of the same technology (i.e. multiple Wi-Fi OBSSs) is analyzed both qualitatively and quantitatively via system-level simulations, and the main issues to be faced are identified from these results. From that, distributed coordination mechanisms are proposed and evaluated via system-level simulations, both for the inter-technology coexistence problem and intra-technology coexistence problem. Results indicate that the proposed solutions provide significant gains when compare to the situation without distributed coordination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Internet é uma tecnologia que revolucionou o mundo, criando novas formas de interação entre pessoas, organizações e negócios. O setor hoteleiro é um segmento que muito tem se beneficiado dos serviços suportados pela Internet. O objetivo do estudo é identificar os diferentes fatores que influenciam ao uso da Internet sob três dimensões: individual, organizacional e ambiental. Um modelo conceitual foi postulado contendo nove variáveis independentes sobre duas variáveis dependentes, relativas ao padrão de uso da Internet. Os dados foram coletados junto a 52 hotéis localizados no litoral do Recife – PE. O resultado da análise inferencial dos dados mostrou um padrão diferenciado de uso da Internet nos hotéis de pequeno, médio e grande porte e como os fatores acima descritos podem ser mais bem explorados a fim de se atingir um eficiente padrão de uso, aumentando suas posições competitivas. Baseadas na análise e resultados obtidos do estudo, são esboçadas algumas recomendações e implicações para futuras pesquisas. ABSTRACT:The Internet technology has revolutionized the world, creating new forms of interaction among people, organizations and businesses. The hotel sector has reaped many benefits from services supported by the Internet. The object of this study is to explore different factors that influence the adoption of the Internet in three areas: individual, organizational and environment. A conceptual framework was advanced containing nine independent variables and two dependent variables related to the usage of the Internet. Data was collected from 52 hotels located along the coast of Recife, PE, Brazil. Analysis of the data has demonstrated the Internet use in small, medium and large size hotels. Some attributes of the Internet usage could be better utilized by owners and managers in order to achieve a more efficient pattern of use, improving their competitive position. Based on the findings obtained from the study, some recommendations and implications for future research are advanced

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forecast is the basis for making strategic, tactical and operational business decisions. In financial economics, several techniques have been used to predict the behavior of assets over the past decades.Thus, there are several methods to assist in the task of time series forecasting, however, conventional modeling techniques such as statistical models and those based on theoretical mathematical models have produced unsatisfactory predictions, increasing the number of studies in more advanced methods of prediction. Among these, the Artificial Neural Networks (ANN) are a relatively new and promising method for predicting business that shows a technique that has caused much interest in the financial environment and has been used successfully in a wide variety of financial modeling systems applications, in many cases proving its superiority over the statistical models ARIMA-GARCH. In this context, this study aimed to examine whether the ANNs are a more appropriate method for predicting the behavior of Indices in Capital Markets than the traditional methods of time series analysis. For this purpose we developed an quantitative study, from financial economic indices, and developed two models of RNA-type feedfoward supervised learning, whose structures consisted of 20 data in the input layer, 90 neurons in one hidden layer and one given as the output layer (Ibovespa). These models used backpropagation, an input activation function based on the tangent sigmoid and a linear output function. Since the aim of analyzing the adherence of the Method of Artificial Neural Networks to carry out predictions of the Ibovespa, we chose to perform this analysis by comparing results between this and Time Series Predictive Model GARCH, developing a GARCH model (1.1).Once applied both methods (ANN and GARCH) we conducted the results' analysis by comparing the results of the forecast with the historical data and by studying the forecast errors by the MSE, RMSE, MAE, Standard Deviation, the Theil's U and forecasting encompassing tests. It was found that the models developed by means of ANNs had lower MSE, RMSE and MAE than the GARCH (1,1) model and Theil U test indicated that the three models have smaller errors than those of a naïve forecast. Although the ANN based on returns have lower precision indicator values than those of ANN based on prices, the forecast encompassing test rejected the hypothesis that this model is better than that, indicating that the ANN models have a similar level of accuracy . It was concluded that for the data series studied the ANN models show a more appropriate Ibovespa forecasting than the traditional models of time series, represented by the GARCH model

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study investigates Internet as technological interaction in the school environment as a resource of the teaching-learning process. It aims to discuss the lack of synchronicity between proposals of educational access for Internet use and types of access and interaction applied by youngsters. For the development of this research, I resorted to a qualitative, descriptive and explanatory research focused on a group whose subjects are youngsters from eleven to fifteen years of age in a catholic school which belongs to a group of private teaching schools in Natal city, Rio Grande do Norte state. As methodological option it focus on a group and the observation of its participation, discourse analysis and ethnography, considering facts and data of the pedagogical practice concerning the focused theme, besides the attempt to know the youngsters everyday at school and the relationship between them and juvenile cultures. It recognizes the existence of two moments of the focused group: the first related to internet use like technological interaction; the second concerns to the way Internet is problematic as technological interaction in classroom learning. While contacting with youngsters, the study discusses the concepts of Media Environments, Culture, Identity, Network, Consumption and Citizenship. It recognizes that it is relevant for the school to consider Internet a pedagogical tool, directed not just at research, but mostly as learning environment and as learning construction in a collaborative way. It points out the need of approach between school and media environment, reevaluating the pedagogical practice, offering a new evaluation proposal (self-evaluation). It suggests a renewal in the teacher's pedagogical practice in the classroom and using Internet, valuing the connection between technological interaction and communication as motivation elements of student s learning construction and their effective participation in decisions involving citizenship. It gives priority to educational work directed at the establishment of dialogic relationship between codes, learning and contents, leading to the new findings domain in the media environment, enabling the development of abilities and performances directed at the recognition and consumption of information from a critical reading of the media

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study discusses the use of information technologies for knowledge management in networks of franchises in the Rio Grande do Norte/Brazil, whose management and operation are complex activities, characterized by the geographic spread of their network unities, creating barriers to communication and information sharing between franchisors, franchisees and final customers. In view of this, the following hypotheses were formulated: the knowledge management can be a positive alternative for improving communication between units; and information technology can eliminate many problems related mainly to capture and share knowledge. In general, it aims to investigate, in qualitative and quantitative aspects, how information technology can support knowledge management in networks of franchises. Specifically purposes to register the existence of managerial practices related to knowledge management in enterprises at the franchising sector; to verify whether they have the technological resources with the potential to facilitate the sharing of information; to identify what are the technologies of information and communication used in the organizational environment; and suggest measures that will facilitate the process of organizational learning, using information technology and communication as tools. It concludes that knowledge management becomes a positive alternative, especially in strengthening of bonds of communication and sharing of knowledge between the franchises. In this regard, information technology must provide all the services of the corporation to facilitate communication between franchisor and franchisee, through a single and integrated system. However, they still show unsuitable for more sophisticated technology platforms

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The progresses of the Internet and telecommunications have been changing the concepts of Information Technology IT, especially with regard to outsourcing services, where organizations seek cost-cutting and a better focus on the business. Along with the development of that outsourcing, a new model named Cloud Computing (CC) evolved. It proposes to migrate to the Internet both data processing and information storing. Among the key points of Cloud Computing are included cost-cutting, benefits, risks and the IT paradigms changes. Nonetheless, the adoption of that model brings forth some difficulties to decision-making, by IT managers, mainly with regard to which solutions may go to the cloud, and which service providers are more appropriate to the Organization s reality. The research has as its overall aim to apply the AHP Method (Analytic Hierarchic Process) to decision-making in Cloud Computing. There to, the utilized methodology was the exploratory kind and a study of case applied to a nationwide organization (Federation of Industries of RN). The data collection was performed through two structured questionnaires answered electronically by IT technicians, and the company s Board of Directors. The analysis of the data was carried out in a qualitative and comparative way, and we utilized the software to AHP method called Web-Hipre. The results we obtained found the importance of applying the AHP method in decision-making towards the adoption of Cloud Computing, mainly because on the occasion the research was carried out the studied company already showed interest and necessity in adopting CC, considering the internal problems with infrastructure and availability of information that the company faces nowadays. The organization sought to adopt CC, however, it had doubt regarding the cloud model and which service provider would better meet their real necessities. The application of the AHP, then, worked as a guiding tool to the choice of the best alternative, which points out the Hybrid Cloud as the ideal choice to start off in Cloud Computing. Considering the following aspects: the layer of Infrastructure as a Service IaaS (Processing and Storage) must stay partly on the Public Cloud and partly in the Private Cloud; the layer of Platform as a Service PaaS (Software Developing and Testing) had preference for the Private Cloud, and the layer of Software as a Service - SaaS (Emails/Applications) divided into emails to the Public Cloud and applications to the Private Cloud. The research also identified the important factors to hiring a Cloud Computing provider

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Post dispatch analysis of signals obtained from digital disturbances registers provide important information to identify and classify disturbances in systems, looking for a more efficient management of the supply. In order to enhance the task of identifying and classifying the disturbances - providing an automatic assessment - techniques of digital signal processing can be helpful. The Wavelet Transform has become a very efficient tool for the analysis of voltage or current signals, obtained immediately after disturbance s occurrences in the network. This work presents a methodology based on the Discrete Wavelet Transform to implement this process. It uses a comparison between distribution curves of signals energy, with and without disturbance. This is done for different resolution levels of its decomposition in order to obtain descriptors that permit its classification, using artificial neural networks

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The usual programs for load flow calculation were in general developped aiming the simulation of electric energy transmission, subtransmission and distribution systems. However, the mathematical methods and algorithms used by the formulations were based, in majority, just on the characteristics of the transmittion systems, which were the main concern focus of engineers and researchers. Though, the physical characteristics of these systems are quite different from the distribution ones. In the transmission systems, the voltage levels are high and the lines are generally very long. These aspects contribute the capacitive and inductive effects that appear in the system to have a considerable influence in the values of the interest quantities, reason why they should be taken into consideration. Still in the transmission systems, the loads have a macro nature, as for example, cities, neiborhoods, or big industries. These loads are, generally, practically balanced, what reduces the necessity of utilization of three-phase methodology for the load flow calculation. Distribution systems, on the other hand, present different characteristics: the voltage levels are small in comparison to the transmission ones. This almost annul the capacitive effects of the lines. The loads are, in this case, transformers, in whose secondaries are connected small consumers, in a sort of times, mono-phase ones, so that the probability of finding an unbalanced circuit is high. This way, the utilization of three-phase methodologies assumes an important dimension. Besides, equipments like voltage regulators, that use simultaneously the concepts of phase and line voltage in their functioning, need a three-phase methodology, in order to allow the simulation of their real behavior. For the exposed reasons, initially was developped, in the scope of this work, a method for three-phase load flow calculation in order to simulate the steady-state behaviour of distribution systems. Aiming to achieve this goal, the Power Summation Algorithm was used, as a base for developing the three phase method. This algorithm was already widely tested and approved by researchers and engineers in the simulation of radial electric energy distribution systems, mainly for single-phase representation. By our formulation, lines are modeled in three-phase circuits, considering the magnetic coupling between the phases; but the earth effect is considered through the Carson reduction. It s important to point out that, in spite of the loads being normally connected to the transformer s secondaries, was considered the hypothesis of existence of star or delta loads connected to the primary circuit. To perform the simulation of voltage regulators, a new model was utilized, allowing the simulation of various types of configurations, according to their real functioning. Finally, was considered the possibility of representation of switches with current measuring in various points of the feeder. The loads are adjusted during the iteractive process, in order to match the current in each switch, converging to the measured value specified by the input data. In a second stage of the work, sensibility parameters were derived taking as base the described load flow, with the objective of suporting further optimization processes. This parameters are found by calculating of the partial derivatives of a variable in respect to another, in general, voltages, losses and reactive powers. After describing the calculation of the sensibility parameters, the Gradient Method was presented, using these parameters to optimize an objective function, that will be defined for each type of study. The first one refers to the reduction of technical losses in a medium voltage feeder, through the installation of capacitor banks; the second one refers to the problem of correction of voltage profile, through the instalation of capacitor banks or voltage regulators. In case of the losses reduction will be considered, as objective function, the sum of the losses in all the parts of the system. To the correction of the voltage profile, the objective function will be the sum of the square voltage deviations in each node, in respect to the rated voltage. In the end of the work, results of application of the described methods in some feeders are presented, aiming to give insight about their performance and acuity