858 resultados para Electrical engineering|Computer science
Resumo:
Semi-supervised learning is one of the important topics in machine learning, concerning with pattern classification where only a small subset of data is labeled. In this paper, a new network-based (or graph-based) semi-supervised classification model is proposed. It employs a combined random-greedy walk of particles, with competition and cooperation mechanisms, to propagate class labels to the whole network. Due to the competition mechanism, the proposed model has a local label spreading fashion, i.e., each particle only visits a portion of nodes potentially belonging to it, while it is not allowed to visit those nodes definitely occupied by particles of other classes. In this way, a "divide-and-conquer" effect is naturally embedded in the model. As a result, the proposed model can achieve a good classification rate while exhibiting low computational complexity order in comparison to other network-based semi-supervised algorithms. Computer simulations carried out for synthetic and real-world data sets provide a numeric quantification of the performance of the method.
Resumo:
This paper presents a theoretical model developed for estimating the power, the optical signal to noise ratio and the number of generated carriers in a comb generator, having as a reference the minimum optical signal do noise ratio at the receiver input, for a given fiber link. Based on the recirculating frequency shifting technique, the generator relies on the use of coherent and orthogonal multi-carriers (Coherent-WDM) that makes use of a single laser source (seed) for feeding high capacity (above 100 Gb/s) systems. The theoretical model has been validated by an experimental demonstration, where 23 comb lines with an optical signal to noise ratio ranging from 25 to 33 dB, in a spectral window of similar to 3.5 nm, are obtained.
Resumo:
This paper discusses some aspects related to Wireless Sensor Networks over the IEEE 802.15.4 standard, and proposes, for the very first time, a mesh network topology with geographic routing integrated to the open Freescale protocol (SMAC - Simple Medium Access Control). For this is proposed the SMAC routing protocol. Before this work the SMAC protocol was suitable to perform one hop communications only. However, with the developed mechanisms, it is possible to use multi-hop communication. Performance results from the implemented protocol are presented and analyzed in order to define important requirements for wireless sensor networks, such as robustness, self-healing property and low latency. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Metadata is data that fully describes the data and the areas they represent, allowing the user to decide on their use as best as possible. Allow reporting on the existence of a set of data linked to specific needs. The use of metadata has the purpose of documenting and organizing a structured organizational data in order to minimize duplication of efforts to locate them and to facilitate maintenance. It also provides the administration of large amounts of data, discovery, retrieval and editing features. The global use of metadata is regulated by a technical group or task force composed of several segments such as industries, universities and research firms. Agriculture in particular is a good example for the development of typical applications using metadata is the integration of systems and equipment, allowing the implementation of techniques used in precision agriculture, the integration of different computer systems via webservices or other type of solution requires the integration of structured data. The purpose of this paper is to present an overview of the standards of metadata areas consolidated as agricultural.
Resumo:
There are some variants of the widely used Fuzzy C-Means (FCM) algorithm that support clustering data distributed across different sites. Those methods have been studied under different names, like collaborative and parallel fuzzy clustering. In this study, we offer some augmentation of the two FCM-based clustering algorithms used to cluster distributed data by arriving at some constructive ways of determining essential parameters of the algorithms (including the number of clusters) and forming a set of systematically structured guidelines such as a selection of the specific algorithm depending on the nature of the data environment and the assumptions being made about the number of clusters. A thorough complexity analysis, including space, time, and communication aspects, is reported. A series of detailed numeric experiments is used to illustrate the main ideas discussed in the study.
Resumo:
Access control is a key component of security in any computer system. In the last two decades, the research on Role Basead Access Control Models was intense. One of the most important components of a Role Based Model is the Role-Permission Relationship. In this paper, the technique of systematic mapping is used to identify, extract and analyze many approaches applied to establish the Role-Permission Relationship. The main goal of this mapping is pointing directions of significant research in the area of Role Based Access Control Models.
Resumo:
This paper presents an optimum user-steered boundary tracking approach for image segmentation, which simulates the behavior of water flowing through a riverbed. The riverbed approach was devised using the image foresting transform with a never-exploited connectivity function. We analyze its properties in the derived image graphs and discuss its theoretical relation with other popular methods such as live wire and graph cuts. Several experiments show that riverbed can significantly reduce the number of user interactions (anchor points), as compared to live wire for objects with complex shapes. This paper also includes a discussion about how to combine different methods in order to take advantage of their complementary strengths.
Resumo:
Purpose - The purpose of this paper is to develop an efficient numerical algorithm for the self-consistent solution of Schrodinger and Poisson equations in one-dimensional systems. The goal is to compute the charge-control and capacitance-voltage characteristics of quantum wire transistors. Design/methodology/approach - The paper presents a numerical formulation employing a non-uniform finite difference discretization scheme, in which the wavefunctions and electronic energy levels are obtained by solving the Schrodinger equation through the split-operator method while a relaxation method in the FTCS scheme ("Forward Time Centered Space") is used to solve the two-dimensional Poisson equation. Findings - The numerical model is validated by taking previously published results as a benchmark and then applying them to yield the charge-control characteristics and the capacitance-voltage relationship for a split-gate quantum wire device. Originality/value - The paper helps to fulfill the need for C-V models of quantum wire device. To do so, the authors implemented a straightforward calculation method for the two-dimensional electronic carrier density n(x,y). The formulation reduces the computational procedure to a much simpler problem, similar to the one-dimensional quantization case, significantly diminishing running time.
Resumo:
Telecommunications have been in constant evolution during past decades. Among the technological innovations, the use of digital technologies is very relevant. Digital communication systems have proven their efficiency and brought a new element in the chain of signal transmitting and receiving, the digital processor. This device offers to new radio equipments the flexibility of a programmable system. Nowadays, the behavior of a communication system can be modified by simply changing its software. This gave rising to a new radio model called Software Defined Radio (or Software-Defined Radio - SDR). In this new model, one moves to the software the task to set radio behavior, leaving to hardware only the implementation of RF front-end. Thus, the radio is no longer static, defined by their circuits and becomes a dynamic element, which may change their operating characteristics, such as bandwidth, modulation, coding rate, even modified during runtime according to software configuration. This article aims to present the use of GNU Radio software, an open-source solution for SDR specific applications, as a tool for development configurable digital radio.
Resumo:
This article describes a real-world production planning and scheduling problem occurring at an integrated pulp and paper mill (P&P) which manufactures paper for cardboard out of produced pulp. During the cooking of wood chips in the digester, two by-products are produced: the pulp itself (virgin fibers) and the waste stream known as black liquor. The former is then mixed with recycled fibers and processed in a paper machine. Here, due to significant sequence-dependent setups in paper type changeovers, sizing and sequencing of lots have to be made simultaneously in order to efficiently use capacity. The latter is converted into electrical energy using a set of evaporators, recovery boilers and counter-pressure turbines. The planning challenge is then to synchronize the material flow as it moves through the pulp and paper mills, and energy plant, maximizing customer demand (as backlogging is allowed), and minimizing operation costs. Due to the intensive capital feature of P&P, the output of the digester must be maximized. As the production bottleneck is not fixed, to tackle this problem we propose a new model that integrates the critical production units associated to the pulp and paper mills, and energy plant for the first time. Simple stochastic mixed integer programming based local search heuristics are developed to obtain good feasible solutions for the problem. The benefits of integrating the three stages are discussed. The proposed approaches are tested on real-world data. Our work may help P&P companies to increase their competitiveness and reactiveness in dealing with demand pattern oscillations. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The need for increasing the loading capacity of transmission lines in a traditional way, by replacing or reinforcement of the structures and foundations on routes crossing areas considered of permanent environmental preservation, may require additional works that alter the environment. The present rigorous environmental legislation turns these changes and substitution unfeasible. One way to increase the capacity of these lines is the use of new conductor technology. The aim of this paper is to discuss the needs for upgrading a transmission line and minimize or eliminate the damage to the environment by using special conductors. Because the aluminum conductor composite reinforced technology is relatively new and considering the lack of information related to its effective performance in practical system, there is a need to verify the behavior of these conductors through monitoring procedures.
Resumo:
The attributes describing a data set may often be arranged in meaningful subsets, each of which corresponds to a different aspect of the data. An unsupervised algorithm (SCAD) that simultaneously performs fuzzy clustering and aspects weighting was proposed in the literature. However, SCAD may fail and halt given certain conditions. To fix this problem, its steps are modified and then reordered to reduce the number of parameters required to be set by the user. In this paper we prove that each step of the resulting algorithm, named ASCAD, globally minimizes its cost-function with respect to the argument being optimized. The asymptotic analysis of ASCAD leads to a time complexity which is the same as that of fuzzy c-means. A hard version of the algorithm and a novel validity criterion that considers aspect weights in order to estimate the number of clusters are also described. The proposed method is assessed over several artificial and real data sets.
Resumo:
The ALRED construction is a lightweight strategy for constructing message authentication algorithms from an underlying iterated block cipher. Even though this construction's original analyses show that it is secure against some attacks, the absence of formal security proofs in a strong security model still brings uncertainty on its robustness. In this paper, aiming to give a better understanding of the security level provided by different authentication algorithms based on this design strategy, we formally analyze two ALRED variants-the MARVIN message authentication code and the LETTERSOUP authenticated-encryption scheme,-bounding their security as a function of the attacker's resources and of the underlying cipher's characteristics.
Resumo:
There is a wide range of telecommunications services that transmit voice, video and data through complex transmission networks and in some cases, the service has not an acceptable quality level for the end user. In this sense the study of methods for assessing video quality and voice have a very important role. This paper presents a classification scheme, based on different criteria, of the methods and metrics that are being studied in recent years. This paper presents how the video quality is affected by degradation in the transmission channel in two kinds of services: Digital TV (ISDB-TB) due the fading in the air interface and video streaming service on an IP network due packet loss. For Digital TV tests was set up a scenario where the digital TV transmitter is connected to an RF channel emulator, where are inserted different fading models and at the end, the videos are saved in a mobile device. The tests of streaming video were performed in an isolated scenario of IP network, which are scheduled several network conditions, resulting in different qualities of video reception. The video quality assessment is performed using objective assessment methods: PSNR, SSIM and VQM. The results show how the losses in the transmission channel affects the quality of end-user experience on both services studied.
Resumo:
Ubiquitous Computing promises seamless access to a wide range of applications and Internet based services from anywhere, at anytime, and using any device. In this scenario, new challenges for the practice of software development arise: Applications and services must keep a coherent behavior, a proper appearance, and must adapt to a plenty of contextual usage requirements and hardware aspects. Especially, due to its interactive nature, the interface content of Web applications must adapt to a large diversity of devices and contexts. In order to overcome such obstacles, this work introduces an innovative methodology for content adaptation of Web 2.0 interfaces. The basis of our work is to combine static adaption - the implementation of static Web interfaces; and dynamic adaptation - the alteration, during execution time, of static interfaces so as for adapting to different contexts of use. In hybrid fashion, our methodology benefits from the advantages of both adaptation strategies - static and dynamic. In this line, we designed and implemented UbiCon, a framework over which we tested our concepts through a case study and through a development experiment. Our results show that the hybrid methodology over UbiCon leads to broader and more accessible interfaces, and to faster and less costly software development. We believe that the UbiCon hybrid methodology can foster more efficient and accurate interface engineering in the industry and in the academy.