73 resultados para COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Resumo:
Metadata is data that fully describes the data and the areas they represent, allowing the user to decide on their use as best as possible. Allow reporting on the existence of a set of data linked to specific needs. The use of metadata has the purpose of documenting and organizing a structured organizational data in order to minimize duplication of efforts to locate them and to facilitate maintenance. It also provides the administration of large amounts of data, discovery, retrieval and editing features. The global use of metadata is regulated by a technical group or task force composed of several segments such as industries, universities and research firms. Agriculture in particular is a good example for the development of typical applications using metadata is the integration of systems and equipment, allowing the implementation of techniques used in precision agriculture, the integration of different computer systems via webservices or other type of solution requires the integration of structured data. The purpose of this paper is to present an overview of the standards of metadata areas consolidated as agricultural.
Resumo:
Synchronous telecommunication networks, distributed control systems and integrated circuits have its accuracy of operation dependent on the existence of a reliable time basis signal extracted from the line data stream and acquirable to each node. In this sense, the existence of a sub-network (inside the main network) dedicated to the distribution of the clock signals is crucially important. There are different solutions for the architecture of the time distribution sub-network and choosing one of them depends on cost, precision, reliability and operational security. In this work we expose: (i) the possible time distribution networks and their usual topologies and arrangements. (ii) How parameters of the network nodes can affect the reachability and stability of the synchronous state of a network. (iii) Optimizations methods for synchronous networks which can provide low cost architectures with operational precision, reliability and security. (C) 2011 Elsevier B. V. All rights reserved.
Resumo:
Telecommunications have been in constant evolution during past decades. Among the technological innovations, the use of digital technologies is very relevant. Digital communication systems have proven their efficiency and brought a new element in the chain of signal transmitting and receiving, the digital processor. This device offers to new radio equipments the flexibility of a programmable system. Nowadays, the behavior of a communication system can be modified by simply changing its software. This gave rising to a new radio model called Software Defined Radio (or Software-Defined Radio - SDR). In this new model, one moves to the software the task to set radio behavior, leaving to hardware only the implementation of RF front-end. Thus, the radio is no longer static, defined by their circuits and becomes a dynamic element, which may change their operating characteristics, such as bandwidth, modulation, coding rate, even modified during runtime according to software configuration. This article aims to present the use of GNU Radio software, an open-source solution for SDR specific applications, as a tool for development configurable digital radio.
Resumo:
An optimal control strategy for the highly active antiretroviral therapy associated to the acquired immunodeficiency syndrome should be designed regarding a comprehensive analysis of the drug chemotherapy behavior in the host tissues, from major viral replication sites to viral sanctuary compartments. Such approach is critical in order to efficiently explore synergistic, competitive and prohibitive relationships among drugs and, hence, therapy costs and side-effect minimization. In this paper, a novel mathematical model for HIV-1 drug chemotherapy dynamics in distinct host anatomic compartments is proposed and theoretically evaluated on fifteen conventional anti-retroviral drugs. Rather than interdependence between drug type and its concentration profile in a host tissue, simulated results suggest that such profile is importantly correlated with the host tissue under consideration. Furthermore, the drug accumulative dynamics are drastically affected by low patient compliance with pharmacotherapy, even when a single dose lacks. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
We review recent visualization techniques aimed at supporting tasks that require the analysis of text documents, from approaches targeted at visually summarizing the relevant content of a single document to those aimed at assisting exploratory investigation of whole collections of documents.Techniques are organized considering their target input materialeither single texts or collections of textsand their focus, which may be at displaying content, emphasizing relevant relationships, highlighting the temporal evolution of a document or collection, or helping users to handle results from a query posed to a search engine.We describe the approaches adopted by distinct techniques and briefly review the strategies they employ to obtain meaningful text models, discuss how they extract the information required to produce representative visualizations, the tasks they intend to support and the interaction issues involved, and strengths and limitations. Finally, we show a summary of techniques, highlighting their goals and distinguishing characteristics. We also briefly discuss some open problems and research directions in the fields of visual text mining and text analytics.
Resumo:
Turbulence is one of the key problems of classical physics, and it has been the object of intense research in the last decades in a large spectrum of problems involving fluids, plasmas, and waves. In order to review some advances in theoretical and experimental investigations on turbulence a mini-symposium on this subject was organized in the Dynamics Days South America 2010 Conference. The main goal of this mini-symposium was to present recent developments in both fundamental aspects and dynamical analysis of turbulence in nonlinear waves and fusion plasmas. In this paper we present a summary of the works presented at this mini-symposium. Among the questions to be addressed were the onset and control of turbulence and spatio-temporal chaos. (C) 2011 Elsevier B. V. All rights reserved.
Resumo:
We introduce a five-parameter continuous model, called the McDonald inverted beta distribution, to extend the two-parameter inverted beta distribution and provide new four- and three-parameter sub-models. We give a mathematical treatment of the new distribution including expansions for the density function, moments, generating and quantile functions, mean deviations, entropy and reliability. The model parameters are estimated by maximum likelihood and the observed information matrix is derived. An application of the new model to real data shows that it can give consistently a better fit than other important lifetime models. (C) 2012 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
Resumo:
Recently, many chaos-based communication systems have been proposed. They can present the many interesting properties of spread spectrum modulations. Besides, they can represent a low-cost increase in security. However, their major drawback is to have a Bit Error Rate (BER) general performance worse than their conventional counterparts. In this paper, we review some innovative techniques that can be used to make chaos-based communication systems attain lower levels of BER in non-ideal environments. In particular, we succinctly describe techniques to counter the effects of finite bandwidth, additive noise and delay in the communication channel. Although much research is necessary for chaos-based communication competing with conventional techniques, the presented results are auspicious. (C) 2011 Elsevier B. V. All rights reserved.
Resumo:
Since the mid 1980s the Atomic Force Microscope is one the most powerful tools to perform surface investigation, and since 1995 Non-Contact AFM achieved true atomic resolution. The Frequency-Modulated Atomic Force Microscope (FM-AFM) operates in the dynamic mode, which means that the control system of the FM-AFM must force the micro-cantilever to oscillate with constant amplitude and frequency. However, tip-sample interaction forces cause modulations in the microcantilever motion. A Phase-Locked loop (PLL) is used to demodulate the tip-sample interaction forces from the microcantilever motion. The demodulated signal is used as the feedback signal to the control system, and to generate both topographic and dissipation images. As a consequence, a proper design of the PLL is vital to the FM-AFM performance. In this work, using bifurcation analysis, the lock-in range of the PLL is determined as a function of the frequency shift (Q) of the microcantilever and of the other design parameters, providing a technique to properly design the PLL in the FM-AFM system. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
We review symplectic nontwist maps that we have introduced to describe Lagrangian transport properties in magnetically confined plasmas in tokamaks. These nontwist maps are suitable to describe the formation and destruction of transport barriers in the shearless region (i.e., near the curve where the twist condition does not hold). The maps can be used to investigate two kinds of problems in plasmas with non-monotonic field profiles: the first is the chaotic magnetic field line transport in plasmas with external resonant perturbations. The second problem is the chaotic particle drift motion caused by electrostatic drift waves. The presented analytical maps, derived from plasma models with equilibrium field profiles and control parameters that are commonly measured in plasma discharges, can be used to investigate long-term transport properties. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
With the financial market globalization, foreign investments became vital for the economies, mainly in emerging countries. In the last decades, Brazilian exchange rates appeared as a good indicator to measure either investors' confidence or risk aversion. Here, some events of global or national financial crisis are analyzed, trying to understand how they influenced the "dollar-real" rate evolution. The theoretical tool to be used is the Lopez-Mancini-Calbet (LMC) complexity measure that, applied to real exchange rate data, has shown good fitness between critical events and measured patterns. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Ubiquitous Computing promises seamless access to a wide range of applications and Internet based services from anywhere, at anytime, and using any device. In this scenario, new challenges for the practice of software development arise: Applications and services must keep a coherent behavior, a proper appearance, and must adapt to a plenty of contextual usage requirements and hardware aspects. Especially, due to its interactive nature, the interface content of Web applications must adapt to a large diversity of devices and contexts. In order to overcome such obstacles, this work introduces an innovative methodology for content adaptation of Web 2.0 interfaces. The basis of our work is to combine static adaption - the implementation of static Web interfaces; and dynamic adaptation - the alteration, during execution time, of static interfaces so as for adapting to different contexts of use. In hybrid fashion, our methodology benefits from the advantages of both adaptation strategies - static and dynamic. In this line, we designed and implemented UbiCon, a framework over which we tested our concepts through a case study and through a development experiment. Our results show that the hybrid methodology over UbiCon leads to broader and more accessible interfaces, and to faster and less costly software development. We believe that the UbiCon hybrid methodology can foster more efficient and accurate interface engineering in the industry and in the academy.
Resumo:
Consider a communication system in which a transmitter equipment sends fixed-size packets of data at a uniform rate to a receiver equipment. Consider also that these equipments are connected by a packet-switched network, which introduces a random delay to each packet. Here we propose an adaptive clock recovery scheme able of synchronizing the frequencies and the phases of these devices, within specified limits of precision. This scheme for achieving frequency and phase synchronization is based on measurements of the packet arrival times at the receiver, which are used to control the dynamics of a digital phase-locked loop. The scheme performance is evaluated via numerical simulations performed by using realistic parameter values. (C) 2011 Elsevier By. All rights reserved.
Resumo:
The hero's journey is a narrative structure identified by several authors in comparative studies on folklore and mythology. This storytelling template presents the stages of inner metamorphosis undergone by the protagonist after being called to an adventure. In a simplified version, this journey is divided into three acts separated by two crucial moments. Here we propose a discrete-time dynamical system for representing the protagonist's evolution. The suffering along the journey is taken as the control parameter of this system. The bifurcation diagram exhibits stationary, periodic and chaotic behaviors. In this diagram, there are transition from fixed point to chaos and transition from limit cycle to fixed point. We found that the values of the control parameter corresponding to these two transitions are in quantitative agreement with the two critical moments of the three-act hero's journey identified in 10 movies appearing in the list of the 200 worldwide highest-grossing films. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
XML similarity evaluation has become a central issue in the database and information communities, its applications ranging over document clustering, version control, data integration and ranked retrieval. Various algorithms for comparing hierarchically structured data, XML documents in particular, have been proposed in the literature. Most of them make use of techniques for finding the edit distance between tree structures, XML documents being commonly modeled as Ordered Labeled Trees. Yet, a thorough investigation of current approaches led us to identify several similarity aspects, i.e., sub-tree related structural and semantic similarities, which are not sufficiently addressed while comparing XML documents. In this paper, we provide an integrated and fine-grained comparison framework to deal with both structural and semantic similarities in XML documents (detecting the occurrences and repetitions of structurally and semantically similar sub-trees), and to allow the end-user to adjust the comparison process according to her requirements. Our framework consists of four main modules for (i) discovering the structural commonalities between sub-trees, (ii) identifying sub-tree semantic resemblances, (iii) computing tree-based edit operations costs, and (iv) computing tree edit distance. Experimental results demonstrate higher comparison accuracy with respect to alternative methods, while timing experiments reflect the impact of semantic similarity on overall system performance.