818 resultados para LDPC, CUDA, GPGPU, computing, GPU, DVB, S2, SDR
Resumo:
Non-conventional database management systems are used to achieve a better performance when dealing with complex data. One fundamental concept of these systems is object identity (OID). Two techniques can be used for the implementation of OIDs: physical or logical. A logical implementation of OIDs, based on an Indirection Table, is used by NuGeM, a multimedia data manager kernel which is described in this paper. NuGeM Indirection Table allows the relocation of all pages in a database. The proposed strategy modifies the workings of this table so that it is possible to reduce considerably the number of I/O operations during the request and release of pages containing objects and their OIDs. Tests show a reduction of 84% in reading operations and a 67% reduction in writing operations when pages are requested. Although no changes were observed in writing operations during the release of pages, a 100% of reduction in reading operations was obtained. © 2012 IEEE.
Resumo:
Transactional memory (TM) is a new synchronization mechanism devised to simplify parallel programming, thereby helping programmers to unleash the power of current multicore processors. Although software implementations of TM (STM) have been extensively analyzed in terms of runtime performance, little attention has been paid to an equally important constraint faced by nearly all computer systems: energy consumption. In this work we conduct a comprehensive study of energy and runtime tradeoff sin software transactional memory systems. We characterize the behavior of three state-of-the-art lock-based STM algorithms, along with three different conflict resolution schemes. As a result of this characterization, we propose a DVFS-based technique that can be integrated into the resolution policies so as to improve the energy-delay product (EDP). Experimental results show that our DVFS-enhanced policies are indeed beneficial for applications with high contention levels. Improvements of up to 59% in EDP can be observed in this scenario, with an average EDP reduction of 16% across the STAMP workloads. © 2012 IEEE.
Resumo:
The main concern in Wireless Sensor Networks (WSN) algorithms and protocols are the energy consumption. Thus, the WSN lifetime is one of the most important metric used to measure the performance of the WSN approaches. Another important metric is the WSN spatial coverage, where the main goal is to obtain sensed data in a uniform way. This paper has proposed an approach called (m,k)-Gur Game that aims a trade-off between quality of service and the increasement of spatial coverage diversity. Simulation results have shown the effectiveness of this approach. © 2012 IEEE.
Resumo:
Constrained intervals, intervals as a mapping from [0, 1] to polynomials of degree one (linear functions) with non-negative slopes, and arithmetic on constrained intervals generate a space that turns out to be a cancellative abelian monoid albeit with a richer set of properties than the usual (standard) space of interval arithmetic. This means that not only do we have the classical embedding as developed by H. Radström, S. Markov, and the extension of E. Kaucher but the properties of these polynomials. We study the geometry of the embedding of intervals into a quasilinear space and some of the properties of the mapping of constrained intervals into a space of polynomials. It is assumed that the reader is familiar with the basic notions of interval arithmetic and interval analysis. © 2013 Springer-Verlag Berlin Heidelberg.
Resumo:
Identification and classification of overlapping nodes in networks are important topics in data mining. In this paper, a network-based (graph-based) semi-supervised learning method is proposed. It is based on competition and cooperation among walking particles in a network to uncover overlapping nodes by generating continuous-valued outputs (soft labels), corresponding to the levels of membership from the nodes to each of the communities. Moreover, the proposed method can be applied to detect overlapping data items in a data set of general form, such as a vector-based data set, once it is transformed to a network. Usually, label propagation involves risks of error amplification. In order to avoid this problem, the proposed method offers a mechanism to identify outliers among the labeled data items, and consequently prevents error propagation from such outliers. Computer simulations carried out for synthetic and real-world data sets provide a numeric quantification of the performance of the method. © 2012 Springer-Verlag.
Resumo:
The present paper proposes a new hybrid multi-population genetic algorithm (HMPGA) as an approach to solve the multi-level capacitated lot sizing problem with backlogging. This method combines a multi-population based metaheuristic using fix-and-optimize heuristic and mathematical programming techniques. A total of four test sets from the MULTILSB (Multi-Item Lot-Sizing with Backlogging) library are solved and the results are compared with those reached by two other methods recently published. The results have shown that HMPGA had a better performance for most of the test sets solved, specially when longer computing time is given. © 2012 Elsevier Ltd.
Resumo:
The daily-to-day of medical practice is marked by a constant search for an accurate diagnosis and therapeutic assessment. For this purpose the doctor serves up a wide variety of imaging techniques, however, the methods using ionizing radiation still the most widely used because it is considered cheaper and above all very efficient when used with control and quality. The optimization of the risk-benefit ratio is considered a major breakthrough in relation to conventional radiology, though this is not the reality of computing and digital radiology, where Brazil has not established standards and protocols for this purpose. This work aims to optimize computational chest radiographs (anterior-posterior projection-AP). To achieve this objective were used a homogeneous phantoms that simulate the characteristics of absorption and scattering of radiation close to the chest of a patient standard. Another factor studied was the subjective evaluation of image quality, carried out by visual grading assessment (VGA) by specialists in radiology, using an anthropomorphic phantom to identify the best image for a particular pathology (fracture or pneumonia). Quantifying the corresponding images indicated by the radiologist was performed from the quantification of physical parameters (Detective Quantum Efficiency - DQE, Modulation Transfer Function - MTF and Noise Power Spectrum - NPS) using the software MatLab®. © 2013 Springer-Verlag.
Resumo:
A risks management, carried on in an effective way, leads the software development to success and may influence on the organization. The knowledge takes part of such a process as a way to help taking decisions. This research aimed to analyze the use of Knowledge Management techniques to the Risk Management in software projects development and the possible influence on the enterprise revenue. It had, as its main studying subject, Brazilian incubated and graduated software developing enterprises. The chosen research method was the Survey type. Multivariate statistical methods were used for the treatment and analysis of the obtained results, this way identifying the most significant factors, that is, enterprise's achievement constraining factors and those outcome achievement ones. Among the latter we highlight the knowledge methodology, the time of existence of the enterprise, the amount of employees and the knowledge externalization. The results encourage contributing actions to the increasing of financial revenue. © 2013 Springer-Verlag.
Resumo:
Four-fermion operators have been utilized in the past to link the quarkexchange processes in the interaction of hadrons with the effective mesonexchange amplitudes. In this paper, we apply the similar idea of Fierz rearrangement to the electromagnetic processes and focus on the electromagnetic form factors of nucleon and electron. We explain the motivation of using four-fermion operators and discuss the advantage of this method in computing electromagnetic processes.
Resumo:
Four-fermion operators have been used in the past to link the quark-exchange processes in the interaction of hadrons with the effective meson-exchange amplitudes. In this paper, we apply the similar idea of a Fierz rearrangement to the self-energy and electromagnetic processes and focus on the electromagnetic form factors of the nucleon and the electron. We explain the motivation of using four-fermion operators and discuss the advantage of this method in computing electromagnetic processes. © 2013 American Physical Society.
Resumo:
In many production processes, a key material is prepared and then transformed into different final products. The lot sizing decisions concern not only the production of final products, but also that of material preparation in order to take account of their sequence-dependent setup costs and times. The amount of research in recent years indicates the relevance of this problem in various industrial settings. In this paper, facility location reformulation and strengthening constraints are newly applied to a previous lot-sizing model in order to improve solution quality and computing time. Three alternative metaheuristics are used to fix the setup variables, resulting in much improved performance over previous research, especially regarding the use of the metaheuristics for larger instances. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
Wireless Sensor Networks (WSNs) can be used to monitor hazardous and inaccessible areas. In these situations, the power supply (e.g. battery) of each node cannot be easily replaced. One solution to deal with the limited capacity of current power supplies is to deploy a large number of sensor nodes, since the lifetime and dependability of the network will increase through cooperation among nodes. Applications on WSN may also have other concerns, such as meeting temporal deadlines on message transmissions and maximizing the quality of information. Data fusion is a well-known technique that can be useful for the enhancement of data quality and for the maximization of WSN lifetime. In this paper, we propose an approach that allows the implementation of parallel data fusion techniques in IEEE 802.15.4 networks. One of the main advantages of the proposed approach is that it enables a trade-off between different user-defined metrics through the use of a genetic machine learning algorithm. Simulations and field experiments performed in different communication scenarios highlight significant improvements when compared with, for instance, the Gur Game approach or the implementation of conventional periodic communication techniques over IEEE 802.15.4 networks. © 2013 Elsevier B.V. All rights reserved.
Resumo:
This paper presents a usability evaluation of the MTE (Ministry of Labor e Employment) website in order to measure the effectiveness, efficiency and user satisfaction regarding the website. The participants were 12 users (07 users were female and 05 male). The results indicate that although the education level of all participants and computing experience, many of them have had difficulty in finding information and do not recommend the site. © 2013 Springer-Verlag Berlin Heidelberg.
Resumo:
This paper presents simulation results of the DNP3 communication protocol over a TCP/IP network, for Smart Grid applications. The simulation was performed using the NS-2 network simulator. This study aimed to use the simulation to verify the performance of the DNP3 protocol in a heterogeneous LAN. Analyzing the results it was possible to verify that the DNP3 over a heterogeneous traffic network, with communication channel capacity between 60 and 85 percent, it works well with low packet loss and low delay, however, with traffic values upper 85 percent, the DNP3 usage becomes unfeasible because the information lost, re-transmissions and latency are significantly increased. © 2013 IEEE.
Resumo:
The structural stability of vector fields with impasse regular curves on S2 is studied and a version of Peixoto's Theorem is established. Moreover a global analysis of normal forms of the constrained systems. A(x).ẋ=F(x),x∈R3,A∈M(3),F:R3→R3 in the Poincaré ball (i.e. in the compactification of R3 with the sphere S2 of the infinity) is made. © 2013 Elsevier Masson SAS.