983 resultados para Software radio architecture
Resumo:
Modern High-Performance Computing HPC systems are gradually increasing in size and complexity due to the correspondent demand of larger simulations requiring more complicated tasks and higher accuracy. However, as side effects of the Dennard’s scaling approaching its ultimate power limit, the efficiency of software plays also an important role in increasing the overall performance of a computation. Tools to measure application performance in these increasingly complex environments provide insights into the intricate ways in which software and hardware interact. The monitoring of the power consumption in order to save energy is possible through processors interfaces like Intel Running Average Power Limit RAPL. Given the low level of these interfaces, they are often paired with an application-level tool like Performance Application Programming Interface PAPI. Since several problems in many heterogeneous fields can be represented as a complex linear system, an optimized and scalable linear system solver algorithm can decrease significantly the time spent to compute its resolution. One of the most widely used algorithms deployed for the resolution of large simulation is the Gaussian Elimination, which has its most popular implementation for HPC systems in the Scalable Linear Algebra PACKage ScaLAPACK library. However, another relevant algorithm, which is increasing in popularity in the academic field, is the Inhibition Method. This thesis compares the energy consumption of the Inhibition Method and Gaussian Elimination from ScaLAPACK to profile their execution during the resolution of linear systems above the HPC architecture offered by CINECA. Moreover, it also collates the energy and power values for different ranks, nodes, and sockets configurations. The monitoring tools employed to track the energy consumption of these algorithms are PAPI and RAPL, that will be integrated with the parallel execution of the algorithms managed with the Message Passing Interface MPI.
Resumo:
The BP (Bundle Protocol) version 7 has been recently standardized by IETF in RFC 9171, but it is the whole DTN (Delay-/Disruption-Tolerant Networking) architecture, of which BP is the core, that is gaining a renewed interest, thanks to its planned adoption in future space missions. This is obviously positive, but at the same time it seems to make space agencies more interested in deployment than in research, with new BP implementations that may challenge the central role played until now by the historical BP reference implementations, such as ION and DTNME. To make Unibo research on DTN independent of space agency decisions, the development of an internal BP implementation was in order. This is the goal of this thesis, which deals with the design and implementation of Unibo-BP: a novel, research-driven BP implementation, to be released as Free Software. Unibo-BP is fully compliant with RFC 9171, as demonstrated by a series of interoperability tests with ION and DTNME, and presents a few innovations, such as the ability to manage remote DTN nodes by means of the BP itself. Unibo-BP is compatible with pre-existing Unibo implementations of CGR (Contact Graph Routing) and LTP (Licklider Transmission Protocol) thanks to interfaces designed during the thesis. The thesis project also includes an implementation of TCPCLv3 (TCP Convergence Layer version 3, RFC 7242), which can be used as an alternative to LTPCL to connect with proximate nodes, especially in terrestrial networks. Summarizing, Unibo-BP is at the heart of a larger project, Unibo-DTN, which aims to implement the main components of a complete DTN stack (BP, TCPCL, LTP, CGR). Moreover, Unibo-BP is compatible with all DTNsuite applications, thanks to an extension of the Unified API library on which DTNsuite applications are based. The hope is that Unibo-BP and all the ancillary programs developed during this thesis will contribute to the growth of DTN popularity in academia and among space agencies.
Resumo:
This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
This paper proposes an architecture for machining process and production monitoring to be applied in machine tools with open Computer numerical control (CNC). A brief description of the advantages of using open CNC for machining process and production monitoring is presented with an emphasis on the CNC architecture using a personal computer (PC)-based human-machine interface. The proposed architecture uses the CNC data and sensors to gather information about the machining process and production. It allows the development of different levels of monitoring systems with mininium investment, minimum need for sensor installation, and low intrusiveness to the process. Successful examples of the utilization of this architecture in a laboratory environment are briefly described. As a Conclusion, it is shown that a wide range of monitoring solutions can be implemented in production processes using the proposed architecture.
Resumo:
Stingless bees exhibit extraordinary variation in nest architecture within and among species. To test for phylogenetic association of behavioral traits for species of the Neotropical stingless bee genus Trigona s.s., a phylogenetic hypothesis was generated by combining sequence data of 24 taxa from one mitochondrial (16S rRNA) and four nuclear gene fragments (long-wavelength rhodopsin copy 1 (opsin), elongation factor-1 alpha copy F2, arginine kinase, and 28S rRNA). Fifteen characteristics of the nest architecture were coded and tested for phylogenetic association. Several characters have significant phylogenetic signal, including type of nesting substrate, nest construction material, and hemipterophily, the tending of hemipteroid insects in exchange for sugar excretions. Phylogenetic independent habits encountered in Trigona s.s. include coprophily and necrophagy.
Resumo:
Aims. We calculate the theoretical event rate of gamma-ray bursts (GRBs) from the collapse of massive first-generation (Population III; Pop III) stars. The Pop III GRBs could be super-energetic with the isotropic energy up to E(iso) greater than or similar to 10(55-57) erg, providing a unique probe of the high-redshift Universe. Methods. We consider both the so-called Pop III.1 stars (primordial) and Pop III.2 stars (primordial but affected by radiation from other stars). We employ a semi-analytical approach that considers inhomogeneous hydrogen reionization and chemical evolution of the intergalactic medium. Results. We show that Pop III.2 GRBs occur more than 100 times more frequently than Pop III.1 GRBs, and thus should be suitable targets for future GRB missions. Interestingly, our optimistic model predicts an event rate that is already constrained by the current radio transient searches. We expect similar to 10-10(4) radio afterglows above similar to 0.3 mJy on the sky with similar to 1 year variability and mostly without GRBs (orphans), which are detectable by ALMA, EVLA, LOFAR, and SKA, while we expect to observe maximum of N < 20 GRBs per year integrated over at z > 6 for Pop III.2 and N < 0.08 per year integrated over at z > 10 for Pop III.1 with EXIST, and N < 0.2 for Pop III.2 GRBs per year integrated over at z > 6 with Swift.
Resumo:
This paper presents SMarty, a variability management approach for UML-based software product lines (PL). SMarty is supported by a UML profile, the SMartyProfile, and a process for managing variabilities, the SMartyProcess. SMartyProfile aims at representing variabilities, variation points, and variants in UML models by applying a set of stereotypes. SMartyProcess consists of a set of activities that is systematically executed to trace, identify, and control variabilities in a PL based on SMarty. It also identifies variability implementation mechanisms and analyzes specific product configurations. In addition, a more comprehensive application of SMarty is presented using SEI's Arcade Game Maker PL. An evaluation of SMarty and related work are discussed.
Resumo:
We study rf spectroscopy of a lithium gas with the goal to explore the possibilities for photoemission spectroscopy of a strongly interacting p-wave Fermi gas. Radio-frequency spectra of quasibound p-wave molecules and of free atoms in the vicinity of the p-wave Feshbach resonance located at 159.15G are presented. The spectra are free of detrimental final-state effects. The observed relative magnetic-field shifts of the molecular and atomic resonances confirm earlier measurements realized with direct rf association. Furthermore, evidence of molecule production by adiabatically ramping the magnetic field is observed. Finally, we propose the use of a one-dimensional optical lattice to study anisotropic superfluid gaps as most direct proof of p-wave superfluidity.
Resumo:
The control of molecular architectures has been a key factor for the use of Langmuir-Blodgett (LB) films in biosensors, especially because biomolecules can be immobilized with preserved activity. In this paper we investigated the incorporation of tyrosinase (Tyr) in mixed Langmuir films of arachidic acid (AA) and a lutetium bisphthalocyanine (LuPc(2)), which is confirmed by a large expansion in the surface pressure isotherm. These mixed films of AA-LuPc(2) + Tyr could be transferred onto ITO and Pt electrodes as indicated by FTIR and electrochemical measurements, and there was no need for crosslinking of the enzyme molecules to preserve their activity. Significantly, the activity of the immobilised Tyr was considerably higher than in previous work in the literature, which allowed Tyr-containing LB films to be used as highly sensitive voltammetric sensors to detect pyrogallol. Linear responses have been found up to 400 mu M, with a detection limit of 4.87 x 10(-2) mu M (n = 4) and a sensitivity of 1.54 mu A mu M(-1) cm(-2). In addition, the Hill coefficient (h = 1.27) indicates cooperation with LuPc(2) that also acts as a catalyst. The enhanced performance of the LB-based biosensor resulted therefore from a preserved activity of Tyr combined with the catalytic activity of LuPc(2), in a strategy that can be extended to other enzymes and analytes upon varying the LB film architecture.
Resumo:
Thousands of Free and Open Source Software Projects (FSP) were, and continually are, created on the Internet. This scenario increases the number of opportunities to collaborate to the same extent that it promotes competition for users and contributors, who can guide projects to superior levels, unachievable by founders alone. Thus, given that the main goal of FSP founders is to improve their projects by means of collaboration, the importance to understand and manage the capacity of attracting users and contributors to the project is established. To support researchers and founders in this challenge, the concept of attractiveness is introduced in this paper, which develops a theoretical-managerial toolkit about the causes, indicators and consequences of attractiveness, enabling its strategic management.
Resumo:
The project of Information Architecture is one of the initial stages of the project of a website, thus the detection and correction of errors in this stage are easier and time-saving than in the following stages. However, to minimize errors for the projects of information architecture, a methodology is necessary to organize the work of the professional and guarantee the final product quality. The profile of the professional who works with Information Architecture in Brazil has been analyzed (quantitative research by means of a questionnaire on-line) as well as the difficulties, techniques and methodologies found in his projects (qualitative research by means of interviews in depth with support of the approaches of the Sense-Making). One concludes that the methodologies of projects of information architecture need to develop the adoption of the approaches of Design Centered in the User and in the ways to evaluate its results.
Resumo:
Objective To evaluate drug interaction software programs and determine their accuracy in identifying drug-drug interactions that may occur in intensive care units. Setting The study was developed in Brazil. Method Drug interaction software programs were identified through a bibliographic search in PUBMED and in LILACS (database related to the health sciences published in Latin American and Caribbean countries). The programs` sensitivity, specificity, and positive and negative predictive values were determined to assess their accuracy in detecting drug-drug interactions. The accuracy of the software programs identified was determined using 100 clinically important interactions and 100 clinically unimportant ones. Stockley`s Drug Interactions 8th edition was employed as the gold standard in the identification of drug-drug interaction. Main outcome Sensitivity, specificity, positive and negative predictive values. Results The programs studied were: Drug Interaction Checker (DIC), Drug-Reax (DR), and Lexi-Interact (LI). DR displayed the highest sensitivity (0.88) and DIC showed the lowest (0.69). A close similarity was observed among the programs regarding specificity (0.88-0.92) and positive predictive values (0.88-0.89). The DIC had the lowest negative predictive value (0.75) and DR the highest (0.91). Conclusion The DR and LI programs displayed appropriate sensitivity and specificity for identifying drug-drug interactions of interest in intensive care units. Drug interaction software programs help pharmacists and health care teams in the prevention and recognition of drug-drug interactions and optimize safety and quality of care delivered in intensive care units.
Resumo:
This paper presents the proposal for a reference model for developing software aimed at small companies. Despite the importance of that represent the small software companies in Latin America, the fact of not having its own standards, and able to meet their specific, has created serious difficulties in improving their process and also in quality certification. In this sense and as a contribution to better understanding of the subject they propose a reference model and as a means to validate the proposal, presents a report of its application in a small Brazilian company, committed to certification of the quality model MPS.BR.