27 resultados para Reference Architecture
Resumo:
No literature data above atmospheric pressure could be found for the viscosity of TOTIVI. As a consequence, the present viscosity results could only be compared upon extrapolation of the vibrating wire data to 0.1 MPa. Independent viscosity measurements were performed, at atmospheric pressure, using an Ubbelohde capillary in order to compare with the vibrating wire results, extrapolated by means of the above mentioned correlation. The two data sets agree within +/- 1%, which is commensurate with the mutual uncertainty of the experimental methods. Comparisons of the literature data obtained at atmospheric pressure with the present extrapolated vibrating-wire viscosity measurements have shown an agreement within +/- 2% for temperatures up to 339 K and within +/- 3.3% for temperatures up to 368 K. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
In Part I of the present work we describe the viscosity measurements performed on tris(2-ethylhexyl) trimellitate or 1,2,4-benzenetricarboxylic acid, tris(2-ethylhexyl) ester (TOTM) up to 65 MPa and at six temperatures from (303 to 373)K, using a new vibrating-wire instrument. The main aim is to contribute to the proposal of that liquid as a potential reference fluid for high viscosity, high pressure and high temperature. The present Part II is dedicated to report the density measurements of TOTM necessary, not only to compute the viscosity data presented in Part I, but also as complementary data for the mentioned proposal. The present density measurements were obtained using a vibrating U-tube densimeter, model DMA HP, using model DMA5000 as a reading unit, both instruments from Anton Paar GmbH. The measurements were performed along five isotherms from (293 to 373)K and at eleven different pressures up to 68 MPa. As far as the authors are aware, the viscosity and density results are the first, above atmospheric pressure, to be published for TOTM. Due to TOTM's high viscosity, its density data were corrected for the viscosity effect on the U-tube density measurements. This effect was estimated using two Newtonian viscosity standard liquids, 20 AW and 200 GW. The density data were correlated with temperature and pressure using a modified Tait equation. The expanded uncertainty of the present density results is estimated as +/- 0.2% at a 95% confidence level. Those results were correlated with temperature and pressure by a modified Tait equation, with deviations within +/- 0.25%. Furthermore, the isothermal compressibility, K-T, and the isobaric thermal expansivity, alpha(p), were obtained by derivation of the modified Tait equation used for correlating the density data. The corresponding uncertainties, at a 95% confidence level, are estimated to be less than +/- 1.5% and +/- 1.2%, respectively. No isobaric thermal expansivity and isothermal compressibility for TOTM were found in the literature. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
A non-coherent vector delay/frequency-locked loop architecture for GNSS receivers is proposed. Two dynamics models are considered: PV (position and velocity) and PVA (position, velocity, and acceleration). In contrast with other vector architectures, the proposed approach does not require the estimation of signals amplitudes. Only coarse estimates of the carrier-to-noise ratios are necessary.
Resumo:
In this article, physical layer awareness in access, core, and metro networks is addressed, and a Physical Layer Aware Network Architecture Framework for the Future Internet is presented and discussed, as proposed within the framework of the European ICT Project 4WARD. Current limitations and shortcomings of the Internet architecture are driving research trends at a global scale toward a novel, secure, and flexible architecture. This Future Internet architecture must allow for the co-existence and cooperation of multiple networks on common platforms, through the virtualization of network resources. Possible solutions embrace a full range of technologies, from fiber backbones to wireless access networks. The virtualization of physical networking resources will enhance the possibility of handling different profiles, while providing the impression of mutual isolation. This abstraction strategy implies the use of well elaborated mechanisms in order to deal with channel impairments and requirements, in both wireless (access) and optical (core) environments.
Resumo:
Human exposure to Bisphenol A (BPA) results mainly from ingestion of food and beverages. Information regarding BPA effects on colon cancer, one of the major causes of death in developed countries, is still scarce. Likewise, little is known about BPA drug interactions although its potential role in doxorubicin (DOX) chemoresistance has been suggested. This study aims to assess potential interactions between BPA and DOX on HT29 colon cancer cells. HT29 cell response was evaluated after exposure to BPA, DOX, or co-exposure to both chemicals. Transcriptional analysis of several cancer-associated genes (c-fos, AURKA, p21, bcl-xl and CLU) shows that BPA exposure induces slight up-regulation exclusively of bcl-xl without affecting cell viability. On the other hand, a sub-therapeutic DOX concentration (40nM) results in highly altered c-fos, bcl-xl, and CLU transcript levels, and this is not affected by co-exposure with BPA. Conversely, DOX at a therapeutic concentration (4μM) results in distinct and very severe transcriptional alterations of c-fos, AURKA, p21 and CLU that are counteracted by co-exposure with BPA resulting in transcript levels similar to those of control. Co-exposure with BPA slightly decreases apoptosis in relation to DOX 4μM alone without affecting DOX-induced loss of cell viability. These results suggest that BPA exposure can influence chemotherapy outcomes and therefore emphasize the necessity of a better understanding of BPA interactions with chemotherapeutic agents in the context of risk assessment.
Resumo:
Brain dopamine transporters imaging by Single Emission Tomography (SPECT) with 123I-FP-CIT (DaTScanTM) has become an important tool in the diagnosis and evaluation of Parkinson syndromes.This diagnostic method allows the visualization of a portion of the striatum – where healthy pattern resemble two symmetric commas - allowing the evaluation of dopamine presynaptic system, in which dopamine transporters are responsible for dopamine release into the synaptic cleft, and their reabsorption into the nigrostriatal nerve terminals, in order to be stored or degraded. In daily practice for assessment of DaTScan TM, it is common to rely only on visual assessment for diagnosis. However, this process is complex and subjective as it depends on the observer’s experience and it is associated with high variability intra and inter observer. Studies have shown that semiquantification can improve the diagnosis of Parkinson syndromes. For semiquantification, analysis methods of image segmentation using regions of interest (ROI) are necessary. ROIs are drawn, in specific - striatum - and in nonspecific – background – uptake areas. Subsequently, specific binding ratios are calculated. Low adherence of semiquantification for diagnosis of Parkinson syndromes is related, not only with the associated time spent, but also with the need of an adapted database of reference values for the population concerned, as well as, the examination of each service protocol. Studies have concluded, that this process increases the reproducibility of semiquantification. The aim of this investigation was to create and validate a database of healthy controls for Dopamine transporters with DaTScanTM named DBRV. The created database has been adapted to the Nuclear Medicine Department’s protocol, and the population of Infanta Cristina’s Hospital located in Badajoz, Spain.
Resumo:
The DatScanTM and its Semiquantification (SQ) can provide advantages in the diagnosis of Parkinsonian Syndromes (PS). To improve the SQ is recommended the creation of adapted database (DB) with reference values for the Nuclear Medicine Departments. Previously to this work was created a adapted database (DBRV) to Nuclear Medicine Department's protocol and population of Infanta Cristina's Hospital located in Badajoz, for patients between the ages of 60 and 75, and reference values of the SQ were calculated. Aim: To evaluate the discrimination capacity of a department's adapted DB reference's values of healthy controls for DatScanTM.
Resumo:
Semi quantification (SQ) in DaTScan® studies is broadly used in clinic daily basis, however there is a suspicious about its discriminative capability, and concordance with the diagnostic classification performed by the physician. Aim: Evaluate the discriminate capability of an adapted database and reference's values of healthy controls for the Dopamine Transporters (DAT) with 123I–FP-IT named DBRV adapted to Nuclear Medicine Department's protocol and population of Infanta Cristina's Hospital, and its concordance with the physician classification.
Resumo:
Sparse matrix-vector multiplication (SMVM) is a fundamental operation in many scientific and engineering applications. In many cases sparse matrices have thousands of rows and columns where most of the entries are zero, while non-zero data is spread over the matrix. This sparsity of data locality reduces the effectiveness of data cache in general-purpose processors quite reducing their performance efficiency when compared to what is achieved with dense matrix multiplication. In this paper, we propose a parallel processing solution for SMVM in a many-core architecture. The architecture is tested with known benchmarks using a ZYNQ-7020 FPGA. The architecture is scalable in the number of core elements and limited only by the available memory bandwidth. It achieves performance efficiencies up to almost 70% and better performances than previous FPGA designs.
Resumo:
The complexity associated with fast growing of B2B and the lack of a (complete) suite of open standards makes difficulty to maintain the underlying collaborative processes. Aligned to this challenge, this paper aims to be a contribution to an open architecture of logistics and transport processes management system. A model of an open integrated system is being defined as an open computational responsibility from the embedded systems (on-board) as well as a reference implementation (prototype) of a host system to validate the proposed open interfaces. Embedded subsystem can, natively, be prepared to cooperate with other on-board units and with IT-systems in an infrastructure commonly referred to as a center information system or back-office. In interaction with a central system the proposal is to adopt an open framework for cooperation where the embedded unit or the unit placed somewhere (land/sea) interacts in response to a set of implemented capabilities.
Resumo:
This paper proposes an FPGA-based architecture for onboard hyperspectral unmixing. This method based on the Vertex Component Analysis (VCA) has several advantages, namely it is unsupervised, fully automatic, and it works without dimensionality reduction (DR) pre-processing step. The architecture has been designed for a low cost Xilinx Zynq board with a Zynq-7020 SoC FPGA based on the Artix-7 FPGA programmable logic and tested using real hyperspectral datasets. Experimental results indicate that the proposed implementation can achieve real-time processing, while maintaining the methods accuracy, which indicate the potential of the proposed platform to implement high-performance, low cost embedded systems.
Resumo:
Hyperspectral imaging has become one of the main topics in remote sensing applications, which comprise hundreds of spectral bands at different (almost contiguous) wavelength channels over the same area generating large data volumes comprising several GBs per flight. This high spectral resolution can be used for object detection and for discriminate between different objects based on their spectral characteristics. One of the main problems involved in hyperspectral analysis is the presence of mixed pixels, which arise when the spacial resolution of the sensor is not able to separate spectrally distinct materials. Spectral unmixing is one of the most important task for hyperspectral data exploitation. However, the unmixing algorithms can be computationally very expensive, and even high power consuming, which compromises the use in applications under on-board constraints. In recent years, graphics processing units (GPUs) have evolved into highly parallel and programmable systems. Specifically, several hyperspectral imaging algorithms have shown to be able to benefit from this hardware taking advantage of the extremely high floating-point processing performance, compact size, huge memory bandwidth, and relatively low cost of these units, which make them appealing for onboard data processing. In this paper, we propose a parallel implementation of an augmented Lagragian based method for unsupervised hyperspectral linear unmixing on GPUs using CUDA. The method called simplex identification via split augmented Lagrangian (SISAL) aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The efficient implementation of SISAL method presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory.