961 resultados para General-purpose computing on graphics processing units (GPGPU)
Resumo:
The increase of applications complexity has demanded hardware even more flexible and able to achieve higher performance. Traditional hardware solutions have not been successful in providing these applications constraints. General purpose processors have inherent flexibility, since they perform several tasks, however, they can not reach high performance when compared to application-specific devices. Moreover, since application-specific devices perform only few tasks, they achieve high performance, although they have less flexibility. Reconfigurable architectures emerged as an alternative to traditional approaches and have become an area of rising interest over the last decades. The purpose of this new paradigm is to modify the device s behavior according to the application. Thus, it is possible to balance flexibility and performance and also to attend the applications constraints. This work presents the design and implementation of a coarse grained hybrid reconfigurable architecture to stream-based applications. The architecture, named RoSA, consists of a reconfigurable logic attached to a processor. Its goal is to exploit the instruction level parallelism from intensive data-flow applications to accelerate the application s execution on the reconfigurable logic. The instruction level parallelism extraction is done at compile time, thus, this work also presents an optimization phase to the RoSA architecture to be included in the GCC compiler. To design the architecture, this work also presents a methodology based on hardware reuse of datapaths, named RoSE. RoSE aims to visualize the reconfigurable units through reusability levels, which provides area saving and datapath simplification. The architecture presented was implemented in hardware description language (VHDL). It was validated through simulations and prototyping. To characterize performance analysis some benchmarks were used and they demonstrated a speedup of 11x on the execution of some applications
Resumo:
This work presents the design of a fuzzy controller with simplified architecture. This architecture tries to minimize the time processing used in? the several stages of hazy modeling of systems and processes. The basic procedures of fuzzification and defuzzification are simplified to the maximum while the inference procedures are computed in private way. Therefore, the simplified architecture allows a fast and easy configuration of the fuzzy controller.All rules that define the control actions are determined by inference procedures and the defuzzification is made automatically using a simplified algorithm. The fuzzy controller operation is standardized and the control actions are previously calculated For general-purpose application? ann results, the industrial systems of fluid pow cona ol will be considered.
Resumo:
The present work introduces a new strategy of induction machines speed adjustment using an adaptive PID (Proportional Integral Derivative) digital controller with gain planning based on the artificial neural networks. This digital controller uses an auxiliary variable to determine the ideal induction machine operating conditions and to establish the closed loop gain of the system. The auxiliary variable value can be estimated from the information stored in a general-purpose artificial neural network based on CMAC (Cerebellar Model Articulation Controller).
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
It has been demonstrated that, on abrupt withdrawal, patients with chronic exposure can experience a number of symptoms indicative of a dependent state. In clinical patients, the earliest to arise and most persistent signal of withdrawal from chronic benzodiazepine (Bzp) treatment is anxiety. In laboratory animals, anxiety-like effects following abrupt interruption of chronic Bzp treatment can also be reproduced. In fact, signs that oscillate from irritability to extreme fear behaviours and seizures have been described already. As anxiety remains one of the most important symptoms of Bzp withdrawal, in this study we evaluated the anxiety levels of rats withdrawn from diazepam. Also studied were the effects on the motor performance and preattentive sensory gating process of rats under diazepam chronic treatment and upon 48-h withdrawal on three animal models of anxiety, the elevated plus-maze (EPM), ultrasonic vocalizations (USV) and startle + prepulse inhibition tests. Data obtained showed an anxiolytic- and anxiogenic-like profile of the chronic intake of and withdrawal from diazepam regimen in the EPM test, 22-KHz USV and startle reflex. Diazepam chronic effects or its withdrawal were ineffective in promoting any alteration in the prepulse inhibition (PPI). However, an increase of PPI was achieved in both sucrose and diazepam pretreated rats on 48-h withdrawal, suggesting a procedural rather than a specific effect of withdrawal on sensory gating processes. It is also possible that the prepulse can function as a conditioned stimulus to informing the delivery of an aversive event, as the auditory startling-eliciting stimulus. All these findings are indicative of a sensitization of the neural substrates of aversion in diazepam withdrawn animals without concomitant changes on the processing of sensory information
Resumo:
Adequate polymerization plays an important role on the longevity of the composite resin restorations. Objectives: The aim of this study was to evaluate the effect of light-curing units, curing mode techniques and storage media on sorption, solubility and biaxial flexural strength (BFS) of a composite resin. Material and Methods: Two hundred and forty specimens were made of one composite resin (Esthet-X) in a stainless steel mold (2 mm x 8 mm 0), and divided into 24 groups (n=10) established according to the 4 study factors: light-curing units: quartz tungsten halogen (QTH) lamp and light-emitting diodes (LED); energy densities: 16 J/cm(2) and 20 J/cm(2); curing modes: conventional (CM) and pulse-delay (PD); and permeants: deionized water and 75% ethanol for 28 days. Sorption and solubility tests were performed according to ISO 4049:2000 specifications. All specimens were then tested for BFS according to ASTM F394-78 specification. Data were analyzed by three-way ANOVA followed by Tukey, Kruskal-Wallis and Mann-Whitney tests (alpha=0.05). Results: In general, no significant differences were found regarding sorption, solubility or BFS means for the light-curing units and curing modes (p>0.05). Only LED unit using 16 J/cm(2) and PD using 10 s produced higher sorption and solubility values than QTH. Otherwise, using CM (16 J/cm(2)), LED produced lower values of BFS than QTH (p<0.05). 75% ethanol permeant produced higher values of sorption and solubility and lower values of BFS than water (p<0.05). Conclusion: Ethanol storage media produced more damage on composite resin than water. In general the LED and QTH curing units using 16 and 20 J/cm(2) by CM and PD curing modes produced no influence on the sorption, solubility or BFS of the tested resin.
Resumo:
Ultrasound imaging is widely used in medical diagnostics as it is the fastest, least invasive, and least expensive imaging modality. However, ultrasound images are intrinsically difficult to be interpreted. In this scenario, Computer Aided Detection (CAD) systems can be used to support physicians during diagnosis providing them a second opinion. This thesis discusses efficient ultrasound processing techniques for computer aided medical diagnostics, focusing on two major topics: (i) Ultrasound Tissue Characterization (UTC), aimed at characterizing and differentiating between healthy and diseased tissue; (ii) Ultrasound Image Segmentation (UIS), aimed at detecting the boundaries of anatomical structures to automatically measure organ dimensions and compute clinically relevant functional indices. Research on UTC produced a CAD tool for Prostate Cancer detection to improve the biopsy protocol. In particular, this thesis contributes with: (i) the development of a robust classification system; (ii) the exploitation of parallel computing on GPU for real-time performance; (iii) the introduction of both an innovative Semi-Supervised Learning algorithm and a novel supervised/semi-supervised learning scheme for CAD system training that improve system performance reducing data collection effort and avoiding collected data wasting. The tool provides physicians a risk map highlighting suspect tissue areas, allowing them to perform a lesion-directed biopsy. Clinical validation demonstrated the system validity as a diagnostic support tool and its effectiveness at reducing the number of biopsy cores requested for an accurate diagnosis. For UIS the research developed a heart disease diagnostic tool based on Real-Time 3D Echocardiography. Thesis contributions to this application are: (i) the development of an automated GPU based level-set segmentation framework for 3D images; (ii) the application of this framework to the myocardium segmentation. Experimental results showed the high efficiency and flexibility of the proposed framework. Its effectiveness as a tool for quantitative analysis of 3D cardiac morphology and function was demonstrated through clinical validation.
Resumo:
The aim of my thesis is to parallelize the Weighting Histogram Analysis Method (WHAM), which is a popular algorithm used to calculate the Free Energy of a molucular system in Molecular Dynamics simulations. WHAM works in post processing in cooperation with another algorithm called Umbrella Sampling. Umbrella Sampling has the purpose to add a biasing in the potential energy of the system in order to force the system to sample a specific region in the configurational space. Several N independent simulations are performed in order to sample all the region of interest. Subsequently, the WHAM algorithm is used to estimate the original system energy starting from the N atomic trajectories. The parallelization of WHAM has been performed through CUDA, a language that allows to work in GPUs of NVIDIA graphic cards, which have a parallel achitecture. The parallel implementation may sensibly speed up the WHAM execution compared to previous serial CPU imlementations. However, the WHAM CPU code presents some temporal criticalities to very high numbers of interactions. The algorithm has been written in C++ and executed in UNIX systems provided with NVIDIA graphic cards. The results were satisfying obtaining an increase of performances when the model was executed on graphics cards with compute capability greater. Nonetheless, the GPUs used to test the algorithm is quite old and not designated for scientific calculations. It is likely that a further performance increase will be obtained if the algorithm would be executed in clusters of GPU at high level of computational efficiency. The thesis is organized in the following way: I will first describe the mathematical formulation of Umbrella Sampling and WHAM algorithm with their apllications in the study of ionic channels and in Molecular Docking (Chapter 1); then, I will present the CUDA architectures used to implement the model (Chapter 2); and finally, the results obtained on model systems will be presented (Chapter 3).
Resumo:
Stata is a general purpose software package that has become popular among various disciplines such as epidemiology, economics, or social sciences. Users like Stata for its scientific approach, its robustness and reliability, and the ease with which its functionality can be extended by user written programs. In this talk I will first give a brief overview of the functionality of Stata and then discuss two specific features: survey estimation and predictive margins/marginal effects. Most surveys are based on complex samples that contain multiple sampling stages, are stratified or clustered, and feature unequal selection probabilities. Standard estimators can produce misleading results in such samples unless the peculiarities of the sampling plan are taken into account. Stata offers survey statistics for complex samples for a wide variety of estimators and supports several variance estimation procedures such as linearization, jackknife, and balanced repeated replication (see Kreuter and Valliant, 2007, Stata Journal 7: 1-21). In the talk I will illustrate these features using applied examples and I will also show how user written commands can be adapted to support complex samples. Complex can also be the models we fit to our data, making it difficult to interpret them, especially in case of nonlinear or non-additive models (Mood, 2010, European Sociological Review 26: 67-82). Stata provides a number of highly useful commands to make results of such models accessible by computing and displaying predictive margins and marginal effects. In my talk I will discuss these commands provide various examples demonstrating their use.
Resumo:
This paper addresses the novel notion of offering a radio access network as a service. Its components may be instantiated on general purpose platforms with pooled resources (both radio and hardware ones) dimensioned on-demand, elastically and following the pay-per-use principle. A novel architecture is proposed that supports this concept. The architecture's success is in its modularity, well-defined functional elements and clean separation between operational and control functions. By moving much processing traditionally located in hardware for computation in the cloud, it allows the optimisation of hardware utilization and reduction of deployment and operation costs. It enables operators to upgrade their network as well as quickly deploy and adapt resources to demand. Also, new players may easily enter the market, permitting a virtual network operator to provide connectivity to its users.
Resumo:
Cloudification of the Centralized-Radio Access Network (C-RAN) in which signal processing runs on general purpose processors inside virtual machines has lately received significant attention. Due to short deadlines in the LTE Frequency Division Duplex access method, processing time fluctuations introduced by the virtualization process have a deep impact on C-RAN performance. This paper evaluates bottlenecks of the OpenAirInterface (OAI is an open-source software-based implementation of LTE) cloud performance, provides feasibility studies on C-RAN execution, and introduces a cloud architecture that significantly reduces the encountered execution problems. In typical cloud environments, the OAI processing time deadlines cannot be guaranteed. Our proposed cloud architecture shows good characteristics for the OAI cloud execution. As an example, in our setup more than 99.5% processed LTE subframes reach reasonable processing deadlines close to performance of a dedicated machine.
Resumo:
The effect of a traditional Ethiopian lupin processing method on the chemical composition of lupin seed samples was studied. Two sampling districts, namely Mecha and Sekela, representing the mid- and high-altitude areas of north-western Ethiopia, respectively, were randomly selected. Different types of traditionally processed and marketed lupin seed samples (raw, roasted, and fi nished) were collected in six replications from each district. Raw samples are unprocessed, and roasted samples are roasted using fi rewood. Finished samples are those ready for human consumption as snack. Thousand seed weight for raw and roasted samples within a study district was similar (P > 0.05), but it was lower (P < 0.01) for fi nished samples compared to raw and roasted samples. The crude fi bre content of fi nished lupin seed sample from Mecha was lower (P < 0.01) than that of raw and roasted samples. However, the different lupin samples from Sekela had similar crude fi bre content (P > 0.05). The crude protein and crude fat contents of fi nished samples within a study district were higher (P < 0.01) than those of raw and roasted samples, respectively. Roasting had no effect on the crude protein content of lupin seed samples. The crude ash content of raw and roasted lupin samples within a study district was higher (P < 0.01) than that of fi nished lupin samples of the respective study districts. The content of quinolizidine alkaloids of fi nished lupin samples was lower than that of raw and roasted samples. There was also an interaction effect between location and lupin sample type. The traditional processing method of lupin seeds in Ethiopia has a positive contribution improving the crude protein and crude fat content, and lowering the alkaloid content of the fi nished product. The study showed the possibility of adopting the traditional processing method to process bitter white lupin for the use as protein supplement in livestock feed in Ethiopia, but further work has to be done on the processing method and animal evaluation.
Resumo:
Population growth is always increasing, and thus the concept of smart and cognitive cities is becoming more important. Developed countries are aware of and working towards needed changes in city management. However, emerging countries require the optimization of their own city management. This chapter illustrates, based on a use case, how a city in an emerging country can quickly progress using the concept of smart and cognitive cities. Nairobi, the capital of Kenya, is chosen for the test case. More than half of the population of Nairobi lives in slums with poor sanitation, and many slum inhabitants often share a single toilet, so the proper functioning and reliable maintenance of toilets are crucial. For this purpose, an approach for processing text messages based on cognitive computing (using soft computing methods) is introduced. Slum inhabitants can inform the responsible center via text messages in cases when toilets are not functioning properly. Through cognitive computer systems, the responsible center can fix the problem in a quick and efficient way by sending repair workers to the area. Focusing on the slum of Kibera, an easy-to-handle approach for slum inhabitants is presented, which can make the city more efficient, sustainable and resilient (i.e., cognitive).
Resumo:
BACKGROUND AND OBJECTIVES Multiple-breath washout (MBW) is an attractive test to assess ventilation inhomogeneity, a marker of peripheral lung disease. Standardization of MBW is hampered as little data exists on possible measurement bias. We aimed to identify potential sources of measurement bias based on MBW software settings. METHODS We used unprocessed data from nitrogen (N2) MBW (Exhalyzer D, Eco Medics AG) applied in 30 children aged 5-18 years: 10 with CF, 10 formerly preterm, and 10 healthy controls. This setup calculates the tracer gas N2 mainly from measured O2 and CO2concentrations. The following software settings for MBW signal processing were changed by at least 5 units or >10% in both directions or completely switched off: (i) environmental conditions, (ii) apparatus dead space, (iii) O2 and CO2 signal correction, and (iv) signal alignment (delay time). Primary outcome was the change in lung clearance index (LCI) compared to LCI calculated with the settings as recommended. A change in LCI exceeding 10% was considered relevant. RESULTS Changes in both environmental and dead space settings resulted in uniform but modest LCI changes and exceeded >10% in only two measurements. Changes in signal alignment and O2 signal correction had the most relevant impact on LCI. Decrease of O2 delay time by 40 ms (7%) lead to a mean LCI increase of 12%, with >10% LCI change in 60% of the children. Increase of O2 delay time by 40 ms resulted in mean LCI decrease of 9% with LCI changing >10% in 43% of the children. CONCLUSIONS Accurate LCI results depend crucially on signal processing settings in MBW software. Especially correct signal delay times are possible sources of incorrect LCI measurements. Algorithms of signal processing and signal alignment should thus be optimized to avoid susceptibility of MBW measurements to this significant measurement bias.
Resumo:
IBAMar (http://www.ba.ieo.es/ibamar) is a regional database that puts together all physical and biochemical data obtained by multiparametric probes (CTDs equipped with different sensors), during the cruises managed by the Balearic Center of the Spanish Institute of Oceanography (COB-IEO). It has been recently extended to include data obtained with classical hydro casts using oceanographic Niskin or Nansen bottles. The result is a database that includes a main core of hydrographic data: temperature (T), salinity (S), dissolved oxygen (DO), fluorescence and turbidity; complemented by bio-chemical data: dissolved inorganic nutrients (phosphate, nitrate, nitrite and silicate) and chlorophyll-a. In IBAMar Database, different technologies and methodologies were used by different teams along the four decades of data sampling in the COB-IEO. Despite of this fact, data have been reprocessed using the same protocols, and a standard QC has been applied to each variable. Therefore it provides a regional database of homogeneous, good quality data. Data acquisition and quality control (QC): 94% of the data are CTDs Sbe911 and Sbe25. S and DO were calibrated on board using water samples, whenever a Rossetta was available (70% of the cases). All CTD data from Seabird CTDs were reviewed and post processed with the software provided by Sea-Bird Electronics. Data were averaged to get 1 dbar vertical resolution. General sampling methodology and pre processing are described in https://ibamardatabase.wordpress.com/home/). Manual QC include visual checks of metadata, duplicate data and outliers. Automatic QC include range check of variables by area (north of Balearic Islands, south of BI and Alboran Sea) and depth (27 standard levels), check for spikes and check for density inversions. Nutrients QC includes a preliminary control and a range check on the observed level of the data to detect outliers around objectively analyzed data fields. A quality flag is assigned as an integer number, depending on the result of the QC check.