19 resultados para Application specific algorithm
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
There are some variants of the widely used Fuzzy C-Means (FCM) algorithm that support clustering data distributed across different sites. Those methods have been studied under different names, like collaborative and parallel fuzzy clustering. In this study, we offer some augmentation of the two FCM-based clustering algorithms used to cluster distributed data by arriving at some constructive ways of determining essential parameters of the algorithms (including the number of clusters) and forming a set of systematically structured guidelines such as a selection of the specific algorithm depending on the nature of the data environment and the assumptions being made about the number of clusters. A thorough complexity analysis, including space, time, and communication aspects, is reported. A series of detailed numeric experiments is used to illustrate the main ideas discussed in the study.
Resumo:
Creating high-quality quad meshes from triangulated surfaces is a highly nontrivial task that necessitates consideration of various application specific metrics of quality. In our work, we follow the premise that automatic reconstruction techniques may not generate outputs meeting all the subjective quality expectations of the user. Instead, we put the user at the center of the process by providing a flexible, interactive approach to quadrangulation design. By combining scalar field topology and combinatorial connectivity techniques, we present a new framework, following a coarse to fine design philosophy, which allows for explicit control of the subjective quality criteria on the output quad mesh, at interactive rates. Our quadrangulation framework uses the new notion of Reeb atlas editing, to define with a small amount of interactions a coarse quadrangulation of the model, capturing the main features of the shape, with user prescribed extraordinary vertices and alignment. Fine grain tuning is easily achieved with the notion of connectivity texturing, which allows for additional extraordinary vertices specification and explicit feature alignment, to capture the high-frequency geometries. Experiments demonstrate the interactivity and flexibility of our approach, as well as its ability to generate quad meshes of arbitrary resolution with high-quality statistics, while meeting the user's own subjective requirements.
Resumo:
In this preliminary study the occupational safety and health practices among flower greenhouses workers were evaluated. The study was carried out in the alto Tiete region, located at the Sao Paulo State, Brazil. Inadequate welfare facilities; poor pesticide storage, use and disposal conditions; use of highly toxic pesticides; lack of adequate data regarding pesticide use; and incorrect use and maintenance of PPE were observed in most of the visited greenhouses. These results suggest that, in greenhouses, workers may be at higher risk of pesticide exposure, due to many factors that can intensify the exposure such as the lack of control on reentry intervals after pesticide application. Specific regulations are needed to ensure better OSH practices on pesticide use and to improve working conditions in greenhouses, in order to deal with the peculiarities of greenhouse working environment. Some of the special requirements for greenhouses workers' protection are the establishment of ventilation criteria for restricted entry interval; clear reentry restrictions; and EPI for workers other than applicators that need to enter the greenhouse before expiring REI interval. Another important way to improve OSH practices among workers includes the distribution of simple guidelines on the dos and don'ts regarding OSH practices in greenhouses and extensively training interventions to change the perception of hazards and the behavior towards risk. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Field-Programmable Gate Arrays (FPGAs) are becoming increasingly important in embedded and high-performance computing systems. They allow performance levels close to the ones obtained with Application-Specific Integrated Circuits, while still keeping design and implementation flexibility. However, to efficiently program FPGAs, one needs the expertise of hardware developers in order to master hardware description languages (HDLs) such as VHDL or Verilog. Attempts to furnish a high-level compilation flow (e.g., from C programs) still have to address open issues before broader efficient results can be obtained. Bearing in mind an FPGA available resources, it has been developed LALP (Language for Aggressive Loop Pipelining), a novel language to program FPGA-based accelerators, and its compilation framework, including mapping capabilities. The main ideas behind LALP are to provide a higher abstraction level than HDLs, to exploit the intrinsic parallelism of hardware resources, and to allow the programmer to control execution stages whenever the compiler techniques are unable to generate efficient implementations. Those features are particularly useful to implement loop pipelining, a well regarded technique used to accelerate computations in several application domains. This paper describes LALP, and shows how it can be used to achieve high-performance computing solutions.
Resumo:
Current SoC design trends are characterized by the integration of larger amount of IPs targeting a wide range of application fields. Such multi-application systems are constrained by a set of requirements. In such scenario network-on-chips (NoC) are becoming more important as the on-chip communication structure. Designing an optimal NoC for satisfying the requirements of each individual application requires the specification of a large set of configuration parameters leading to a wide solution space. It has been shown that IP mapping is one of the most critical parameters in NoC design, strongly influencing the SoC performance. IP mapping has been solved for single application systems using single and multi-objective optimization algorithms. In this paper we propose the use of a multi-objective adaptive immune algorithm (M(2)AIA), an evolutionary approach to solve the multi-application NoC mapping problem. Latency and power consumption were adopted as the target multi-objective functions. To compare the efficiency of our approach, our results are compared with those of the genetic and branch and bound multi-objective mapping algorithms. We tested 11 well-known benchmarks, including random and real applications, and combines up to 8 applications at the same SoC. The experimental results showed that the M(2)AIA decreases in average the power consumption and the latency 27.3 and 42.1 % compared to the branch and bound approach and 29.3 and 36.1 % over the genetic approach.
Resumo:
Rapid in vitro methods for measuring digestibility may be useful in analysing aqua feeds if the extent and limits of their application are clearly defined. The pH-stat protein digestibility routine with shrimp hepatopancreas enzymes was previously related to apparent protein digestibility with juvenile Litopenaeus vannamei fed diets containing different protein ingredients. The potential of the method to predict culture performance of shrimp fed six commercial feeds (T3, T4, T5, T6, T7 and T8) with 350 g kg(-1) declared crude-protein content was assessed. The consistency of results obtained using hepatopancreas enzyme extracts from either pond or clear water-raised shrimp was further verified in terms of reproducibility and possible diet history effects upon in vitro outputs. Shrimps were previously acclimated and then maintained over 56 days (initial mean weight 3.28 g) on each diet in 500-L tanks at 114 ind m(-2), clear water closed system with continuous renewal and mechanical filtering (50 mu m), with four replicates per treatment. Feeds were offered four times daily (six days a week) delivered in trays at feeding rates ranging from 4.0% to 7.0% of stocked shrimp biomass. Feed was accessible to shrimp 4 h daily for 1-h feeding period after which uneaten feed was recovered. Growth and survival were determined every 14 days from a sample of 16 individuals per tank. Water quality was monitored daily (pH, temperature and salinity) and managed by water back flushing filter cleaning every 7-10 days. Feeds were analysed for crude protein, gross energy, amino acids and pepsin digestibility. In vitro pH-stat degree of protein hydrolysis (DH%) was determined for each feed using hepatopancreas enzyme extracts from experimental (clear water) or pond-raised shrimp. Feeds resulted in significant differences in shrimp performance (P < 0.05) as seen by the differences in growth rates (0.56-0.98 g week(-1)), final weight and feed conversion ratio (FCR). Shrimp performance and in vitro DH% with pond-raised shrimp enzymes showed significant correlation (P < 0.05) for yield (R-2 = 0.72), growth rates (R-2 = 0.72-0.80) and FCR (R-2 = -0.67). Other feed attributes (protein : energy ratio, amino acids, true protein, non-protein nitrogen contents and in vitro pepsin digestibility) showed none or limited correlation with shrimp culture performance. Additional correlations were found between growth rates and methionine (R-2 = 0.73), FCR and histidine (R-2 = -0.60), and DH% and methionine or methionine+cystine feed contents (R-2 = 0.67-0.92). pH-stat assays with shrimp enzymes generated reproducible DH% results with either pond (CV <= 6.5%) or clear water (CV <= 8.5%) hepatopancreas enzyme sources. Moreover, correlations between shrimp growth rates and feed DH% were significant regardless of the enzyme origin (pond or clear water-raised shrimp) and showed consistent R-2 values. Results suggest the feasibility of using standardized hepatopancreas enzyme extracts for in vitro protein digestibility.
Resumo:
Improving the charge capacity, electrochemical reversibility and stability of anode materials are main challenges for the development of Ni-based rechargeable batteries and devices. The combination of cobalt, as additive, and electrode material nanostructuration revealed a very promising approach for this purpose. The new alpha-NiCo mixed hydroxide based electrodes exhibited high specific charge/discharge capacity (355-714 C g(-1)) and outstanding structural stability, withstanding up to 700 redox cycles without any significant phase transformation, as confirmed by cyclic voltammetry, electrochemical quartz crystal microbalance and X-ray diffractometry. In short, the nanostructured alpha-NiCo mixed hydroxide materials possess superior electrochemical properties and stability, being strong candidates for application in high performance batteries and devices. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Objectives: To identify people affected by leprosy with impairments after completing multidrug therapy for leprosy, and to assess their limitations in conducting daily activities by applying the Screening of Activity Limitation and Safety Awareness (SALSA) scale. Methods: A cross-sectional study was performed of all residents of a medium-sized city who were treated for leprosy from 1998 to 2006. A specific questionnaire was applied to obtain general and clinical data and the SALSA scale was used to assess limitations in activities. Impairments were assessed using the 'World Health Organization leprosy disability grading system' (WHO-DG). Findings: Of the 335 people affected by leprosy treated in the period, 223 (62.1%) were located and interviewed. A total of 51.6% were female with a mean age of 54 years (SD +/- 15.72) and 67.9% had up to 6 years formal education. The borderline form predominated among interviewees (39.9%) and 54.3% suffer from associated diseases with hypertension (29.1%) and diabetes (10.3%) being the most common. Pain was reported by 54.7% of interviewees. By multiple logistic regression analysis, associations were found between limitations in activities and being female (P < 0.025), family income <= 3 minimum wages (P-value < 0.003), reports of major lesions (P-value < 0.004), pain (P-value < 0.001), associated diseases (P-value < 0.023) and the WHO-DG (P-value < 0.001). Disabilities, as identified using the WHO-DG, were less common (32%) than limitations in activities as evaluated by the SALSA scale (57.8%). Conclusion: Limitations in activities proved to be common in people affected by leprosy and were. associated with low income, being female, reported major lesions, disability, disease and pain.
Properties of nanoparticles prepared from NdFeB-based compound for magnetic hyperthermia application
Resumo:
Nanoparticles were prepared from a NdFeB-based alloy using the hydrogen decrepitation process together with high-energy ball milling and tested as heating agent for magnetic hyperthermia. In the milling time range evaluated (up to 10 h), the magnetic moment per mass at H = 1.59 MA m(-1) is superior than 70 A m(2) kg(-1); however, the intrinsic coercivity might be inferior than 20 kA m(-1). The material presents both ferromagnetic and superparamagnetic particles constituted by a mixture of phases due to the incomplete disproportionation reaction of Nd2Fe14BHx during milling. Solutions prepared with deionized water and magnetic particles exposed to an AC magnetic field (H-max similar to 3.7 kA m(-1) and f = 228 kHz) exhibited 26 K <= Delta T-max <= 44 K with a maximum estimated specific absorption rate (SAR) of 225 W kg(-1). For the pure magnetic material milled for the longest period of time (10 h), the SAR was estimated as similar to 2500 W kg(-1). In vitro tests indicated that the powders have acceptable cytotoxicity over a wide range of concentration (0.1-100 mu g ml(-1)) due to the coating applied during milling.
Resumo:
Background: In the analysis of effects by cell treatment such as drug dosing, identifying changes on gene network structures between normal and treated cells is a key task. A possible way for identifying the changes is to compare structures of networks estimated from data on normal and treated cells separately. However, this approach usually fails to estimate accurate gene networks due to the limited length of time series data and measurement noise. Thus, approaches that identify changes on regulations by using time series data on both conditions in an efficient manner are demanded. Methods: We propose a new statistical approach that is based on the state space representation of the vector autoregressive model and estimates gene networks on two different conditions in order to identify changes on regulations between the conditions. In the mathematical model of our approach, hidden binary variables are newly introduced to indicate the presence of regulations on each condition. The use of the hidden binary variables enables an efficient data usage; data on both conditions are used for commonly existing regulations, while for condition specific regulations corresponding data are only applied. Also, the similarity of networks on two conditions is automatically considered from the design of the potential function for the hidden binary variables. For the estimation of the hidden binary variables, we derive a new variational annealing method that searches the configuration of the binary variables maximizing the marginal likelihood. Results: For the performance evaluation, we use time series data from two topologically similar synthetic networks, and confirm that our proposed approach estimates commonly existing regulations as well as changes on regulations with higher coverage and precision than other existing approaches in almost all the experimental settings. For a real data application, our proposed approach is applied to time series data from normal Human lung cells and Human lung cells treated by stimulating EGF-receptors and dosing an anticancer drug termed Gefitinib. In the treated lung cells, a cancer cell condition is simulated by the stimulation of EGF-receptors, but the effect would be counteracted due to the selective inhibition of EGF-receptors by Gefitinib. However, gene expression profiles are actually different between the conditions, and the genes related to the identified changes are considered as possible off-targets of Gefitinib. Conclusions: From the synthetically generated time series data, our proposed approach can identify changes on regulations more accurately than existing methods. By applying the proposed approach to the time series data on normal and treated Human lung cells, candidates of off-target genes of Gefitinib are found. According to the published clinical information, one of the genes can be related to a factor of interstitial pneumonia, which is known as a side effect of Gefitinib.
Resumo:
OBJECTIVES: A number of complications exist with invasive mechanical ventilation and with the use of and withdrawal from prolonged ventilator support. The use of protocols that enable the systematic identification of patients eligible for an interruption in mechanical ventilation can significantly reduce the number of complications. This study describes the application of a weaning protocol and its results. METHODS: Patients who required invasive mechanical ventilation for more than 24 hours were included and assessed daily to identify individuals who were ready to begin the weaning process. RESULTS: We studied 252 patients with a median mechanical ventilation time of 3.7 days (interquartile range of 1 to 23 days), a rapid shallow breathing index value of 48 (median), a maximum inspiratory pressure of 40 cmH2O, and a maximum expiratory pressure of 40 cm H2O (median). Of these 252 patients, 32 (12.7%) had to be reintubated, which represented weaning failure. Noninvasive ventilation was used postextubation in 170 (73%) patients, and 15% of these patients were reintubated, which also represented weaning failure. The mortality rate of the 252 patients studied was 8.73% (22), and there was no significant difference in the age, gender, mechanical ventilation time, and maximum inspiratory pressure between the survivors and nonsurvivors. CONCLUSIONS: The use of a specific weaning protocol resulted in a lower mechanical ventilation time and an acceptable reintubation rate. This protocol can be used as a comparative index in hospitals to improve the weaning system, its monitoring and the informative reporting of patient outcomes and may represent a future tool and source of quality markers for patient care.
Resumo:
This work presents the application of Linear Matrix Inequalities to the robust and optimal adjustment of Power System Stabilizers with pre-defined structure. Results of some tests show that gain and zeros adjustments are sufficient to guarantee robust stability and performance with respect to various operating points. Making use of the flexible structure of LMI's, we propose an algorithm that minimizes the norm of the controllers gain matrix while it guarantees the damping factor specified for the closed loop system, always using a controller with flexible structure. The technique used here is the pole placement, whose objective is to place the poles of the closed loop system in a specific region of the complex plane. Results of tests with a nine-machine system are presented and discussed, in order to validate the algorithm proposed. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The leaf area index (LAI) is a key characteristic of forest ecosystems. Estimations of LAI from satellite images generally rely on spectral vegetation indices (SVIs) or radiative transfer model (RTM) inversions. We have developed a new and precise method suitable for practical application, consisting of building a species-specific SVI that is best-suited to both sensor and vegetation characteristics. Such an SVI requires calibration on a large number of representative vegetation conditions. We developed a two-step approach: (1) estimation of LAI on a subset of satellite data through RTM inversion; and (2) the calibration of a vegetation index on these estimated LAI. We applied this methodology to Eucalyptus plantations which have highly variable LAI in time and space. Previous results showed that an RTM inversion of Moderate Resolution Imaging Spectroradiometer (MODIS) near-infrared and red reflectance allowed good retrieval performance (R-2 = 0.80, RMSE = 0.41), but was computationally difficult. Here, the RTM results were used to calibrate a dedicated vegetation index (called "EucVI") which gave similar LAI retrieval results but in a simpler way. The R-2 of the regression between measured and EucVI-simulated LAI values on a validation dataset was 0.68, and the RMSE was 0.49. The additional use of stand age and day of year in the SVI equation slightly increased the performance of the index (R-2 = 0.77 and RMSE = 0.41). This simple index opens the way to an easily applicable retrieval of Eucalyptus LAI from MODIS data, which could be used in an operational way.
Resumo:
The extraction of information about neural activity timing from BOLD signal is a challenging task as the shape of the BOLD curve does not directly reflect the temporal characteristics of electrical activity of neurons. In this work, we introduce the concept of neural processing time (NPT) as a parameter of the biophysical model of the hemodynamic response function (HRF). Through this new concept we aim to infer more accurately the duration of neuronal response from the highly nonlinear BOLD effect. The face validity and applicability of the concept of NPT are evaluated through simulations and analysis of experimental time series. The results of both simulation and application were compared with summary measures of HRF shape. The experiment that was analyzed consisted of a decision-making paradigm with simultaneous emotional distracters. We hypothesize that the NPT in primary sensory areas, like the fusiform gyrus, is approximately the stimulus presentation duration. On the other hand, in areas related to processing of an emotional distracter, the NPT should depend on the experimental condition. As predicted, the NPT in fusiform gyrus is close to the stimulus duration and the NPT in dorsal anterior cingulate gyrus depends on the presence of an emotional distracter. Interestingly, the NPT in right but not left dorsal lateral prefrontal cortex depends on the stimulus emotional content. The summary measures of HRF obtained by a standard approach did not detect the variations observed in the NPT. Hum Brain Mapp, 2012. (C) 2010 Wiley Periodicals, Inc.
Resumo:
A semi-autonomous unmanned underwater vehicle (UUV), named LAURS, is being developed at the Laboratory of Sensors and Actuators at the University of Sao Paulo. The vehicle has been designed to provide inspection and intervention capabilities in specific missions of deep water oil fields. In this work, a method of modeling and identification of yaw motion dynamic system model of an open-frame underwater vehicle is presented. Using an on-board low cost magnetic compass sensor the method is based on the utilization of an uncoupled 1-DOF (degree of freedom) dynamic system equation and the application of the integral method which is the classical least squares algorithm applied to the integral form of the dynamic system equations. Experimental trials with the actual vehicle have been performed in a test tank and diving pool. During these experiments, thrusters responsible for yaw motion are driven by sinusoidal voltage signal profiles. An assessment of the feasibility of the method reveals that estimated dynamic system models are more reliable when considering slow and small sinusoidal voltage signal profiles, i.e. with larger periods and with relatively small amplitude and offset.