22 resultados para Application specific algorithm
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
Objective. To evaluate the perception of eating practices and the stages of change among adolescents. Methods. Cross-sectional study involving a representative sample of 390 adolescents from 11 public schools in the city of Piracicaba, Brazil, in 2004. Food consumption was identified by a food frequency questionnaire and the perception of eating practices evaluation was conducted by comparing food consumption and individual classification of healthy aspects of the diet. The participants were classified within stages of change by means of a specific algorithm. A reclassification within new stages of change was proposed to identify adolescents with similar characteristics regarding food consumption and perception. Results. Low consumption of fruit and vegetables and high consumption of sweets and fats were identified. More than 44% of the adolescents had a mistaken perception of their diet. A significant relationship between the stages of change and food consumption was observed. The reclassification among stages of change, through including the pseudo-maintenance and non-reflective action stages was necessary, considering the high proportion of adolescents who erroneously classified their diets as healthy. Conclusion. Classification of the adolescents into stages of change, together with consumption and perception data, enabled identification of groups at risk, in accordance with their inadequate dietary habits and non-recognition of such habits. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
This paper presents a new statistical algorithm to estimate rainfall over the Amazon Basin region using the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI). The algorithm relies on empirical relationships derived for different raining-type systems between coincident measurements of surface rainfall rate and 85-GHz polarization-corrected brightness temperature as observed by the precipitation radar (PR) and TMI on board the TRMM satellite. The scheme includes rain/no-rain area delineation (screening) and system-type classification routines for rain retrieval. The algorithm is validated against independent measurements of the TRMM-PR and S-band dual-polarization Doppler radar (S-Pol) surface rainfall data for two different periods. Moreover, the performance of this rainfall estimation technique is evaluated against well-known methods, namely, the TRMM-2A12 [ the Goddard profiling algorithm (GPROF)], the Goddard scattering algorithm (GSCAT), and the National Environmental Satellite, Data, and Information Service (NESDIS) algorithms. The proposed algorithm shows a normalized bias of approximately 23% for both PR and S-Pol ground truth datasets and a mean error of 0.244 mm h(-1) ( PR) and -0.157 mm h(-1)(S-Pol). For rain volume estimates using PR as reference, a correlation coefficient of 0.939 and a normalized bias of 0.039 were found. With respect to rainfall distributions and rain area comparisons, the results showed that the formulation proposed is efficient and compatible with the physics and dynamics of the observed systems over the area of interest. The performance of the other algorithms showed that GSCAT presented low normalized bias for rain areas and rain volume [0.346 ( PR) and 0.361 (S-Pol)], and GPROF showed rainfall distribution similar to that of the PR and S-Pol but with a bimodal distribution. Last, the five algorithms were evaluated during the TRMM-Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) 1999 field campaign to verify the precipitation characteristics observed during the easterly and westerly Amazon wind flow regimes. The proposed algorithm presented a cumulative rainfall distribution similar to the observations during the easterly regime, but it underestimated for the westerly period for rainfall rates above 5 mm h(-1). NESDIS(1) overestimated for both wind regimes but presented the best westerly representation. NESDIS(2), GSCAT, and GPROF underestimated in both regimes, but GPROF was closer to the observations during the easterly flow.
Resumo:
Antibody phage display libraries are a useful tool in proteomic analyses. This study evaluated an antibody recombinant library for identification of sex-specific proteins on the sperm cell surface. The Griffin.1 library was used to produce phage antibodies capable of recognizing membrane proteins from Nelore sperm cells. After producing soluble monoclonal scFv, clones were screened on Simental sperm cells by flow cytometry and those that bound to 40-60% of cells were selected. These clones were re-analyzed using Nelore sperm cells and all clones bound to 40-60% of cells. Positive clones were submitted to a binding assay against male and female bovine leukocytes by flow cytometry and one clone preferentially bound to male cells. The results indicate that phage display antibodies are an alternative method for identification of molecules markers on sperm cells. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Background: Microarray techniques have become an important tool to the investigation of genetic relationships and the assignment of different phenotypes. Since microarrays are still very expensive, most of the experiments are performed with small samples. This paper introduces a method to quantify dependency between data series composed of few sample points. The method is used to construct gene co-expression subnetworks of highly significant edges. Results: The results shown here are for an adapted subset of a Saccharomyces cerevisiae gene expression data set with low temporal resolution and poor statistics. The method reveals common transcription factors with a high confidence level and allows the construction of subnetworks with high biological relevance that reveals characteristic features of the processes driving the organism adaptations to specific environmental conditions. Conclusion: Our method allows a reliable and sophisticated analysis of microarray data even under severe constraints. The utilization of systems biology improves the biologists ability to elucidate the mechanisms underlying celular processes and to formulate new hypotheses.
Resumo:
Background: The Burns Specific Health Scale-Revised (BSHS-R) is of easy application, can be self-administered, and it is considered a good scale to evaluate various important life aspects of burn victims. Objectives: To translate and culturally adapt the BSHS-R into the Brazilian-Portuguese language and to evaluate the internal consistency and convergent validity of the translated BSHS-R. Methods: The cultural adaptation of the BSHS-R included translation and back-translation, discussions with professionals and patients to ensure conceptual equivalence, semantic evaluation, and pre-test of the instrument. The Final Brazilian-Portuguese Version (FBPV) of the BSHS-R was tested on a group of 115 burn patients for internal consistency and validity of construct (using the Rosenberg Self-Esteem Scale (RSES) and the Beck Depression Inventory (BDI)). Results: All values of Cronbach`s alpha were greater than. 8, demonstrating that the internal consistency of the FBPV was very high. Self-esteem was highly correlated with affect and body image (r = .59, p < .001), and with interpersonal relationships (T = .51, p < .001). Correlations between the domains of the FBPV and the BDI were all negative but larger in magnitude than the correlations with RSES. Depression was highly correlated with affect and body image (r = -77, p < .001), and with interpersonal relationships (r = -67, p < .001). Conclusions: The results showed that the adapted version of the BSHS-R into Brazilian-Portuguese fulfills the validity and reliability criteria required from an instrument of health status assessment for burn patients. (C) 2008 Elsevier Ltd and ISBI. All rights reserved.
Resumo:
The main objective of this paper is to relieve the power system engineers from the burden of the complex and time-consuming process of power system stabilizer (PSS) tuning. To achieve this goal, the paper proposes an automatic process for computerized tuning of PSSs, which is based on an iterative process that uses a linear matrix inequality (LMI) solver to find the PSS parameters. It is shown in the paper that PSS tuning can be written as a search problem over a non-convex feasible set. The proposed algorithm solves this feasibility problem using an iterative LMI approach and a suitable initial condition, corresponding to a PSS designed for nominal operating conditions only (which is a quite simple task, since the required phase compensation is uniquely defined). Some knowledge about the PSS tuning is also incorporated in the algorithm through the specification of bounds defining the allowable PSS parameters. The application of the proposed algorithm to a benchmark test system and the nonlinear simulation of the resulting closed-loop models demonstrate the efficiency of this algorithm. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The general flowshop scheduling problem is a production problem where a set of n jobs have to be processed with identical flow pattern on in machines. In permutation flowshops the sequence of jobs is the same on all machines. A significant research effort has been devoted for sequencing jobs in a flowshop minimizing the makespan. This paper describes the application of a Constructive Genetic Algorithm (CGA) to makespan minimization on flowshop scheduling. The CGA was proposed recently as an alternative to traditional GA approaches, particularly, for evaluating schemata directly. The population initially formed only by schemata, evolves controlled by recombination to a population of well-adapted structures (schemata instantiation). The CGA implemented is based on the NEH classic heuristic and a local search heuristic used to define the fitness functions. The parameters of the CGA are calibrated using a Design of Experiments (DOE) approach. The computational results are compared against some other successful algorithms from the literature on Taillard`s well-known standard benchmark. The computational experience shows that this innovative CGA approach provides competitive results for flowshop scheduling; problems. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The specific methanogenic activity (SMA) test is an important tool for the monitoring of anaerobic digestion. This paper presents the behavior of the methanogenic archaea of an anaerobic sludge under different conditions of oxygenation in a fixed-bed anaerobic-aerobic reactor treating domestic sewage. The reactor was operated in a continuous manner under different liquid recycle ratios from aerobic to anaerobic zones in order to remove carbon and nitrogen. The application of the SMA test was adapted from several authors and the measurement of the accumulated methane in the reactor was carried out by means of gas chromatography. Methanogenic organisms were not inhibited by the presence of oxygen. In contrast, the values of CH, production rate by sludge exposed to oxygen were greater than those obtained for strictly anaerobic sludge.
Resumo:
A fully conserving algorithm is developed in this paper for the integration of the equations of motion in nonlinear rod dynamics. The starting point is a re-parameterization of the rotation field in terms of the so-called Rodrigues rotation vector, which results in an extremely simple update of the rotational variables. The weak form is constructed with a non-orthogonal projection corresponding to the application of the virtual power theorem. Together with an appropriate time-collocation, it ensures exact conservation of momentum and total energy in the absence of external forces. Appealing is the fact that nonlinear hyperelastic materials (and not only materials with quadratic potentials) are permitted without any prejudice on the conservation properties. Spatial discretization is performed via the finite element method and the performance of the scheme is assessed by means of several numerical simulations.
Resumo:
The calcium carbonate industry generates solid waste products which, because of their high alkaline content (CaO, CaCO(3) and Ca (OH)(2)), have a substantial impact on the environment. The objectives of this study are to characterize and classify the solid waste products, which are generated during the hydration process of the calcium carbonate industry, according to ABNT`s NBR 10.000 series, and to determine the potential and efficiency of using these solid residues to correct soil acidity. Initially, the studied residue was submitted to gross mass, leaching, solubility, pH. X-ray Diffractometry, Inductive Coupled Plasma - Atomic Emission Spectrometry (ICP-AES), granularity and humidity analyses. The potential and efficiency of the residue for correcting soil acidity was determined by analysis of the quality attributes for soil correctives (PN, PRNT, Ca and Mg contents, granularity). Consequently, the results show that the studied residue may be used as a soil acidity corrective, considering that a typical corrective compound is recommended for each different type of soil. Additionally, the product must be further treated (dried and ground) to suit the specific requirements of the consumer market.
Resumo:
Application of the thermal sum concept was developed to determine the optimal harvesting stage of new banana hybrids to be grown for export. It was tested on two triploid hybrid bananas, FlhorBan 916 (F916) and FlhorBan 918 (F918), created by CIRAD`s banana breeding programme, using two different approaches. The first approach was used with F916 and involved calculating the base temperature of bunches sampled at two sites at the ripening stage, and then determining the thermal sum at which the stage of maturity would be identical to that of the control Cavendish export banana. The second approach was used to assess the harvest stage of F918 and involved calculating the two thermal parameters directly, but using more plants and a longer period. Using the linear regression model, the estimated thermal parameters were a thermal sum of 680 degree-days (dd) at a base temperature of 17.0 degrees C for cv. F916, and 970 dd at 13.9 degrees C for cv. F918. This easy-to-use method provides quick and reliable calculations of the two thermal parameters required at a specific harvesting stage for a given banana variety in tropical climate conditions. Determining these two values is an essential step for gaining insight into the agronomic features of a new variety and its potential for export. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Prediction of carbohydrate fractions using equations from the Cornell Net Carbohydrate and Protein System (CNCPS) is a valuable tool to assess the nutritional value of forages. In this paper these carbohydrate fractions were predicted using data from three sunflower (Helianthus annuus L.) cultivars, fresh or as silage. The CNCPS equations for fractions B(2) and C include measurement of ash and protein-free neutral detergent fibre (NDF) as one of their components. However, NDF lacks pectin and other non-starch polysaccharides that are found in the cell wall (CW) matrix, so this work compared the use of a crude CW preparation instead of NDF in the CNCPS equations. There were no differences in the estimates of fractions B, and C when CW replaced NDF; however there were differences in fractions A and B2. Some of the CNCPS equations could be simplified when using CW instead of NDF Notably, lignin could be expressed as a proportion of DM, rather than on the basis of ash and protein-free NDF, when predicting CNCPS fraction C. The CNCPS fraction B(1) (starch + pectin) values were lower than pectin determined through wet chemistty. This finding, along with the results obtained by the substitution of CW for NDF in the CNCPS equations, suggests that pectin was not part of fraction B(1) but present in fraction A. We suggest that pectin and other non-starch polysaccharides that are dissolved by the neutral detergent solution be allocated to a specific fraction (B2) and that another fraction (B(3)) be adopted for the digestible cell wall carbohydrates.
Resumo:
A rapid, sensitive and specific LC-MS/MS method was developed and validated for quantifying chlordesmethyldiazepam (CDDZ or delorazepam), the active metabolite of cloxazolam, in human plasma. In the analytical assay, bromazepam (internal standard) and CDDZ were extracted using a liquid-liquid extraction (diethyl-ether/hexane, 80/20, v/v) procedure. The LC-MS/MS method on a RP-C18 column had an overall run time of 5.0 min and was linear (1/x weighted) over the range 0.5-50 ng/mL (R > 0.999). The between-run precision was 8.0% (1.5 ng/mL), 7.6% (9 ng/mL), 7.4% (40 ng/mL), and 10.9% at the low limit of quantification-LLOQ (0.500 ng/mL). The between-run accuracies were 0.1, -1.5, -2.7 and 8.7% for the above mentioned concentrations, respectively. All current bioanalytical method validation requirements (FDA and ANVISA) were achieved and it was applied to the bioequivalence study (Cloxazolam-test, Eurofarma Lab. Ltda and Olcadil (R)-reference, Novartis Biociencias S/A). The relative bioavailability between both formulations was assessed by calculating individual test/reference ratios for Cmax, AUClast and AUCO-inf. The pharmacokinetic profiles indicated bioequivalence since all ratios were as proposed by FDA and ANVISA. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
The evolution of commodity computing lead to the possibility of efficient usage of interconnected machines to solve computationally-intensive tasks, which were previously solvable only by using expensive supercomputers. This, however, required new methods for process scheduling and distribution, considering the network latency, communication cost, heterogeneous environments and distributed computing constraints. An efficient distribution of processes over such environments requires an adequate scheduling strategy, as the cost of inefficient process allocation is unacceptably high. Therefore, a knowledge and prediction of application behavior is essential to perform effective scheduling. In this paper, we overview the evolution of scheduling approaches, focusing on distributed environments. We also evaluate the current approaches for process behavior extraction and prediction, aiming at selecting an adequate technique for online prediction of application execution. Based on this evaluation, we propose a novel model for application behavior prediction, considering chaotic properties of such behavior and the automatic detection of critical execution points. The proposed model is applied and evaluated for process scheduling in cluster and grid computing environments. The obtained results demonstrate that prediction of the process behavior is essential for efficient scheduling in large-scale and heterogeneous distributed environments, outperforming conventional scheduling policies by a factor of 10, and even more in some cases. Furthermore, the proposed approach proves to be efficient for online predictions due to its low computational cost and good precision. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
We consider the two-level network design problem with intermediate facilities. This problem consists of designing a minimum cost network respecting some requirements, usually described in terms of the network topology or in terms of a desired flow of commodities between source and destination vertices. Each selected link must receive one of two types of edge facilities and the connection of different edge facilities requires a costly and capacitated vertex facility. We propose a hybrid decomposition approach which heuristically obtains tentative solutions for the vertex facilities number and location and use these solutions to limit the computational burden of a branch-and-cut algorithm. We test our method on instances of the power system secondary distribution network design problem. The results show that the method is efficient both in terms of solution quality and computational times. (C) 2010 Elsevier Ltd. All rights reserved.