15 resultados para moment based approach
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Religious communities have been a challenge to HIV prevention globally. Focusing on the acceptability component of the right to health, this intervention study examined how local Catholic, Evangelical and Afro-Brazilian religious communities can collaborate to foster young people`s sexual health and ensure their access to comprehensive HIV prevention in their communities in Brazil. This article describes the process of a three-stage sexual health promotion and HIV prevention initiative that used a multicultural human rights approach to intervention. Methods included 27 in-depth interviews with religious authorities on sexuality, AIDS prevention and human rights training of 18 young people as research-agents, who surveyed 177 youth on the same issues using self-administered questionnaires. The results, analysed using a rights-based perspective on health and the vulnerability framework, were discussed in daylong interfaith workshops. Emblematic of the collaborative process, workshops are the focus of the analysis. Our findings suggest that this human rights framework is effective in increasing inter-religious tolerance and in providing a collective understanding of the sexuality and prevention needs of youth from different religious communities, and also serves as a platform for the expansion of state AIDS programmes based on laical principles.
Resumo:
In this paper, a modeling technique for small-signal stability assessment of unbalanced power systems is presented. Since power distribution systems are inherently unbalanced, due to its lines and loads characteristics, and the penetration of distributed generation into these systems is increasing nowadays, such a tool is needed in order to ensure a secure and reliable operation of these systems. The main contribution of this paper is the development of a phasor-based model for the study of dynamic phenomena in unbalanced power systems. Using an assumption on the net torque of the generator, it is possible to precisely define an equilibrium point for the phasor model of the system, thus enabling its linearization around this point, and, consequently, its eigenvalue/eigenvector analysis for small-signal stability assessment. The modeling technique presented here was compared to the dynamic behavior observed in ATP simulations and the results show that, for the generator and controller models used, the proposed modeling approach is adequate and yields reliable and precise results.
Resumo:
Traditional abduction imposes as a precondition the restriction that the background information may not derive the goal data. In first-order logic such precondition is, in general, undecidable. To avoid such problem, we present a first-order cut-based abduction method, which has KE-tableaux as its underlying inference system. This inference system allows for the automation of non-analytic proofs in a tableau setting, which permits a generalization of traditional abduction that avoids the undecidable precondition problem. After demonstrating the correctness of the method, we show how this method can be dynamically iterated in a process that leads to the construction of non-analytic first-order proofs and, in some terminating cases, to refutations as well.
Resumo:
The present work shows a novel fractal dimension method for shape analysis. The proposed technique extracts descriptors from a shape by applying a multi-scale approach to the calculus of the fractal dimension. The fractal dimension is estimated by applying the curvature scale-space technique to the original shape. By applying a multi-scale transform to the calculus, we obtain a set of descriptors which is capable of describing the shape under investigation with high precision. We validate the computed descriptors in a classification process. The results demonstrate that the novel technique provides highly reliable descriptors, confirming the efficiency of the proposed method. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4757226]
Resumo:
Selective modulation of liver X receptor beta (LXR beta) has been recognized as an important approach to prevent or reverse the atherosclerotic process. In the present work, we have developed robust conformation-independent fragment-based quantitative structure-activity and structure-selectivity relationship models for a series of quinolines and cinnolines as potent modulators of the two LXR sub-types. The generated models were then used to predict the potency of an external test set and the predicted values were in good agreement with the experimental results, indicating the potential of the models for untested compounds. The final 2D molecular recognition patterns obtained were integrated to 3D structure-based molecular modeling studies to provide useful insights into the chemical and structural determinants for increased LXR beta binding affinity and selectivity. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Blood-brain barrier (BBB) permeation is an essential property for drugs that act in the central nervous system (CNS) for the treatment of human diseases, such as epilepsy, depression, Alzheimer's disease, Parkinson disease, schizophrenia, among others. In the present work, quantitative structure-property relationship (QSPR) studies were conducted for the development and validation of in silico models for the prediction of BBB permeation. The data set used has substantial chemical diversity and a relatively wide distribution of property values. The generated QSPR models showed good statistical parameters and were successfully employed for the prediction of a test set containing 48 compounds. The predictive models presented herein are useful in the identification, selection and design of new drug candidates having improved pharmacokinetic properties.
Resumo:
Exergetic analysis can provide useful information as it enables the identification of irreversible phenomena bringing about entropy generation and, therefore, exergy losses (also referred to as irreversibilities). As far as human thermal comfort is concerned, irreversibilities can be evaluated based on parameters related to both the occupant and his surroundings. As an attempt to suggest more insights for the exergetic analysis of thermal comfort, this paper calculates irreversibility rates for a sitting person wearing fairly light clothes and subjected to combinations of ambient air and mean radiant temperatures. The thermodynamic model framework relies on the so-called conceptual energy balance equation together with empirical correlations for invoked thermoregulatory heat transfer rates adapted for a clothed body. Results suggested that a minimum irreversibility rate may exist for particular combinations of the aforesaid surrounding temperatures. By separately considering the contribution of each thermoregulatory mechanism, the total irreversibility rate rendered itself more responsive to either convective or radiative clothing-influenced heat transfers, with exergy losses becoming lower if the body is able to transfer more heat (to the ambient) via convection.
Resumo:
The behavior of composed Web services depends on the results of the invoked services; unexpected behavior of one of the invoked services can threat the correct execution of an entire composition. This paper proposes an event-based approach to black-box testing of Web service compositions based on event sequence graphs, which are extended by facilities to deal not only with service behavior under regular circumstances (i.e., where cooperating services are working as expected) but also with their behavior in undesirable situations (i.e., where cooperating services are not working as expected). Furthermore, the approach can be used independently of artifacts (e.g., Business Process Execution Language) or type of composition (orchestration/choreography). A large case study, based on a commercial Web application, demonstrates the feasibility of the approach and analyzes its characteristics. Test generation and execution are supported by dedicated tools. Especially, the use of an enterprise service bus for test execution is noteworthy and differs from other approaches. The results of the case study encourage to suggest that the new approach has the power to detect faults systematically, performing properly even with complex and large compositions. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Rare variants are becoming the new candidates in the search for genetic variants that predispose individuals to a phenotype of interest. Their low prevalence in a population requires the development of dedicated detection and analytical methods. A family-based approach could greatly enhance their detection and interpretation because rare variants are nearly family specific. In this report, we test several distinct approaches for analyzing the information provided by rare and common variants and how they can be effectively used to pinpoint putative candidate genes for follow-up studies. The analyses were performed on the mini-exome data set provided by Genetic Analysis Workshop 17. Eight approaches were tested, four using the trait’s heritability estimates and four using QTDT models. These methods had their sensitivity, specificity, and positive and negative predictive values compared in light of the simulation parameters. Our results highlight important limitations of current methods to deal with rare and common variants, all methods presented a reduced specificity and, consequently, prone to false positive associations. Methods analyzing common variants information showed an enhanced sensibility when compared to rare variants methods. Furthermore, our limited knowledge of the use of biological databases for gene annotations, possibly for use as covariates in regression models, imposes a barrier to further research.
Resumo:
Abstract Background Over the last years, a number of researchers have investigated how to improve the reuse of crosscutting concerns. New possibilities have emerged with the advent of aspect-oriented programming, and many frameworks were designed considering the abstractions provided by this new paradigm. We call this type of framework Crosscutting Frameworks (CF), as it usually encapsulates a generic and abstract design of one crosscutting concern. However, most of the proposed CFs employ white-box strategies in their reuse process, requiring two mainly technical skills: (i) knowing syntax details of the programming language employed to build the framework and (ii) being aware of the architectural details of the CF and its internal nomenclature. Also, another problem is that the reuse process can only be initiated as soon as the development process reaches the implementation phase, preventing it from starting earlier. Method In order to solve these problems, we present in this paper a model-based approach for reusing CFs which shields application engineers from technical details, letting him/her concentrate on what the framework really needs from the application under development. To support our approach, two models are proposed: the Reuse Requirements Model (RRM) and the Reuse Model (RM). The former must be used to describe the framework structure and the later is in charge of supporting the reuse process. As soon as the application engineer has filled in the RM, the reuse code can be automatically generated. Results We also present here the result of two comparative experiments using two versions of a Persistence CF: the original one, whose reuse process is based on writing code, and the new one, which is model-based. The first experiment evaluated the productivity during the reuse process, and the second one evaluated the effort of maintaining applications developed with both CF versions. The results show the improvement of 97% in the productivity; however little difference was perceived regarding the effort for maintaining the required application. Conclusion By using the approach herein presented, it was possible to conclude the following: (i) it is possible to automate the instantiation of CFs, and (ii) the productivity of developers are improved as long as they use a model-based instantiation approach.
Resumo:
Many recent survival studies propose modeling data with a cure fraction, i.e., data in which part of the population is not susceptible to the event of interest. This event may occur more than once for the same individual (recurrent event). We then have a scenario of recurrent event data in the presence of a cure fraction, which may appear in various areas such as oncology, finance, industries, among others. This paper proposes a multiple time scale survival model to analyze recurrent events using a cure fraction. The objective is analyzing the efficiency of certain interventions so that the studied event will not happen again in terms of covariates and censoring. All estimates were obtained using a sampling-based approach, which allows information to be input beforehand with lower computational effort. Simulations were done based on a clinical scenario in order to observe some frequentist properties of the estimation procedure in the presence of small and moderate sample sizes. An application of a well-known set of real mammary tumor data is provided.
Resumo:
We report a morphology-based approach for the automatic identification of outlier neurons, as well as its application to the NeuroMorpho.org database, with more than 5,000 neurons. Each neuron in a given analysis is represented by a feature vector composed of 20 measurements, which are then projected into a two-dimensional space by applying principal component analysis. Bivariate kernel density estimation is then used to obtain the probability distribution for the group of cells, so that the cells with highest probabilities are understood as archetypes while those with the smallest probabilities are classified as outliers. The potential of the methodology is illustrated in several cases involving uniform cell types as well as cell types for specific animal species. The results provide insights regarding the distribution of cells, yielding single and multi-variate clusters, and they suggest that outlier cells tend to be more planar and tortuous. The proposed methodology can be used in several situations involving one or more categories of cells, as well as for detection of new categories and possible artifacts.
Resumo:
The analysis of spatial relations among objects in an image is an important vision problem that involves both shape analysis and structural pattern recognition. In this paper, we propose a new approach to characterize the spatial relation along, an important feature of spatial configurations in space that has been overlooked in the literature up to now. We propose a mathematical definition of the degree to which an object A is along an object B, based on the region between A and B and a degree of elongatedness of this region. In order to better fit the perceptual meaning of the relation, distance information is included as well. In order to cover a more wide range of potential applications, both the crisp and fuzzy cases are considered. In the crisp case, the objects are represented in terms of 2D regions or ID contours, and the definition of the alongness between them is derived from a visibility notion and from the region between the objects. However, the computational complexity of this approach leads us to the proposition of a new model to calculate the between region using the convex hull of the contours. On the fuzzy side, the region-based approach is extended. Experimental results obtained using synthetic shapes and brain structures in medical imaging corroborate the proposed model and the derived measures of alongness, thus showing that they agree with the common sense. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Abstract Background Tnt1 was the first active plant retrotransposon identified in tobacco after nitrate reductase gene disruption. The Tnt1 superfamily comprises elements from Nicotiana (Tnt1 and Tto1) and Lycopersicon (Retrolyc1 and Tlc1) species. The study presented here was conducted to characterise Tnt1-related sequences in 20 wild species of Solanum and five cultivars of Solanum tuberosum. Results Tnt1-related sequences were amplified from total genomic DNA using a PCR-based approach. Purified fragments were cloned and sequenced, and clustering analysis revealed three groups that differ in their U3 region. Using a network approach with a total of 453 non-redundant sequences isolated from Solanum (197), Nicotiana (140) and Lycopersicon (116) species, it is demonstrated that the Tnt1 superfamily can be treated as a population to resolve previous phylogenetic multifurcations. The resulting RNAseH network revealed that sequences group according to the Solanaceae genus, supporting a strong association with the host genome, whereas tracing the U3 region sequence association characterises the modular evolutionary pattern within the Tnt1 superfamily. Within each genus, and irrespective of species, nearly 20% of Tnt1 sequences analysed are identical, indicative of being part of an active copy. The network approach enabled the identification of putative "master" sequences and provided evidence that within a genus these master sequences are associated with distinct U3 regions. Conclusion The results presented here support the hypothesis that the Tnt1 superfamily was present early in the evolution of Solanaceae. The evidence also suggests that the RNAseH region of Tnt1 became fixed at the host genus level whereas, within each genus, propagation was ensured by the diversification of the U3 region. Different selection pressures seemed to have acted on the U3 and RNAseH modules of ancestral Tnt1 elements, probably due to the distinct functions of these regions in the retrotransposon life cycle, resulting in both co evolution and adaptation of the element population with its host.