11 resultados para Competency-Based Approach

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Religious communities have been a challenge to HIV prevention globally. Focusing on the acceptability component of the right to health, this intervention study examined how local Catholic, Evangelical and Afro-Brazilian religious communities can collaborate to foster young people`s sexual health and ensure their access to comprehensive HIV prevention in their communities in Brazil. This article describes the process of a three-stage sexual health promotion and HIV prevention initiative that used a multicultural human rights approach to intervention. Methods included 27 in-depth interviews with religious authorities on sexuality, AIDS prevention and human rights training of 18 young people as research-agents, who surveyed 177 youth on the same issues using self-administered questionnaires. The results, analysed using a rights-based perspective on health and the vulnerability framework, were discussed in daylong interfaith workshops. Emblematic of the collaborative process, workshops are the focus of the analysis. Our findings suggest that this human rights framework is effective in increasing inter-religious tolerance and in providing a collective understanding of the sexuality and prevention needs of youth from different religious communities, and also serves as a platform for the expansion of state AIDS programmes based on laical principles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a modeling technique for small-signal stability assessment of unbalanced power systems is presented. Since power distribution systems are inherently unbalanced, due to its lines and loads characteristics, and the penetration of distributed generation into these systems is increasing nowadays, such a tool is needed in order to ensure a secure and reliable operation of these systems. The main contribution of this paper is the development of a phasor-based model for the study of dynamic phenomena in unbalanced power systems. Using an assumption on the net torque of the generator, it is possible to precisely define an equilibrium point for the phasor model of the system, thus enabling its linearization around this point, and, consequently, its eigenvalue/eigenvector analysis for small-signal stability assessment. The modeling technique presented here was compared to the dynamic behavior observed in ATP simulations and the results show that, for the generator and controller models used, the proposed modeling approach is adequate and yields reliable and precise results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional abduction imposes as a precondition the restriction that the background information may not derive the goal data. In first-order logic such precondition is, in general, undecidable. To avoid such problem, we present a first-order cut-based abduction method, which has KE-tableaux as its underlying inference system. This inference system allows for the automation of non-analytic proofs in a tableau setting, which permits a generalization of traditional abduction that avoids the undecidable precondition problem. After demonstrating the correctness of the method, we show how this method can be dynamically iterated in a process that leads to the construction of non-analytic first-order proofs and, in some terminating cases, to refutations as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present work shows a novel fractal dimension method for shape analysis. The proposed technique extracts descriptors from a shape by applying a multi-scale approach to the calculus of the fractal dimension. The fractal dimension is estimated by applying the curvature scale-space technique to the original shape. By applying a multi-scale transform to the calculus, we obtain a set of descriptors which is capable of describing the shape under investigation with high precision. We validate the computed descriptors in a classification process. The results demonstrate that the novel technique provides highly reliable descriptors, confirming the efficiency of the proposed method. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4757226]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Selective modulation of liver X receptor beta (LXR beta) has been recognized as an important approach to prevent or reverse the atherosclerotic process. In the present work, we have developed robust conformation-independent fragment-based quantitative structure-activity and structure-selectivity relationship models for a series of quinolines and cinnolines as potent modulators of the two LXR sub-types. The generated models were then used to predict the potency of an external test set and the predicted values were in good agreement with the experimental results, indicating the potential of the models for untested compounds. The final 2D molecular recognition patterns obtained were integrated to 3D structure-based molecular modeling studies to provide useful insights into the chemical and structural determinants for increased LXR beta binding affinity and selectivity. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Blood-brain barrier (BBB) permeation is an essential property for drugs that act in the central nervous system (CNS) for the treatment of human diseases, such as epilepsy, depression, Alzheimer's disease, Parkinson disease, schizophrenia, among others. In the present work, quantitative structure-property relationship (QSPR) studies were conducted for the development and validation of in silico models for the prediction of BBB permeation. The data set used has substantial chemical diversity and a relatively wide distribution of property values. The generated QSPR models showed good statistical parameters and were successfully employed for the prediction of a test set containing 48 compounds. The predictive models presented herein are useful in the identification, selection and design of new drug candidates having improved pharmacokinetic properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exergetic analysis can provide useful information as it enables the identification of irreversible phenomena bringing about entropy generation and, therefore, exergy losses (also referred to as irreversibilities). As far as human thermal comfort is concerned, irreversibilities can be evaluated based on parameters related to both the occupant and his surroundings. As an attempt to suggest more insights for the exergetic analysis of thermal comfort, this paper calculates irreversibility rates for a sitting person wearing fairly light clothes and subjected to combinations of ambient air and mean radiant temperatures. The thermodynamic model framework relies on the so-called conceptual energy balance equation together with empirical correlations for invoked thermoregulatory heat transfer rates adapted for a clothed body. Results suggested that a minimum irreversibility rate may exist for particular combinations of the aforesaid surrounding temperatures. By separately considering the contribution of each thermoregulatory mechanism, the total irreversibility rate rendered itself more responsive to either convective or radiative clothing-influenced heat transfers, with exergy losses becoming lower if the body is able to transfer more heat (to the ambient) via convection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The behavior of composed Web services depends on the results of the invoked services; unexpected behavior of one of the invoked services can threat the correct execution of an entire composition. This paper proposes an event-based approach to black-box testing of Web service compositions based on event sequence graphs, which are extended by facilities to deal not only with service behavior under regular circumstances (i.e., where cooperating services are working as expected) but also with their behavior in undesirable situations (i.e., where cooperating services are not working as expected). Furthermore, the approach can be used independently of artifacts (e.g., Business Process Execution Language) or type of composition (orchestration/choreography). A large case study, based on a commercial Web application, demonstrates the feasibility of the approach and analyzes its characteristics. Test generation and execution are supported by dedicated tools. Especially, the use of an enterprise service bus for test execution is noteworthy and differs from other approaches. The results of the case study encourage to suggest that the new approach has the power to detect faults systematically, performing properly even with complex and large compositions. Copyright © 2012 John Wiley & Sons, Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Rare variants are becoming the new candidates in the search for genetic variants that predispose individuals to a phenotype of interest. Their low prevalence in a population requires the development of dedicated detection and analytical methods. A family-based approach could greatly enhance their detection and interpretation because rare variants are nearly family specific. In this report, we test several distinct approaches for analyzing the information provided by rare and common variants and how they can be effectively used to pinpoint putative candidate genes for follow-up studies. The analyses were performed on the mini-exome data set provided by Genetic Analysis Workshop 17. Eight approaches were tested, four using the trait’s heritability estimates and four using QTDT models. These methods had their sensitivity, specificity, and positive and negative predictive values compared in light of the simulation parameters. Our results highlight important limitations of current methods to deal with rare and common variants, all methods presented a reduced specificity and, consequently, prone to false positive associations. Methods analyzing common variants information showed an enhanced sensibility when compared to rare variants methods. Furthermore, our limited knowledge of the use of biological databases for gene annotations, possibly for use as covariates in regression models, imposes a barrier to further research.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract Background Over the last years, a number of researchers have investigated how to improve the reuse of crosscutting concerns. New possibilities have emerged with the advent of aspect-oriented programming, and many frameworks were designed considering the abstractions provided by this new paradigm. We call this type of framework Crosscutting Frameworks (CF), as it usually encapsulates a generic and abstract design of one crosscutting concern. However, most of the proposed CFs employ white-box strategies in their reuse process, requiring two mainly technical skills: (i) knowing syntax details of the programming language employed to build the framework and (ii) being aware of the architectural details of the CF and its internal nomenclature. Also, another problem is that the reuse process can only be initiated as soon as the development process reaches the implementation phase, preventing it from starting earlier. Method In order to solve these problems, we present in this paper a model-based approach for reusing CFs which shields application engineers from technical details, letting him/her concentrate on what the framework really needs from the application under development. To support our approach, two models are proposed: the Reuse Requirements Model (RRM) and the Reuse Model (RM). The former must be used to describe the framework structure and the later is in charge of supporting the reuse process. As soon as the application engineer has filled in the RM, the reuse code can be automatically generated. Results We also present here the result of two comparative experiments using two versions of a Persistence CF: the original one, whose reuse process is based on writing code, and the new one, which is model-based. The first experiment evaluated the productivity during the reuse process, and the second one evaluated the effort of maintaining applications developed with both CF versions. The results show the improvement of 97% in the productivity; however little difference was perceived regarding the effort for maintaining the required application. Conclusion By using the approach herein presented, it was possible to conclude the following: (i) it is possible to automate the instantiation of CFs, and (ii) the productivity of developers are improved as long as they use a model-based instantiation approach.