9 resultados para interdisciplinary approach
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
Promoting the inclusion of students with disabilities in e-learning systems has brought many challenges for researchers and educators. The use of synchronous communication tools such as interactive whiteboards has been regarded as an obstacle for inclusive education. In this paper, we present the proposal of an inclusive approach to provide blind students with the possibility to participate in live learning sessions with whiteboard software. The approach is based on the provision of accessible textual descriptions by a live mediator. With the accessible descriptions, students are able to navigate through the elements and explore the content of the class using screen readers. The method used for this study consisted of the implementation of a software prototype within a virtual learning environment and a case study with the participation of a blind student in a live distance class. The results from the case study have shown that this approach can be very effective, and may be a starting point to provide blind students with resources they had previously been deprived from. The proof of concept implemented has shown that many further possibilities may be explored to enhance the interaction of blind users with educational content in whiteboards, and further pedagogical approaches can be investigated from this proposal. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Increasing efforts exist in integrating different levels of detail in models of the cardiovascular system. For instance, one-dimensional representations are employed to model the systemic circulation. In this context, effective and black-box-type decomposition strategies for one-dimensional networks are needed, so as to: (i) employ domain decomposition strategies for large systemic models (1D-1D coupling) and (ii) provide the conceptual basis for dimensionally-heterogeneous representations (1D-3D coupling, among various possibilities). The strategy proposed in this article works for both of these two scenarios, though the several applications shown to illustrate its performance focus on the 1D-1D coupling case. A one-dimensional network is decomposed in such a way that each coupling point connects two (and not more) of the sub-networks. At each of the M connection points two unknowns are defined: the flow rate and pressure. These 2M unknowns are determined by 2M equations, since each sub-network provides one (non-linear) equation per coupling point. It is shown how to build the 2M x 2M non-linear system with arbitrary and independent choice of boundary conditions for each of the sub-networks. The idea is then to solve this non-linear system until convergence, which guarantees strong coupling of the complete network. In other words, if the non-linear solver converges at each time step, the solution coincides with what would be obtained by monolithically modeling the whole network. The decomposition thus imposes no stability restriction on the choice of the time step size. Effective iterative strategies for the non-linear system that preserve the black-box character of the decomposition are then explored. Several variants of matrix-free Broyden`s and Newton-GMRES algorithms are assessed as numerical solvers by comparing their performance on sub-critical wave propagation problems which range from academic test cases to realistic cardiovascular applications. A specific variant of Broyden`s algorithm is identified and recommended on the basis of its computer cost and reliability. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
In many data sets from clinical studies there are patients insusceptible to the occurrence of the event of interest. Survival models which ignore this fact are generally inadequate. The main goal of this paper is to describe an application of the generalized additive models for location, scale, and shape (GAMLSS) framework to the fitting of long-term survival models. in this work the number of competing causes of the event of interest follows the negative binomial distribution. In this way, some well known models found in the literature are characterized as particular cases of our proposal. The model is conveniently parameterized in terms of the cured fraction, which is then linked to covariates. We explore the use of the gamlss package in R as a powerful tool for inference in long-term survival models. The procedure is illustrated with a numerical example. (C) 2009 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
Resumo:
This paper presents an automatic method to detect and classify weathered aggregates by assessing changes of colors and textures. The method allows the extraction of aggregate features from images and the automatic classification of them based on surface characteristics. The concept of entropy is used to extract features from digital images. An analysis of the use of this concept is presented and two classification approaches, based on neural networks architectures, are proposed. The classification performance of the proposed approaches is compared to the results obtained by other algorithms (commonly considered for classification purposes). The obtained results confirm that the presented method strongly supports the detection of weathered aggregates.
Structure-Based Approach for the Study of Estrogen Receptor Binding Affinity and Subtype Selectivity
Resumo:
Estrogens exert important physiological effects through the modulation of two human estrogen receptor (hER) subtypes, alpa (hER alpha) and beta (hER beta). Because the levels and relative proportion of hER alpha and hER beta differ significantly in different target cells, selective hER ligands could target specific tissues or pathways regulated by one receptor subtype without affecting the other. To understand the structural and chemical basis by which small molecule modulators are able to discriminate between the two subtypes, we have applied three-dimensional target-based approaches employing a series of potent hER-ligands. Comparative molecular field analysis (CoMFA) studies were applied to a data set of 81 hER modulators, for which binding affinity values were collected for both hER alpha and hER beta. Significant statistical coefficients were obtained (hER alpha, q(2) = 0.76; hER beta, q(2) = 0.70), indicating the internal consistency of the models. The generated models were validated using external test sets, and the predicted values were in good agreement with the experimental results. Five hER crystal structures were used in GRID/PCA investigations to generate molecular interaction fields (MIF) maps. hER alpha and hER beta were separated using one factor. The resulting 3D information was integrated with the aim of revealing the most relevant structural features involved in hER subtype selectivity. The final QSAR and GRID/PCA models and the information gathered from 3D contour maps should be useful for the design or novel hER modulators with improved selectivity.
Resumo:
Alzheimer`s disease is an ultimately fatal neurodegenerative disease, and BACE-1 has become an attractive validated target for its therapy, with more than a hundred crystal structures deposited in the PDB. In the present study, we present a new methodology that integrates ligand-based methods with structural information derived from the receptor. 128 BACE-1 inhibitors recently disclosed by GlaxoSmithKline R&D were selected specifically because the crystal structures of 9 of these compounds complexed to BACE-1, as well as five closely related analogs, have been made available. A new fragment-guided approach was designed to incorporate this wealth of structural information into a CoMFA study, and the methodology was systematically compared to other popular approaches, such as docking, for generating a molecular alignment. The influence of the partial charges calculation method was also analyzed. Several consistent and predictive models are reported, including one with r (2) = 0.88, q (2) = 0.69 and r (pred) (2) = 0.72. The models obtained with the new methodology performed consistently better than those obtained by other methodologies, particularly in terms of external predictive power. The visual analyses of the contour maps in the context of the enzyme drew attention to a number of possible opportunities for the development of analogs with improved potency. These results suggest that 3D-QSAR studies may benefit from the additional structural information added by the presented methodology.
A robust Bayesian approach to null intercept measurement error model with application to dental data
Resumo:
Measurement error models often arise in epidemiological and clinical research. Usually, in this set up it is assumed that the latent variable has a normal distribution. However, the normality assumption may not be always correct. Skew-normal/independent distribution is a class of asymmetric thick-tailed distributions which includes the Skew-normal distribution as a special case. In this paper, we explore the use of skew-normal/independent distribution as a robust alternative to null intercept measurement error model under a Bayesian paradigm. We assume that the random errors and the unobserved value of the covariate (latent variable) follows jointly a skew-normal/independent distribution, providing an appealing robust alternative to the routine use of symmetric normal distribution in this type of model. Specific distributions examined include univariate and multivariate versions of the skew-normal distribution, the skew-t distributions, the skew-slash distributions and the skew contaminated normal distributions. The methods developed is illustrated using a real data set from a dental clinical trial. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The substitution of petroleum-based fuels with those from renewable sources has gained momentum worldwide. A UV-vis experiment for the quantitative analysis of biofuels (bioethanol or biodiesel) in (petroleum-based) diesel oil has been developed. Before the experiment, students were given a quiz on biofuels, and then they were asked to suggest a suitable UV-vis experiment for the quantification of biofuels in diesel oil. After discussing the results of the quiz, the experiment was conducted. This included the determination of lambda(max) of the medium-dependent, that is, solvatochromic, visible absorption band of the probe 2,6-bis[4-(tert-butyl)phenyl]-4-{2,4,6-tris[4-(tert-butyl)phenyl]pyridinium-1-yl}phenolate as a function of fuel composition. The students appreciated that the subject was linked to a daily situation and that they were asked to suggest the experiment. This experiment served to introduce the phenomena of solvation and solvatochromism.