983 resultados para EXPLICIT
Resumo:
In the protein folding problem, solvent-mediated forces are commonly represented by intra-chain pairwise contact energy. Although this approximation has proven to be useful in several circumstances, it is limited in some other aspects of the problem. Here we show that it is possible to achieve two models to represent the chain-solvent system. one of them with implicit and other with explicit solvent, such that both reproduce the same thermodynamic results. Firstly, lattice models treated by analytical methods, were used to show that the implicit and explicitly representation of solvent effects can be energetically equivalent only if local solvent properties are time and spatially invariant. Following, applying the same reasoning Used for the lattice models, two inter-consistent Monte Carlo off-lattice models for implicit and explicit solvent are constructed, being that now in the latter the solvent properties are allowed to fluctuate. Then, it is shown that the chain configurational evolution as well as the globule equilibrium conformation are significantly distinct for implicit and explicit solvent systems. Actually, strongly contrasting with the implicit solvent version, the explicit solvent model predicts: (i) a malleable globule, in agreement with the estimated large protein-volume fluctuations; (ii) thermal conformational stability, resembling the conformational hear resistance of globular proteins, in which radii of gyration are practically insensitive to thermal effects over a relatively wide range of temperatures; and (iii) smaller radii of gyration at higher temperatures, indicating that the chain conformational entropy in the unfolded state is significantly smaller than that estimated from random coil configurations. Finally, we comment on the meaning of these results with respect to the understanding of the folding process. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
We present a fast method for finding optimal parameters for a low-resolution (threading) force field intended to distinguish correct from incorrect folds for a given protein sequence. In contrast to other methods, the parameterization uses information from >10(7) misfolded structures as well as a set of native sequence-structure pairs. In addition to testing the resulting force field's performance on the protein sequence threading problem, results are shown that characterize the number of parameters necessary for effective structure recognition.
Resumo:
1. Although population viability analysis (PVA) is widely employed, forecasts from PVA models are rarely tested. This study in a fragmented forest in southern Australia contrasted field data on patch occupancy and abundance for the arboreal marsupial greater glider Petauroides volans with predictions from a generic spatially explicit PVA model. This work represents one of the first landscape-scale tests of its type. 2. Initially we contrasted field data from a set of eucalypt forest patches totalling 437 ha with a naive null model in which forecasts of patch occupancy were made, assuming no fragmentation effects and based simply on remnant area and measured densities derived from nearby unfragmented forest. The naive null model predicted an average total of approximately 170 greater gliders, considerably greater than the true count (n = 81). 3. Congruence was examined between field data and predictions from PVA under several metapopulation modelling scenarios. The metapopulation models performed better than the naive null model. Logistic regression showed highly significant positive relationships between predicted and actual patch occupancy for the four scenarios (P = 0.001-0.006). When the model-derived probability of patch occupancy was high (0.50-0.75, 0.75-1.00), there was greater congruence between actual patch occupancy and the predicted probability of occupancy. 4. For many patches, probability distribution functions indicated that model predictions for animal abundance in a given patch were not outside those expected by chance. However, for some patches the model either substantially over-predicted or under-predicted actual abundance. Some important processes, such as inter-patch dispersal, that influence the distribution and abundance of the greater glider may not have been adequately modelled. 5. Additional landscape-scale tests of PVA models, on a wider range of species, are required to assess further predictions made using these tools. This will help determine those taxa for which predictions are and are not accurate and give insights for improving models for applied conservation management.
Resumo:
The corporative portals, enabled by Information Technology and Communication tools, provide the integration of heterogeneous data proceeding from internal information systems, which are available for access and sharing of the interested community. They can be considered an important instrument of explicit knowledge evaluation in the. organization, once they allow faster and,safer, information exchanges, enabling a healthful collaborative environment. In the specific case of major Brazilian universities, the corporate portals assume a basic aspect; therefore they offer an enormous variety and amount of information and knowledge, due to the multiplicity of their activities This. study aims to point out important aspects of the explicit knowledge expressed by the searched universities; by the analysis, of the content offered in their corporative portals` This is an exploratory study made through, direct observation of the existing contents in the corporative portals of two public universities as. Well as three private ones. A. comparative analysis of the existing contents in these portals was carried through;. it can be useful to evaluate its use as factor of optimization of the generated explicit knowledge in the university. As results, the existence of important differences, could be verified in the composition and in the content of the corporative portals of the public universities compared to the private institutions. The main differences are about the kind of services and the destination-of the,information that have as focus different public-target. It-could also be concluded that the searched private universities, focus, on the processes related to the attendance of the students, the support for the courses as well as the spreading of information to the public interested in joining the institution; whereas the anal public universities prioritize more specific information, directed to,the dissemination-of the research, developed internally or with institutional objectives.
Resumo:
Reaction between 5-(4-amino-2-thiabutyl)-5-methyl-3,7-dithianonane-1, 9-diamine (N3S3) and 5- methyl-2,2-bipyridine-5-carbaldehyde and subsequent reduction of the resulting imine with sodium borohydride results in a potentially ditopic ligand (L). Treatment of L with one equivalent of an iron( II) salt led to the monoprotonated complex [Fe(HL)](3+), isolated as the hexafluorophosphate salt. The presence of characteristic bands for the tris( bipyridyl) iron( II) chromophore in the UV/vis spectrum indicated that the iron( II) atom is coordinated octahedrally by the three bipyridyl (bipy) groups. The [Fe( bipy) 3] moiety encloses a cavity composed of the N3S3 portion of the ditopic ligand. The mononuclear and monomeric nature of the complex [Fe(HL)](3+) has been established also by accurate mass analysis. [Fe(HL)](3+) displays reduced stability to base compared with the complex [Fe(bipy)(3)](2+). In aqueous solution [Fe(HL)](3+) exhibits irreversible electrochemical behaviour with an oxidation wave ca. 60 mV to more positive potential than [Fe(bipy)(3)](2+). Investigations of the interaction of [Fe(L)](2+) with copper( II), iron( II), and mercury( II) using mass spectroscopic and potentiometric methods suggested that where complexation occurred, fewer than six of the N3S3 cavity donors were involved. The high affinity of the complex [Fe(L)](2+) for protons is one reason suggested to contribute to the reluctance to coordinate a second metal ion.
Resumo:
We detail the automatic construction of R matrices corresponding to (the tensor products of) the (O-m\alpha(n)) families of highest-weight representations of the quantum superalgebras Uq[gl(m\n)]. These representations are irreducible, contain a free complex parameter a, and are 2(mn)-dimensional. Our R matrices are actually (sparse) rank 4 tensors, containing a total of 2(4mn) components, each of which is in general an algebraic expression in the two complex variables q and a. Although the constructions are straightforward, we describe them in full here, to fill a perceived gap in the literature. As the algorithms are generally impracticable for manual calculation, we have implemented the entire process in MATHEMATICA; illustrating our results with U-q [gl(3\1)]. (C) 2002 Published by Elsevier Science B.V.
Resumo:
Most finite element packages use the Newmark algorithm for time integration of structural dynamics. Various algorithms have been proposed to better optimize the high frequency dissipation of this algorithm. Hulbert and Chung proposed both implicit and explicit forms of the generalized alpha method. The algorithms optimize high frequency dissipation effectively, and despite recent work on algorithms that possess momentum conserving/energy dissipative properties in a non-linear context, the generalized alpha method remains an efficient way to solve many problems, especially with adaptive timestep control. However, the implicit and explicit algorithms use incompatible parameter sets and cannot be used together in a spatial partition, whereas this can be done for the Newmark algorithm, as Hughes and Liu demonstrated, and for the HHT-alpha algorithm developed from it. The present paper shows that the explicit generalized alpha method can be rewritten so that it becomes compatible with the implicit form. All four algorithmic parameters can be matched between the explicit and implicit forms. An element interface between implicit and explicit partitions can then be used, analogous to that devised by Hughes and Liu to extend the Newmark method. The stability of the explicit/implicit algorithm is examined in a linear context and found to exceed that of the explicit partition. The element partition is significantly less dissipative of intermediate frequencies than one using the HHT-alpha method. The explicit algorithm can also be rewritten so that the discrete equation of motion evaluates forces from displacements and velocities found at the predicted mid-point of a cycle. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Image segmentation is an ubiquitous task in medical image analysis, which is required to estimate morphological or functional properties of given anatomical targets. While automatic processing is highly desirable, image segmentation remains to date a supervised process in daily clinical practice. Indeed, challenging data often requires user interaction to capture the required level of anatomical detail. To optimize the analysis of 3D images, the user should be able to efficiently interact with the result of any segmentation algorithm to correct any possible disagreement. Building on a previously developed real-time 3D segmentation algorithm, we propose in the present work an extension towards an interactive application where user information can be used online to steer the segmentation result. This enables a synergistic collaboration between the operator and the underlying segmentation algorithm, thus contributing to higher segmentation accuracy, while keeping total analysis time competitive. To this end, we formalize the user interaction paradigm using a geometrical approach, where the user input is mapped to a non-cartesian space while this information is used to drive the boundary towards the position provided by the user. Additionally, we propose a shape regularization term which improves the interaction with the segmented surface, thereby making the interactive segmentation process less cumbersome. The resulting algorithm offers competitive performance both in terms of segmentation accuracy, as well as in terms of total analysis time. This contributes to a more efficient use of the existing segmentation tools in daily clinical practice. Furthermore, it compares favorably to state-of-the-art interactive segmentation software based on a 3D livewire-based algorithm.
Resumo:
OBJECTIVE: To develop an instrument to assess discrimination effects on health outcomes and behaviors, capable of distinguishing harmful differential treatment effects from their interpretation as discriminatory events. METHODS: Successive versions of an instrument were developed based on a systematic review of instruments assessing racial discrimination, focus groups and review by a panel comprising seven experts. The instrument was refined using cognitive interviews and pilot-testing. The final version of the instrument was administered to 424 undergraduate college students in the city of Rio de Janeiro, Southeastern Brazil, in 2010. Structural dimensionality, two types of reliability and construct validity were analyzed. RESULTS: Exploratory factor analysis corroborated the hypothesis of the instrument's unidimensionality, and seven experts verified its face and content validity. The internal consistency was 0.8, and test-retest reliability was higher than 0.5 for 14 out of 18 items. The overall score was higher among socially disadvantaged individuals and correlated with adverse health behaviors/conditions, particularly when differential treatments were attributed to discrimination. CONCLUSIONS: These findings indicate the validity and reliability of the instrument developed. The proposed instrument enables the investigation of novel aspects of the relationship between discrimination and health.
Resumo:
After a historical introduction, the bulk of the thesis concerns the study of a declarative semantics for logic programs. The main original contributions are: ² WFSX (Well–Founded Semantics with eXplicit negation), a new semantics for logic programs with explicit negation (i.e. extended logic programs), which compares favourably in its properties with other extant semantics. ² A generic characterization schema that facilitates comparisons among a diversity of semantics of extended logic programs, including WFSX. ² An autoepistemic and a default logic corresponding to WFSX, which solve existing problems of the classical approaches to autoepistemic and default logics, and clarify the meaning of explicit negation in logic programs. ² A framework for defining a spectrum of semantics of extended logic programs based on the abduction of negative hypotheses. This framework allows for the characterization of different levels of scepticism/credulity, consensuality, and argumentation. One of the semantics of abduction coincides with WFSX. ² O–semantics, a semantics that uniquely adds more CWA hypotheses to WFSX. The techniques used for doing so are applicable as well to the well–founded semantics of normal logic programs. ² By introducing explicit negation into logic programs contradiction may appear. I present two approaches for dealing with contradiction, and show their equivalence. One of the approaches consists in avoiding contradiction, and is based on restrictions in the adoption of abductive hypotheses. The other approach consists in removing contradiction, and is based in a transformation of contradictory programs into noncontradictory ones, guided by the reasons for contradiction.
Resumo:
The IEEE 802.15.4 standard provides appealing features to simultaneously support real-time and non realtime traffic, but it is only capable of supporting real-time communications from at most seven devices. Additionally, it cannot guarantee delay bounds lower than the superframe duration. Motivated by this problem, in this paper we propose an Explicit Guaranteed time slot Sharing and Allocation scheme (EGSA) for beacon-enabled IEEE 802.15.4 networks. This scheme is capable of providing tighter delay bounds for real-time communications by splitting the Contention Free access Period (CFP) into smaller mini time slots and by means of a new guaranteed bandwidth allocation scheme for a set of devices with periodic messages. At the same the novel bandwidth allocation scheme can maximize the duration of the CFP for non real-time communications. Performance analysis results show that the EGSA scheme works efficiently and outperforms competitor schemes both in terms of guaranteed delay and bandwidth utilization.
Resumo:
In this work, we present the explicit series solution of a specific mathematical model from the literature, the Deng bursting model, that mimics the glucose-induced electrical activity of pancreatic beta-cells (Deng, 1993). To serve to this purpose, we use a technique developed to find analytic approximate solutions for strongly nonlinear problems. This analytical algorithm involves an auxiliary parameter which provides us with an efficient way to ensure the rapid and accurate convergence to the exact solution of the bursting model. By using the homotopy solution, we investigate the dynamical effect of a biologically meaningful bifurcation parameter rho, which increases with the glucose concentration. Our analytical results are found to be in excellent agreement with the numerical ones. This work provides an illustration of how our understanding of biophysically motivated models can be directly enhanced by the application of a newly analytic method.
Resumo:
Given an algebraic curve in the complex affine plane, we describe how to determine all planar polynomial vector fields which leave this curve invariant. If all (finite) singular points of the curve are nondegenerate, we give an explicit expression for these vector fields. In the general setting we provide an algorithmic approach, and as an alternative we discuss sigma processes.