973 resultados para extended Hildebrand solubility approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

HIV virulence, i.e. the time of progression to AIDS, varies greatly among patients. As for other rapidly evolving pathogens of humans, it is difficult to know if this variance is controlled by the genotype of the host or that of the virus because the transmission chain is usually unknown. We apply the phylogenetic comparative approach (PCA) to estimate the heritability of a trait from one infection to the next, which indicates the control of the virus genotype over this trait. The idea is to use viral RNA sequences obtained from patients infected by HIV-1 subtype B to build a phylogeny, which approximately reflects the transmission chain. Heritability is measured statistically as the propensity for patients close in the phylogeny to exhibit similar infection trait values. The approach reveals that up to half of the variance in set-point viral load, a trait associated with virulence, can be heritable. Our estimate is significant and robust to noise in the phylogeny. We also check for the consistency of our approach by showing that a trait related to drug resistance is almost entirely heritable. Finally, we show the importance of taking into account the transmission chain when estimating correlations between infection traits. The fact that HIV virulence is, at least partially, heritable from one infection to the next has clinical and epidemiological implications. The difference between earlier studies and ours comes from the quality of our dataset and from the power of the PCA, which can be applied to large datasets and accounts for within-host evolution. The PCA opens new perspectives for approaches linking clinical data and evolutionary biology because it can be extended to study other traits or other infectious diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the benefits and experimental feasibility of approaches enabling the shift from short (1.7kDa on average) peptides in bottom-up proteomics to about twice longer (~3.2kDa on average) peptides in the so-called extended bottom-up proteomics. Candida albicans secreted aspartic protease Sap9 has been selected for evaluation as an extended bottom-up proteomic-grade enzyme due to its suggested dibasic cleavage specificity and ease of production. We report the extensive characterization of Sap9 specificity and selectivity revealing that protein cleavage by Sap9 most often occurs in the vicinity of proximal basic amino acids, and in select cases also at basic and hydrophobic residues. Sap9 is found to cleave a large variety of proteins in a relatively short, ~1h, period of time and it is efficient in a broad pH range, including slightly acidic, e. g., pH5.5, conditions. Importantly, the resulting peptide mixtures contain representative peptides primarily in the target 3-7kDa range. The utility and advantages of this enzyme in routine analysis of protein mixtures are demonstrated and the limitations are discussed. Overall, Sap9 has a potential to become an enzyme of choice in an extended bottom-up proteomics, which is technically ready to complement the traditional bottom-up proteomics for improved targeted protein structural analysis and expanded proteome coverage. BIOLOGICAL SIGNIFICANCE: Advances in biological applications of mass spectrometry-based bottom-up proteomics are oftentimes limited by the extreme complexity of biological samples, e.g., proteomes or protein complexes. One of the reasons for it is in the complexity of the mixtures of enzymatically (most often using trypsin) produced short (<3kDa) peptides, which may exceed the analytical capabilities of liquid chromatography and mass spectrometry. Information on localization of protein modifications may also be affected by the small size of typically produced peptides. On the other hand, advances in high-resolution mass spectrometry and liquid chromatography have created an intriguing opportunity of improving proteome analysis by gradually increasing the size of enzymatically-derived peptides in MS-based bottom-up proteomics. Bioinformatics has already confirmed the envisioned advantages of such approach. The remaining bottle-neck is an enzyme that could produce longer peptides. Here, we report on the characterization of a possible candidate enzyme, Sap9, which may be considered for producing longer, e.g., 3-7kDa, peptides and lead to a development of extended bottom-up proteomics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the recent advances in structural analysis of monoclonal antibodies with bottom-up, middle-down, and top-down mass spectrometry (MS), further improvements in analysis accuracy, depth, and speed are needed. The remaining challenges include quantitatively accurate assignment of post-translational modifications, reduction of artifacts introduced during sample preparation, increased sequence coverage per liquid chromatography (LC) MS experiment, and ability to extend the detailed characterization to simple antibody cocktails and more complex antibody mixtures. Here, we evaluate the recently introduced extended bottom-up proteomics (eBUP) approach based on proteolysis with secreted aspartic protease 9, Sap9, for analysis of monoclonal antibodies. Key findings of the Sap9-based proteomics analysis of a single antibody include: (i) extensive antibody sequence coverage with up to 100% for the light chain and up to 99-100% for the heavy chain in a single LC-MS run; (ii) connectivity of complementarity-determining regions (CDRs) via Sap9-produced large proteolytic peptides (3.4 kDa on average) containing up to two CDRs per peptide; (iii) reduced artifact introduction (e. g., deamidation) during proteolysis with Sap9 compared to conventional bottom-up proteomics workflows. The analysis of a mixture of six antibodies via Sap9-based eBUP produced comparable results. Due to the reasons specified above, Sap9-produced proteolytic peptides improve the identification confidence of antibodies from the mixtures compared to conventional bottom-up proteomics dealing with shorter proteolytic peptides.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid (whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then the problem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid(whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then theproblem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows how recently developed regression-based methods for thedecomposition of health inequality can be extended to incorporateindividual heterogeneity in the responses of health to the explanatoryvariables. We illustrate our method with an application to the CanadianNPHS of 1994. Our strategy for the estimation of heterogeneous responsesis based on the quantile regression model. The results suggest that thereis an important degree of heterogeneity in the association of health toexplanatory variables which, in turn, accounts for a substantial percentageof inequality in observed health. A particularly interesting finding isthat the marginal response of health to income is zero for healthyindividuals but positive and significant for unhealthy individuals. Theheterogeneity in the income response reduces both overall health inequalityand income related health inequality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Liquid-chromatography (LC) high-resolution (HR) mass spectrometry (MS) analysis can record HR full scans, a technique of detection that shows comparable selectivity and sensitivity to ion transitions (SRM) performed with triple-quadrupole (TQ)-MS but that allows de facto determination of "all" ions including drug metabolites. This could be of potential utility in in vivo drug metabolism and pharmacovigilance studies in order to have a more comprehensive insight in drug biotransformation profile differences in patients. This simultaneous quantitative and qualitative (Quan/Qual) approach has been tested with 20 patients chronically treated with tamoxifen (TAM). The absolute quantification of TAM and three metabolites in plasma was realized using HR- and TQ-MS and compared. The same LC-HR-MS analysis allowed the identification and relative quantification of 37 additional TAM metabolites. A number of new metabolites were detected in patients' plasma including metabolites identified as didemethyl-trihydroxy-TAM-glucoside and didemethyl-tetrahydroxy-TAM-glucoside conjugates corresponding to TAM with six and seven biotransformation steps, respectively. Multivariate analysis allowed relevant patterns of metabolites and ratios to be associated with TAM administration and CYP2D6 genotype. Two hydroxylated metabolites, α-OH-TAM and 4'-OH-TAM, were newly identified as putative CYP2D6 substrates. The relative quantification was precise (<20 %), and the semiquantitative estimation suggests that metabolite levels are non-negligible. Metabolites could play an important role in drug toxicity, but their impact on drug-related side effects has been partially neglected due to the tremendous effort needed with previous MS technologies. Using present HR-MS, this situation should evolve with the straightforward determination of drug metabolites, enlarging the possibilities in studying inter- and intra-patients drug metabolism variability and related effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

First and second branchial arch syndromes (BAS) manifest as combined tissue deficiencies and hypoplasias of the face, external ear, middle ear and maxillary and mandibular arches. They represent the second most common craniofacial malformation after cleft lip and palate. Extended knowledge of the embryology and anatomy of each branchial arch derivative is mandatory for the diagnosis and grading of different BAS lesions and in the follow-up of postoperative patients. In recent years, many new complex surgical approaches and procedures have been designed by maxillofacial surgeons to treat extensive maxillary, mandibular and external and internal ear deformations. The purpose of this review is to evaluate the role of different imaging modalities (orthopantomogram (OPG), lateral and posteroanterior cephalometric radiographs, CT and MRI) in the diagnosis of a wide spectrum of first and second BAS, including hemifacial microsomia, mandibulofacial dysostosis, branchio-oto-renal syndrome, Pierre Robin sequence and Nager acrofacial dysostosis. Additionally, we aim to emphasize the importance of the systematic use of a multimodality imaging approach to facilitate the precise grading of these syndromes, as well as the preoperative planning of different reconstructive surgical procedures and their follow-up during treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most sedimentary modelling programs developed in recent years focus on either terrigenous or carbonate marine sedimentation. Nevertheless, only a few programs have attempted to consider mixed terrigenous-carbonate sedimentation, and most of these are two-dimensional, which is a major restriction since geological processes take place in 3D. This paper presents the basic concepts of a new 3D mathematical forward simulation model for clastic sediments, which was developed from SIMSAFADIM, a previous 3D carbonate sedimentation model. The new extended model, SIMSAFADIM-CLASTIC, simulates processes of autochthonous marine carbonate production and accumulation, together with clastic transport and sedimentation in three dimensions of both carbonate and terrigenous sediments. Other models and modelling strategies may also provide realistic and efficient tools for prediction of stratigraphic architecture and facies distribution of sedimentary deposits. However, SIMSAFADIM-CLASTIC becomes an innovative model that attempts to simulate different sediment types using a process-based approach, therefore being a useful tool for 3D prediction of stratigraphic architecture and facies distribution in sedimentary basins. This model is applied to the neogene Vallès-Penedès half-graben (western Mediterranean, NE Spain) to show the capacity of the program when applied to a realistic geologic situation involving interactions between terrigenous clastics and carbonate sediments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recently developed variational Wigner-Kirkwood approach is extended to the relativistic mean field theory for finite nuclei. A numerical application to the calculation of the surface energy coefficient in semi-infinite nuclear matter is presented. The new method is contrasted with the standard density functional theory and the fully quantal approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a generalization of the density functional theory. The theory leads to single-particle equations of motion with a quasilocal mean-field operator, which contains a quasiparticle position-dependent effective mass and a spin-orbit potential. The energy density functional is constructed using the extended Thomas-Fermi approximation and the ground-state properties of doubly magic nuclei are considered within the framework of this approach. Calculations were performed using the finite-range Gogny D1S forces and the results are compared with the exact Hartree-Fock calculations

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most sedimentary modelling programs developed in recent years focus on either terrigenous or carbonate marine sedimentation. Nevertheless, only a few programs have attempted to consider mixed terrigenous-carbonate sedimentation, and most of these are two-dimensional, which is a major restriction since geological processes take place in 3D. This paper presents the basic concepts of a new 3D mathematical forward simulation model for clastic sediments, which was developed from SIMSAFADIM, a previous 3D carbonate sedimentation model. The new extended model, SIMSAFADIM-CLASTIC, simulates processes of autochthonous marine carbonate production and accumulation, together with clastic transport and sedimentation in three dimensions of both carbonate and terrigenous sediments. Other models and modelling strategies may also provide realistic and efficient tools for prediction of stratigraphic architecture and facies distribution of sedimentary deposits. However, SIMSAFADIM-CLASTIC becomes an innovative model that attempts to simulate different sediment types using a process-based approach, therefore being a useful tool for 3D prediction of stratigraphic architecture and facies distribution in sedimentary basins. This model is applied to the neogene Vallès-Penedès half-graben (western Mediterranean, NE Spain) to show the capacity of the program when applied to a realistic geologic situation involving interactions between terrigenous clastics and carbonate sediments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rationale: The purpose of this article is to demonstrate the use of homologous culture cells in treating an advanced coccon formation of the hand and three extended squamous cell carcinomas of the lower and upper limb in a patient with recessive dystrophic epidermolysis bullosa. The preparation and application of these cells in the operation room are being described. Methods: A number of surgical approaches have been described to correct these deformities in order to improve function.We propose a new therapeutic approach of treating loss of motion and independent digital function as well as coverage of large skin defects in a patient with recessive dystrophic epidermolysis bullosa by using autologous culture cells. Surgical treatment of these patients is really difficult because of the existing skin fragility. Furthermore, surgical wounds do not easily heal because of recurrent blisters and erosions as well as due to the patients' poor nutricial status. Results: We report our experience of mutiple extended cutaneous squamous cell carcinomas arising in our patient which were successfully managed using autologous composite cultured skin grafts. The cocoon hand deformity was also treated with the limb becoming functional. Conclusion: The use of autologous keratinocytes and fibroblasts in epidermolysis bullosa is hereby outlined for the fist time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new phenomenological approach to nucleation, based on the combination of the extended modified liquid drop model and dynamical nucleation theory. The new model proposes a new cluster definition, which properly includes the effect of fluctuations, and it is consistent both thermodynamically and kinetically. The model is able to predict successfully the free energy of formation of the critical nucleus, using only macroscopic thermodynamic properties. It also accounts for the spinodal and provides excellent agreement with the result of recent simulations.