995 resultados para Hepatic Elimination Models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chagas disease or American trypanosomiasis is, together with geohelminths, the neglected disease that causes more loss of years of healthy life due to disability in Latin America. Chagas disease, as determined by the factors and determinants, shows that different contexts require different actions, preventing new cases or reducing the burden of disease. Control strategies must combine two general courses of action including prevention of transmission to prevent the occurrence of new cases (these measures are cost effective), as well as opportune diagnosis and treatment of infected individuals in order to prevent the clinical evolution of the disease and to allow them to recuperate their health. All actions should be implemented as fully as possible and with an integrated way, to maximise the impact. Chagas disease cannot be eradicated due because of the demonstrated existence of infected wild triatomines in permanent contact with domestic cycles and it contributes to the occurrence of at least few new cases. However, it is possible to interrupt the transmission ofTrypanosoma cruziin a large territory and to eliminate Chagas disease as a public health problem with a dramatic reduction of burden of the disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Functional divergence between homologous proteins is expected to affect amino acid sequences in two main ways, which can be considered as proxies of biochemical divergence: a "covarion-like" pattern of correlated changes in evolutionary rates, and switches in conserved residues ("conserved but different"). Although these patterns have been used in case studies, a large-scale analysis is needed to estimate their frequency and distribution. We use a phylogenomic framework of animal genes to answer three questions: 1) What is the prevalence of such patterns? 2) Can we link such patterns at the amino acid level with selection inferred at the codon level? 3) Are patterns different between paralogs and orthologs? We find that covarion-like patterns are more frequently detected than "constant but different," but that only the latter are correlated with signal for positive selection. Finally, there is no obvious difference in patterns between orthologs and paralogs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prevention of Trypanosoma cruzi infection in mammals likely depends on either prevention of the invading trypomastigotes from infecting host cells or the rapid recognition and killing of the newly infected cells byT. cruzi-specific T cells. We show here that multiple rounds of infection and cure (by drug therapy) fails to protect mice from reinfection, despite the generation of potent T cell responses. This disappointing result is similar to that obtained with many other vaccine protocols used in attempts to protect animals from T. cruziinfection. We have previously shown that immune recognition ofT. cruziinfection is significantly delayed both at the systemic level and at the level of the infected host cell. The systemic delay appears to be the result of a stealth infection process that fails to trigger substantial innate recognition mechanisms while the delay at the cellular level is related to the immunodominance of highly variable gene family proteins, in particular those of the trans-sialidase family. Here we discuss how these previous studies and the new findings herein impact our thoughts on the potential of prophylactic vaccination to serve a productive role in the prevention of T. cruziinfection and Chagas disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Astrocytes are the main neural cell type responsible for the maintenance of brain homeostasis. They form highly organized anatomical domains that are interconnected into extensive networks. These features, along with the expression of a wide array of receptors, transporters, and ion channels, ideally position them to sense and dynamically modulate neuronal activity. Astrocytes cooperate with neurons on several levels, including neurotransmitter trafficking and recycling, ion homeostasis, energy metabolism, and defense against oxidative stress. The critical dependence of neurons upon their constant support confers astrocytes with intrinsic neuroprotective properties which are discussed here. Conversely, pathogenic stimuli may disturb astrocytic function, thus compromising neuronal functionality and viability. Using neuroinflammation, Alzheimer's disease, and hepatic encephalopathy as examples, we discuss how astrocytic defense mechanisms may be overwhelmed in pathological conditions, contributing to disease progression.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND & AIMS Hy's Law, which states that hepatocellular drug-induced liver injury (DILI) with jaundice indicates a serious reaction, is used widely to determine risk for acute liver failure (ALF). We aimed to optimize the definition of Hy's Law and to develop a model for predicting ALF in patients with DILI. METHODS We collected data from 771 patients with DILI (805 episodes) from the Spanish DILI registry, from April 1994 through August 2012. We analyzed data collected at DILI recognition and at the time of peak levels of alanine aminotransferase (ALT) and total bilirubin (TBL). RESULTS Of the 771 patients with DILI, 32 developed ALF. Hepatocellular injury, female sex, high levels of TBL, and a high ratio of aspartate aminotransferase (AST):ALT were independent risk factors for ALF. We compared 3 ways to use Hy's Law to predict which patients would develop ALF; all included TBL greater than 2-fold the upper limit of normal (×ULN) and either ALT level greater than 3 × ULN, a ratio (R) value (ALT × ULN/alkaline phosphatase × ULN) of 5 or greater, or a new ratio (nR) value (ALT or AST, whichever produced the highest ×ULN/ alkaline phosphatase × ULN value) of 5 or greater. At recognition of DILI, the R- and nR-based models identified patients who developed ALF with 67% and 63% specificity, respectively, whereas use of only ALT level identified them with 44% specificity. However, the level of ALT and the nR model each identified patients who developed ALF with 90% sensitivity, whereas the R criteria identified them with 83% sensitivity. An equal number of patients who did and did not develop ALF had alkaline phosphatase levels greater than 2 × ULN. An algorithm based on AST level greater than 17.3 × ULN, TBL greater than 6.6 × ULN, and AST:ALT greater than 1.5 identified patients who developed ALF with 82% specificity and 80% sensitivity. CONCLUSIONS When applied at DILI recognition, the nR criteria for Hy's Law provides the best balance of sensitivity and specificity whereas our new composite algorithm provides additional specificity in predicting the ultimate development of ALF.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the EU funded integrated project "ACuteTox" is to develop a strategy in which general cytotoxicity, together with organ-specific endpoints and biokinetic features, are taken into consideration in the in vitro prediction of oral acute systemic toxicity. With regard to the nervous system, the effects of 23 reference chemicals were tested with approximately 50 endpoints, using a neuronal cell line, primary neuronal cell cultures, brain slices and aggregated brain cell cultures. Comparison of the in vitro neurotoxicity data with general cytotoxicity data generated in a non-neuronal cell line and with in vivo data such as acute human lethal blood concentration, revealed that GABA(A) receptor function, acetylcholine esterase activity, cell membrane potential, glucose uptake, total RNA expression and altered gene expression of NF-H, GFAP, MBP, HSP32 and caspase-3 were the best endpoints to use for further testing with 36 additional chemicals. The results of the second analysis showed that no single neuronal endpoint could give a perfect improvement in the in vitro-in vivo correlation, indicating that several specific endpoints need to be analysed and combined with biokinetic data to obtain the best correlation with in vivo acute toxicity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The organophosphate temephos has been the main insecticide used against larvae of the dengue and yellow fever mosquito (Aedes aegypti) in Brazil since the mid-1980s. Reports of resistance date back to 1995; however, no systematic reports of widespread temephos resistance have occurred to date. As resistance investigation is paramount for strategic decision-making by health officials, our objective here was to investigate the spatial and temporal spread of temephos resistance in Ae. aegypti in Brazil for the last 12 years using discriminating temephos concentrations and the bioassay protocols of the World Health Organization. The mortality results obtained were subjected to spatial analysis for distance interpolation using semi-variance models to generate maps that depict the spread of temephos resistance in Brazil since 1999. The problem has been expanding. Since 2002-2003, approximately half the country has exhibited mosquito populations resistant to temephos. The frequency of temephos resistance and, likely, control failures, which start when the insecticide mortality level drops below 80%, has increased even further since 2004. Few parts of Brazil are able to achieve the target 80% efficacy threshold by 2010/2011, resulting in a significant risk of control failure by temephos in most of the country. The widespread resistance to temephos in Brazilian Ae. aegypti populations greatly compromise effective mosquito control efforts using this insecticide and indicates the urgent need to identify alternative insecticides aided by the preventive elimination of potential mosquito breeding sites.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE We investigated the association between the proportion of long-chain n-3 polyunsaturated fatty acids (PUFA) in plasma phospholipids from blood samples drawn at enrollment and subsequent change in body weight. Sex, age, and BMI were considered as potential effect modifiers. METHOD A total of 1,998 women and men participating in the European Prospective Investigation into Cancer and Nutrition (EPIC) were followed for a median of 4.9 years. The associations between the proportion of plasma phospholipid long-chain n-3 PUFA and change in weight were investigated using mixed-effect linear regression. RESULTS The proportion of long-chain n-3 PUFA was not associated with change in weight. Among all participants, the 1-year weight change was -0.7 g per 1% point higher long-chain n-3 PUFA level (95% confidence interval: -20.7 to 19.3). The results when stratified by sex, age, or BMI groups were not systematically different. CONCLUSION The results of this study suggest that the proportion of long-chain n-3 PUFA in plasma phospholipids is not associated with subsequent change in body weight within the range of exposure in the general population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper discusses maintenance challenges of organisations with a huge number of devices and proposes the use of probabilistic models to assist monitoring and maintenance planning. The proposal assumes connectivity of instruments to report relevant features for monitoring. Also, the existence of enough historical registers with diagnosed breakdowns is required to make probabilistic models reliable and useful for predictive maintenance strategies based on them. Regular Markov models based on estimated failure and repair rates are proposed to calculate the availability of the instruments and Dynamic Bayesian Networks are proposed to model cause-effect relationships to trigger predictive maintenance services based on the influence between observed features and previously documented diagnostics

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our work is concerned with user modelling in open environments. Our proposal then is the line of contributions to the advances on user modelling in open environments thanks so the Agent Technology, in what has been called Smart User Model. Our research contains a holistic study of User Modelling in several research areas related to users. We have developed a conceptualization of User Modelling by means of examples from a broad range of research areas with the aim of improving our understanding of user modelling and its role in the next generation of open and distributed service environments. This report is organized as follow: In chapter 1 we introduce our motivation and objectives. Then in chapters 2, 3, 4 and 5 we provide the state-of-the-art on user modelling. In chapter 2, we give the main definitions of elements described in the report. In chapter 3, we present an historical perspective on user models. In chapter 4 we provide a review of user models from the perspective of different research areas, with special emphasis on the give-and-take relationship between Agent Technology and user modelling. In chapter 5, we describe the main challenges that, from our point of view, need to be tackled by researchers wanting to contribute to advances in user modelling. From the study of the state-of-the-art follows an exploratory work in chapter 6. We define a SUM and a methodology to deal with it. We also present some cases study in order to illustrate the methodology. Finally, we present the thesis proposal to continue the work, together with its corresponding work scheduling and temporalisation