887 resultados para Specific theories and interaction models
Resumo:
The determination of skeletal loading conditions in vivo and their relationship to the health of bone tissues, remain an open question. Computational modeling of the musculoskeletal system is the only practicable method providing a valuable approach to muscle and joint loading analyses, although crucial shortcomings limit the translation process of computational methods into the orthopedic and neurological practice. A growing attention focused on subject-specific modeling, particularly when pathological musculoskeletal conditions need to be studied. Nevertheless, subject-specific data cannot be always collected in the research and clinical practice, and there is a lack of efficient methods and frameworks for building models and incorporating them in simulations of motion. The overall aim of the present PhD thesis was to introduce improvements to the state-of-the-art musculoskeletal modeling for the prediction of physiological muscle and joint loads during motion. A threefold goal was articulated as follows: (i) develop state-of-the art subject-specific models and analyze skeletal load predictions; (ii) analyze the sensitivity of model predictions to relevant musculotendon model parameters and kinematic uncertainties; (iii) design an efficient software framework simplifying the effort-intensive phases of subject-specific modeling pre-processing. The first goal underlined the relevance of subject-specific musculoskeletal modeling to determine physiological skeletal loads during gait, corroborating the choice of full subject-specific modeling for the analyses of pathological conditions. The second goal characterized the sensitivity of skeletal load predictions to major musculotendon parameters and kinematic uncertainties, and robust probabilistic methods were applied for methodological and clinical purposes. The last goal created an efficient software framework for subject-specific modeling and simulation, which is practical, user friendly and effort effective. Future research development aims at the implementation of more accurate models describing lower-limb joint mechanics and musculotendon paths, and the assessment of an overall scenario of the crucial model parameters affecting the skeletal load predictions through probabilistic modeling.
Resumo:
Systems Biology is an innovative way of doing biology recently raised in bio-informatics contexts, characterised by the study of biological systems as complex systems with a strong focus on the system level and on the interaction dimension. In other words, the objective is to understand biological systems as a whole, putting on the foreground not only the study of the individual parts as standalone parts, but also of their interaction and of the global properties that emerge at the system level by means of the interaction among the parts. This thesis focuses on the adoption of multi-agent systems (MAS) as a suitable paradigm for Systems Biology, for developing models and simulation of complex biological systems. Multi-agent system have been recently introduced in informatics context as a suitabe paradigm for modelling and engineering complex systems. Roughly speaking, a MAS can be conceived as a set of autonomous and interacting entities, called agents, situated in some kind of nvironment, where they fruitfully interact and coordinate so as to obtain a coherent global system behaviour. The claim of this work is that the general properties of MAS make them an effective approach for modelling and building simulations of complex biological systems, following the methodological principles identified by Systems Biology. In particular, the thesis focuses on cell populations as biological systems. In order to support the claim, the thesis introduces and describes (i) a MAS-based model conceived for modelling the dynamics of systems of cells interacting inside cell environment called niches. (ii) a computational tool, developed for implementing the models and executing the simulations. The tool is meant to work as a kind of virtual laboratory, on top of which kinds of virtual experiments can be performed, characterised by the definition and execution of specific models implemented as MASs, so as to support the validation, falsification and improvement of the models through the observation and analysis of the simulations. A hematopoietic stem cell system is taken as reference case study for formulating a specific model and executing virtual experiments.
Resumo:
Analyzing and modeling relationships between the structure of chemical compounds, their physico-chemical properties, and biological or toxic effects in chemical datasets is a challenging task for scientific researchers in the field of cheminformatics. Therefore, (Q)SAR model validation is essential to ensure future model predictivity on unseen compounds. Proper validation is also one of the requirements of regulatory authorities in order to approve its use in real-world scenarios as an alternative testing method. However, at the same time, the question of how to validate a (Q)SAR model is still under discussion. In this work, we empirically compare a k-fold cross-validation with external test set validation. The introduced workflow allows to apply the built and validated models to large amounts of unseen data, and to compare the performance of the different validation approaches. Our experimental results indicate that cross-validation produces (Q)SAR models with higher predictivity than external test set validation and reduces the variance of the results. Statistical validation is important to evaluate the performance of (Q)SAR models, but does not support the user in better understanding the properties of the model or the underlying correlations. We present the 3D molecular viewer CheS-Mapper (Chemical Space Mapper) that arranges compounds in 3D space, such that their spatial proximity reflects their similarity. The user can indirectly determine similarity, by selecting which features to employ in the process. The tool can use and calculate different kinds of features, like structural fragments as well as quantitative chemical descriptors. Comprehensive functionalities including clustering, alignment of compounds according to their 3D structure, and feature highlighting aid the chemist to better understand patterns and regularities and relate the observations to established scientific knowledge. Even though visualization tools for analyzing (Q)SAR information in small molecule datasets exist, integrated visualization methods that allows for the investigation of model validation results are still lacking. We propose visual validation, as an approach for the graphical inspection of (Q)SAR model validation results. New functionalities in CheS-Mapper 2.0 facilitate the analysis of (Q)SAR information and allow the visual validation of (Q)SAR models. The tool enables the comparison of model predictions to the actual activity in feature space. Our approach reveals if the endpoint is modeled too specific or too generic and highlights common properties of misclassified compounds. Moreover, the researcher can use CheS-Mapper to inspect how the (Q)SAR model predicts activity cliffs. The CheS-Mapper software is freely available at http://ches-mapper.org.
Resumo:
The identification of molecular processes involved in cancer development and prognosis opened avenues for targeted therapies, which made treatment more tumor-specific and less toxic than conventional therapies. One important example is the epidermal growth factor receptor (EGFR) and EGFR-specific inhibitors (i.e. erlotinib). However, challenges such as drug resistance still remain in targeted therapies. Therefore, novel candidate compounds and new strategies are needed for improvement of therapy efficacy. Shikonin and its derivatives are cytotoxic constituents in traditional Chinese herbal medicine Zicao (Lithospermum erythrorhizin). In this study, we investigated the molecular mechanisms underlying the anti-cancer effects of shikonin and its derivatives in glioblastoma cells and leukemia cells. Most of shikonin derivatives showed strong cytotoxicity towards erlotinib-resistant glioblastoma cells, especially U87MG.ΔEGFR cells which overexpressed a deletion-activated EGFR (ΔEGFR). Moreover, shikonin and some derivatives worked synergistically with erlotinib in killing EGFR-overexpressing cells. Combination treatment with shikonin and erlotinib overcame the drug resistance of these cells to erlotinib. Western blotting analysis revealed that shikonin inhibited ΔEGFR phosphorylation and led to corresponding decreases in phosphorylation of EGFR downstream molecules. By means of Loewe additivity and Bliss independence drug interaction models, we found erlotinb and shikonin or its derivatives corporately suppressed ΔEGFR phosphorylation. We believed this to be a main mechanism responsible for their synergism in U87MG.ΔEGFR cells. In leukemia cells, which did not express EGFR, shikonin and its derivatives exhibited even greater cytotoxicity, suggesting the existence of other mechanisms. Microarray-based gene expression analysis uncovered the transcription factor c-MYC as the commonly deregulated molecule by shikonin and its derivatives. As validated by Western blotting analysis, DNA-binding assays and molecular docking, shikonin and its derivatives bound and inhibited c-MYC. Furthermore, the deregulation of ERK, JNK MAPK and AKT activity was closely associated with the reduction of c-MYC, indicating the involvement of these signaling molecules in shikonin-triggered c-MYC inactivation. In conclusion, the inhibition of EGFR signaling, synergism with erlotinib and targeting of c-MYC illustrate the multi-targeted feature of natural naphthoquinones such as shikonin and derivatives. This may open attractive possibilities for their use in a molecular targeted cancer therapy.
Resumo:
Vascular surgeons perform numerous highly sophisticated and delicate procedures. Due to restrictions in training time and the advent of endovascular techniques, new concepts including alternative environments for training and assessment of surgical skills are required. Over the past decade, training on simulators and synthetic models has become more sophisticated and lifelike. This study was designed to evaluate the impact of a 3-day intense training course in open vascular surgery on both specific and global vascular surgical skills.
Resumo:
BACKGROUND: Interaction refers to the situation in which the effect of 1 exposure on an outcome differs across strata of another exposure. We did a survey of epidemiologic studies published in leading journals to examine how interaction is assessed and reported. METHODS: We selected 150 case-control and 75 cohort studies published between May 2001 and May 2007 in leading general medicine, epidemiology, and clinical specialist journals. Two reviewers independently extracted data on study characteristics. RESULTS: Of the 225 studies, 138 (61%) addressed interaction. Among these, 25 (18%) presented no data or only a P value or a statement of statistical significance; 40 (29%) presented stratum-specific effect estimates but no meaningful comparison of these estimates; and 58 (42%) presented stratum-specific estimates and appropriate tests for interaction. Fifteen articles (11%) presented the individual effects of both exposures and also their joint effect or a product term, providing sufficient information to interpret interaction on an additive and multiplicative scale. Reporting was poorest in articles published in clinical specialist articles and most adequate in articles published in general medicine journals, with epidemiology journals in an intermediate position. CONCLUSIONS: A majority of articles reporting cohort and case-control studies address possible interactions between exposures. However, in about half of these, the information provided was unsatisfactory, and only 1 in 10 studies reported data that allowed readers to interpret interaction effects on an additive and multiplicative scale.
Resumo:
Chronic myeloid leukemia (CML) is a malignant myeloproliferative disease with a characteristic chronic phase (cp) of several years before progression to blast crisis (bc). The immune system may contribute to disease control in CML. We analyzed leukemia-specific immune responses in cpCML and bcCML in a retroviral-induced murine CML model. In the presence of cpCML and bcCML expressing the glycoprotein of lymphocytic choriomeningitis virus as a model leukemia antigen, leukemia-specific cytotoxic T lymphocytes (CTLs) became exhausted. They maintained only limited cytotoxic activity, and did not produce interferon-gamma or tumor necrosis factor-alpha or expand after restimulation. CML-specific CTLs were characterized by high expression of programmed death 1 (PD-1), whereas CML cells expressed PD-ligand 1 (PD-L1). Blocking the PD-1/PD-L1 interaction by generating bcCML in PD-1-deficient mice or by repetitive administration of alphaPD-L1 antibody prolonged survival. In addition, we found that PD-1 is up-regulated on CD8(+) T cells from CML patients. Taken together, our results suggest that blocking the PD-1/PD-L1 interaction may restore the function of CML-specific CTLs and may represent a novel therapeutic approach for CML.
Resumo:
Abstract This article disputes some basic questions concerning the coordination of seventeenth century sacred music, approaching the phenomenon through a musical species of particular interest, though largely unstudied from a performance practice point of view. Roman polychorality with its specific performing conditions offers an illuminating perspective on principles of musical direction and interaction which differ significantly from our modern access path towards these topics. The inquiry ranges from basics of performance (such as sheet music, rehearsals, direction technics, models and stylistic conditioning of the performers) over the concrete role of the maestro di cappella as well as the tactus as cornerstones of sacred music practice, up to the philosophical content the particular 'system of values', shining though this overall picture, seems to represent. On this groundwork the essay opens a discussion on conventions which have remained literally unchallenged by generations of researchers and performers.
Resumo:
This article analyzes the interaction between theories of radicalization and state responses to militancy in India. Focusing on the interpretation of the increased frequency of terrorist attacks in Indian metropolises in the last decade, the article examines the narratives surrounding those classified as terrorists in the context of rising Muslim militancy in the country. Different state agencies operate with different theories about the links between processes of radicalization and terrorist violence. The scenarios of radicalization underlying legislative efforts to prevent terrorism, the construction of motives by the police, and the interpretation of violence by the judiciary all rely on assumptions about radicalization and violence. Such narratives are used to explain terrorism both to security agencies and to the public; they inform the categories and scenarios of prevention. Prevention relies on detection of future deeds, planning, intentions, and even potential intentions. "Detection" of potential intentions relies on assumptions about specific dispositions. Identification of such dispositions in turn relies on the context-specific theories of the causes of militancy. These determine what "characteristics" of individuals or groups indicate potential threats and form the basis for their categorization as "potentially dangerous." The article explores the cultural contexts of theories of radicalization, focusing on how they are framed by societal understandings of the causes of deviance and the relation between the individual and society emerging in contemporary India. It examines the shift in the perception of threat and the categories of "dangerous others" from a focus on role to a focus on ascriptive identity.
Resumo:
Abelian and non-Abelian gauge theories are of central importance in many areas of physics. In condensed matter physics, AbelianU(1) lattice gauge theories arise in the description of certain quantum spin liquids. In quantum information theory, Kitaev’s toric code is a Z(2) lattice gauge theory. In particle physics, Quantum Chromodynamics (QCD), the non-Abelian SU(3) gauge theory of the strong interactions between quarks and gluons, is nonperturbatively regularized on a lattice. Quantum link models extend the concept of lattice gauge theories beyond the Wilson formulation, and are well suited for both digital and analog quantum simulation using ultracold atomic gases in optical lattices. Since quantum simulators do not suffer from the notorious sign problem, they open the door to studies of the real-time evolution of strongly coupled quantum systems, which are impossible with classical simulation methods. A plethora of interesting lattice gauge theories suggests itself for quantum simulation, which should allow us to address very challenging problems, ranging from confinement and deconfinement, or chiral symmetry breaking and its restoration at finite baryon density, to color superconductivity and the real-time evolution of heavy-ion collisions, first in simpler model gauge theories and ultimately in QCD.
Resumo:
The Youngest Toba Tuff (YTT, erupted ca. 74 ka ago) is a distinctive and widespread tephra marker across south and southeast Asia. The climatic, human and environmental consequences of the YTT eruption are widely debated. Although a considerable body of geochemical data is available for this unit, there has not been a systematic study of the variability of the ash geochemistry. Intrinsic (magmatic) and extrinsic (post-depositional) chemical variations bring fundamental information regarding the petrogenesis of the magma, the distribution of the tephra and the interaction between the ash and the receiving environment. Considering the importance of the geochemistry of the YTT for stratigraphic correlations and eruptive models, it is central to the YTT debate to quantify and interpret such variations. Here we collate all published geochemical data on the YTT glass, including analyses from 68 sites described in the literature and three new samples. Two principal sources of chemical variation are investigated: (i) compositional zonation of the magma reservoir, and (ii) post-depositional alteration. Post-depositional leaching is responsible for up to ca. 11% differences in Na2O/K2O and ca. 1% differences in SiO2/Al2O3 ratios in YTT glass from marine sites. Continental tephra are 2% higher in Na2O/K2O and 3% higher in SiO2/Al2O3 respect to the marine tephra. We interpret such post-depositional glass alteration as related to seawater induced alkali migration in marine environments, or to site-specific water pH. Crystal fractionation and consequential magmatic differentiation, which produced order-of-magnitude variations in trace element concentrations reported in the literature, also produced major element differences in the YTT glass. FeO/Al2O3 ratios vary by about 50 %, which is analytically significant. These variations represent magmatic fractionation involving Fe-bearing phases. We also compared major element concentrations in YTT and Oldest Toba Tuff (OTT) ash samples, to identify potential compositional differences that could constrain the stratigraphic identity of the Morgaon ash (Western India); no differences between the OTT and YTT samples were observed.
Resumo:
In the present study we examined the interrelation of everyday life handedness and hand preference in basketball, as an area of expertise that requires individuals being proficient with both their nondominant and dominant hand. A secondary aim was to elucidate the link between basketball-specific practice, hand preference in basketball and everyday life handedness. Therefore, 176 expert basketball players self-reported their hand preference for activities of daily living and for basketball-specific behavior as well as details about their basketball-specific history via questionnaire. We found that compared to the general population the one-hand bias was significantly reduced for both everyday life and basketball-specific hand preference (i.e., a higher prevalence of mixed-handed individuals), and that both concepts were significantly related. Moreover, only preference scores for lay-up and dribbling skills were significantly related to measures of basketball-specific practice. Consequently, training-induced modulations of lateral preference seem to be very specific to only a few basketball-specific skills, and do not generalize to other skills within the domain of basketball nor do they extend into everyday life handedness. The results are discussed in terms of their relevance regarding theories of handedness and their practical implications for the sport of basketball.
Resumo:
The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable approach to risk estimation would be to use organ-specific non-linear risk models applied to the dose distributions of organs within or near the treatment fields (lungs and contralateral breast in the case of breast radiotherapy) as the majority of radiation-induced secondary cancers are found in the beam-bordering regions.
Resumo:
A variety of studies indicate that the process of athrosclerosis begins in childhood. There was limited information on the association of the changes in anthropometric variables to blood lipids in school age children and adolescents. Previous longitudinal studies of children typically with insufficient frequency of observation could not provide sound inference on the dynamics of change in blood lipids. The aims of this analysis are (1) to document the sex- and ethnic-specific trajectory and velocity curves of blood lipids (TC, LDL-C, HDL-C and TG); (2) to evaluate the relationship of changes in anthropometric variables, such as height, weight and BMI, to blood lipids from age 8 to 18 years. ^ Project HeartBeat! is a longitudinal study designed to examine the patterns of serial change in major cardiovascular risk factors. Cohort of three different age levels, 8, 11 and 14 years at baseline, with a total of 678 participants were enrolled. Each member of these cohorts was examined three times per year for up to four years. ^ Sex- and ethnic-specific trajectory and velocity curves of blood lipids; demonstrated the complex and polyphasic changes in TC, LDL-C, HDL-C and TG longitudinally. The trajectory curves of TC, LDL-C and HDL-C with age showed curvilinear patterns of change. The velocity change in TC, HDL-C and LDL-C showed U-shaped curves for non-Blacks, and nearly linear lines in velocity of TG for both Blacks and non-Blacks. ^ The relationship of changes in anthropometric variables to blood lipids was evaulated by adding height, weight, or BMI and associated interaction terms separately to the basic age-sex models. Height or height gain had a significant negative association with changes in TC, LDL-C and HDL-C. Weight or BMI gain showed positive associations with TC, LDL-C and TC, and a negative relationship with HDL-C. ^ Dynamic changes of blood lipids in school age children and adolescents observed from this analysis suggested that using fixed screening criteria under the current NCEP guidelines for all ages 2–19 may not be appropriate for this age group. The association of increasing BMI or weight to an adverse blood lipid profile found in this analysis also indicated that weight or BMI monitoring could be a future intervention to be implemented in the pediatric population. ^
Resumo:
Complex diseases such as cancer result from multiple genetic changes and environmental exposures. Due to the rapid development of genotyping and sequencing technologies, we are now able to more accurately assess causal effects of many genetic and environmental factors. Genome-wide association studies have been able to localize many causal genetic variants predisposing to certain diseases. However, these studies only explain a small portion of variations in the heritability of diseases. More advanced statistical models are urgently needed to identify and characterize some additional genetic and environmental factors and their interactions, which will enable us to better understand the causes of complex diseases. In the past decade, thanks to the increasing computational capabilities and novel statistical developments, Bayesian methods have been widely applied in the genetics/genomics researches and demonstrating superiority over some regular approaches in certain research areas. Gene-environment and gene-gene interaction studies are among the areas where Bayesian methods may fully exert its functionalities and advantages. This dissertation focuses on developing new Bayesian statistical methods for data analysis with complex gene-environment and gene-gene interactions, as well as extending some existing methods for gene-environment interactions to other related areas. It includes three sections: (1) Deriving the Bayesian variable selection framework for the hierarchical gene-environment and gene-gene interactions; (2) Developing the Bayesian Natural and Orthogonal Interaction (NOIA) models for gene-environment interactions; and (3) extending the applications of two Bayesian statistical methods which were developed for gene-environment interaction studies, to other related types of studies such as adaptive borrowing historical data. We propose a Bayesian hierarchical mixture model framework that allows us to investigate the genetic and environmental effects, gene by gene interactions (epistasis) and gene by environment interactions in the same model. It is well known that, in many practical situations, there exists a natural hierarchical structure between the main effects and interactions in the linear model. Here we propose a model that incorporates this hierarchical structure into the Bayesian mixture model, such that the irrelevant interaction effects can be removed more efficiently, resulting in more robust, parsimonious and powerful models. We evaluate both of the 'strong hierarchical' and 'weak hierarchical' models, which specify that both or one of the main effects between interacting factors must be present for the interactions to be included in the model. The extensive simulation results show that the proposed strong and weak hierarchical mixture models control the proportion of false positive discoveries and yield a powerful approach to identify the predisposing main effects and interactions in the studies with complex gene-environment and gene-gene interactions. We also compare these two models with the 'independent' model that does not impose this hierarchical constraint and observe their superior performances in most of the considered situations. The proposed models are implemented in the real data analysis of gene and environment interactions in the cases of lung cancer and cutaneous melanoma case-control studies. The Bayesian statistical models enjoy the properties of being allowed to incorporate useful prior information in the modeling process. Moreover, the Bayesian mixture model outperforms the multivariate logistic model in terms of the performances on the parameter estimation and variable selection in most cases. Our proposed models hold the hierarchical constraints, that further improve the Bayesian mixture model by reducing the proportion of false positive findings among the identified interactions and successfully identifying the reported associations. This is practically appealing for the study of investigating the causal factors from a moderate number of candidate genetic and environmental factors along with a relatively large number of interactions. The natural and orthogonal interaction (NOIA) models of genetic effects have previously been developed to provide an analysis framework, by which the estimates of effects for a quantitative trait are statistically orthogonal regardless of the existence of Hardy-Weinberg Equilibrium (HWE) within loci. Ma et al. (2012) recently developed a NOIA model for the gene-environment interaction studies and have shown the advantages of using the model for detecting the true main effects and interactions, compared with the usual functional model. In this project, we propose a novel Bayesian statistical model that combines the Bayesian hierarchical mixture model with the NOIA statistical model and the usual functional model. The proposed Bayesian NOIA model demonstrates more power at detecting the non-null effects with higher marginal posterior probabilities. Also, we review two Bayesian statistical models (Bayesian empirical shrinkage-type estimator and Bayesian model averaging), which were developed for the gene-environment interaction studies. Inspired by these Bayesian models, we develop two novel statistical methods that are able to handle the related problems such as borrowing data from historical studies. The proposed methods are analogous to the methods for the gene-environment interactions on behalf of the success on balancing the statistical efficiency and bias in a unified model. By extensive simulation studies, we compare the operating characteristics of the proposed models with the existing models including the hierarchical meta-analysis model. The results show that the proposed approaches adaptively borrow the historical data in a data-driven way. These novel models may have a broad range of statistical applications in both of genetic/genomic and clinical studies.