963 resultados para Compositional modulation
Resumo:
PARP inhibition can induce anti-neoplastic effects when used as monotherapy or in combination with chemo- or radiotherapy in various tumor settings; however, the basis for the anti-metastasic activities resulting from PARP inhibition remains unknown. PARP inhibitors may also act as modulators of tumor angiogenesis. Proteomic analysis of endothelial cells revealed that vimentin, an intermediary filament involved in angiogenesis and a specific hallmark of EndoMT (endothelial to mesenchymal transition) transformation, was down-regulated following loss of PARP-1 function in endothelial cells. VE-cadherin, an endothelial marker of vascular normalization, was up-regulated in HUVEC treated with PARP inhibitors or following PARP-1 silencing; vimentin over-expression was sufficient to drive to an EndoMT phenotype. In melanoma cells, PARP inhibition reduced pro-metastatic markers, including vasculogenic mimicry. We also demonstrated that vimentin expression was sufficient to induce increased mesenchymal/pro-metastasic phenotypic changes in melanoma cells, including ILK/GSK3-β-dependent E-cadherin down-regulation, Snail1 activation and increased cell motility and migration. In a murine model of metastatic melanoma, PARP inhibition counteracted the ability of melanoma cells to metastasize to the lung. These results suggest that inhibition of PARP interferes with key metastasis-promoting processes, leading to suppression of invasion and colonization of distal organs by aggressive metastatic cells.
Resumo:
While the influence of water in Helicobacter pylori culturability and membrane integrity has been extensively studied, there are little data concerning the effect of this environment on virulence properties. Therefore, we studied the culturability of water-exposed H. pylori and determined whether there was any relation with the bacterium’s ability to adhere, produce functional components of pathogenicity and induce inflammation and alterations in apoptosis in an experimental model of human gastric epithelial cells. H. pylori partially retained the ability to adhere to epithelial cells even after complete loss of culturability. However, the microorganism is no longer effective in eliciting in vitro host cell inflammation and apoptosis, possibly due to the non-functionality of the cag type IV secretion system. These H. pylori-induced host cell responses, which are lost along with culturability, are known to increase epithelial cell turnover and, consequently, could have a deleterious effect on the initial H. pylori colonisation process. The fact that adhesion is maintained by H. pylori to the detriment of other factors involved in later infection stages appears to point to a modulation of the physiology of the pathogen after water exposure and might provide the microorganism with the necessary means to, at least transiently, colonise the human stomach.
Resumo:
The aim of this review is to describe the contributions of the knowledge of T-cell responses to the understanding of the physiopathology and the responsiveness to etiological treatment during the chronic phase of Chagas disease.T-helper (Th)1 and interleukin (IL)-10Trypanosoma cruzi-specific T-cells have been linked to the asymptomatic phase or to severe clinical forms of the disease, respectively orvice versa, depending on the T. cruziantigen source, the patient’s location and the performed immunological assays. Parasite-specific T-cell responses are modulated after benznidazole (BZ) treatment in chronically T. cruzi-infected subjects in association with a significant decrease in T. cruzi-specific antibodies. Accumulating evidence has indicated that treatment efficacy during experimental infection with T. cruziresults from the combined action of BZ and the activation of appropriate immune responses in the host. However, strong support of this interaction in T. cruzi-infected humans remains lacking. Overall, the quality of T-cell responses might be a key factor in not only disease evolution, but also chemotherapy responsiveness. Immunological parameters are potential indicators of treatment response regardless of achievement of cure. Providing tools to monitor and provide early predictions of treatment success will allow the development of new therapeutic options.
Resumo:
As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced
Resumo:
In the eighties, John Aitchison (1986) developed a new methodological approach for the statistical analysis of compositional data. This new methodology was implemented in Basic routines grouped under the name CODA and later NEWCODA inMatlab (Aitchison, 1997). After that, several other authors have published extensions to this methodology: Marín-Fernández and others (2000), Barceló-Vidal and others (2001), Pawlowsky-Glahn and Egozcue (2001, 2002) and Egozcue and others (2003). (...)
Resumo:
The log-ratio methodology makes available powerful tools for analyzing compositionaldata. Nevertheless, the use of this methodology is only possible for those data setswithout null values. Consequently, in those data sets where the zeros are present, aprevious treatment becomes necessary. Last advances in the treatment of compositionalzeros have been centered especially in the zeros of structural nature and in the roundedzeros. These tools do not contemplate the particular case of count compositional datasets with null values. In this work we deal with \count zeros" and we introduce atreatment based on a mixed Bayesian-multiplicative estimation. We use the Dirichletprobability distribution as a prior and we estimate the posterior probabilities. Then weapply a multiplicative modi¯cation for the non-zero values. We present a case studywhere this new methodology is applied.Key words: count data, multiplicative replacement, composition, log-ratio analysis
Resumo:
In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel densityestimation techniques in the context of compositional data analysis. Indeed, they gavetwo options for the choice of the kernel to be used in the kernel estimator. One ofthese kernels is based on the use the alr transformation on the simplex SD jointly withthe normal distribution on RD-1. However, these authors themselves recognized thatthis method has some deficiencies. A method for overcoming these dificulties based onrecent developments for compositional data analysis and multivariate kernel estimationtheory, combining the ilr transformation with the use of the normal density with a fullbandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu-Figueras (2006). Here we present an extensive simulation study that compares bothmethods in practice, thus exploring the finite-sample behaviour of both estimators
Resumo:
The quantitative estimation of Sea Surface Temperatures from fossils assemblages is afundamental issue in palaeoclimatic and paleooceanographic investigations. TheModern Analogue Technique, a widely adopted method based on direct comparison offossil assemblages with modern coretop samples, was revised with the aim ofconforming it to compositional data analysis. The new CODAMAT method wasdeveloped by adopting the Aitchison metric as distance measure. Modern coretopdatasets are characterised by a large amount of zeros. The zero replacement was carriedout by adopting a Bayesian approach to the zero replacement, based on a posteriorestimation of the parameter of the multinomial distribution. The number of modernanalogues from which reconstructing the SST was determined by means of a multipleapproach by considering the Proxies correlation matrix, Standardized Residual Sum ofSquares and Mean Squared Distance. This new CODAMAT method was applied to theplanktonic foraminiferal assemblages of a core recovered in the Tyrrhenian Sea.Kew words: Modern analogues, Aitchison distance, Proxies correlation matrix,Standardized Residual Sum of Squares
Resumo:
In Catalonia, according to the nitrate directive (91/676/EU), nine areas have been declared as vulnerable to nitrate pollution from agricultural sources (Decret 283/1998 and Decret 479/2004). Five of these areas have been studied coupling hydro chemical data with a multi-isotopic approach (Vitòria et al. 2005, Otero et al. 2007, Puig et al. 2007), in an ongoing research project looking for an integrated application of classical hydrochemistry data, with a comprehensive isotopic characterisation (δ15N and δ18O of dissolved nitrate, δ34S and δ18O of dissolved sulphate, δ13C of dissolved inorganic carbon, and δD and δ18O of water). Within this general frame, the contribution presented explores compositional ways of: (i) distinguish agrochemicals and manure N pollution, (ii) quantify natural attenuation of nitrate (denitrification), and identify possible controlling factors.To achieve this two-fold goal, the following techniques have been used. Separate biplots of each suite of data show that each studied region has a distinct δ34S and pH signatures, but they are homogeneous with regard to NO3- related variables. Also, the geochemical variables were projected onto the compositional directions associated with the possible denitrification reactions in each region. The resulting balances can be plot together with some isotopes, to assess their likelihood of occurrence
Resumo:
Geochemical data that is derived from the whole or partial analysis of various geologic materialsrepresent a composition of mineralogies or solute species. Minerals are composed of structuredrelationships between cations and anions which, through atomic and molecular forces, keep the elementsbound in specific configurations. The chemical compositions of minerals have specific relationships thatare governed by these molecular controls. In the case of olivine, there is a well-defined relationshipbetween Mn-Fe-Mg with Si. Balances between the principal elements defining olivine composition andother significant constituents in the composition (Al, Ti) have been defined, resulting in a near-linearrelationship between the logarithmic relative proportion of Si versus (MgMnFe) and Mg versus (MnFe),which is typically described but poorly illustrated in the simplex.The present contribution corresponds to ongoing research, which attempts to relate stoichiometry andgeochemical data using compositional geometry. We describe here the approach by which stoichiometricrelationships based on mineralogical constraints can be accounted for in the space of simplicialcoordinates using olivines as an example. Further examples for other mineral types (plagioclases andmore complex minerals such as clays) are needed. Issues that remain to be dealt with include thereduction of a bulk chemical composition of a rock comprised of several minerals from which appropriatebalances can be used to describe the composition in a realistic mineralogical framework. The overallobjective of our research is to answer the question: In the cases where the mineralogy is unknown, arethere suitable proxies that can be substituted?Kew words: Aitchison geometry, balances, mineral composition, oxides
Resumo:
Our essay aims at studying suitable statistical methods for the clustering ofcompositional data in situations where observations are constituted by trajectories ofcompositional data, that is, by sequences of composition measurements along a domain.Observed trajectories are known as “functional data” and several methods have beenproposed for their analysis.In particular, methods for clustering functional data, known as Functional ClusterAnalysis (FCA), have been applied by practitioners and scientists in many fields. To ourknowledge, FCA techniques have not been extended to cope with the problem ofclustering compositional data trajectories. In order to extend FCA techniques to theanalysis of compositional data, FCA clustering techniques have to be adapted by using asuitable compositional algebra.The present work centres on the following question: given a sample of compositionaldata trajectories, how can we formulate a segmentation procedure giving homogeneousclasses? To address this problem we follow the steps described below.First of all we adapt the well-known spline smoothing techniques in order to cope withthe smoothing of compositional data trajectories. In fact, an observed curve can bethought of as the sum of a smooth part plus some noise due to measurement errors.Spline smoothing techniques are used to isolate the smooth part of the trajectory:clustering algorithms are then applied to these smooth curves.The second step consists in building suitable metrics for measuring the dissimilaritybetween trajectories: we propose a metric that accounts for difference in both shape andlevel, and a metric accounting for differences in shape only.A simulation study is performed in order to evaluate the proposed methodologies, usingboth hierarchical and partitional clustering algorithm. The quality of the obtained resultsis assessed by means of several indices
Resumo:
One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data
Resumo:
First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by asimplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able togenerate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow definingmonitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated
Resumo:
The prefrontal (PFC) and orbitofrontal cortex (OFC) appear to be associated with both executive functions and olfaction. However, there is little data relating olfactory processing and executive functions in humans. The present study aimed at exploring the role of olfaction on executive functioning, making a distinction between primary and more cognitive aspects of olfaction. Three executive tasks of similar difficulty were used. One was used to assess hot executive functions (Iowa Gambling Task-IGT), and two as a measure of cold executive functioning (Stroop Colour and Word Test-SCWT and Wisconsin Card Sorting Test-WCST). Sixty two healthy participants were included: 31 with normosmia and 31 with hyposmia. Olfactory abilities were assessed using the ''Sniffin' Sticks'' test and the olfactory threshold, odour discrimination and odour identification measures were obtained. All participants were female, aged between 18 and 60. Results showed that participants with hyposmia displayed worse performance in decision making (IGT; Cohen's-d = 0.91) and cognitive flexibility (WCST; Cohen's-d between 0.54 and 0.68) compared to those with normosmia. Multiple regression adjusted by the covariates participants' age and education level showed a positive association between odour identification and the cognitive inhibition response (SCWT-interference; Beta = 0.29; p = .034). The odour discrimination capacity was not a predictor of the cognitive executive performance. Our results suggest that both hot and cold executive functions seem to be associated with higher-order olfactory functioning in humans. These results robustly support the hypothesis that olfaction and executive measures have a common neural substrate in PFC and OFC, and suggest that olfaction might be a reliable cognitive marker in psychiatric and neurologic disorders.