920 resultados para Subgroup-analyses
Resumo:
Els isòtops estables com a traçadors de la cadena alimentària, s'han utilitzat per caracteritzar la relació entre els consumidors i els seus aliments, ja que el fraccionament isotòpic implica una discriminació en contra de certs isòtops. Però les anàlisis d'isòtops estables (SIA), també es poden dur a terme en peixos cultivats amb dietes artificials, com la orada (Sparus aurata), la especie más cultivada en el Mediterráneo. Canvis en l'abundància natural d'isòtops estables (13C i 15N) en els teixits i les seves reserves poden reflectir els canvis en l'ús i reciclatge dels nutrients ja que els enzims catabòlics implicats en els processos de descarboxilació i desaminació mostren una preferència pels isòtops més lleugers. Per tant, aquestes anàlisis ens poden proporcionar informació útil sobre l'estat nutricional i metabòlic dels peixos. L'objectiu d'aquest projecte va ser determinar la capacitat dels isòtops estables per ser utilitzats com a marcadors potencials de la capacitat de creixement i condicions de cria de l'orada. En aquest sentit, les anàlisis d'isòtops estables s'han combinat amb altres metabòlics (activitats citocrom-c-oxidasa, COX, i citrat sintasa, CS) i els paràmetres de creixement (ARN/ADN). El conjunt de resultats obtinguts en els diferents estudis realitzats en aquest projecte demostra que el SIA, en combinació amb altres paràmetres metabòlics, pot servir com una eina eficaç per discriminar els peixos amb millor potencial de creixement, així com a marcador sensible de l'estat nutricional i d'engreix. D'altra banda, la combinació de l'anàlisi d'isòtops estables amb les eines emergents, com ara tècniques de proteòmica (2D-PAGE), ens proporciona nous coneixements sobre els canvis metabòlics que ocorren en els músculs dels peixos durant l‟increment del creixement muscular induït per l'exercici.
Resumo:
Imaging mass spectrometry (IMS) represents an innovative tool in the cancer research pipeline, which is increasingly being used in clinical and pharmaceutical applications. The unique properties of the technique, especially the amount of data generated, make the handling of data from multiple IMS acquisitions challenging. This work presents a histology-driven IMS approach aiming to identify discriminant lipid signatures from the simultaneous mining of IMS data sets from multiple samples. The feasibility of the developed workflow is evaluated on a set of three human colorectal cancer liver metastasis (CRCLM) tissue sections. Lipid IMS on tissue sections was performed using MALDI-TOF/TOF MS in both negative and positive ionization modes after 1,5-diaminonaphthalene matrix deposition by sublimation. The combination of both positive and negative acquisition results was performed during data mining to simplify the process and interrogate a larger lipidome into a single analysis. To reduce the complexity of the IMS data sets, a sub data set was generated by randomly selecting a fixed number of spectra from a histologically defined region of interest, resulting in a 10-fold data reduction. Principal component analysis confirmed that the molecular selectivity of the regions of interest is maintained after data reduction. Partial least-squares and heat map analyses demonstrated a selective signature of the CRCLM, revealing lipids that are significantly up- and down-regulated in the tumor region. This comprehensive approach is thus of interest for defining disease signatures directly from IMS data sets by the use of combinatory data mining, opening novel routes of investigation for addressing the demands of the clinical setting.
Resumo:
As the distribution of Candida species and their susceptibility to antifungal agents have changed, a new means of accurately and rapidly identifying these species is necessary for the successful early resolution of infection and the subsequent reduction of morbidity and mortality. The current work aimed to evaluate ribosomal RNA gene sequencing for the identification of medically relevant Candida species in comparison with a standard phenotypic method. Eighteen reference strains (RSs), 69 phenotypically identified isolates and 20 inconclusively identified isolates were examined. Internal transcribed spaces (ITSs) and D1/D2 of the 26S ribosomal RNA gene regions were used as targets for sequencing. Additionally, the sequences of the ITS regions were used to establish evolutionary relationships. The sequencing of the ITS regions was successful for 88% (94/107) of the RS and isolates, whereas 100% of the remaining 12% (13/107) of the samples were successfully analysed by sequencing the D1/D2 region. Similarly, genotypic analysis identified all of the RS and isolates, including the 20 isolates that were not phenotypically identified. Phenotypic analysis, however, misidentified 10% (7/69) of the isolates. Phylogenetic analysis allowed the confirmation of the relationships between evolutionarily close species. Currently, the use of genotypic methods is necessary for the correct identification of Candida species.
Resumo:
In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.
Resumo:
Scarcities of environmental services are no longer merely a remote hypothesis. Consequently, analysis of their inequalities between nations becomes of paramount importance for the achievement of sustainability in terms either of international policy, or of Universalist ethical principles of equity. This paper aims, on the one hand, at revising methodological aspects of the inequality measurement of certain environmental data and, on the other, at extending the scarce empirical evidence relating to the international distribution of Ecological Footprint (EF), by using a longer EF time series. Most of the techniques currently important in the literature are revised and then tested on EF data with interesting results. We look in depth at Lorenz dominance analyses and consider the underlying properties of different inequality indices. Those indices which fit best with environmental inequality measurements are CV2 and GE(2) because of their neutrality property, however a trade-off may occur when subgroup decompositions are performed. A weighting factor decomposition method is proposed in order to isolate weighting factor changes in inequality growth rates. Finally, the only non-ambiguous way of decomposing inequality by source is the natural decomposition of CV2, which additionally allows the interpretation of marginal term contributions. Empirically, this paper contributes to the environmental inequality measurement of EF: this inequality has been quite stable and its change over time is due to per capita vector changes rather than population changes. Almost the entirety of the EF inequality is explainable by differences in the means between the countries of the World Bank group. This finding suggests that international environmental agreements should be attempted on a regional basis in an attempt to achieve greater consensus between the parties involved. Additionally, source decomposition warns of the dangers of confining CO2 emissions reduction to crop-based energies because of the implications for basic needs satisfaction.
Resumo:
In this tutorial review, we detail both the rationale for as well as the implementation of a set of analyses of surface-recorded event-related potentials (ERPs) that uses the reference-free spatial (i.e. topographic) information available from high-density electrode montages to render statistical information concerning modulations in response strength, latency, and topography both between and within experimental conditions. In these and other ways these topographic analysis methods allow the experimenter to glean additional information and neurophysiologic interpretability beyond what is available from canonical waveform analyses. In this tutorial we present the example of somatosensory evoked potentials (SEPs) in response to stimulation of each hand to illustrate these points. For each step of these analyses, we provide the reader with both a conceptual and mathematical description of how the analysis is carried out, what it yields, and how to interpret its statistical outcome. We show that these topographic analysis methods are intuitive and easy-to-use approaches that can remove much of the guesswork often confronting ERP researchers and also assist in identifying the information contained within high-density ERP datasets.
Resumo:
The dynamic properties of helix 12 in the ligand binding domain of nuclear receptors are a major determinant of AF-2 domain activity. We investigated the molecular and structural basis of helix 12 mobility, as well as the involvement of individual residues with regard to peroxisome proliferator-activated receptor alpha (PPARalpha) constitutive and ligand-dependent transcriptional activity. Functional assays of the activity of PPARalpha helix 12 mutants were combined with free energy molecular dynamics simulations. The agreement between the results from these approaches allows us to make robust claims concerning the mechanisms that govern helix 12 functions. Our data support a model in which PPARalpha helix 12 transiently adopts a relatively stable active conformation even in the absence of a ligand. This conformation provides the interface for the recruitment of a coactivator and results in constitutive activity. The receptor agonists stabilize this conformation and increase PPARalpha transcription activation potential. Finally, we disclose important functions of residues in PPARalpha AF-2, which determine the positioning of helix 12 in the active conformation in the absence of a ligand. Substitution of these residues suppresses PPARalpha constitutive activity, without changing PPARalpha ligand-dependent activation potential.
Resumo:
Scarcities of environmental services are no longer merely a remote hypothesis. Consequently, analysis of their inequalities between nations becomes of paramount importance for the achievement of sustainability in terms either of international policy, or of Universalist ethical principles of equity. This paper aims, on the one hand, at revising methodological aspects of the inequality measurement of certain environmental data and, on the other, at extending the scarce empirical evidence relating to the international distribution of Ecological Footprint (EF), by using a longer EF time series. Most of the techniques currently important in the literature are revised and then tested on EF data with interesting results. We look in depth at Lorenz dominance analyses and consider the underlying properties of different inequality indices. Those indices which fit best with environmental inequality measurements are CV2 and GE(2) because of their neutrality property, however a trade-off may occur when subgroup decompositions are performed. A weighting factor decomposition method is proposed in order to isolate weighting factor changes in inequality growth rates. Finally, the only non-ambiguous way of decomposing inequality by source is the natural decomposition of CV2, which additionally allows the interpretation of marginal term contributions. Empirically, this paper contributes to the environmental inequality measurement of EF: this inequality has been quite stable and its change over time is due to per capita vector changes rather than population changes. Almost the entirety of the EF inequality is explainable by differences in the means between the countries of the World Bank group. This finding suggests that international environmental agreements should be attempted on a regional basis in an attempt to achieve greater consensus between the parties involved. Additionally, source decomposition warns of the dangers of confining CO2 emissions reduction to crop-based energies because of the implications for basic needs satisfaction. Keywords: ecological footprint; ecological inequality measurement, inequality decomposition.
Resumo:
BACKGROUND AND HYPOTHESIS Although prodromal angina occurring shortly before an acute myocardial infarction (MI) has protective effects against in-hospital complications, this effect has not been well documented after initial hospitalization, especially in older or diabetic patients. We examined whether angina 1 week before a first MI provides protection in these patients. METHODS A total of 290 consecutive patients, 143 elderly (>64 years of age) and 147 adults (<65 years of age), 68 of whom were diabetic (23.4%) and 222 nondiabetic (76.6%), were examined to assess the effect of preceding angina on long-term prognosis (56 months) after initial hospitalization for a first MI. RESULTS No significant differences were found in long-term complications after initial hospitalization in these adult and elderly patients according to whether or not they had prodromal angina (44.4% with angina vs 45.4% without in adults; 45.5% vs 58% in elderly, P < 0.2). Nor were differences found according to their diabetic status (61.5% with angina vs 72.7% without in diabetics; 37.3% vs 38.3% in nondiabetics; P = 0.4). CONCLUSION The occurrence of angina 1 week before a first MI does not confer long-term protection against cardiovascular complications after initial hospitalization in adult or elderly patients, whether or not they have diabetes.
Resumo:
BACKGROUND: Prognosis of status epilepticus (SE) depends on its cause, but there is uncertainty as to whether SE represents an independent outcome predictor for a given etiology. Cerebral anoxia is a relatively homogenous severe encephalopathy. Postanoxic SE is associated to a nearly 100% mortality in this setting; however, it is still unclear whether this is a severity marker of the underlying encephalopathy, or an independent factor influencing outcome. The goal of this study was to assess if postanoxic SE is independently associated with mortality after cerebral anoxia. METHODS: This was a retrospective observation of consecutive comatose survivors of cardiac arrest, including subjects treated with hypothermia. On the subgroup with EEG recordings in the first hospitalization days, univariate and multivariate analyses were applied to potential determinants of in-hospital mortality, and included the following variables: age, gender, type and length of cardiac arrest, occurrence of circulatory shock, presence of therapeutic hypothermia, and electrographic SE. RESULTS: Out of 166 postanoxic patients, 107 (64%) had an EEG (median latency from admission, 2 days); in this group, therapeutic hypothermia was administered in 59%. Death occurred in 71 (67%) patients. Postanoxic SE was associated with mortality regardless of type of acute cardiac rhythm and administration of hypothermic treatment. CONCLUSION: In this hospital-based cohort, postanoxic status epilepticus (SE) seems to be independently related to death in cardiac arrest survivors, suggesting that SE might determine a bad prognosis for a given etiology. Confirmation of these results in a prospective assessment is needed.
Resumo:
Gumbel analyses were carried out on rainfall time-series at 151 locations in Switzerland for 4 different periods of 30 years in order to estimate daily extreme precipitation for a return period of 100 years. Those estimations were compared with maximal daily values measured during the last 100 years (1911-2010) to test the efficiency of these analyses. This comparison shows that these analyses provide good results for 50 to 60% locations in this country from rainfall time-series 1961-1990 and 1980-2010. On the other hand, daily precipitation with a return period of 100 years is underestimated at most locations from time-series 1931-1960 and especially 1911-1940. Such underestimation results from the increase of maximal daily precipitation recorded from 1911 to 2010 at 90% locations in Switzerland.
Resumo:
Chikungunya virus (CHIKV) is a mosquito-borne pathogen that emerged in Brazil by late 2014. In the country, two CHIKV foci characterized by the East/Central/South Africa and Asian genotypes, were established in North and Northeast regions. We characterized, by phylogenetic analyses of full and partial genomes, CHIKV from Rio de Janeiro state (2014-2015). These CHIKV strains belong to the Asian genotype, which is the determinant of the current Northern Brazilian focus, even though the genome sequence presents particular single nucleotide variations. This study provides the first genetic characterisation of CHIKV in Rio de Janeiro and highlights the potential impact of human mobility in the spread of an arthropod-borne virus.
Resumo:
A problem in the archaeometric classification of Catalan Renaissance pottery is the fact, thatthe clay supply of the pottery workshops was centrally organized by guilds, and thereforeusually all potters of a single production centre produced chemically similar ceramics.However, analysing the glazes of the ware usually a large number of inclusions in the glaze isfound, which reveal technological differences between single workshops. These inclusionshave been used by the potters in order to opacify the transparent glaze and to achieve a whitebackground for further decoration.In order to distinguish different technological preparation procedures of the single workshops,at a Scanning Electron Microscope the chemical composition of those inclusions as well astheir size in the two-dimensional cut is recorded. Based on the latter, a frequency distributionof the apparent diameters is estimated for each sample and type of inclusion.Following an approach by S.D. Wicksell (1925), it is principally possible to transform thedistributions of the apparent 2D-diameters back to those of the true three-dimensional bodies.The applicability of this approach and its practical problems are examined using differentways of kernel density estimation and Monte-Carlo tests of the methodology. Finally, it istested in how far the obtained frequency distributions can be used to classify the pottery