916 resultados para quantitative data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The HUPO Proteomics Standards Initiative has developed several standardized data formats to facilitate data sharing in mass spectrometry (MS)-based proteomics. These allow researchers to report their complete results in a unified way. However, at present, there is no format to describe the final qualitative and quantitative results for proteomics and metabolomics experiments in a simple tabular format. Many downstream analysis use cases are only concerned with the final results of an experiment and require an easily accessible format, compatible with tools such as Microsoft Excel or R. We developed the mzTab file format for MS-based proteomics and metabolomics results to meet this need. mzTab is intended as a lightweight supplement to the existing standard XML-based file formats (mzML, mzIdentML, mzQuantML), providing a comprehensive summary, similar in concept to the supplemental material of a scientific publication. mzTab files can contain protein, peptide, and small molecule identifications together with experimental metadata and basic quantitative information. The format is not intended to store the complete experimental evidence but provides mechanisms to report results at different levels of detail. These range from a simple summary of the final results to a representation of the results including the experimental design. This format is ideally suited to make MS-based proteomics and metabolomics results available to a wider biological community outside the field of MS. Several software tools for proteomics and metabolomics have already adapted the format as an output format. The comprehensive mzTab specification document and extensive additional documentation can be found online.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes the composition of fingermark residue as being a complex system with numerous compounds coming from different sources and evolving over time from the initial composition (corresponding to the composition right after deposition) to the aged composition (corresponding to the evolution of the initial composition over time). This complex system will additionally vary due to effects of numerous influence factors grouped in five different classes: the donor characteristics, the deposition conditions, the substrate nature, the environmental conditions and the applied enhancement techniques. The initial and aged compositions as well as the influence factors are thus considered in this article to provide a qualitative and quantitative review of all compounds identified in fingermark residue up to now. The analytical techniques used to obtain these data are also enumerated. This review highlights the fact that despite the numerous analytical processes that have already been proposed and tested to elucidate fingermark composition, advanced knowledge is still missing. Thus, there is a real need to conduct future research on the composition of fingermark residue, focusing particularly on quantitative measurements, aging kinetics and effects of influence factors. The results of future research are particularly important for advances in fingermark enhancement and dating technique developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work discuss the use of the standard model for the calculation of the solvency capital requirement (SCR) when the company aims to use the specific parameters of the model on the basis of the experience of its portfolio. In particular, this analysis focuses on the formula presented in the latest quantitative impact study (2010 CEIOPS) for non-life underwriting premium and reserve risk. One of the keys of the standard model for premium and reserves risk is the correlation matrix between lines of business. In this work we present how the correlation matrix between lines of business could be estimated from a quantitative perspective, as well as the possibility of using a credibility model for the estimation of the matrix of correlation between lines of business that merge qualitative and quantitative perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Gene expression analysis has emerged as a major biological research area, with real-time quantitative reverse transcription PCR (RT-QPCR) being one of the most accurate and widely used techniques for expression profiling of selected genes. In order to obtain results that are comparable across assays, a stable normalization strategy is required. In general, the normalization of PCR measurements between different samples uses one to several control genes (e. g. housekeeping genes), from which a baseline reference level is constructed. Thus, the choice of the control genes is of utmost importance, yet there is not a generally accepted standard technique for screening a large number of candidates and identifying the best ones. Results: We propose a novel approach for scoring and ranking candidate genes for their suitability as control genes. Our approach relies on publicly available microarray data and allows the combination of multiple data sets originating from different platforms and/or representing different pathologies. The use of microarray data allows the screening of tens of thousands of genes, producing very comprehensive lists of candidates. We also provide two lists of candidate control genes: one which is breast cancer-specific and one with more general applicability. Two genes from the breast cancer list which had not been previously used as control genes are identified and validated by RT-QPCR. Open source R functions are available at http://www.isrec.isb-sib.ch/similar to vpopovic/research/ Conclusion: We proposed a new method for identifying candidate control genes for RT-QPCR which was able to rank thousands of genes according to some predefined suitability criteria and we applied it to the case of breast cancer. We also empirically showed that translating the results from microarray to PCR platform was achievable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale for the purpose of improving predictions of groundwater flow and solute transport. However, extending corresponding approaches to the regional scale still represents one of the major challenges in the domain of hydrogeophysics. To address this problem, we have developed a regional-scale data integration methodology based on a two-step Bayesian sequential simulation approach. Our objective is to generate high-resolution stochastic realizations of the regional-scale hydraulic conductivity field in the common case where there exist spatially exhaustive but poorly resolved measurements of a related geophysical parameter, as well as highly resolved but spatially sparse collocated measurements of this geophysical parameter and the hydraulic conductivity. To integrate this multi-scale, multi-parameter database, we first link the low- and high-resolution geophysical data via a stochastic downscaling procedure. This is followed by relating the downscaled geophysical data to the high-resolution hydraulic conductivity distribution. After outlining the general methodology of the approach, we demonstrate its application to a realistic synthetic example where we consider as data high-resolution measurements of the hydraulic and electrical conductivities at a small number of borehole locations, as well as spatially exhaustive, low-resolution estimates of the electrical conductivity obtained from surface-based electrical resistivity tomography. The different stochastic realizations of the hydraulic conductivity field obtained using our procedure are validated by comparing their solute transport behaviour with that of the underlying ?true? hydraulic conductivity field. We find that, even in the presence of strong subsurface heterogeneity, our proposed procedure allows for the generation of faithful representations of the regional-scale hydraulic conductivity structure and reliable predictions of solute transport over long, regional-scale distances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: In the middle of the 90's, the discovery of endogenous ligands for cannabinoid receptors opened a new era in this research field. Amides and esters of arachidonic acid have been identified as these endogenous ligands. Arachidonoylethanolamide (anandamide or AEA) and 2-Arachidonoylglycerol (2-AG) seem to be the most important of these lipid messengers. In addition, virodhamine (VA), noladin ether (2-AGE), and N-arachidonoyl dopamine (NADA) have been shown to bind to CB receptors with varying affinities. During recent years, it has become more evident that the EC system is part of fundamental regulatory mechanisms in many physiological processes such as stress and anxiety responses, depression, anorexia and bulimia, schizophrenia disorders, neuroprotection, Parkinson disease, anti-proliferative effects on cancer cells, drug addiction, and atherosclerosis. Aims: This work presents the problematic of EC analysis and the input of Information Dependant Acquisition based on hybrid triple quadrupole linear ion trap (QqQLIT) system for the profiling of these lipid mediators. Methods: The method was developed on a LC Ultimate 3000 series (Dionex, Sunnyvale, CA, USA) coupled to a QTrap 4000 system (Applied biosystems, Concord, ON, Canada). The ECs were separated on an XTerra C18 MS column (50 × 3.0 mm i.d., 3.5 μm) with a 5 min gradient elution. For confirmatory analysis, an information-dependant acquisition experiment was performed with selected reaction monitoring (SRM) as survey scan and enhanced produced ion (EPI) as dependant scan. Results: The assay was found to be linear in the concentration range of 0.1-5 ng/mL for AEA, 0.3-5 ng/mL for VA, 2-AGE, and NADA and 1-20 ng/mL for 2-AG using 0.5 mL of plasma. Repeatability and intermediate precision were found less than 15% over the tested concentration ranges. Under non-pathophysiological conditions, only AEA and 2-AG were actually detected in plasma with concentration ranges going from 104 to 537 pg/mL and from 2160 to 3990 pg/mL respectively. We have particularly focused our scopes on the evaluation of EC level changes in biological matrices through drug addiction and atherosclerosis processes. We will present preliminary data obtained during pilot study after administration of cannabis on human patients. Conclusion: ECs have been shown to play a key role in regulation of many pathophysiological processes. Medical research in these different fields continues to growth in order to understand and to highlight the predominant role of EC in the CNS and peripheral tissues signalisation. The profiling of these lipids needs to develop rapid, highly sensitive and selective analytical methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Immune dysregulation, Polyendocrinopathy, Enteropathy X-linked (IPEX) syndrome is a unique example of primary immunodeficiency characterized by autoimmune manifestations due to defective regulatory T (Treg) cells, in the presence of FOXP3 mutations. However, autoimmune symptoms phenotypically resembling IPEX often occur in the absence of detectable FOXP3 mutations. The cause of this "IPEX-like" syndrome presently remains unclear. To investigate whether a defect in Treg cells sustains the immunological dysregulation in IPEX-like patients, we measured the amount of peripheral Treg cells within the CD3(+) T cells by analysing demethylation of the Treg cell-Specific-Demethylated-Region (TSDR) in the FOXP3 locus and demethylation of the T cell-Specific-Demethylated-Region (TLSDR) in the CD3 locus, highly specific markers for stable Treg cells and overall T cells, respectively. TSDR demethylation analysis, alone or normalized for the total T cells, showed that the amount of peripheral Treg cells in a cohort of IPEX-like patients was significantly reduced, as compared to both healthy subjects and unrelated disease controls. This reduction could not be displayed by flow cytometric analysis, showing highly variable percentages of FOXP3(+) and CD25(+)FOXP3(+) T cells. These data provide evidence that a quantitative defect of Treg cells could be considered a common biological hallmark of IPEX-like syndrome. Since Treg cell suppressive function was not impaired, we propose that this reduction per se could sustain autoimmunity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interpretability and power of genome-wide association studies can be increased by imputing unobserved genotypes, using a reference panel of individuals genotyped at higher marker density. For many markers, genotypes cannot be imputed with complete certainty, and the uncertainty needs to be taken into account when testing for association with a given phenotype. In this paper, we compare currently available methods for testing association between uncertain genotypes and quantitative traits. We show that some previously described methods offer poor control of the false-positive rate (FPR), and that satisfactory performance of these methods is obtained only by using ad hoc filtering rules or by using a harsh transformation of the trait under study. We propose new methods that are based on exact maximum likelihood estimation and use a mixture model to accommodate nonnormal trait distributions when necessary. The new methods adequately control the FPR and also have equal or better power compared to all previously described methods. We provide a fast software implementation of all the methods studied here; our new method requires computation time of less than one computer-day for a typical genome-wide scan, with 2.5 M single nucleotide polymorphisms and 5000 individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative ultrasound of bone is a promising method for bone assessment: radiation-free, portable and predictive of hip fracture. Its portability allowed us to study the relationships between ultrasonic parameters of bone with age and with non-vertebral fractures in elderly women living in 19 nursing homes. Broadband ultrasound attenuation (BUA) and speed of sound (SOS) of the calcaneus were measured (and the stiffness index calculated) in a sample of 270 institutionalized women, aged 85 +/- 7 years, using an Achilles bone densitometer (Lunar). The effects of age, history of non-vertebral and non-traumatic fractures, body mass index, triceps skinfold and arm circumference were assessed on BUA, SOS and stiffness index. Furthermore, to evaluate longitudinally the influence of aging on the ultrasound parameters of bone, 60 subjects from the same group had a second ultrasound measurement after 1 year. The cross-sectional analysis of the data on all 270 women showed a significant decrease (p < 0.001) with age in BUA, SOS and stiffness index (-0.47%, -0.06%, and -1.01% respectively per year). In the 94 women, (35%) with a history of previous non-vertebral fractures, ultrasound parameters were significantly lower (p < 0.0001) than in the 176 women with no history of fracture (-8.3% for BUA, -1.3% for SOS, -18.9% for stiffness index). In contrast, there was no significant difference in anthropometric measurements between the groups with and without previous non-vertebral fractures, although the measurements decreased significantly with age. In the longitudinal study, repeated quantitative ultrasound after 11.4 +/- 0.8 months showed no significant decrease in BUA (-1%) but a significant decrease in SOS (-0.3%, p < 0.0001) and in stiffness index (-3.6%, p < 0.0002). In conclusion, quantitative ultrasound of the calcaneus measures properties of bone which continue to decline in institutionalized elderly women, and is able to discriminate women with previous non-vertebral fractures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dentate gyrus is one of only two regions of the mammalian brain where substantial neurogenesis occurs postnatally. However, detailed quantitative information about the postnatal structural maturation of the primate dentate gyrus is meager. We performed design-based, stereological studies of neuron number and size, and volume of the dentate gyrus layers in rhesus macaque monkeys (Macaca mulatta) of different postnatal ages. We found that about 40% of the total number of granule cells observed in mature 5-10-year-old macaque monkeys are added to the granule cell layer postnatally; 25% of these neurons are added within the first three postnatal months. Accordingly, cell proliferation and neurogenesis within the dentate gyrus peak within the first 3 months after birth and remain at an intermediate level between 3 months and at least 1 year of age. Although granule cell bodies undergo their largest increase in size during the first year of life, cell size and the volume of the three layers of the dentate gyrus (i.e. the molecular, granule cell and polymorphic layers) continue to increase beyond 1 year of age. Moreover, the different layers of the dentate gyrus exhibit distinct volumetric changes during postnatal development. Finally, we observe significant levels of cell proliferation, neurogenesis and cell death in the context of an overall stable number of granule cells in mature 5-10-year-old monkeys. These data identify an extended developmental period during which neurogenesis might be modulated to significantly impact the structure and function of the dentate gyrus in adulthood.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we quantitatively investigated the expression of beta-site amyloid precursor protein cleaving enzyme (BACE) in the entorhinohippocampal and frontal cortex of Alzheimer's disease (AD) and old control subjects. The semiquantitative estimation indicated that the intensity of BACE overall immunoreactivity did not differ significantly between AD and controls, but that a significantly stronger staining was observed in the hippocampal regions CA3-4 compared to other regions in both AD patients and controls. The quantitative estimation confirmed that the number of BACE-positive neuronal profiles was not significantly decreased in AD. However, some degeneration of BACE-positive profiles was attested by the colocalization of neurons expressing BACE and exhibiting neurofibrillary tangles (NFT), as well as by a decrease in the surface area of BACE-positive profiles. In addition, BACE immunocytochemical expression was observed in and around senile plaques (SP), as well as in reactive astrocytes. BACE-immunoreactive astrocytes were localized in the vicinity or close to the plaques and their number was significantly increased in AD entorhinal cortex. The higher amount of beta-amyloid SP and NFT in AD was not correlated with an increase in BACE immunoreactivity. Taken together, these data accent that AD progression does not require an increased neuronal BACE protein level, but suggest an active role of BACE in immunoreactive astrocytes. Moreover, the strong expression in controls and regions less vulnerable to AD puts forward the probable existence of alternate BACE functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quantitative estimation of Sea Surface Temperatures from fossils assemblages is afundamental issue in palaeoclimatic and paleooceanographic investigations. TheModern Analogue Technique, a widely adopted method based on direct comparison offossil assemblages with modern coretop samples, was revised with the aim ofconforming it to compositional data analysis. The new CODAMAT method wasdeveloped by adopting the Aitchison metric as distance measure. Modern coretopdatasets are characterised by a large amount of zeros. The zero replacement was carriedout by adopting a Bayesian approach to the zero replacement, based on a posteriorestimation of the parameter of the multinomial distribution. The number of modernanalogues from which reconstructing the SST was determined by means of a multipleapproach by considering the Proxies correlation matrix, Standardized Residual Sum ofSquares and Mean Squared Distance. This new CODAMAT method was applied to theplanktonic foraminiferal assemblages of a core recovered in the Tyrrhenian Sea.Kew words: Modern analogues, Aitchison distance, Proxies correlation matrix,Standardized Residual Sum of Squares