970 resultados para DATASETS
Resumo:
Background: Co-localisation is a widely used measurement in immunohistochemical analysis to determine if fluorescently labelled biological entities, such as cells, proteins or molecules share a same location. However the measurement of co-localisation is challenging due to the complex nature of such fluorescent images, especially when multiple focal planes are captured. The current state-of-art co-localisation measurements of 3-dimensional (3D) image stacks are biased by noise and cross-overs from non-consecutive planes.
Method: In this study, we have developed Co-localisation Intensity Coefficients (CICs) and Co-localisation Binary Coefficients (CBCs), which uses rich z-stack data from neighbouring focal planes to identify similarities between image intensities of two and potentially more fluorescently-labelled biological entities. This was developed using z-stack images from murine organotypic slice cultures from central nervous system tissue, and two sets of pseudo-data. A large amount of non-specific cross-over situations are excluded using this method. This proposed method is also proven to be robust in recognising co-localisations even when images are polluted with a range of noises.
Results: The proposed CBCs and CICs produce robust co-localisation measurements which are easy to interpret, resilient to noise and capable of removing a large amount of false positivity, such as non-specific cross-overs. Performance of this method of measurement is significantly more accurate than existing measurements, as determined statistically using pseudo datasets of known values. This method provides an important and reliable tool for fluorescent 3D neurobiological studies, and will benefit other biological studies which measure fluorescence co-localisation in 3D.
Resumo:
The debate about the complex issues of human development during the Middle to Upper Palaeolithic transition period (45-35 ka BP) has been hampered by concerns about the reliability of the radiocarbon dating method. Large C-14 anomalies were postulated and radiocarbon dating was considered flawed. We show here that these issues are no longer relevant, because the large anomalies are artefacts beyond plausible physical limits for their magnitude. Previous inconsistencies between C-14 radiocarbon datasets have been resolved, and a new radiocarbon calibration curve, IntCal09 (Reimer et al., 2009), was created. Improved procedures for bone collagen extraction and charcoal pre-treatment generally result in older ages, consistent with independently dated time markers. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
This paper responds to recent calls for more academic research and critical discussion on the relationship between spatial planning and city branding. Through the lens of Liverpool, the article analyses how key planning projects have delivered major transformations in the city's built environment and cultural landscape. More specifically, in concentrating on the performative nature of spatial planning it reveals the physical, symbolic and discursive re-imaging of Liverpool into a 'world class city'. Another aspect of the paper presents important socioeconomic datasets and offers a critical reading of the re-branding in showing how it presents an inaccurate representation of Liverpool. The evidence provided indicates that a more accurate label for Liverpool is a polarised and divided city, thereby questioning the fictive spectacle of city branding. Finally, the paper ends with some critical commentary on the role of spatial planning as an accessory to the sophistry of city branding.
Resumo:
A novel non-linear dimensionality reduction method, called Temporal Laplacian Eigenmaps, is introduced to process efficiently time series data. In this embedded-based approach, temporal information is intrinsic to the objective function, which produces description of low dimensional spaces with time coherence between data points. Since the proposed scheme also includes bidirectional mapping between data and embedded spaces and automatic tuning of key parameters, it offers the same benefits as mapping-based approaches. Experiments on a couple of computer vision applications demonstrate the superiority of the new approach to other dimensionality reduction method in term of accuracy. Moreover, its lower computational cost and generalisation abilities suggest it is scalable to larger datasets. © 2010 IEEE.
Resumo:
We conducted data-mining analyses of genome wide association (GWA) studies of the CATIE and MGS-GAIN datasets, and found 13 markers in the two physically linked genes, PTPN21 and EML5, showing nominally significant association with schizophrenia. Linkage disequilibrium (LD) analysis indicated that all 7 markers from PTPN21 shared high LD (r(2)>0.8), including rs2274736 and rs2401751, the two non-synonymous markers with the most significant association signals (rs2401751, P=1.10 × 10(-3) and rs2274736, P=1.21 × 10(-3)). In a meta-analysis of all 13 replication datasets with a total of 13,940 subjects, we found that the two non-synonymous markers are significantly associated with schizophrenia (rs2274736, OR=0.92, 95% CI: 0.86-0.97, P=5.45 × 10(-3) and rs2401751, OR=0.92, 95% CI: 0.86-0.97, P=5.29 × 10(-3)). One SNP (rs7147796) in EML5 is also significantly associated with the disease (OR=1.08, 95% CI: 1.02-1.14, P=6.43 × 10(-3)). These 3 markers remain significant after Bonferroni correction. Furthermore, haplotype conditioned analyses indicated that the association signals observed between rs2274736/rs2401751 and rs7147796 are statistically independent. Given the results that 2 non-synonymous markers in PTPN21 are associated with schizophrenia, further investigation of this locus is warranted.
Resumo:
Barr and Clark published a series of maps depicting the distribution of end moraines across Far NE Russia. These
moraines outlined the former distribution and dimensions of glaciers, and were identified through the analysis of
Landsat ETM+ satellite images (15- and 30-m resolution). Now, a number of freely available digital elevation
model (DEM) datasets are available, which cover the entire 4 million km2 of Far NE Russia. These include
the 30-m resolution ASTER GDEM and the 90-m resolution Viewfinder Panorama DEM. Here we use these
datasets, in conjunction with Landsat ETM+ images, to complete the process of systematically and
comprehensively mapping end moraines. With the aid of the DEMs described above, here we present a total
dataset of 8414 moraines, which almost quadruples the inventory of Barr and Clark. This increase in the
number of moraines is considered to reflect the utility of the DEMs for mapping glacial landforms. In terms of
moraine distribution, the Barr and Clark map and the one presented here are comparable, with moraines found
to cluster in highland regions and upon adjacent lowlands, attesting to the former occupation of the region by
mountain-centred ice masses. This record is considered to reflect palaeoclimatic and topographic controls upon
the extent and dynamics of palaeoglaciers, as well as spatial variability in moraine preservation.
Resumo:
As with all aspects of public management, the control, financing, and regulation of state-owned enterprises (SOEs) are matters subject to changing international trends and domestic political imperatives. The effects of the global financial crisis (GFC) on the ownership, financing, and role of SOEs are still unfolding, but undoubtedly will be heavily influenced by a new era of public sector reforms principally designed to reassert central political controls, as well as by fiscal pressures to balance state budgets. In this regard, the Irish experience is instructive, with the findings from two datasets being used here to examine various modes of state enterprise control and their corresponding autonomy. Significantly, there has been considerable variety within and across the SOE sector, demonstrating the need for more detailed understanding of how SOEs are managed. © 2011 Taylor & Francis.
Resumo:
Medical geology research has recognised a number of potentially toxic elements (PTEs), such as arsenic, cobalt, chromium, copper, nickel, lead, vanadium, uranium and zinc, known to influence human disease by their respective deficiency or toxicity. As the impact of infectious diseases has decreased and the population ages, so cancer has become the most common cause of death in developed countries including Northern Ireland. This research explores the relationship between environmental exposure to potentially toxic elements in soil and cancer disease data across Northern Ireland. The incidence of twelve different cancer types (lung, stomach, leukaemia, oesophagus, colorectal, bladder, kidney, breast, mesothelioma, melanoma and non melanoma(NM) both basal and squamous, were examined in the form of twenty-five coded datasets comprising aggregates over the 12 year period from 1993 to 2006. A local modelling technique,geographically weighted regression (GWR) is usedto explore the relationship between environmental exposure and cancer disease data. The results show comparisons of the geographical incidence of certain cancers (stomach and NM squamous skin cancer) in relation to concentrations of certain PTEs (arsenic levels in soils and radon were identified). Findings from the research have implications for regional human health risk assessments.
Resumo:
The Hippo pathway restricts the activity of transcriptional coactivators TAZ (WWTR1) and YAP. TAZ and YAP are reported to be overexpressed in various cancers, however, their prognostic significance in colorectal cancers remains unstudied. The expression levels of TAZ and YAP, and their downstream transcriptional targets, AXL and CTGF, were extracted from two independent colon cancer patient datasets available in the Gene Expression Omnibus database, totaling 522 patients. We found that mRNA expressions of both TAZ and YAP were positively correlated with those of AXL and CTGF (p<0.05). High level mRNA expression of TAZ, AXL or CTGF significantly correlated with shorter survival. Importantly, patients co-overexpressing all 3 genes had a significantly shorter survival time, and combinatorial expression of these 3 genes was an independent predictor for survival. The downstream target genes for TAZ-AXL-CTGF overexpression were identified by Java application MyStats. Interestingly, genes that are associated with colon cancer progression (ANTXR1, EFEMP2, SULF1, TAGLN, VCAN, ZEB1 and ZEB2) were upregulated in patients co-overexpressing TAZ-AXL-CTGF. This TAZ-AXL-CTGF gene expression signature (GES) was then applied to Connectivity Map to identify small molecules that could potentially be utilized to reverse this GES. Of the top 20 small molecules identified by connectivity map, amiloride (a potassium sparing diuretic,) and tretinoin (all-trans retinoic acid) have shown therapeutic promise in inhibition of colon cancer cell growth. Using MyStats, we found that low level expression of either ANO1 or SQLE were associated with a better prognosis in patients who co-overexpressed TAZ-AXL-CTGF, and that ANO1 was an independent predictor of survival together with TAZ-AXL-CTGF. Finally, we confirmed that TAZ regulates Axl, and plays an important role in clonogenicity and non-adherent growth in vitro and tumor formation in vivo. These data suggest that TAZ could be a therapeutic target for the treatment of colon cancer.
Resumo:
The cytogenetically normal subtype of acute myeloid leukemia (CN-AML) is associated with Intermediate risk which complicates therapeutic options. Lower overall HOX/TALE expression appears to correlate with more favorable prognosis/better response to treatment in some leukemias and solid cancer. The functional significance of the associated gene expression and response to chemotherapy is not known. Three independent microarray datasets obtained from large patient cohorts along with quantitative PCR validation was used to identify a four gene HOXA/TALE signature capable of prognostic stratification. Biochemical analysis was used to identify interactions between the four encoded proteins and targeted knockdown used to examine the functional importance of sustained expression of the signature in leukemia maintenance and response to chemotherapy. An eleven HOXA/TALE code identified in an Intermediate risk (n=315) compared to a Favourable group of patients (n=105) was reduced to a four gene signature of HOXA6, HOXA9, PBX3 and MEIS1 by iterative analysis of independent platforms. This signature maintained the Favorable/Intermediate risk partition and where applicable, correlated with overall survival in CN-AML. We further show that cell growth and function is dependent on maintained levels of these core genes and that direct targeting of HOXA/PBX3 sensitizes CN-AML cells to standard chemotherapy. Together the data support a key role for HOXA/TALE in CN-AML and demonstrate that targeting of clinically significant HOXA/PBX3 elements may provide therapeutic benefit to these patients.
Resumo:
The number of well-dated pollen diagrams in Europe has increased considerably over the last 30 years and many of them have been submitted to the European Pollen Database (EPD). This allows for the construction of increasingly precise maps of Holocene vegetation change across the continent. Chronological information in the EPD has been expressed in uncalibrated radiocarbon years, and most chronologies to date are based on this time scale. Here we present new chronologies for most of the datasets stored in the EPD based on calibrated radiocarbon years. Age information associated with pollen diagrams is often derived from the pollen stratigraphy itself or from other sedimentological information. We reviewed these chronological tie points and assigned uncertainties to them. The steps taken to generate the new chronologies are described and the rationale for a new classification system for age uncertainties is introduced. The resulting chronologies are fit for most continental-scale questions. They may not provide the best age model for particular sites, but may be viewed as general purpose chronologies. Taxonomic particularities of the data stored in the EPD are explained. An example is given of how the database can be queried to select samples with appropriate age control as well as the suitable taxonomic level to answer a specific research question. © 2013 The Author(s).
Resumo:
OBJECTIVES: To determine effective and efficient monitoring criteria for ocular hypertension [raised intraocular pressure (IOP)] through (i) identification and validation of glaucoma risk prediction models; and (ii) development of models to determine optimal surveillance pathways.
DESIGN: A discrete event simulation economic modelling evaluation. Data from systematic reviews of risk prediction models and agreement between tonometers, secondary analyses of existing datasets (to validate identified risk models and determine optimal monitoring criteria) and public preferences were used to structure and populate the economic model.
SETTING: Primary and secondary care.
PARTICIPANTS: Adults with ocular hypertension (IOP > 21 mmHg) and the public (surveillance preferences).
INTERVENTIONS: We compared five pathways: two based on National Institute for Health and Clinical Excellence (NICE) guidelines with monitoring interval and treatment depending on initial risk stratification, 'NICE intensive' (4-monthly to annual monitoring) and 'NICE conservative' (6-monthly to biennial monitoring); two pathways, differing in location (hospital and community), with monitoring biennially and treatment initiated for a ≥ 6% 5-year glaucoma risk; and a 'treat all' pathway involving treatment with a prostaglandin analogue if IOP > 21 mmHg and IOP measured annually in the community.
MAIN OUTCOME MEASURES: Glaucoma cases detected; tonometer agreement; public preferences; costs; willingness to pay and quality-adjusted life-years (QALYs).
RESULTS: The best available glaucoma risk prediction model estimated the 5-year risk based on age and ocular predictors (IOP, central corneal thickness, optic nerve damage and index of visual field status). Taking the average of two IOP readings, by tonometry, true change was detected at two years. Sizeable measurement variability was noted between tonometers. There was a general public preference for monitoring; good communication and understanding of the process predicted service value. 'Treat all' was the least costly and 'NICE intensive' the most costly pathway. Biennial monitoring reduced the number of cases of glaucoma conversion compared with a 'treat all' pathway and provided more QALYs, but the incremental cost-effectiveness ratio (ICER) was considerably more than £30,000. The 'NICE intensive' pathway also avoided glaucoma conversion, but NICE-based pathways were either dominated (more costly and less effective) by biennial hospital monitoring or had a ICERs > £30,000. Results were not sensitive to the risk threshold for initiating surveillance but were sensitive to the risk threshold for initiating treatment, NHS costs and treatment adherence.
LIMITATIONS: Optimal monitoring intervals were based on IOP data. There were insufficient data to determine the optimal frequency of measurement of the visual field or optic nerve head for identification of glaucoma. The economic modelling took a 20-year time horizon which may be insufficient to capture long-term benefits. Sensitivity analyses may not fully capture the uncertainty surrounding parameter estimates.
CONCLUSIONS: For confirmed ocular hypertension, findings suggest that there is no clear benefit from intensive monitoring. Consideration of the patient experience is important. A cohort study is recommended to provide data to refine the glaucoma risk prediction model, determine the optimum type and frequency of serial glaucoma tests and estimate costs and patient preferences for monitoring and treatment.
FUNDING: The National Institute for Health Research Health Technology Assessment Programme.
Resumo:
This research aims to use the multivariate geochemical dataset, generated by the Tellus project, to investigate the appropriate use of transformation methods to maintain the integrity of geochemical data and inherent constrained behaviour in multivariate relationships. The widely used normal score transform is compared with the use of a stepwise conditional transform technique. The Tellus Project, managed by GSNI and funded by the Department of Enterprise Trade and Development and the EU’s Building Sustainable Prosperity Fund, involves the most comprehensive geological mapping project ever undertaken in Northern Ireland. Previous study has demonstrated spatial variability in the Tellus data but geostatistical analysis and interpretation of the datasets requires use of an appropriate methodology that reproduces the inherently complex multivariate relations. Previous investigation of the Tellus geochemical data has included use of Gaussian-based techniques. However, earth science variables are rarely Gaussian, hence transformation of data is integral to the approach. The multivariate geochemical dataset generated by the Tellus project provides an opportunity to investigate the appropriate use of transformation methods, as required for Gaussian-based geostatistical analysis. In particular, the stepwise conditional transform is investigated and developed for the geochemical datasets obtained as part of the Tellus project. The transform is applied to four variables in a bivariate nested fashion due to the limited availability of data. Simulation of these transformed variables is then carried out, along with a corresponding back transformation to original units. Results show that the stepwise transform is successful in reproducing both univariate statistics and the complex bivariate relations exhibited by the data. Greater fidelity to multivariate relationships will improve uncertainty models, which are required for consequent geological, environmental and economic inferences.
Resumo:
In recent years, gradient vector flow (GVF) based algorithms have been successfully used to segment a variety of 2-D and 3-D imagery. However, due to the compromise of internal and external energy forces within the resulting partial differential equations, these methods may lead to biased segmentation results. In this paper, we propose MSGVF, a mean shift based GVF segmentation algorithm that can successfully locate the correct borders. MSGVF is developed so that when the contour reaches equilibrium, the various forces resulting from the different energy terms are balanced. In addition, the smoothness constraint of image pixels is kept so that over- or under-segmentation can be reduced. Experimental results on publicly accessible datasets of dermoscopic and optic disc images demonstrate that the proposed method effectively detects the borders of the objects of interest.
Resumo:
The advent of next generation sequencing technologies (NGS) has expanded the area of genomic research, offering high coverage and increased sensitivity over older microarray platforms. Although the current cost of next generation sequencing is still exceeding that of microarray approaches, the rapid advances in NGS will likely make it the platform of choice for future research in differential gene expression. Connectivity mapping is a procedure for examining the connections among diseases, genes and drugs by differential gene expression initially based on microarray technology, with which a large collection of compound-induced reference gene expression profiles have been accumulated. In this work, we aim to test the feasibility of incorporating NGS RNA-Seq data into the current connectivity mapping framework by utilizing the microarray based reference profiles and the construction of a differentially expressed gene signature from a NGS dataset. This would allow for the establishment of connections between the NGS gene signature and those microarray reference profiles, alleviating the associated incurring cost of re-creating drug profiles with NGS technology. We examined the connectivity mapping approach on a publicly available NGS dataset with androgen stimulation of LNCaP cells in order to extract candidate compounds that could inhibit the proliferative phenotype of LNCaP cells and to elucidate their potential in a laboratory setting. In addition, we also analyzed an independent microarray dataset of similar experimental settings. We found a high level of concordance between the top compounds identified using the gene signatures from the two datasets. The nicotine derivative cotinine was returned as the top candidate among the overlapping compounds with potential to suppress this proliferative phenotype. Subsequent lab experiments validated this connectivity mapping hit, showing that cotinine inhibits cell proliferation in an androgen dependent manner. Thus the results in this study suggest a promising prospect of integrating NGS data with connectivity mapping. © 2013 McArt et al.