147 resultados para Image Processing in Molecular Biology Research
Resumo:
Digital pathology and the adoption of image analysis have grown rapidly in the last few years. This is largely due to the implementation of whole slide scanning, advances in software and computer processing capacity and the increasing importance of tissue-based research for biomarker discovery and stratified medicine. This review sets out the key application areas for digital pathology and image analysis, with a particular focus on research and biomarker discovery. A variety of image analysis applications are reviewed including nuclear morphometry and tissue architecture analysis, but with emphasis on immunohistochemistry and fluorescence analysis of tissue biomarkers. Digital pathology and image analysis have important roles across the drug/companion diagnostic development pipeline including biobanking, molecular pathology, tissue microarray analysis, molecular profiling of tissue and these important developments are reviewed. Underpinning all of these important developments is the need for high quality tissue samples and the impact of pre-analytical variables on tissue research is discussed. This requirement is combined with practical advice on setting up and running a digital pathology laboratory. Finally, we discuss the need to integrate digital image analysis data with epidemiological, clinical and genomic data in order to fully understand the relationship between genotype and phenotype and to drive discovery and the delivery of personalized medicine.
Resumo:
We know considerably more about what makes cells and tissues resistant or sensitive to radiation than we did 20 years ago. Novel techniques in molecular biology have made a major contribution to our understanding at the level of signalling pathways. Before the “New Biology” era, radioresponsiveness was defined in terms of physiological parameters designated as the five Rs. These are: repair, repopulation, reassortment, reoxygenation and radiosensitivity. Of these, only the role of hypoxia proved to be a robust predictive and prognostic marker, but radiotherapy regimens were nonetheless modified in terms of dose per fraction, fraction size and overall time, in ways that persist in clinical practice today. The first molecular techniques were applied to radiobiology about two decades ago and soon revealed the existence of genes/proteins that respond to and influence the cellular outcome of irradiation. The subsequent development of screening techniques using microarray technology has since revealed that a very large number of genes fall into this category. We can now obtain an adequately robust molecular signature, predicting for a radioresponsive phenotype using gene expression and proteomic approaches. In parallel with these developments, functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) can now detect specific biological molecules such as haemoglobin and glucose, so revealing a 3D map of tumour blood flow and metabolism. The key to personalised radiotherapy will be to extend this capability to the proteins of the molecular signature that determine radiosensitivity.
Resumo:
REMA is an interactive web-based program which predicts endonuclease cut sites in DNA sequences. It analyses Multiple sequences simultaneously and predicts the number and size of fragments as well as provides restriction maps. The users can select single or paired combinations of all commercially available enzymes. Additionally, REMA permits prediction of multiple sequence terminal fragment sizes and suggests suitable restriction enzymes for maximally discriminatory results. REMA is an easy to use, web based program which will have a wide application in molecular biology research. Availability: REMA is written in Perl and is freely available for non-commercial use. Detailed information on installation can be obtained from Jan Szubert (jan.szubert@gmail.com) and the web based application is accessible on the internet at the URL http://www.macaulay.ac.uk/rema. Contact: b.singh@macaulay.ac.uk. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Power has become a key constraint in current nanoscale integrated circuit design due to the increasing demands for mobile computing and a low carbon economy. As an emerging technology, an inexact circuit design offers a promising approach to significantly reduce both dynamic and static power dissipation for error tolerant applications. Although fixed-point arithmetic circuits have been studied in terms of inexact computing, floating-point arithmetic circuits have not been fully considered although require more power. In this paper, the first inexact floating-point adder is designed and applied to high dynamic range (HDR) image processing. Inexact floating-point adders are proposed by approximately designing an exponent subtractor and mantissa adder. Related logic operations including normalization and rounding modules are also considered in terms of inexact computing. Two HDR images are processed using the proposed inexact floating-point adders to show the validity of the inexact design. HDR-VDP is used as a metric to measure the subjective results of the image addition. Significant improvements have been achieved in terms of area, delay and power consumption. Comparison results show that the proposed inexact floating-point adders can improve power consumption and the power-delay product by 29.98% and 39.60%, respectively.
Resumo:
The purpose of this study was to investigate the occupational hazards within the tanning industry caused by contaminated dust. A qualitative assessment of the risk of human exposure to dust was made throughout a commercial Kenyan tannery. Using this information, high-risk points in the processing line were identified and dust sampling regimes developed. An optical set-up using microscopy and digital imaging techniques was used to determine dust particle numbers and size distributions. The results showed that chemical handling was the most hazardous (12 mg m(-3)). A Monte Carlo method was used to estimate the concentration of the dust in the air throughout the tannery during an 8 h working day. This showed that the high-risk area of the tannery was associated with mean concentrations of dust greater than the UK Statutory Instrument 2002 No. 2677. stipulated limits (exceeding 10 mg m(-3) (Inhalable dust limits) and 4 mg m(-3) (Respirable dust limits). This therefore has implications in terms of provision of personal protective equipment (PPE) to the tannery workers for the mitigation of occupational risk.
Resumo:
Donor hematopoiesis or donor chimerism in the host following allogeneic bone marrow transplantation (BMT) has appeared crucial to the engraftment process. However, as molecular techniques exploiting neutral variation in human genetic material have been used in the study of chimerism, the detection of residual host cells or mixed hemopoietic chimerism has indicated that donor chimerism is not obligatory following BMT. This review focuses on the detection and significance of mixed chimerism (MC) in patients transplanted for both malignant and non-malignant hemopoietic disease and attempts to tease out the contribution of MC to engraftment, leukemia relapse, graft rejection and long-term disease-free survival.
Resumo:
Non-invasive real time in vivo molecular imaging in small animal models has become the essential bridge between in vitro data and their translation into clinical applications. The tremendous development and technological progress, such as tumour modelling, monitoring of tumour growth and detection of metastasis, has facilitated translational drug development. This has added to our knowledge on carcinogenesis. The modalities that are commonly used include Magnetic Resonance Imaging (MRI), Computed Tomography (CT), Positron Emission Tomography (PET), bioluminescence imaging, fluorescence imaging and multi-modality imaging systems. The ability to obtain multiple images longitudinally provides reliable information whilst reducing animal numbers. As yet there is no one modality that is ideal for all experimental studies. This review outlines the instrumentation available together with corresponding applications reported in the literature with particular emphasis on cancer research. Advantages and limitations to current imaging technology are discussed and the issues concerning small animal care during imaging are highlighted.
Resumo:
Current understanding of risk associated with low-dose radiation exposure has for many years been embedded in the linear-no-threshold (LNT) approach, based on simple extrapolation from the Japanese atomic bomb survivors. Radiation biology research has supported the LNT approach although much of this has been limited to relatively high-dose studies. Recently, with new advances for studying effects of low-dose exposure in experimental models and advances in molecular and cellular biology, a range of new effects of biological responses to radiation has been observed. These include genomic instability, adaptive responses and bystander effects. Most have one feature in common in that they are observed at low doses and suggest significant non-linear responses. These new observations pose a significant challenge to our understanding of low-dose exposure and require further study to elucidate mechanisms and determine their relevance.
Resumo:
Background: Gene networks are considered to represent various aspects of molecular biological systems meaningfully because they naturally provide a systems perspective of molecular interactions. In this respect, the functional understanding of the transcriptional regulatory network is considered as key to elucidate the functional organization of an organism.