861 resultados para Diagnostic imaging Digital techniques
Resumo:
Passive equipments operating in the 30-300 GHZ (millimeter wave) band are compared to those in the 300 GHz-3 THz (submillimeter band). Equipments operating in the submillimeter band can measure distance and also spectral information and have been used to address new opportunities in security. Solid state spectral information is available in the submillimeter region making it possible to identify materials, whereas in millimeter region bulk optical properties determine the image contrast. The optical properties in the region from 30 GHz to 3 THz are discussed for some typical inorganic and organic solids. in the millimeter-wave region of the spectrum, obscurants such as poor weather, dust, and smoke can be penetrated and useful imagery generated for surveillance. in the 30 GHZ-3 THZ region dielectrics such as plastic and cloth are also transparent and the detection of contraband hidden under clothing is possible. A passive millimeter-wave imaging concept based on a folded Schmidt camera has been developed and applied to poor weather navigation and security. The optical design uses a rotating mirror and is folded using polarization techniques. The design is very well corrected over a wide field of view making it ideal for surveillance, and security. This produces a relatively compact imager which minimizes the receiver count.
Resumo:
OBJECTIVE: There is a widely recognised need to develop effective Alzheimer's disease (AD) biomarkers to aid the development of disease-modifying treatments, to facilitate early diagnosis and to improve clinical care. This overview aims to summarise the utility of key neuroimaging and cerebrospinal fluid (CSF) biomarkers for AD, before focusing on the latest efforts to identify informative blood biomarkers. DESIGN: A literature search was performed using PubMed up to September 2011 for reviews and primary research studies of neuroimaging (magnetic resonance imaging, magnetic resonance spectroscopy, positron emission tomography and amyloid imaging), CSF and blood-based (plasma, serum and platelet) biomarkers in AD and mild cognitive impairment. Citations within individual articles were examined to identify additional studies relevant to this review. RESULTS: Evidence of AD biomarker potential was available for imaging techniques reflecting amyloid burden and neurodegeneration. Several CSF measures are promising, including 42 amino acid ß-amyloid peptide (Aß(42) ); total tau (T-tau) protein, reflecting axonal damage; and phosphorylated tau (P-tau), reflecting neurofibrillary tangle pathology. Studies of plasma Aß have produced inferior diagnostic discrimination. Alternative plasma and platelet measures are described, which represent potential avenues for future research. CONCLUSIONS: Several imaging and CSF markers demonstrate utility in predicting AD progression and determining aetiology. These require standardisation before forming core elements of diagnostic criteria. The enormous potential available for identifying a minimally-invasive, easily-accessible blood measure as an effective AD biomarker currently remains unfulfilled. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
The application of fine grain pipelining techniques in the design of high performance Wave Digital Filters (WDFs) is described. It is shown that significant increases in the sampling rate of bit parallel circuits can be achieved using most significant bit (msb) first arithmetic. A novel VLSI architecture for implementing two-port adaptor circuits is described which embodies these ideas. The circuit in question is highly regular, uses msb first arithmetic and is implemented using simple carry-save adders. © 1992 Kluwer Academic Publishers.
Resumo:
A systematic design methodology is described for the rapid derivation of VLSI architectures for implementing high performance recursive digital filters, particularly ones based on most significant digit (msd) first arithmetic. The method has been derived by undertaking theoretical investigations of msd first multiply-accumulate algorithms and by deriving important relationships governing the dependencies between circuit latency, levels of pipe-lining and the range and number representations of filter operands. The techniques described are general and can be applied to both bit parallel and bit serial circuits, including those based on on-line arithmetic. The method is illustrated by applying it to the design of a number of highly pipelined bit parallel IIR and wave digital filter circuits. It is shown that established architectures, which were previously designed using heuristic techniques, can be derived directly from the equations described.
Resumo:
The application of fine-grain pipelining techniques in the design of high-performance wave digital filters (WDFs) is described. The problems of latency in feedback loops can be significantly reduced if computations are organized most significant, as opposed to least significant, bit first and if the results are fed back as soon as they are formed. The result is that chips can be designed which offer significantly higher sampling rates than otherwise can be obtained using conventional methods. How these concepts can be extended to the more challenging problem of WDFs is discussed. It is shown that significant increases in the sampling rate of bit-parallel circuits can be achieved using most significant bit first arithmetic.
Resumo:
Nanostructure and molecular orientation play a crucial role in determining the functionality of organic thin films. In practical devices, such as organic solar cells consisting of donor-acceptor mixtures, crystallinity is poor and these qualities cannot be readily determined by conventional diffraction techniques, while common microscopy only reveals surface morphology. Using a simple nondestructive technique, namely, continuous-wave electron paramagnetic resonance spectroscopy, which exploits the well-understood angular dependence of the g-factor and hyperfine tensors, we show that in the solar cell blend of C-60 and copper phthalocyanine (CuPc)-for which X-ray diffraction gives no information-the CuPc, and by implication the C-60, molecules form nanoclusters, with the planes of the CuPc molecules oriented perpendicular to the film surface. This information demonstrates that the current nanostructure in CuPc:C-60 solar cells is far from optimal and suggests that their efficiency could be considerably increased by alternative film growth algorithms.
Resumo:
The adoption of each new level of automotive emissions legislation often requires the introduction of additional emissions reduction techniques or the development of existing emissions control systems. This, in turn, usually requires the implementation of new sensors and hardware which must subsequently be monitored by the on-board fault detection systems. The reliable detection and diagnosis of faults in these systems or sensors, which result in the tailpipe emissions rising above the progressively lower failure thresholds, provides enormous challenges for OBD engineers. This paper gives a review of the field of fault detection and diagnostics as used in the automotive industry. Previous work is discussed and particular emphasis is placed on the various strategies and techniques employed. Methodologies such as state estimation, parity equations and parameter estimation are explained with their application within a physical model diagnostic structure. The utilization of symptoms and residuals in the diagnostic process is also discussed. These traditional physical model based diagnostics are investigated in terms of their limitations. The requirements from the OBD legislation are also addressed. Additionally, novel diagnostic techniques, such as principal component analysis (PCA) are also presented as a potential method of achieving the monitoring requirements of current and future OBD legislation.
Resumo:
Next Generation Sequencing (NGS) has the potential of becoming an important tool in clinical diagnosis and therapeutic decision-making in oncology owing to its enhanced sensitivity in DNA mutation detection, fast-turnaround of samples in comparison to current gold standard methods and the potential to sequence a large number of cancer-driving genes at the one time. We aim to test the diagnostic accuracy of current NGS technology in the analysis of mutations that represent current standard-of-care, and its reliability to generate concomitant information on other key genes in human oncogenesis. Thirteen clinical samples (8 lung adenocarcinomas, 3 colon carcinomas and 2 malignant melanomas) already genotyped for EGFR, KRAS and BRAF mutations by current standard-of-care methods (Sanger Sequencing and q-PCR), were analysed for detection of mutations in the same three genes using two NGS platforms and an additional 43 genes with one of these platforms. The results were analysed using closed platform-specific proprietary bioinformatics software as well as open third party applications. Our results indicate that the existing format of the NGS technology performed well in detecting the clinically relevant mutations stated above but may not be reliable for a broader unsupervised analysis of the wider genome in its current design. Our study represents a diagnostically lead validation of the major strengths and weaknesses of this technology before consideration for diagnostic use.
Resumo:
We obtained high-resolution, high-contrast optical imaging in the Sloan Digital Sky Survey i′ band with the LuckyCam camera mounted on the 2.56 m Nordic Optical Telescope, to search for faint stellar companions to 16 stars harbouring transiting exoplanets. The Lucky imaging technique uses very short exposures to obtain near diffraction-limited images yielding sub-arcsecond sensitivity, allowing us to search for faint stellar companions within the seeing disc of the primary planet host. Here, we report the detection of two candidate stellar companions to the planet host TrES-1 at separations <6.5 arcsec and we confirm stellar companions to CoRoT-2, CoRoT-3, TrES-2, TrES-4 and HAT-P-7 already known in the literature. We do not confirm the candidate companions to HAT-P-8 found via Lucky imaging by Bergfors et al., however, most probably because HAT-P-8 was observed in poor seeing conditions. Our detection sensitivity limits allow us to place constraints on the spectral types and masses of the putative bound companions to the planet host stars in our sample. If bound, the stellar companions identified in this work would provide stringent observational constraints to models of planet formation and evolution. In addition, these companions could affect the derived physical properties of the exoplanets in these systems.
Resumo:
Digital pathology and the adoption of image analysis have grown rapidly in the last few years. This is largely due to the implementation of whole slide scanning, advances in software and computer processing capacity and the increasing importance of tissue-based research for biomarker discovery and stratified medicine. This review sets out the key application areas for digital pathology and image analysis, with a particular focus on research and biomarker discovery. A variety of image analysis applications are reviewed including nuclear morphometry and tissue architecture analysis, but with emphasis on immunohistochemistry and fluorescence analysis of tissue biomarkers. Digital pathology and image analysis have important roles across the drug/companion diagnostic development pipeline including biobanking, molecular pathology, tissue microarray analysis, molecular profiling of tissue and these important developments are reviewed. Underpinning all of these important developments is the need for high quality tissue samples and the impact of pre-analytical variables on tissue research is discussed. This requirement is combined with practical advice on setting up and running a digital pathology laboratory. Finally, we discuss the need to integrate digital image analysis data with epidemiological, clinical and genomic data in order to fully understand the relationship between genotype and phenotype and to drive discovery and the delivery of personalized medicine.
Resumo:
Next-generation sequencing (NGS) is beginning to show its full potential for diagnostic and therapeutic applications. In particular, it is enunciating its capacity to contribute to a molecular taxonomy of cancer, to be used as a standard approach for diagnostic mutation detection, and to open new treatment options that are not exclusively organ-specific. If this is the case, how much validation is necessary and what should be the validation strategy, when bringing NGS into the diagnostic/clinical practice? This validation strategy should address key issues such as: what is the overall extent of the validation? Should essential indicators of test performance such as sensitivity of specificity be calculated for every target or sample type? Should bioinformatic interpretation approaches be validated with the same rigour? What is a competitive clinical turnaround time for a NGS-based test, and when does it become a cost-effective testing proposition? While we address these and other related topics in this commentary, we also suggest that a single set of international guidelines for the validation and use of NGS technology in routine diagnostics may allow us all to make a much more effective use of resources.
Resumo:
Nanoparticles offer alternative options in cancer therapy both as drug delivery carriers and as direct therapeutic agents for cancer cell inactivation. More recently, gold nanoparticles (AuNPs) have emerged as promising radiosensitizers achieving significantly elevated radiation dose enhancement factors when irradiated with both kilo-electron-volt and mega-electronvolt X-rays. Use of AuNPs in radiobiology is now being intensely driven by the desire to achieve precise energy deposition in tumours. As a consequence, there is a growing demand for efficient and simple techniques for detection, imaging and characterization of AuNPs in both biological and tumour samples. Spatially accurate imaging on the nanoscale poses a serious challenge requiring high- or super-resolution imaging techniques. In this mini review, we discuss the challenges in using AuNPs as radiosensitizers as well as various current and novel imaging techniques designed to validate the uptake, distribution and localization in mammalian cells. In our own work, we have used multiphoton excited plasmon resonance imaging to map the AuNP intracellular distribution. The benefits and limitations of this approach will also be discussed in some detail. In some cases, the same "excitation" mechanism as is used in an imaging modality can be harnessed tomake it also a part of therapymodality (e.g. phototherapy)-such examples are discussed in passing as extensions to the imaging modality concerned.
Resumo:
Despite the increasing availability of digital slide viewing, and numerous advantages associated with its application, a lack of quality validation studies is amongst the reasons for poor uptake in routine practice. This study evaluated primary digital pathology reporting in the setting of routine subspecialist gastrointestinal pathology, commonplace in most tissue pathology laboratories and representing one of the highest volume specialties in most laboratories. Individual digital and glass slide diagnoses were compared amongst three pathologists reporting in a gastrointestinal subspecialty team, in a prospective series of 100 consecutive diagnostic cases from routine practice in a large teaching hospital laboratory. The study included a washout period of at least 6 months. Discordant diagnoses were classified, and the study evaluated against recent College of American Pathologists (CAP) recommendations for evaluating digital pathology systems for diagnostic use. The study design met all 12 of the CAP recommendations. The 100 study cases generated 300 pairs of diagnoses, comprising 100 glass slide diagnoses and 100 digital diagnoses from each of the three study pathologists. 286 of 300 pairs of diagnoses were concordant, representing intraobserver concordance of 95.3 %, broadly comparable to rates previously published in this field. In ten of the 14 discordant pairs, the glass slide diagnosis was favoured; in four cases, the digital diagnosis was favoured, but importantly, the 14 discordant intraobserver diagnoses were considered to be of minor clinical significance. Interobserver, or viewing modality independent, concordance was found in 94 of the total of 100 study cases, providing a comparable baseline discordance rate expected in any second viewing of pathology material. These overall results support the safe use of digital pathology in primary diagnostic reporting in this setting
Resumo:
A novel digital image correlation (DIC) technique has been developed to track changes in textile yarn orientations during shear characterisation experiments, requiring only low-cost digital imaging equipment. Fabric shear angles and effective yarn strains are calculated and visualised using this new DIC technique for bias extension testing of an aerospace grade, carbon-fibre reinforcement material with a plain weave architecture. The DIC results are validated by direct measurement, and the use of a wide bias extension sample is evaluated against a more commonly used narrow sample. Wide samples exhibit a shear angle range 25% greater than narrow samples and peak loads which are 10 times higher. This is primarily due to excessive yarn slippage in the narrow samples; hence, the wide sample configuration is recommended for characterisation of shear properties which are required for accurate modelling of textile draping.
Resumo:
Efficient identification and follow-up of astronomical transients is hindered by the need for humans to manually select promising candidates from data streams that contain many false positives. These artefacts arise in the difference images that are produced by most major ground-based time-domain surveys with large format CCD cameras. This dependence on humans to reject bogus detections is unsustainable for next generation all-sky surveys and significant effort is now being invested to solve the problem computationally. In this paper, we explore a simple machine learning approach to real-bogus classification by constructing a training set from the image data of similar to 32 000 real astrophysical transients and bogus detections from the Pan-STARRS1 Medium Deep Survey. We derive our feature representation from the pixel intensity values of a 20 x 20 pixel stamp around the centre of the candidates. This differs from previous work in that it works directly on the pixels rather than catalogued domain knowledge for feature design or selection. Three machine learning algorithms are trained (artificial neural networks, support vector machines and random forests) and their performances are tested on a held-out subset of 25 per cent of the training data. We find the best results from the random forest classifier and demonstrate that by accepting a false positive rate of 1 per cent, the classifier initially suggests a missed detection rate of around 10 per cent. However, we also find that a combination of bright star variability, nuclear transients and uncertainty in human labelling means that our best estimate of the missed detection rate is approximately 6 per cent.