979 resultados para 2D synchrosqueezed transforms
Resumo:
We investigated the use of mice transgenic for human leucocyte antigen (HLA) A*0201 antigen-binding domains to test vaccines composed of defined HLA A*0201-restricted cytotoxic T-lymphocyte (CTL) epitopes of human papillomavirus (HPV) type 16 E7 oncoprotein. HPV is detected in >90% of cervical carcinomas. HPV16 E7 oncoprotein transforms cells of the uterine cervix and functions as a tumour-associated antigen to which immunotherapeutic strategies may be directed. We report that although the HLA A*0201 E7 epitope peptides function both to prime for E7 CTL responses, and to sensitize target cells for E7-directed CTL killing in situations where antigen processing is not required, the epitopes are not processed out of either endogenously expressed or immunization-introduced E7, by the mouse antigen-processing and presentation machinery. Thus (1) CTL induced by HLA A*0201 peptide immunization killed E7 peptide-pulsed target cells, but did not kill target cells expressing whole E7; (2) immunization with whole E7 protein did not elicit CTL directed to HLA A*0201-restricted E7 CTL epitopes; (3) HLA A*0201-restricted CTL epitopes expressed in the context of a DNA polytope vaccine did not activate E7-specific T cells either in 'conventional' HLA A*0201 transgenic (A2.1K(b) ) mice, or in HHD transgenic mice in which expression of endogenous H-2 class 1 is precluded; and (4) HLA A*0201 E7 peptide epitope immunization was incapable of preventing the growth of an HLA A*0201- and E7-expressing tumour. There are generic implications for the universal applicability of HLA-class 1 transgenic mice for studies of human CTL epitope presentation in murine models of human infectious disease where recognition of endogenously processed antigen is necessary. There are also specific implications for the use of HLA A2 transgenic mice for the development of E7-based therapeutic vaccines for cervical cancer.
Resumo:
A series of crown ether appended macrocyclic amines has been prepared comprising benzo-12-crown-4, benzo-15-crown-5, or benzo-18-crown-6 attached to a diamino-substituted cyclam. The Co-III complexes of these three receptors have been prepared and characterized spectroscopically and structurally. Crystal structures of each receptor in complex with an alkali metal ion and structures of the benzo-12-crown-4 and benzo-15-crown-5-receptors without guest ions are reported. 2D NMR and molecular mechanics modeling have been used to examine conformational variations upon guest ion complexation. Addition of cations to these receptors results in an appreciable anodic shift in the Co-III:II 11 redox potential, even in aqueous solution, but little cation selectivity is observed. Evidence for complex formation has been corroborated by Na-23 and Li-7 NMR spectroscopy and electrospray mass spectrometry.
Resumo:
The formation of molecular complexes (prereactive intermediates) between C3O2 and amines (ammonia, dimethylamine, trimethylamine, and 4-(dimethylamino)pyridine) as well as the subsequent transformation of the complexes into C3O2-amine zwitterions in cryogenic matrixes (ca. 40 K) has been observed. In the case of dimethylamine, the formation of tetramethylmalonamide has also been documented. Calculations using density functional theory (B3LYP/6-31G(2d, p)) are used to assign all above species and are in excellent agreement with the IR spectra.
Resumo:
High-throughput screening (HTS) using high-density microplates is the primary method for the discovery of novel lead candidate molecules. However, new strategies that eschew 2D microplate technology, including technologies that enable mass screening of targets against large combinatorial libraries, have the potential to greatly increase throughput and decrease unit cost. This review presents an overview of state-of-the-art microplate-based HTS technology and includes a discussion of emerging miniaturized systems for HTS. We focus on new methods of encoding combinatorial libraries that promise throughputs of as many as 100 000 compounds per second.
Resumo:
Apart from very few families who have a direct cause from genetic mutation, causes of most Parkinson's disease (PD) remain unclear. Many allelic association studies on polymorphism of different candidate genes have been studied. Although these association studies do not imply a causal relationship, it does warrant further studies to elucidate the pathophysiologic significance. CYP1A1 polymorphisms have been reported to be associated with PD in a Japanese population sample. Since CYP1A1 transforms aromatic hydrocarbons into products that may be neurotoxic and perhaps lead to PD, we therefore undertook a study to look at the possible association of CYP1A1 polymorphism and PD in a Chinese population. Contrary to the Japanese result, we did not find any statistically significant difference between the PD group and the control group in our study with a bigger sample size.
Resumo:
The flow field and the energy transport near thermoacoustic couples are simulated using a 2D full Navier-Stokes solver. The thermoacoustic couple plate is maintained at a constant temperature; plate lengths, which are short and long compared with the particle displacement lengths of the acoustic standing waves, are tested. Also investigated are the effects of plate spacing and the amplitude of the standing wave. Results are examined in the form of energy vectors, particle paths, and overall entropy generation rates. These show that a net heat-pumping effect appears only near the edges of thermoacoustic couple plates, within about a particle displacement distance from the ends. A heat-pumping effect can be seen even on the shortest plates tested when the plate spacing exceeds the thermal penetration depth. It is observed that energy dissipation near the plate increases quadratically as the plate spacing is reduced. The results also indicate that there may be a larger scale vortical motion outside the plates which disappears as the plate spacing is reduced. (C) 2002 Acoustical Society of America.
Resumo:
Four novel sesquiterpenes, namely 7alpha,8beta,13-trihydroxy-5,13-marasmanolide (2), isoplorantinone (5), 4,8,14-trihydroxyilludala-2,6,8-triene (6), and 8-hydroxy-8,9-secolactara-1,6-dien-5,13-olide (10), together with six known ones, 7alpha,8beta-dihydroxy-5,13-marasmanolide (1), 7alpha,8alpha-dihydroxy-5,13-marasmanolide (3), isolactarorufin (4), blennin A (7), blennin D (8), and lactarorufin (9), were isolated from the ethanolic extract of Lactarius piperatus. The structures of these sesquiterpenes, representing diversified structural types, were determined mainly by spectroscopic methods, especially 2D-NMR techniques. The structure of 6 was further confirmed by a single-crystal X-ray-diffraction determination.
Resumo:
We consider pure continuous variable entanglement with non-equal correlations between orthogonal quadratures. We introduce a simple protocol which equates these correlations and in the process transforms the entanglement onto a state with the minimum allowed number of photons. As an example we show that our protocol transforms, through unitary local operations, a single squeezed beam split on a beam splitter into the same entanglement that is produced when two squeezed beams are mixed orthogonally. We demonstrate that this technique can in principle facilitate perfect teleportation utilizing only one squeezed beam.
Resumo:
Subtractive imaging in confocal fluorescence light microscopy is based on the subtraction of a suitably weighted widefield image from a confocal image. An approximation to a widefield image can be obtained by detection with an opened confocal pinhole. The subtraction of images enhances the resolution in-plane as well as along the optic axis. Due to the linearity of the approach, the effect of subtractive imaging in Fourier-space corresponds to a reduction of low spatial frequency contributions leading to a relative enhancement of the high frequencies. Along the direction of the optic axis this also results in an improved sectioning. Image processing can achieve a similar effect. However, a 3D volume dataset must be acquired and processed, yielding a result essentially identical to subtractive imaging but superior in signal-to-noise ratio. The latter can be increased further with the technique of weighted averaging in Fourier-space. A comparison of 2D and 3D experimental data analysed with subtractive imaging, the equivalent Fourier-space processing of the confocal data only, and Fourier-space weighted averaging is presented. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
In this work the critical indices β, γ , and ν for a three-dimensional (3D) hardcore cylinder composite system with short-range interaction have been obtained. In contrast to the 2D stick system and the 3D hardcore cylinder system, the determined critical exponents do not belong to the same universality class as the lattice percolation,although they obey the common hyperscaling relation for a 3D system. It is observed that the value of the correlation length exponent is compatible with the predictions of the mean field theory. It is also shown that, by using the Alexander-Orbach conjuncture, the relation between the conductivity and the correlation length critical exponents has a typical value for a 3D lattice system.
Resumo:
In the last years, it has become increasingly clear that neurodegenerative diseases involve protein aggregation, a process often used as disease progression readout and to develop therapeutic strategies. This work presents an image processing tool to automatic segment, classify and quantify these aggregates and the whole 3D body of the nematode Caenorhabditis Elegans. A total of 150 data set images, containing different slices, were captured with a confocal microscope from animals of distinct genetic conditions. Because of the animals’ transparency, most of the slices pixels appeared dark, hampering their body volume direct reconstruction. Therefore, for each data set, all slices were stacked in one single 2D image in order to determine a volume approximation. The gradient of this image was input to an anisotropic diffusion algorithm that uses the Tukey’s biweight as edge-stopping function. The image histogram median of this outcome was used to dynamically determine a thresholding level, which allows the determination of a smoothed exterior contour of the worm and the medial axis of the worm body from thinning its skeleton. Based on this exterior contour diameter and the medial animal axis, random 3D points were then calculated to produce a volume mesh approximation. The protein aggregations were subsequently segmented based on an iso-value and blended with the resulting volume mesh. The results obtained were consistent with qualitative observations in literature, allowing non-biased, reliable and high throughput protein aggregates quantification. This may lead to a significant improvement on neurodegenerative diseases treatment planning and interventions prevention
Resumo:
Minimally invasive cardiovascular interventions guided by multiple imaging modalities are rapidly gaining clinical acceptance for the treatment of several cardiovascular diseases. These images are typically fused with richly detailed pre-operative scans through registration techniques, enhancing the intra-operative clinical data and easing the image-guided procedures. Nonetheless, rigid models have been used to align the different modalities, not taking into account the anatomical variations of the cardiac muscle throughout the cardiac cycle. In the current study, we present a novel strategy to compensate the beat-to-beat physiological adaptation of the myocardium. Hereto, we intend to prove that a complete myocardial motion field can be quickly recovered from the displacement field at the myocardial boundaries, therefore being an efficient strategy to locally deform the cardiac muscle. We address this hypothesis by comparing three different strategies to recover a dense myocardial motion field from a sparse one, namely, a diffusion-based approach, thin-plate splines, and multiquadric radial basis functions. Two experimental setups were used to validate the proposed strategy. First, an in silico validation was carried out on synthetic motion fields obtained from two realistic simulated ultrasound sequences. Then, 45 mid-ventricular 2D sequences of cine magnetic resonance imaging were processed to further evaluate the different approaches. The results showed that accurate boundary tracking combined with dense myocardial recovery via interpolation/ diffusion is a potentially viable solution to speed up dense myocardial motion field estimation and, consequently, to deform/compensate the myocardial wall throughout the cardiac cycle. Copyright © 2015 John Wiley & Sons, Ltd.
Resumo:
One of the current frontiers in the clinical management of Pectus Excavatum (PE) patients is the prediction of the surgical outcome prior to the intervention. This can be done through computerized simulation of the Nuss procedure, which requires an anatomically correct representation of the costal cartilage. To this end, we take advantage of the costal cartilage tubular structure to detect it through multi-scale vesselness filtering. This information is then used in an interactive 2D initialization procedure which uses anatomical maximum intensity projections of 3D vesselness feature images to efficiently initialize the 3D segmentation process. We identify the cartilage tissue centerlines in these projected 2D images using a livewire approach. We finally refine the 3D cartilage surface through region-based sparse field level-sets. We have tested the proposed algorithm in 6 noncontrast CT datasets from PE patients. A good segmentation performance was found against reference manual contouring, with an average Dice coefficient of 0.75±0.04 and an average mean surface distance of 1.69±0.30mm. The proposed method requires roughly 1 minute for the interactive initialization step, which can positively contribute to an extended use of this tool in clinical practice, since current manual delineation of the costal cartilage can take up to an hour.
Resumo:
Background: An accurate percutaneous puncture is essential for disintegration and removal of renal stones. Although this procedure has proven to be safe, some organs surrounding the renal target might be accidentally perforated. This work describes a new intraoperative framework where tracked surgical tools are superimposed within 4D ultrasound imaging for security assessment of the percutaneous puncture trajectory (PPT). Methods: A PPT is first generated from the skin puncture site towards an anatomical target, using the information retrieved by electromagnetic motion tracking sensors coupled to surgical tools. Then, 2D ultrasound images acquired with a tracked probe are used to reconstruct a 4D ultrasound around the PPT under GPU processing. Volume hole-filling was performed in different processing time intervals by a tri-linear interpolation method. At spaced time intervals, the volume of the anatomical structures was segmented to ascertain if any vital structure is in between PPT and might compromise the surgical success. To enhance the volume visualization of the reconstructed structures, different render transfer functions were used. Results: Real-time US volume reconstruction and rendering with more than 25 frames/s was only possible when rendering only three orthogonal slice views. When using the whole reconstructed volume one achieved 8-15 frames/s. 3 frames/s were reached when one introduce the segmentation and detection if some structure intersected the PPT. Conclusions: The proposed framework creates a virtual and intuitive platform that can be used to identify and validate a PPT to safely and accurately perform the puncture in percutaneous nephrolithotomy.
Resumo:
Dental implant recognition in patients without available records is a time-consuming and not straightforward task. The traditional method is a complete user-dependent process, where the expert compares a 2D X-ray image of the dental implant with a generic database. Due to the high number of implants available and the similarity between them, automatic/semi-automatic frameworks to aide implant model detection are essential. In this study, a novel computer-aided framework for dental implant recognition is suggested. The proposed method relies on image processing concepts, namely: (i) a segmentation strategy for semi-automatic implant delineation; and (ii) a machine learning approach for implant model recognition. Although the segmentation technique is the main focus of the current study, preliminary details of the machine learning approach are also reported. Two different scenarios are used to validate the framework: (1) comparison of the semi-automatic contours against implant’s manual contours of 125 X-ray images; and (2) classification of 11 known implants using a large reference database of 601 implants. Regarding experiment 1, 0.97±0.01, 2.24±0.85 pixels and 11.12±6 pixels of dice metric, mean absolute distance and Hausdorff distance were obtained, respectively. In experiment 2, 91% of the implants were successfully recognized while reducing the reference database to 5% of its original size. Overall, the segmentation technique achieved accurate implant contours. Although the preliminary classification results prove the concept of the current work, more features and an extended database should be used in a future work.