982 resultados para (2D)2PCA
Resumo:
A series of crown ether appended macrocyclic amines has been prepared comprising benzo-12-crown-4, benzo-15-crown-5, or benzo-18-crown-6 attached to a diamino-substituted cyclam. The Co-III complexes of these three receptors have been prepared and characterized spectroscopically and structurally. Crystal structures of each receptor in complex with an alkali metal ion and structures of the benzo-12-crown-4 and benzo-15-crown-5-receptors without guest ions are reported. 2D NMR and molecular mechanics modeling have been used to examine conformational variations upon guest ion complexation. Addition of cations to these receptors results in an appreciable anodic shift in the Co-III:II 11 redox potential, even in aqueous solution, but little cation selectivity is observed. Evidence for complex formation has been corroborated by Na-23 and Li-7 NMR spectroscopy and electrospray mass spectrometry.
Resumo:
The formation of molecular complexes (prereactive intermediates) between C3O2 and amines (ammonia, dimethylamine, trimethylamine, and 4-(dimethylamino)pyridine) as well as the subsequent transformation of the complexes into C3O2-amine zwitterions in cryogenic matrixes (ca. 40 K) has been observed. In the case of dimethylamine, the formation of tetramethylmalonamide has also been documented. Calculations using density functional theory (B3LYP/6-31G(2d, p)) are used to assign all above species and are in excellent agreement with the IR spectra.
Resumo:
High-throughput screening (HTS) using high-density microplates is the primary method for the discovery of novel lead candidate molecules. However, new strategies that eschew 2D microplate technology, including technologies that enable mass screening of targets against large combinatorial libraries, have the potential to greatly increase throughput and decrease unit cost. This review presents an overview of state-of-the-art microplate-based HTS technology and includes a discussion of emerging miniaturized systems for HTS. We focus on new methods of encoding combinatorial libraries that promise throughputs of as many as 100 000 compounds per second.
Resumo:
The flow field and the energy transport near thermoacoustic couples are simulated using a 2D full Navier-Stokes solver. The thermoacoustic couple plate is maintained at a constant temperature; plate lengths, which are short and long compared with the particle displacement lengths of the acoustic standing waves, are tested. Also investigated are the effects of plate spacing and the amplitude of the standing wave. Results are examined in the form of energy vectors, particle paths, and overall entropy generation rates. These show that a net heat-pumping effect appears only near the edges of thermoacoustic couple plates, within about a particle displacement distance from the ends. A heat-pumping effect can be seen even on the shortest plates tested when the plate spacing exceeds the thermal penetration depth. It is observed that energy dissipation near the plate increases quadratically as the plate spacing is reduced. The results also indicate that there may be a larger scale vortical motion outside the plates which disappears as the plate spacing is reduced. (C) 2002 Acoustical Society of America.
Resumo:
Four novel sesquiterpenes, namely 7alpha,8beta,13-trihydroxy-5,13-marasmanolide (2), isoplorantinone (5), 4,8,14-trihydroxyilludala-2,6,8-triene (6), and 8-hydroxy-8,9-secolactara-1,6-dien-5,13-olide (10), together with six known ones, 7alpha,8beta-dihydroxy-5,13-marasmanolide (1), 7alpha,8alpha-dihydroxy-5,13-marasmanolide (3), isolactarorufin (4), blennin A (7), blennin D (8), and lactarorufin (9), were isolated from the ethanolic extract of Lactarius piperatus. The structures of these sesquiterpenes, representing diversified structural types, were determined mainly by spectroscopic methods, especially 2D-NMR techniques. The structure of 6 was further confirmed by a single-crystal X-ray-diffraction determination.
Resumo:
Subtractive imaging in confocal fluorescence light microscopy is based on the subtraction of a suitably weighted widefield image from a confocal image. An approximation to a widefield image can be obtained by detection with an opened confocal pinhole. The subtraction of images enhances the resolution in-plane as well as along the optic axis. Due to the linearity of the approach, the effect of subtractive imaging in Fourier-space corresponds to a reduction of low spatial frequency contributions leading to a relative enhancement of the high frequencies. Along the direction of the optic axis this also results in an improved sectioning. Image processing can achieve a similar effect. However, a 3D volume dataset must be acquired and processed, yielding a result essentially identical to subtractive imaging but superior in signal-to-noise ratio. The latter can be increased further with the technique of weighted averaging in Fourier-space. A comparison of 2D and 3D experimental data analysed with subtractive imaging, the equivalent Fourier-space processing of the confocal data only, and Fourier-space weighted averaging is presented. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
In this work the critical indices β, γ , and ν for a three-dimensional (3D) hardcore cylinder composite system with short-range interaction have been obtained. In contrast to the 2D stick system and the 3D hardcore cylinder system, the determined critical exponents do not belong to the same universality class as the lattice percolation,although they obey the common hyperscaling relation for a 3D system. It is observed that the value of the correlation length exponent is compatible with the predictions of the mean field theory. It is also shown that, by using the Alexander-Orbach conjuncture, the relation between the conductivity and the correlation length critical exponents has a typical value for a 3D lattice system.
Resumo:
In the last years, it has become increasingly clear that neurodegenerative diseases involve protein aggregation, a process often used as disease progression readout and to develop therapeutic strategies. This work presents an image processing tool to automatic segment, classify and quantify these aggregates and the whole 3D body of the nematode Caenorhabditis Elegans. A total of 150 data set images, containing different slices, were captured with a confocal microscope from animals of distinct genetic conditions. Because of the animals’ transparency, most of the slices pixels appeared dark, hampering their body volume direct reconstruction. Therefore, for each data set, all slices were stacked in one single 2D image in order to determine a volume approximation. The gradient of this image was input to an anisotropic diffusion algorithm that uses the Tukey’s biweight as edge-stopping function. The image histogram median of this outcome was used to dynamically determine a thresholding level, which allows the determination of a smoothed exterior contour of the worm and the medial axis of the worm body from thinning its skeleton. Based on this exterior contour diameter and the medial animal axis, random 3D points were then calculated to produce a volume mesh approximation. The protein aggregations were subsequently segmented based on an iso-value and blended with the resulting volume mesh. The results obtained were consistent with qualitative observations in literature, allowing non-biased, reliable and high throughput protein aggregates quantification. This may lead to a significant improvement on neurodegenerative diseases treatment planning and interventions prevention
Resumo:
Minimally invasive cardiovascular interventions guided by multiple imaging modalities are rapidly gaining clinical acceptance for the treatment of several cardiovascular diseases. These images are typically fused with richly detailed pre-operative scans through registration techniques, enhancing the intra-operative clinical data and easing the image-guided procedures. Nonetheless, rigid models have been used to align the different modalities, not taking into account the anatomical variations of the cardiac muscle throughout the cardiac cycle. In the current study, we present a novel strategy to compensate the beat-to-beat physiological adaptation of the myocardium. Hereto, we intend to prove that a complete myocardial motion field can be quickly recovered from the displacement field at the myocardial boundaries, therefore being an efficient strategy to locally deform the cardiac muscle. We address this hypothesis by comparing three different strategies to recover a dense myocardial motion field from a sparse one, namely, a diffusion-based approach, thin-plate splines, and multiquadric radial basis functions. Two experimental setups were used to validate the proposed strategy. First, an in silico validation was carried out on synthetic motion fields obtained from two realistic simulated ultrasound sequences. Then, 45 mid-ventricular 2D sequences of cine magnetic resonance imaging were processed to further evaluate the different approaches. The results showed that accurate boundary tracking combined with dense myocardial recovery via interpolation/ diffusion is a potentially viable solution to speed up dense myocardial motion field estimation and, consequently, to deform/compensate the myocardial wall throughout the cardiac cycle. Copyright © 2015 John Wiley & Sons, Ltd.
Resumo:
One of the current frontiers in the clinical management of Pectus Excavatum (PE) patients is the prediction of the surgical outcome prior to the intervention. This can be done through computerized simulation of the Nuss procedure, which requires an anatomically correct representation of the costal cartilage. To this end, we take advantage of the costal cartilage tubular structure to detect it through multi-scale vesselness filtering. This information is then used in an interactive 2D initialization procedure which uses anatomical maximum intensity projections of 3D vesselness feature images to efficiently initialize the 3D segmentation process. We identify the cartilage tissue centerlines in these projected 2D images using a livewire approach. We finally refine the 3D cartilage surface through region-based sparse field level-sets. We have tested the proposed algorithm in 6 noncontrast CT datasets from PE patients. A good segmentation performance was found against reference manual contouring, with an average Dice coefficient of 0.75±0.04 and an average mean surface distance of 1.69±0.30mm. The proposed method requires roughly 1 minute for the interactive initialization step, which can positively contribute to an extended use of this tool in clinical practice, since current manual delineation of the costal cartilage can take up to an hour.
Resumo:
Background: An accurate percutaneous puncture is essential for disintegration and removal of renal stones. Although this procedure has proven to be safe, some organs surrounding the renal target might be accidentally perforated. This work describes a new intraoperative framework where tracked surgical tools are superimposed within 4D ultrasound imaging for security assessment of the percutaneous puncture trajectory (PPT). Methods: A PPT is first generated from the skin puncture site towards an anatomical target, using the information retrieved by electromagnetic motion tracking sensors coupled to surgical tools. Then, 2D ultrasound images acquired with a tracked probe are used to reconstruct a 4D ultrasound around the PPT under GPU processing. Volume hole-filling was performed in different processing time intervals by a tri-linear interpolation method. At spaced time intervals, the volume of the anatomical structures was segmented to ascertain if any vital structure is in between PPT and might compromise the surgical success. To enhance the volume visualization of the reconstructed structures, different render transfer functions were used. Results: Real-time US volume reconstruction and rendering with more than 25 frames/s was only possible when rendering only three orthogonal slice views. When using the whole reconstructed volume one achieved 8-15 frames/s. 3 frames/s were reached when one introduce the segmentation and detection if some structure intersected the PPT. Conclusions: The proposed framework creates a virtual and intuitive platform that can be used to identify and validate a PPT to safely and accurately perform the puncture in percutaneous nephrolithotomy.
Resumo:
Dental implant recognition in patients without available records is a time-consuming and not straightforward task. The traditional method is a complete user-dependent process, where the expert compares a 2D X-ray image of the dental implant with a generic database. Due to the high number of implants available and the similarity between them, automatic/semi-automatic frameworks to aide implant model detection are essential. In this study, a novel computer-aided framework for dental implant recognition is suggested. The proposed method relies on image processing concepts, namely: (i) a segmentation strategy for semi-automatic implant delineation; and (ii) a machine learning approach for implant model recognition. Although the segmentation technique is the main focus of the current study, preliminary details of the machine learning approach are also reported. Two different scenarios are used to validate the framework: (1) comparison of the semi-automatic contours against implant’s manual contours of 125 X-ray images; and (2) classification of 11 known implants using a large reference database of 601 implants. Regarding experiment 1, 0.97±0.01, 2.24±0.85 pixels and 11.12±6 pixels of dice metric, mean absolute distance and Hausdorff distance were obtained, respectively. In experiment 2, 91% of the implants were successfully recognized while reducing the reference database to 5% of its original size. Overall, the segmentation technique achieved accurate implant contours. Although the preliminary classification results prove the concept of the current work, more features and an extended database should be used in a future work.
Resumo:
Dissertação apresentada para obtenção do grau de Mestre em Educação Matemática na Educação Pré-Escolar e nos 1.º e 2.º Ciclos do Ensino Básico
Resumo:
Com este trabalho mostra-se a importância da utilização integrada de modelos numéricos e de resultados da observação do comportamento dinâmico com vista ao controlo de segurança de grandes estruturas, particularizando para o caso da barragem do Cabril e da respectiva torre das tomadas de água. Descrevem-se os fundamentos da dinâmica de estruturas sob a perspectiva da realização de estudos no domínio do tempo e no domínio da frequência, e referem-se os princípios em que se baseiam as metodologias de identificação modal, utilizadas na interpretação e análise de resultados de ensaios de vibração ambiental. Apresentam-se os fundamentos do método dos elementos finitos na perspectiva da sua implementação computacional para análise dinâmica de estruturas, e apresenta-se sumariamente o programa MEFDIN3D, desenvolvido em MATLAB no âmbito deste trabalho, o qual permite a análise estática e dinâmica de estruturas utilizando elementos finitos de placa e tridimensionais. Analisam-se os parâmetros dinâmicos da torre, em termos de frequências naturais e configurações modais, utilizando modelos numéricos 2D (MEFDIN3D e SAP 2000) e um modelo 3D em SAP 2000. Os resultados destes modelos numéricos são comparados com resultados experimentais obtidos a partir de: i) ensaios de vibração ambiental com medição de acelerações no topo da torre e no corpo da barragem e; ii) de um sistema de observação em contínuo do comportamento dinâmico da barragem do Cabril, recentemente instalado em obra pelo LNEC. Após a calibração dos modelos numéricos, apresenta-se um estudo de previsão do comportamento dinâmico da torre sob acções sísmicas, efectuando a análise no domínio do tempo e por espectro de resposta. Por fim, apresentam-se resultados de um cálculo sísmico 3D da barragem do Cabril com o MEFDIN3D.
Resumo:
In the Sparse Point Representation (SPR) method the principle is to retain the function data indicated by significant interpolatory wavelet coefficients, which are defined as interpolation errors by means of an interpolating subdivision scheme. Typically, a SPR grid is coarse in smooth regions, and refined close to irregularities. Furthermore, the computation of partial derivatives of a function from the information of its SPR content is performed in two steps. The first one is a refinement procedure to extend the SPR by the inclusion of new interpolated point values in a security zone. Then, for points in the refined grid, such derivatives are approximated by uniform finite differences, using a step size proportional to each point local scale. If required neighboring stencils are not present in the grid, the corresponding missing point values are approximated from coarser scales using the interpolating subdivision scheme. Using the cubic interpolation subdivision scheme, we demonstrate that such adaptive finite differences can be formulated in terms of a collocation scheme based on the wavelet expansion associated to the SPR. For this purpose, we prove some results concerning the local behavior of such wavelet reconstruction operators, which stand for SPR grids having appropriate structures. This statement implies that the adaptive finite difference scheme and the one using the step size of the finest level produce the same result at SPR grid points. Consequently, in addition to the refinement strategy, our analysis indicates that some care must be taken concerning the grid structure, in order to keep the truncation error under a certain accuracy limit. Illustrating results are presented for 2D Maxwell's equation numerical solutions.