7 resultados para Time-resolved methods

em Repositório Científico do Instituto Politécnico de Lisboa - Portugal


Relevância:

90.00% 90.00%

Publicador:

Resumo:

To study luminescence, reflectance, and color stability of dental composites and ceramics. Materials and Methods: IPS e.max, IPS Classic, Gradia, and Sinfony materials were tested, both unpolished (as-cast) and polished specimens. Coffee, tea, red wine, and distilled water (control) were used as staining drinks. Disk-shaped specimens were soaked in the staining drinks for up to 5 days. Color was measured by a colorimeter. Fluorescence was recorded using a spectrofluorometer, in the front-face geometry. Time-resolved fluorescence spectra were recorded using a laser nanosecond spectrofluorometer. Results: The exposure of the examined dental materials to staining drinks caused changes in color of the composites and ceramics, with the polished specimens exhibiting significantly lower color changes as compared to unpolished specimens. Composites exhibited lower color stability as compared to ceramic materials. Water also caused perceptible color changes in most materials. The materials tested demonstrated significantly different initial luminescence intensities. Upon exposure to staining drinks, luminescence became weaker by up to 40%, dependent on the drink and the material. Time-resolved luminescence spectra exhibited some red shift of the emission band at longer times, with the lifetimes in the range of tens of nanoseconds. Conclusions: Unpolished specimens with a more developed surface have lower color stability. Specimens stored in water develop some changes in their visual appearance. The presently proposed methods are effective in evaluating the luminescence of dental materials. Luminescence needs to be tested in addition to color, as the two characteristics are uncorrelated. It is important to further improve the color and luminescence stability of dental materials.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The very high antiproliferative activity of [Co(Cl)(H2O)(phendione)(2)][BF4] (phendione is 1,10-phenanthroline-5,6-dione) against three human tumor cell lines (half-maximal inhibitory concentration below 1 mu M) and its slight selectivity for the colorectal tumor cell line compared with healthy human fibroblasts led us to explore the mechanisms of action underlying this promising antitumor potential. As previously shown by our group, this complex induces cell cycle arrest in S phase and subsequent cell death by apoptosis and it also reduces the expression of proteins typically upregulated in tumors. In the present work, we demonstrate that [Co(Cl)(phendione)(2)(H2O)][BF4] (1) does not reduce the viability of nontumorigenic breast epithelial cells by more than 85 % at 1 mu M, (2) promotes the upregulation of proapoptotic Bax and cell-cycle-related p21, and (3) induces release of lactate dehydrogenase, which is partially reversed by ursodeoxycholic acid. DNA interaction studies were performed to uncover the genotoxicity of the complex and demonstrate that even though it displays K (b) (+/- A standard error of the mean) of (3.48 +/- A 0.03) x 10(5) M-1 and is able to produce double-strand breaks in a concentration-dependent manner, it does not exert any clastogenic effect ex vivo, ruling out DNA as a major cellular target for the complex. Steady-state and time-resolved fluorescence spectroscopy studies are indicative of a strong and specific interaction of the complex with human serum albumin, involving one binding site, at a distance of approximately 1.5 nm for the Trp214 indole side chain with log K (b) similar to 4.7, thus suggesting that this complex can be efficiently transported by albumin in the blood plasma.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: Summarize all relevant findings in published literature regarding the potential dose reduction related to image quality using Sinogram-Affirmed Iterative Reconstruction (SAFIRE) compared to Filtered Back Projection (FBP). Background: Computed Tomography (CT) is one of the most used radiographic modalities in clinical practice providing high spatial and contrast resolution. However it also delivers a relatively high radiation dose to the patient. Reconstructing raw-data using Iterative Reconstruction (IR) algorithms has the potential to iteratively reduce image noise while maintaining or improving image quality of low dose standard FBP reconstructions. Nevertheless, long reconstruction times made IR unpractical for clinical use until recently. Siemens Medical developed a new IR algorithm called SAFIRE, which uses up to 5 different strength levels, and poses an alternative to the conventional IR with a significant reconstruction time reduction. Methods: MEDLINE, ScienceDirect and CINAHL databases were used for gathering literature. Eleven articles were included in this review (from 2012 to July 2014). Discussion: This narrative review summarizes the results of eleven articles (using studies on both patients and phantoms) and describes SAFIRE strengths for noise reduction in low dose acquisitions while providing acceptable image quality. Conclusion: Even though the results differ slightly, the literature gathered for this review suggests that the dose in current CT protocols can be reduced at least 50% while maintaining or improving image quality. There is however a lack of literature concerning paediatric population (with increased radiation sensitivity). Further studies should also assess the impact of SAFIRE on diagnostic accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Epidemiological studies showed increased prevalence of respiratory symptoms and adverse changes in pulmonary function parameters in poultry workers, corroborating the increased exposure to risk factors, such as fungal load and their metabolites. This study aimed to determine the occupational exposure threat due to fungal contamination caused by the toxigenic isolates belonging to the complex of the species of Aspergillus flavus and also isolates fromAspergillus fumigatus species complex. The study was carried out in seven Portuguese poultries, using cultural and molecularmethodologies. For conventional/cultural methods, air, surfaces, and litter samples were collected by impaction method using the Millipore Air Sampler. For the molecular analysis, air samples were collected by impinger method using the Coriolis μ air sampler. After DNA extraction, samples were analyzed by real-time PCR using specific primers and probes for toxigenic strains of the Aspergillus flavus complex and for detection of isolates from Aspergillus fumigatus complex. Through conventional methods, and among the Aspergillus genus, different prevalences were detected regarding the presence of Aspergillus flavus and Aspergillus fumigatus species complexes, namely: 74.5 versus 1.0% in the air samples, 24.0 versus 16.0% in the surfaces, 0 versus 32.6% in new litter, and 9.9 versus 15.9%in used litter. Through molecular biology, we were able to detect the presence of aflatoxigenic strains in pavilions in which Aspergillus flavus did not grow in culture. Aspergillus fumigatus was only found in one indoor air sample by conventional methods. Using molecular methodologies, however, Aspergillus fumigatus complex was detected in seven indoor samples from three different poultry units. The characterization of fungal contamination caused by Aspergillus flavus and Aspergillus fumigatus raises the concern of occupational threat not only due to the detected fungal load but also because of the toxigenic potential of these species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microarray allow to monitoring simultaneously thousands of genes, where the abundance of the transcripts under a same experimental condition at the same time can be quantified. Among various available array technologies, double channel cDNA microarray experiments have arisen in numerous technical protocols associated to genomic studies, which is the focus of this work. Microarray experiments involve many steps and each one can affect the quality of raw data. Background correction and normalization are preprocessing techniques to clean and correct the raw data when undesirable fluctuations arise from technical factors. Several recent studies showed that there is no preprocessing strategy that outperforms others in all circumstances and thus it seems difficult to provide general recommendations. In this work, it is proposed to use exploratory techniques to visualize the effects of preprocessing methods on statistical analysis of cancer two-channel microarray data sets, where the cancer types (classes) are known. For selecting differential expressed genes the arrow plot was used and the graph of profiles resultant from the correspondence analysis for visualizing the results. It was used 6 background methods and 6 normalization methods, performing 36 pre-processing methods and it was analyzed in a published cDNA microarray database (Liver) available at http://genome-www5.stanford.edu/ which microarrays were already classified by cancer type. All statistical analyses were performed using the R statistical software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT - Starting with the explanation of metanarrative as a sort of self-reflexive storytelling (as defended by Kenneth Weaver Hope in his unpublished PhD. thesis), I propose to talk about enunciative practices that stress the telling more than the told. In line with some metaficcional practices applied to cinema, such as the ‘mindfuck’ film (Jonathan Eig, 2003), the ‘psychological puzzle film’ (Elliot Panek, 2003) and the ‘mind-game film’ (Thomas Elsaesser, 2009), I will address the manipulations that a narrative film endures in order to produce a more fruitful and complex experience for the viewer. I will particularly concentrate on the misrepresentation of time as a way to produce a labyrinthine work of fiction where the linear description of events is replaced by a game of time disclosure. The viewer is thus called upon to reconstruct the order of the various situations portrayed in a process that I call ‘temporal mapping’. However, as the viewer attempts to do this, the film, ironically, because of the intricate nature of the plot and the uncertain status of the characters, resists the attempt. There is a sort of teasing taking place between the film and its spectator: an invitation of decoding that is half-denied until the end, where the puzzle is finally solved. I will use three of Alejandro Iñárritu’s films to better convey my point: Amores perros (2000), 21 Grams (2003) and Babel (2006). I will consider Iñárritu’s methods to produce a non-linear storytelling as a way to stress the importance of time and its validity as one of the elements that make up for a metanarrative experience in films. I will focus especially on 21 Grams, which I consider to be a paragon of the labyrinth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hyperspectral instruments have been incorporated in satellite missions, providing large amounts of data of high spectral resolution of the Earth surface. This data can be used in remote sensing applications that often require a real-time or near-real-time response. To avoid delays between hyperspectral image acquisition and its interpretation, the last usually done on a ground station, onboard systems have emerged to process data, reducing the volume of information to transfer from the satellite to the ground station. For this purpose, compact reconfigurable hardware modules, such as field-programmable gate arrays (FPGAs), are widely used. This paper proposes an FPGA-based architecture for hyperspectral unmixing. This method based on the vertex component analysis (VCA) and it works without a dimensionality reduction preprocessing step. The architecture has been designed for a low-cost Xilinx Zynq board with a Zynq-7020 system-on-chip FPGA-based on the Artix-7 FPGA programmable logic and tested using real hyperspectral data. Experimental results indicate that the proposed implementation can achieve real-time processing, while maintaining the methods accuracy, which indicate the potential of the proposed platform to implement high-performance, low-cost embedded systems, opening perspectives for onboard hyperspectral image processing.