942 resultados para Non-uniform heat intensity
Resumo:
Background and Objectives: This study evaluated the hybrid layer (HL) morphology created by three adhesive systems (AS) on dentin surfaces treated with Er:YAG laser using two irradiation parameters. Study Design: Occlusal flat dentin surfaces of 36 human third molars were assigned into nine groups (n = 4) according to the following ASs: one bottle etch&rinse Single Bond Plus (3M ESPE), two-step Clearfil Protect Bond (Kuraray), and all-in-one S3 Bond (Kuraray) self-etching, which were labeled with rhodamine B or fluorescein isothiocyanate dextran and were applied to dentin surfaces that were irradiated with Er:YAG laser at either 120 (38.7 J/cm(2)) or 200 mJ/pulse (64.5 J/cm(2)), or were applied to untreated dentin surfaces (control group). The ASs were light-activated following MI and the bonded surfaces were restored with resin composite Z250 (3M ESPE). After 24 hours of storage in vegetable oil, the restored teeth were vertically, serially sectioned into 1-mm thick slabs, which had the adhesive interfaces analyzed with confocal laser microscope (CLSM-LSM 510 Meta). CLSM images were recorded in the fluorescent mode from three different regions along each bonded interface. Results: Non-uniform HL was created on laser-irradiated dentin surfaces regardless of laser irradiation protocol for all AS, while regular and uniform HL was observed in the control groups. ""Stretch mark""-like red lines were found within the HL as a result of resin infiltration into dentin microfissures, which were predominantly observed in 200 mJ/pulse groups regardless of AS. Poor resin infiltration into peritubular dentin was observed in most regions of adhesive interfaces created by all ASs on laser-irradiated dentin, resulting in thin resin tags with neither funnel-shaped morphology nor lateral resin projections. Conclusion: Laser irradiation of dentin surfaces at 120 or 200 mJ/pulse resulted in morphological changes in HL and resin tags for all ASs evaluated in the study. Lasers Surg. Med. 42:662-670, 2010. (C) 2010 Wiley-Liss, Inc.
Resumo:
Objective. The goal of this paper is to undertake a literature search collecting all dentin bond strength data obtained for six adhesives with four tests ( shear, microshear, tensile and microtensile) and to critically analyze the results with respect to average bond strength, coefficient of variation, mode of failure and product ranking. Method. A PubMed search was carried out for the years between 1998 and 2009 identifying publications on bond strength measurements of resin composite to dentin using four tests: shear, tensile, microshear and microtensile. The six adhesive resins were selected covering three step systems ( OptiBond FL, Scotch Bond Multi-Purpose Plus), two-step (Prime & Bond NT, Single Bond, Clear. l SE Bond) and one step (Adper Prompt L Pop). Results. Pooling results from 147 references showed an ongoing high scatter in the bond strength data regardless which adhesive and which bond test was used. Coefficients of variation remained high (20-50%) even with the microbond test. The reported modes of failure for all tests still included high number of cohesive failures. The ranking seemed to be dependant on the test used. Significance. The scatter in dentin bond strength data remains regardless which test is used confirming Finite Element Analysis predicting non-uniform stress distributions due to a number of geometrical, loading, material properties and specimens preparation variables. This reopens the question whether, an interfacial fracture mechanics approach to analyze the dentin - adhesive bond is not more appropriate for obtaining better agreement among dentin bond related papers. (C) 2009 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Resumo:
Research on the stability of flavours during high temperature extrusion cooking is reviewed. The important factors that affect flavour and aroma retention during the process of extrusion are illustrated. A substantial number of flavour volatiles which are incorporated prior to extrusion are normally lost during expansion, this is because of steam distillation. Therefore, a general practice has been to introduce a flavour mix after the extrusion process. This extra operation requires a binding agent (normally oil), and may also result in a non-uniform distribution of the flavour and low oxidative stability of the flavours exposed on the surface. Therefore, the importance of encapsulated flavours, particularly the beta -cyclodextrin-flavour complex, is highlighted in this paper.
Resumo:
The chlorophyll meter (SPAD-502) is widely used to estimate chlorophyll content, but non-uniform chloroplast distribution can affect its accuracy. This study aimed to assess the effect of photon fluence (F, irradiance x time of illumination) in leaves with different chlorophyll content and determine the effect of chlorophyll a/b on SPAD values of four tropical tree species (Croton draconoides Müll. Arg., Hevea guianensis Aubl., Hymenaea courbaril L. and Matisia cordata H.B.K.). There were also determined calibration equations for the chlorophyll meter and assessed the effect of F on SPAD values between 07:00 h and 17:00 h. Calibration equations were obtained after determining leaf chlorophyll content in the laboratory. Increases in F with time caused a reduction in SPAD values in species with a high chlorophyll content, with reductions of 20% in M. cordata and 10% in H. guianensis. Leaves of C. draconoides and H. courbaril had lower chlorophyll content and showed no changes in SPAD values with increase in F. The chlorophyll a/b ratio increased with SPAD values and the SPAD/chlorophyll relationship was best described by an exponential equation. It seems that F may affect SPAD values in leaves with high chlorophyll content, probably due to non-uniform chloroplast distribution at high irradiance. This indicates that SPAD values tend to be more accurate if recorded early in morning when irradiance is low.
Resumo:
The use of iris recognition for human authentication has been spreading in the past years. Daugman has proposed a method for iris recognition, composed by four stages: segmentation, normalization, feature extraction, and matching. In this paper we propose some modifications and extensions to Daugman's method to cope with noisy images. These modifications are proposed after a study of images of CASIA and UBIRIS databases. The major modification is on the computationally demanding segmentation stage, for which we propose a faster and equally accurate template matching approach. The extensions on the algorithm address the important issue of pre-processing that depends on the image database, being mandatory when we have a non infra-red camera, like a typical WebCam. For this scenario, we propose methods for reflection removal and pupil enhancement and isolation. The tests, carried out by our C# application on grayscale CASIA and UBIRIS images show that the template matching segmentation method is more accurate and faster than the previous one, for noisy images. The proposed algorithms are found to be efficient and necessary when we deal with non infra-red images and non uniform illumination.
Resumo:
The recent trends of chip architectures with higher number of heterogeneous cores, and non-uniform memory/non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as a fundamental building block for developing parallel applications. Nevertheless, although STM promises to ease concurrent and parallel software development, it relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by embedded real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upper-bounded and task sets can be feasibly scheduled. In this paper we assess the use of STM in the development of embedded real-time software, defending that the amount of contention can be reduced if read-only transactions access recent consistent data snapshots, progressing in a wait-free manner. We show how the required number of versions of a shared object can be calculated for a set of tasks. We also outline an algorithm to manage conflicts between update transactions that prevents starvation.
Resumo:
The foreseen evolution of chip architectures to higher number of, heterogeneous, cores, with non-uniform memory and non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as an alternative to lock-based synchronisation. However, STM relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upperbounded and task sets can be feasibly scheduled. In this paper we defend the role of the transaction contention manager to reduce the number of transaction retries and to help the real-time scheduler assuring schedulability. For such purpose, the contention management policy should be aware of on-line scheduling information.
Resumo:
Earthquakes are associated with negative events, such as large number of casualties, destruction of buildings and infrastructures, or emergence of tsunamis. In this paper, we apply the Multidimensional Scaling (MDS) analysis to earthquake data. MDS is a set of techniques that produce spatial or geometric representations of complex objects, such that, objects perceived to be similar/distinct in some sense are placed nearby/distant on the MDS maps. The interpretation of the charts is based on the resulting clusters since MDS produces a different locus for each similarity measure. In this study, over three million seismic occurrences, covering the period from January 1, 1904 up to March 14, 2012 are analyzed. The events, characterized by their magnitude and spatiotemporal distributions, are divided into groups, either according to the Flinn–Engdahl seismic regions of Earth or using a rectangular grid based in latitude and longitude coordinates. Space-time and Space-frequency correlation indices are proposed to quantify the similarities among events. MDS has the advantage of avoiding sensitivity to the non-uniform spatial distribution of seismic data, resulting from poorly instrumented areas, and is well suited for accessing dynamics of complex systems. MDS maps are proven as an intuitive and useful visual representation of the complex relationships that are present among seismic events, which may not be perceived on traditional geographic maps. Therefore, MDS constitutes a valid alternative to classic visualization tools, for understanding the global behavior of earthquakes.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e Computadores
Resumo:
One hundred and fourteen hectares of a "terra-fiirme" rain forest 70 km north of Manaus, Amazonas, Brazil, were surveyed for leaf-cutting ant colonies (Atta spp). One half of this area was in isolated forest fragments (surrounded by pastures or second growth) of two sizes: 1 and 10 ha. The other half was in non-isolated fragments (connected to a large parch of forest) of the same sizes. Only two species occured in this forest: Atta sexdens sexdens L. and A. cepfhalotes L. The first was the most abundant species with a mean density of 0.35 colonies per ha. The mean density of A. cephalotes colonies was 0.03 per ha. The density of colonies was not significantly different between the isolated fragments and the continuous forest. Furthermore, the species composition did not change with isolation. However, pre-isolation data and long term monitoring are necessary to conclude that the isolation of a forest fragment has no effect upon Atta colonies. The non-uniform spatial distribution of Atta colonics within the "terra-firme" forest must be taken into account when selecting conservation areas in the Amazon, in order to preserve this important group of ants together with their native habitat.
Resumo:
Dissertação de mestrado integrado em Civil Engineering
Resumo:
A simple protocol is described for the silver staining of polyacrylamide gradient gels used for the separation of restriction fragments of kinetoplast DNA [schizodeme analysis of trypanosomatids (Morel et al., 1980)]. The method overcomes the problems of non-uniform staining and strong background color which are frequently encountered when conventional protocols for silver staining of linear gels. The method described has proven to be of general applicability for DNA, RNA and protein separations in gradient gels.
Resumo:
Gastromermis cordobensis n. sp (Nematoda: Mermithidae) a parasite of larvae of the blackfly Simulium lahillei Paterson & Shannon (Diptera: Simuliidae) in Argentina, is described. Diagnostic characters of this species include a mouth ventrlly shifted; six cephalic papillae; eigh hypodermal chords; small and pear shaped amphids; a long and S-shaped vagina; a singl spicule, which is long, has non-uniform walls, and a tip with sculpture; three rows of genital papillae, the middle one with 18 pre-anal and 10 post-anal papillae, the lateral rows have 36 papillae each; oval eggs; and post-parasitic juveniles with long thin tails. Pre-parasitic and parasitic juveniles are included in the description.
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.