64 resultados para attempt to obtain disclosure


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: The objective of this study was to evaluate the influence of the surface treatment and acid conditioning (AC) time of bovine sclerotic dentine on the micro-tensile bond strength (mu-TBS) to an etch and rinse adhesive system. Materials and method: Thirty-six bovine incisors were divided into six groups (n = 6): G1 sound dentine submitted to AC for 15 s; G2-G6 sclerotic dentine: G2-AC for 15 s; G3-AC for 30 s; G4-EDTA and AC for 15 s; G5-diamond bur and AC for 15 s; G6-diamond paste and AC for 15 s. An adhesive system was applied to the treated dentine surfaces followed by a hybrid composite inserted in increments and light cured. After 24 h storage in water at 37 degrees C, the specimens were perpendicularly cut with a low-speed diamond saw to obtain beams (0.8 mm x 0.8 mm cross-sectional dimensions) for mu-TBS testing. Data was compared by ANOVA followed by Tukey`s test (P <= 0.05). Results: The mean L-TBS was G1: 18.87 +/- 5.36 MPa; G2: 12.94 +/- 2.09 MPa; G3: 11.73 +/- 0.64 MPa; G4: 11.14 +/- 1.50 MPa; G5: 22.75 +/- 4.10 MPa; G6: 22.48 +/- 2.71 MPa. G1, G5 and G6 presented similar bond strengths significantly higher than those of all other groups. Conclusion: The surface treatment of sclerotic dentine significantly influenced the bond strength to an adhesive system. Mechanical treatment, either using a diamond bur or a diamond paste was able to improve bonding to bovine sclerotic dentine, reaching values similar to bonding to sound dentine. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To evaluate the influence of cleaning procedures (pumice, anionic detergent and both procedures together) on the tensile bond strength of etch-and-rinse and self-etch adhesive systems to bovine enamel and dentin in vitro. Methods: Eighty non-carious, bovine incisors were extracted, embedded in acrylic resin to obtain enamel/dentin specimens. Flat bonding surfaces were obtained by grinding. Groups were divided according to substrate (enamel or dentin), adhesive system [etch-and-rinse, Adper Single Bond 2 (SB) or self-etch, Clearfil Protect Bond (PB)]; and cleaning substances (pumice, anionic detergent and their combination). The teeth were randomly divided into 20 groups (n=8): G1 - Enamel (E) + SB; G2 -E + oil (O) + SB; G3 - E + O + Pumice (P) + SB; G4 - E + O + Tergentol (T) + SB; G5 - E + O + P + T + SB; G6 - E + PB; G7 - E + O + PB; G8 - E + O + P + PB; G9 - E + O + T + PB; GIO - E + O + P + T + PB; G11 - Dentin (D) + SB; G12 D + SB + O; G13 - D + SB + O + P; G14 - D + SB + O + T; G15 - D + SB + O + P + T; G16 - D + PB; G17 - D + O + PB +; G18 - D + O + P + PB; G19 - D + O + T + PB; G20 - D + O + P + T + PB. Specimens were contaminated with handpiece oil for 5 seconds before bonding. Adhesive systems and resin composite were applied according to manufacturers` instructions. Specimens were tested in tension after 24 hours of immersion using a universal testing machine at a crosshead speed of 0.5 mm/minute. Bond strengths were analyzed with ANOVA. Failure sites were observed and recorded. Results: Tensile bond strength in MPa were: G1 (23.6 +/- 0.9); G2 (17.3 +/- 2.2); G3 (20.9 +/- 0.9); G4 (20.6 +/- 0.5); G5 (18.7 +/- 2.3); G6 (23.0 +/- 1.0); G7 (21.5 +/- 2.4); G8 (19.9 +/- 1.3); G9 (22.1 +/- 1.2); G10 (19.1 +/- 1.2); G11 (18.8 +/- 1.3); G12 (15.7 +/- 2.1); G13 (17.8 +/- 3.3); G14 (15.3 +/- 2.9); G15 (15.6 +/- 1.9); G16 (14.7 +/- 2.3); G17 (5.5 +/- 0.9); G18 (19.3 +/- 1.8); G19 (15.6 +/- 1.6); G20 (20.3 +/- 3.9). Statistical analysis showed that the main factors substrate and cleaning were statistically significant, as well as the triple interaction between factors of variance. However, the factor adhesive system did not show statistical difference. Oil contamination reduced bond strengths, being less detrimental to enamel than to dentin. Etch-and-rinse (SB) and two-step self-etch (PB) systems had similar bond strengths in the presence of oil contamination. For etch-and-rinse (SB), the cleaning procedures were able to clean enamel, but dentin was better cleaned by pumice. When self-etch (PB) system was used on enamel, anionic detergent was the best cleaning substance, while on dentin the tested procedures were similarly efficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to evaluate the micro-shear bond strength of 5 adhesive systems to enamel, one single-bottle acid-etch adhesive (O), two self-etching primers (P) and two all-in-one self-etching adhesives (S). Method: Sixty premolar enamel surfaces (buccal or lingual) were ground flat with 400- and 600-grit SiC papers and randomly divided into 5 groups (n=12), according to the adhesive system.. SB2 - Single Bond 2 (O); CSE - Clearfil SE Bond (P); ADS - AdheSE (P); PLP - Adper Prompt L-Pop (S); XE3 - Xeno III (S). Tygon tubing (inner diameter of 0.8mm) restricted the bonding area to obtain the resin composite (Z250) cylinders. After storage in distilled water at 37 degrees C for 24h and thermocycling, micro-shear testing was performed (crosshead speed of 0.5mm/min). Data were submitted to one-way ANOVA and Tukey test (a=5%). Samples were also subjected to stereomicroscopic and SEM evaluations after micro-shear testing. Mean bond strength values (MPa +/- SD) and the results of Tukey test were: SB2: 36.36(+/- 3.34)a; ADS: 33.03(+/- 7.83)a; XE3: 32.76(+/- 5.61)a; CSE: 30.61(+/- 6.68)a; PLP: 22.17(+/- 6.05)b. Groups with the same letter were not statistically different. It can be concluded that no significant difference was there between SB2, ADS, XE3 and CSE, in spite of different etching patterns of these adhesives. Only PLP presented statistically lower bond strengths compared with others. J Clin Pediatr Dent 35(3): 301-304, 2011

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To evaluate the influence of surface treatments on microtensile bond strength of luting resin cements to fiber posts. Materials and Methods: Forty-two quartz fiber posts (Light Post, RTD) were divided into 7 groups (n = 6) according to the surface treatment. I and 11: experimental patented industrial treatment consisting of zirconium oxide coating and silanization (RTD); III: industrial treatment followed by adhesive application (XPBond, Dentsply Caulk); IV: adhesive (XPBond); V: adhesive (Prime&Bond NT, Dentsply Caulk); VI: silane (Calibra Silane, Dentsply Caulk); VII: no treatment. Adhesives were used in the self-curing mode. Two cements (Sealbond, RTD - group 1, and Calibra, Dentsply Caulk - groups 11 to VII) were applied on the posts to produce cylindrical specimens. Post/cement interfaces were evaluated under SEM. The surface of the industrially coated posts was examined using energy dispersive analysis by x-ray. Cylinders were cut to obtain microtensile sticks that were loaded in tension at a crosshead speed of 0.5 mm/min until failure. Statistical analysis was performed using Kruskal-Wallis analysis of variance followed by Dunn`s multiple range test for post-hoc comparisons (p < 0.05). Weibull analysis was also performed. Results: The post/cement bond strength was significantly higher on fiber posts treated industrially (I: 23.14 +/- 8.05 MPa; II: 21.56 +/- 7.07 MPa; III: 22.37 +/- 7.00 MPa) or treated with XPBond adhesive (IV: 21.03 +/- 5.34 MPa) when compared to Prime&Bond NT application (V: 14.05 +/- 5.06 MPa), silanization (VI: 6.31 +/- 4.60 MPa) or no treatment (VII: 4.62 +/- 4.31) of conventional fiber posts (p < 0.001). Conclusion: The experimental industrial surface treatment and the adhesive application enhanced fiber post to resin cement interfacial strength. Industrial pretreatment may simplify the clinical luting procedure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

P>Aim To assess the push-out strength of Epiphany SE, Epiphany and Hybrid Root SEAL to the dentine walls of root canals. Methodology Sixty roots of canines were prepared and distributed to six groups (n = 10) according to the filling material: GI - Epiphany SE, GII - Epiphany primer and sealer, GIII - Epiphany primer, sealer and resinous solvent, GIV - Clearfil DC Bond and Epiphany sealer, GV - Clearfil, Epiphany sealer and solvent and GVI - Hybrid Root SEAL. Resilon cones were used in all groups. Roots were sectioned transversally to obtain three slices from each third. One slice was subjected to the push-out test (MPa), and results were analysed by anova and Tukey`s test (P < 0.05). The other two slices were prepared for scanning electron microscopy (SEM). Failure mode was also analysed. Results A statistically significant difference (P < 0.05) occurred between Hybrid Root SEAL (5.27 +/- 2.07) and the other materials, GI (0.40 +/- 0.23), GII (0.78 +/- 0.45), GIII (0.57 +/- 0.28), GIV (0.40 +/- 0.24) and GV (0.50 +/- 0.41), which did not differ significantly from each other (P > 0.05). Adhesive failures predominated in groups I, II, IV and V, whilst mixed and cohesive failures were the most frequent in groups III and VI, respectively. There were gaps in the adhesive interface of GI and GII, continuity areas of the filling material with dentine in GIV and GV and good adaptation of the interface of GVI. Conclusion Hybrid Root SEAL had greater push-out strength to root canal dentine than Epiphany SE and Epiphany. The use of primer, solvent and adhesive system did not influence the adhesion of Epiphany.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The steady-state heat transfer in laminar flow of liquid egg yolk - an important pseudoplastic fluid food - in circular and concentric annular ducts was experimentally investigated. The average convection heat transfer coefficients, determined by measuring temperatures before and after heating sections with constant temperatures at the tube wall, were used to obtain simple new empirical expressions to estimate the Nusselt numbers for fully established flows at the thermal entrance of the considered geometries. The comparisons with existing correlations for Newtonian and non-Newtonian fluids resulted in excellent agreement. The main contribution of this work is to supply practical and easily applicable correlations, which are, especially for the case of annulus, rather scarce and extensively required in the design of heat transfer operations dealing with similar shear-thinning products. In addition, the experimental results may support existing theoretical analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We detail an innovative new technique for measuring the two-dimensional (2D) velocity moments (rotation velocity, velocity dispersion and Gauss-Hermite coefficients h(3) and h(4)) of the stellar populations of galaxy haloes using spectra from Keck DEIMOS (Deep Imaging Multi-Object Spectrograph) multi-object spectroscopic observations. The data are used to reconstruct 2D rotation velocity maps. Here we present data for five nearby early-type galaxies to similar to three effective radii. We provide significant insights into the global kinematic structure of these galaxies, and challenge the accepted morphological classification in several cases. We show that between one and three effective radii the velocity dispersion declines very slowly, if at all, in all five galaxies. For the two galaxies with velocity dispersion profiles available from planetary nebulae data we find very good agreement with our stellar profiles. We find a variety of rotation profiles beyond one effective radius, i.e. rotation speed remaining constant, decreasing and increasing with radius. These results are of particular importance to studies which attempt to classify galaxies by their kinematic structure within one effective radius, such as the recent definition of fast- and slow-rotator classes by the Spectrographic Areal Unit for Research on Optical Nebulae project. Our data suggest that the rotator class may change when larger galactocentric radii are probed. This has important implications for dynamical modelling of early-type galaxies. The data from this study are available on-line.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Atlantic rainforest has the second highest biodiversity in Brazil. It has been shrinking rapidly in area as a result of intensive deforestation, and only 7% of the original cover now remains, as isolated patches or in ecological reserves. In order to obtain new information on the distribution of the Atlantic rainforest during the Quaternary, we examined herbarium data to locate relevant populations and extracted DNA from fresh leaves from 26 populations. The present-day distribution of endemic Podocarpus populations shows that they are widely dispersed across eastern Brazil, and that the expansion of Podocarpus recorded in single Amazonian pollen records may have originated from either western or eastern populations. Genetic analysis enabled us to determine the boundaries of their regional expansion: northern and central populations of P. sellowii appeared between 5 degrees and 15 degrees S some 16,000 years ago; populations of P lambertii or sellowii have appeared between 15 degrees and 23 degrees S at different times since the last glaciation at least; and P lambertii appeared between 23 degrees and 30 degrees S during the recent expansion of Araucaria forests. The combination of botanical, pollen, and molecular analyses proved to be a rapid means of inferring distribution boundaries for sparse populations and their regional evolution within tropical ecosystems. Today the rainforest refugia we identified have become hotspots that are crucial to the survival of the Atlantic forest under unfavourable climatic conditions and, as such, offer the only possible opportunity for this type of forest to expand in the event of future climate change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of projecting multidimensional data into lower dimensions has been pursued by many researchers due to its potential application to data analyses of various kinds. This paper presents a novel multidimensional projection technique based on least square approximations. The approximations compute the coordinates of a set of projected points based on the coordinates of a reduced number of control points with defined geometry. We name the technique Least Square Projections ( LSP). From an initial projection of the control points, LSP defines the positioning of their neighboring points through a numerical solution that aims at preserving a similarity relationship between the points given by a metric in mD. In order to perform the projection, a small number of distance calculations are necessary, and no repositioning of the points is required to obtain a final solution with satisfactory precision. The results show the capability of the technique to form groups of points by degree of similarity in 2D. We illustrate that capability through its application to mapping collections of textual documents from varied sources, a strategic yet difficult application. LSP is faster and more accurate than other existing high-quality methods, particularly where it was mostly tested, that is, for mapping text sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ancient potteries usually are made of the local clay material, which contains relatively high concentration of iron. The powdered samples are usually quite black, due to magnetite, and, although they can be used for thermoluminescene (TL) dating, it is easiest to obtain better TL reading when clearest natural or pre-treated sample is used. For electron paramagnetic resonance (EPR) measurements, the huge signal due to iron spin-spin interaction, promotes an intense interference overlapping any other signal in this range. Sample dating is obtained by dividing the radiation dose, determined by the concentration of paramagnetic species generated by irradiation, by the natural dose so as a consequence, EPR dating cannot be used, since iron signal do not depend on radiation dose. In some cases, the density separation method using hydrated solution of sodium polytungstate [Na(G)(H(2)W(12)O(40))center dot H(2)O] becomes useful. However, the sodium polytungstate is very expensive in Brazil: hence an alternative method for eliminating this interference is proposed. A chemical process to eliminate about 90% of magnetite was developed. A sample of powdered ancient pottery was treated in a mixture (3:1:1) of HCI, HNO(3) and H(2)O(2) for 4 h. After that, it was washed several times in distilled water to remove all acid matrixes. The original black sample becomes somewhat clearer. The resulting material was analyzed by plasma mass spectrometry (ICP-MS), with the result that the iron content is reduced by a factor of about 9. In EPR measurements a non-treated natural ceramic sample shows a broad spin-spin interaction signal, the chemically treated sample presents a narrow signal in g= 2.00 region, possibly due to a radical of (SiO(3))(3-), mixed with signal of remaining iron [M. lkeya, New Applications of Electron Spin Resonance, World Scientific, Singapore, 1993, p. 285]. This signal increases in intensity under -gamma-irradiation. However, still due to iron influence, the additive method yielded too old age-value. Since annealing at 300 degrees C, Toyoda and Ikeya IS. Toyoda, M. Ikeya, Geochem. J. 25 (1991) 427-445] states that E `(1)-signal with maximum intensity is obtained, while annealing at 400 degrees C E`(1)-signal is completely eliminated, the subtraction of the second one from 300 degrees C heat-treated sample isolate E`(1)-like signal. Since this is radiation dose-dependent, we show that now EPR dating becomes possible. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Differently from theoretical scale-free networks, most real networks present multi-scale behavior, with nodes structured in different types of functional groups and communities. While the majority of approaches for classification of nodes in a complex network has relied on local measurements of the topology/connectivity around each node, valuable information about node functionality can be obtained by concentric (or hierarchical) measurements. This paper extends previous methodologies based on concentric measurements, by studying the possibility of using agglomerative clustering methods, in order to obtain a set of functional groups of nodes, considering particular institutional collaboration network nodes, including various known communities (departments of the University of Sao Paulo). Among the interesting obtained findings, we emphasize the scale-free nature of the network obtained, as well as identification of different patterns of authorship emerging from different areas (e.g. human and exact sciences). Another interesting result concerns the relatively uniform distribution of hubs along concentric levels, contrariwise to the non-uniform pattern found in theoretical scale-free networks such as the BA model. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An entropy-based image segmentation approach is introduced and applied to color images obtained from Google Earth. Segmentation refers to the process of partitioning a digital image in order to locate different objects and regions of interest. The application to satellite images paves the way to automated monitoring of ecological catastrophes, urban growth, agricultural activity, maritime pollution, climate changing and general surveillance. Regions representing aquatic, rural and urban areas are identified and the accuracy of the proposed segmentation methodology is evaluated. The comparison with gray level images revealed that the color information is fundamental to obtain an accurate segmentation. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When modeling real-world decision-theoretic planning problems in the Markov Decision Process (MDP) framework, it is often impossible to obtain a completely accurate estimate of transition probabilities. For example, natural uncertainty arises in the transition specification due to elicitation of MOP transition models from an expert or estimation from data, or non-stationary transition distributions arising from insufficient state knowledge. In the interest of obtaining the most robust policy under transition uncertainty, the Markov Decision Process with Imprecise Transition Probabilities (MDP-IPs) has been introduced to model such scenarios. Unfortunately, while various solution algorithms exist for MDP-IPs, they often require external calls to optimization routines and thus can be extremely time-consuming in practice. To address this deficiency, we introduce the factored MDP-IP and propose efficient dynamic programming methods to exploit its structure. Noting that the key computational bottleneck in the solution of factored MDP-IPs is the need to repeatedly solve nonlinear constrained optimization problems, we show how to target approximation techniques to drastically reduce the computational overhead of the nonlinear solver while producing bounded, approximately optimal solutions. Our results show up to two orders of magnitude speedup in comparison to traditional ""flat"" dynamic programming approaches and up to an order of magnitude speedup over the extension of factored MDP approximate value iteration techniques to MDP-IPs while producing the lowest error of any approximation algorithm evaluated. (C) 2011 Elsevier B.V. All rights reserved.