988 resultados para SELF-ADJUSTING FILE


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: The goal of this study was to evaluate the efficacy of the Self-Adjusting File (SAF) and ProTaper for removing calcium hydroxide [Ca(OH)2] from root canals. Material And Methods: Thirty-six human mandibular incisors were instrumented with the ProTaper system up to instrument F2 and filled with a Ca(OH)2-based dressing. After 7 days, specimens were distributed in two groups (n=15) according to the method of Ca(OH)2 removal. Group I (SAF) was irrigated with 5 mL of NaOCl and SAF was used for 30 seconds under constant irrigation with 5 mL of NaOCl using the Vatea irrigation device, followed by irrigation with 3 mL of EDTA and 5 mL of NaOCl. Group II (ProTaper) was irrigated with 5 mL of NaOCl, the F2 instrument was used for 30 seconds, followed by irrigation with 5 mL of NaOCl, 3 mL of EDTA, and 5 mL of NaOCl. In 3 teeth Ca(OH)2 was not removed (positive control) and in 3 teeth canals were not filled with Ca(OH)2 (negative control). Teeth were sectioned and prepared for the scanning electron microscopy. The amounts of residual Ca(OH)2 were evaluated in the middle and apical thirds using a 5-score system. Results: None of the techniques completely removed the Ca(OH)2 dressing. No difference was observed between SAF and ProTaper in removing Ca(OH)2 in the middle (P=0.11) and the apical (P=0.23) thirds. Conclusion: The SAF system showed similar efficacy to rotary instrument for removal of Ca(OH)2 from mandibular incisor root canals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: This study examined the anatomy of 4-rooted maxillary second molars by using micro computed tomography. Methods: Twenty-five 4-rooted maxillary second molars were scanned to evaluate the size and curvature of the roots; the distance and spatial configuration between some anatomical landmarks; the number of root canals and the position of apical foramina; the occurrence of fusion of roots and enamel pearls; the configuration of the canal at the apical third; the cross-sectional appearance, the volume, and surface area of the root canals. Data were compared by using analysis of variance post hoc Tukey test (alpha = 0.05). Results: The specimens were classified as types I (n = 16), II (n = 7), and III (n = 2). The size of the roots was similar (P > .05), and most of them presented straight with 1 canal, except the mesiobuccal that showed 2 canals in 24% of the samples. The configuration of the pulp chamber was mostly irregular quadrilateral-shaped. The lowest mean distance of the orifices was observed between the buccal roots (P < .05). Accessory canals were present mostly in the apical third. Location of the apical foramina varied considerably. Fusion of roots and enamel pearls occurred in 44% and 8% of the samples, respectively. Mean distance from the pulp chamber floor to the furcation was 2.15 +/- 0.57 mm. No statistical differences were found in the bi-dimensional and 3-dimensional analyses (P > .05). Conclusions: All analyzed parameters showed differences between roots, except for the length of the roots, the configuration of the canals at the apical third, cross-sectional appearance, volume, and surface area of the canals. (J Endod 2012;38:977-982)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Vietnam launched its first-ever stock market, named as Ho Chi Minh City Securities Trading Center (HSTC) on July 20, 2000. This is one of pioneering works on HSTC, which finds empirical evidences for the following: Anomalies of the HSTC stock returns through clusters of limit-hits, limit-hit sequences; Strong herd effect toward extreme positive returns of the market portfolio;The specification of ARMA-GARCH helps capture fairly well issues such as serial correlations and fat-tailed for the stabilized period. By using further information and policy dummy variables, it is justifiable that policy decisions on technicalities of trading can have influential impacts on the move of risk level, through conditional variance behaviors of HSTC stock returns. Policies on trading and disclosure practices have had profound impacts on Vietnam Stock Market (VSM). The over-using of policy tools can harm the market and investing mentality. Price limits become increasingly irrelevant and prevent the market from self-adjusting to equilibrium. These results on VSM have not been reported before in the literature on Vietnam’s financial markets. Given the policy implications, we suggest that the Vietnamese authorities re-think the use of price limit and give more freedom to market participants.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Presents a unified and systematic assessment of ten position control strategies for a hydraulic servo system with single-ended cylinder driven by a proportional directional control valve. We aim at identifying those methods that achieve better tracking, have a low sensitivity to system uncertainties, and offer a good balance between development effort and end results. A formal approach for solving this problem relies on several practical metrics, which is introduced herein. Their choice is important, as the comparison results between controllers can vary significantly, depending on the selected criterion. Apart from the quantitative assessment, we also raise aspects which are difficult to quantify, but which must stay in attention when considering the position control problem for this class of hydraulic servo systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Radiocarbon dating is routinely used in paleoecology to build chronolo- gies of lake and peat sediments, aiming at inferring a model that would relate the sediment depth with its age. We present a new approach for chronology building (called “Bacon”) that has received enthusiastic attention by paleoecologists. Our methodology is based on controlling core accumulation rates using a gamma autoregressive semiparametric model with an arbitrary number of subdivisions along the sediment. Using prior knowledge about accumulation rates is crucial and informative priors are routinely used. Since many sediment cores are currently analyzed, using different data sets and prior distributions, a robust (adaptive) MCMC is very useful. We use the t-walk (Christen and Fox, 2010), a self adjusting, robust MCMC sampling algorithm, that works acceptably well in many situations. Outliers are also addressed using a recent approach that considers a Student-t model for radiocarbon data. Two examples are presented here, that of a peat core and a core from a lake, and our results are compared with other approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Multi-scale representations of lines, edges and keypoints on the basis of simple, complex and end-stopped cells can be used for object categorisation and recognition (Rodrigues and du Buf, 2009 BioSystems 95 206-226). These representations are complemented by saliency maps of colour, texture, disparity and motion information, which also serve to model extremely fast gist vision in parallel with object segregation. We present a low-level geometry model based on a single type of self-adjusting grouping cell, with a circular array of dendrites connected to edge cells located at several angles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mathematical models have been vitally important in the development of technologies in building engineering. A literature review identifies that linear models are the most widely used building simulation models. The advent of intelligent buildings has added new challenges in the application of the existing models as an intelligent building requires learning and self-adjusting capabilities based on environmental and occupants' factors. It is therefore argued that the linearity is an impropriate basis for any model of either complex building systems or occupant behaviours for control or whatever purpose. Chaos and complexity theory reflects nonlinear dynamic properties of the intelligent systems excised by occupants and environment and has been used widely in modelling various engineering, natural and social systems. It is proposed that chaos and complexity theory be applied to study intelligent buildings. This paper gives a brief description of chaos and complexity theory and presents its current positioning, recent developments in building engineering research and future potential applications to intelligent building studies, which provides a bridge between chaos and complexity theory and intelligent building research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Mecânica - FEG

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the search for productivity increase, industry has invested on the development of intelligent, flexible and self-adjusting method, capable of controlling processes through the assistance of autonomous systems, independently whether they are hardware or software. Notwithstanding, simulating conventional computational techniques is rather challenging, regarding the complexity and non-linearity of the production systems. Compared to traditional models, the approach with Artificial Neural Networks (ANN) performs well as noise suppression and treatment of non-linear data. Therefore, the challenges in the wood industry justify the use of ANN as a tool for process improvement and, consequently, add value to the final product. Furthermore, Artificial Intelligence techniques such as Neuro-Fuzzy Networks (NFNs) have proven effective, since NFNs combine the ability to learn from previous examples and generalize the acquired information from the ANNs with the capacity of Fuzzy Logic to transform linguistic variables in rules.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Electrodynamic tethered systems, in which an exposed portion of the conducting tether itself collects electrons from the ionosphere, promise to attain currents of 10 A or more in low Earth orbit. For the first time, another desirable feature of such bare-tether systems is reported and analyzed in detail: Collection by a bare tether is relatively insensitive to variations in electron density that are regularly encountered on each revolution of an orbit. This self-adjusting property of bare-tether systems occurs because the electron-collecting area on the tether is not fixed, but extends along its positively biased portion, and because the current varies as collecting length to a power greater than unity. How this adjustment to density variations follows from the basic collection law of thin cylinders is shown. The effect of variations in the motionally induced tether voltage is also analyzed. Both power and thruster modes are considered. The performance of bare-tether systems to tethered systems is compared using passive spherical collectors of fixed area, taking into consideration recent experimental results. Calculations taking into account motional voltage and plasma density around a realistic orbit for bare-tether systems suitable for space station applications are also presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The miniaturization, sophistication, proliferation, and accessibility of technologies are enabling the capture of more and previously inaccessible phenomena in Parkinson's disease (PD). However, more information has not translated into a greater understanding of disease complexity to satisfy diagnostic and therapeutic needs. Challenges include noncompatible technology platforms, the need for wide-scale and long-term deployment of sensor technology (among vulnerable elderly patients in particular), and the gap between the "big data" acquired with sensitive measurement technologies and their limited clinical application. Major opportunities could be realized if new technologies are developed as part of open-source and/or open-hardware platforms that enable multichannel data capture sensitive to the broad range of motor and nonmotor problems that characterize PD and are adaptable into self-adjusting, individualized treatment delivery systems. The International Parkinson and Movement Disorders Society Task Force on Technology is entrusted to convene engineers, clinicians, researchers, and patients to promote the development of integrated measurement and closed-loop therapeutic systems with high patient adherence that also serve to (1) encourage the adoption of clinico-pathophysiologic phenotyping and early detection of critical disease milestones, (2) enhance the tailoring of symptomatic therapy, (3) improve subgroup targeting of patients for future testing of disease-modifying treatments, and (4) identify objective biomarkers to improve the longitudinal tracking of impairments in clinical care and research. This article summarizes the work carried out by the task force toward identifying challenges and opportunities in the development of technologies with potential for improving the clinical management and the quality of life of individuals with PD. © 2016 International Parkinson and Movement Disorder Society.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An indirect genetic algorithm for the non-unicost set covering problem is presented. The algorithm is a two-stage meta-heuristic, which in the past was successfully applied to similar multiple-choice optimisation problems. The two stages of the algorithm are an ‘indirect’ genetic algorithm and a decoder routine. First, the solutions to the problem are encoded as permutations of the rows to be covered, which are subsequently ordered by the genetic algorithm. Fitness assignment is handled by the decoder, which transforms the permutations into actual solutions to the set covering problem. This is done by exploiting both problem structure and problem specific information. However, flexibility is retained by a self-adjusting element within the decoder, which allows adjustments to both the data and to stages within the search process. Computational results are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new chiral amphiphilic salicylideneaniline bearing a terminal pyridine was synthesized. It formed reverse vesicles in toluene. The addition of Ag+, however, reversibly transforms these reverse vesicles into left-handed nanohelices accompanied by spontaneous gel formation at room temperature.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Illustrated in this paper are two examples of altering planar growth into self-assembled island formation by adapting experimental conditions. Partial oxidation, undersaturated solution and high temperature change Frank-Van der Merwe (FM) growth of Al0.3Ga0.7As in liquid phase epitaxy (LPE) into isolated island deposition. Low growth speed, high temperature and in situ annealing in molecular beam epitaxy (MBE) cause the origination of InAs/GaAs quantum dots (QDs) to happen while the film is still below critical thickness in Stranski-Krastanow (SK) mode. Sample morphologies are characterized by scanning electron microscopy (SEM) or atomic force microscopy (AFM). It is suggested that such achievements are of value not only to fundamental researches but also to spheres of device applications as well. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent measurements of local-area and wide-area traffic have shown that network traffic exhibits variability at a wide range of scales self-similarity. In this paper, we examine a mechanism that gives rise to self-similar network traffic and present some of its performance implications. The mechanism we study is the transfer of files or messages whose size is drawn from a heavy-tailed distribution. We examine its effects through detailed transport-level simulations of multiple TCP streams in an internetwork. First, we show that in a "realistic" client/server network environment i.e., one with bounded resources and coupling among traffic sources competing for resources the degree to which file sizes are heavy-tailed can directly determine the degree of traffic self-similarity at the link level. We show that this causal relationship is not significantly affected by changes in network resources (bottleneck bandwidth and buffer capacity), network topology, the influence of cross-traffic, or the distribution of interarrival times. Second, we show that properties of the transport layer play an important role in preserving and modulating this relationship. In particular, the reliable transmission and flow control mechanisms of TCP (Reno, Tahoe, or Vegas) serve to maintain the long-range dependency structure induced by heavy-tailed file size distributions. In contrast, if a non-flow-controlled and unreliable (UDP-based) transport protocol is used, the resulting traffic shows little self-similar characteristics: although still bursty at short time scales, it has little long-range dependence. If flow-controlled, unreliable transport is employed, the degree of traffic self-similarity is positively correlated with the degree of throttling at the source. Third, in exploring the relationship between file sizes, transport protocols, and self-similarity, we are also able to show some of the performance implications of self-similarity. We present data on the relationship between traffic self-similarity and network performance as captured by performance measures including packet loss rate, retransmission rate, and queueing delay. Increased self-similarity, as expected, results in degradation of performance. Queueing delay, in particular, exhibits a drastic increase with increasing self-similarity. Throughput-related measures such as packet loss and retransmission rate, however, increase only gradually with increasing traffic self-similarity as long as reliable, flow-controlled transport protocol is used.