974 resultados para pacs: simulation techniques
Resumo:
Objectives: The aim of this study was to compare the fracture strength of three techniques used to re-attach tooth fragments in sound and endodontically treated fractured teeth with or without fiber post placement. Material and methods: Ninety human lower incisors were randomly divided into three groups of 30 teeth each. In group A teeth were not subjected to endodontic treatment; while teeth from groups B and C were endodontically treated and the pulp chamber restored with a composite resin. All teeth were fractured by an axial load applied to the buccal area in order to obtain tooth fragments. Teeth from each group were then divided into three subgroups, according to the re-attachment technique: bonded-only, buccal-chamfer and circumferential chamfer. Before the re-attachment procedures, fiber posts were placed in teeth from group C using dual cure resin luting cement (Duo-Link). All teeth (groups A-C) had the fragments re-attached using a same dual cure resin luting cement. in the bonded-only group, no additional preparation was made. After re-attachment of the fragment, teeth from groups buccal and circumferential chamfer groups had a 1.0 mm depth chamfer placed in the fracture line either on buccal surfaceor along the buccal and lingual surfaces, respectively. increments of microhybid composite resin (Tetric Ceram) were used in subgroups buccal chamfer and circumferential chamfer to restore the chamfer. The specimens were loaded until fracture in the same pre-determined area. The force required to detach each fragment was recorded and the data was subjected to a three-way analysis of variance where factors Group and Re-attachment technique are independent measures and Time of fracture is a repeated measure factor (first and second) and Tukey`s test (alpha = 0.05). Results: The main factors Re-attachment technique (p = 0.04) and Time of fracture (p = 0.02) were statistically significant. The buccal and circumferential chamfer techniques were statistically similar (p > 0.05) and superior to the bonded-only group (p < 0.05). The first time of fracture was statistically superior to second time of fracture (p < 0.001). Conclusions: The use of fiber post is not necessary for the reinforcement of the tooth structure in re-attachment of endodontically treated teeth. When bonding a fractured fragment, the buccal or circumferential re-attachment techniques should be preferable in comparison with the simple re-attachment without any additional preparation. None of the techniques used for re-attachment restored the fracture strength of the intact teeth. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
P>Aim To compare the percentage of gutta-percha, sealer and voids and the influence of isthmuses in mesial root canals of mandibular molars filled with different techniques. Methodology Canals in 60 mesial roots of mandibular first molars were prepared with ProTaper instruments to size F2 (size 25, 0.08 taper) and filled using a single-cone, lateral compaction, System B or Thermafil techniques. An epoxy resin sealer was labelled with Rhodamine-B dye to allow analysis under a confocal microscope. The percentage of gutta-percha, sealer and area of voids was calculated at 2, 4 and 6 mm from the apex, using Image Tool 3.0 software. Statistical analysis was performed using nonparametric Kruskal-Wallis and Dunn tests (P < 0.05). The influence of isthmuses on the presence or absence of voids was evaluated using the Fisher test. Results At the 2 mm level, the percentage of gutta-percha, sealer and voids was similar amongst the System B, lateral compaction and single-cone techniques. The single-cone technique revealed significantly less gutta-percha, more sealer and voids in comparison with the Thermafil technique at the 2 and 4 mm level (P < 0.05). The analysis of all sections (2, 4 and 6 mm) revealed that more gutta-percha and less sealer and voids were found in root canals filled with Thermafil and System B techniques (P < 0.05). The Fisher test revealed that the presence of isthmuses increased the occurence of voids in the lateral compaction group only (P < 0.05). Conclusion Gutta-percha, sealer filled area and voids were dependent on the canal-filling technique. The presence of isthmuses may influence the quality of root filling.
Bacterial leakage in root canals obturated by different techniques. Part 1: microbiologic evaluation
Resumo:
Objective. This study compared the coronal bacterial leakage of root canals obturated by different techniques and with different lengths of obturation. Study design. The canals of palatal roots of 160 maxillary molars were instrumented and divided into different groups according to the obturation technique used (lateral condensation, Microseal system, Touch `n Heat + Ultrafil system, or Tagger`s hybrid technique) and the length of obturation (5 mm or 10 mm). The roots were impermeabilized, sterilized in ethylene oxide, and mounted on a device for evaluation of the bacterial leakage. Results. Tagger`s hybrid technique produced a statistically greater number of specimens with coronal leakage than the other techniques. There was no statistically significant difference between the lateral condensation, Touch `n Heat + Ultrafil, and Microseal groups. Root canals with 10 mm of obturation produced a statistically significantly smaller number of specimens with leakage than root canals with 5 mm of obturation. Conclusion. Tagger`s hybrid technique produced a greater number of specimens with coronal leakage than the other techniques, and a greater number of root canals with 5 mm of obturation leaked than root canals with 10 mm of obturation.
Resumo:
This study evaluated the stress levels at the core layer and the veneer layer of zirconia crowns (comprising an alternative core design vs. a standard core design) under mechanical/thermal simulation, and subjected simulated models to laboratory mouth-motion fatigue. The dimensions of a mandibular first molar were imported into computer-aided design (CAD) software and a tooth preparation was modeled. A crown was designed using the space between the original tooth and the prepared tooth. The alternative core presented an additional lingual shoulder that lowered the veneer bulk of the cusps. Finite element analyses evaluated the residual maximum principal stresses fields at the core and veneer of both designs under loading and when cooled from 900 degrees C to 25 degrees C. Crowns were fabricated and mouth-motion fatigued, generating master Weibull curves and reliability data. Thermal modeling showed low residual stress fields throughout the bulk of the cusps for both groups. Mechanical simulation depicted a shift in stress levels to the core of the alternative design compared with the standard design. Significantly higher reliability was found for the alternative core. Regardless of the alternative configuration, thermal and mechanical computer simulations showed stress in the alternative core design comparable and higher to that of the standard configuration, respectively. Such a mechanical scenario probably led to the higher reliability of the alternative design under fatigue.
Resumo:
Acetohydroxy acid isomeroreductase is a key enzyme involved in the biosynthetic pathway of the amino acids isoleucine, valine, and leucine. This enzyme is of great interest in agrochemical research because it is present only in plants and microorganisms, making it a potential target for specific herbicides and fungicides. Moreover, it catalyzes an unusual two-step reaction that is of great fundamental interest. With a view to characterizing both the mechanism of inhibition by potential herbicides and the complex reaction mechanism, various techniques of enzymology, molecular biology, mass spectrometry, X-ray crystallography, and theoretical simulation have been used. The results and conclusions of these studies are described briefly in this paper.
Resumo:
A combination of modelling and analysis techniques was used to design a six component force balance. The balance was designed specifically for the measurement of impulsive aerodynamic forces and moments characteristic of hypervelocity shock tunnel testing using the stress wave force measurement technique. Aerodynamic modelling was used to estimate the magnitude and distribution of forces and finite element modelling to determine the mechanical response of proposed balance designs. Simulation of balance performance was based on aerodynamic loads and mechanical responses using convolution techniques. Deconvolution was then used to assess balance performance and to guide further design modifications leading to the final balance design. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
The step size determines the accuracy of a discrete element simulation. The position and velocity updating calculation uses a pre-calculated table and hence the control of step size can not use the integration formulas for step size control. A step size control scheme for use with the table driven velocity and position calculation uses the difference between the calculation result from one big step and that from two small steps. This variable time step size method chooses the suitable time step size for each particle at each step automatically according to the conditions. Simulation using fixed time step method is compared with that of using variable time step method. The difference in computation time for the same accuracy using a variable step size (compared to the fixed step) depends on the particular problem. For a simple test case the times are roughly similar. However, the variable step size gives the required accuracy on the first run. A fixed step size may require several runs to check the simulation accuracy or a conservative step size that results in longer run times. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Computer assisted learning has an important role in the teaching of pharmacokinetics to health sciences students because it transfers the emphasis from the purely mathematical domain to an 'experiential' domain in which graphical and symbolic representations of actions and their consequences form the major focus for learning. Basic pharmacokinetic concepts can be taught by experimenting with the interplay between dose and dosage interval with drug absorption (e.g. absorption rate, bioavailability), drug distribution (e.g. volume of distribution, protein binding) and drug elimination (e.g. clearance) on drug concentrations using library ('canned') pharmacokinetic models. Such 'what if' approaches are found in calculator-simulators such as PharmaCalc, Practical Pharmacokinetics and PK Solutions. Others such as SAAM II, ModelMaker, and Stella represent the 'systems dynamics' genre, which requires the user to conceptualise a problem and formulate the model on-screen using symbols, icons, and directional arrows. The choice of software should be determined by the aims of the subject/course, the experience and background of the students in pharmacokinetics, and institutional factors including price and networking capabilities of the package(s). Enhanced learning may result if the computer teaching of pharmacokinetics is supported by tutorials, especially where the techniques are applied to solving problems in which the link with healthcare practices is clearly established.
Resumo:
Objective-To compare the accuracy and feasibility of harmonic power Doppler and digitally subtracted colour coded grey scale imaging for the assessment of perfusion defect severity by single photon emission computed tomography (SPECT) in an unselected group of patients. Design-Cohort study. Setting-Regional cardiothoracic unit. Patients-49 patients (mean (SD) age 61 (11) years; 27 women, 22 men) with known or suspected coronary artery disease were studied with simultaneous myocardial contrast echo (MCE) and SPECT after standard dipyridamole stress. Main outcome measures-Regional myocardial perfusion by SPECT, performed with Tc-99m tetrafosmin, scored qualitatively and also quantitated as per cent maximum activity. Results-Normal perfusion was identified by SPECT in 225 of 270 segments (83%). Contrast echo images were interpretable in 92% of patients. The proportion of normal MCE by grey scale, subtracted, and power Doppler techniques were respectively 76%, 74%, and 88% (p < 0.05) at > 80% of maximum counts, compared with 65%, 69%, and 61% at < 60% of maximum counts. For each technique, specificity was lowest in the lateral wail, although power Doppler was the least affected. Grey scale and subtraction techniques were least accurate in the septal wall, but power Doppler showed particular problems in the apex. On a per patient analysis, the sensitivity was 67%, 75%, and 83% for detection of coronary artery disease using grey scale, colour coded, and power Doppler, respectively, with a significant difference between power Doppler and grey scale only (p < 0.05). Specificity was also the highest for power Doppler, at 55%, but not significantly different from subtracted colour coded images. Conclusions-Myocardial contrast echo using harmonic power Doppler has greater accuracy than with grey scale imaging and digital subtraction. However, power Doppler appears to be less sensitive for mild perfusion defects.
Resumo:
Activated sludge models are used extensively in the study of wastewater treatment processes. While various commercial implementations of these models are available, there are many people who need to code models themselves using the simulation packages available to them, Quality assurance of such models is difficult. While benchmarking problems have been developed and are available, the comparison of simulation data with that of commercial models leads only to the detection, not the isolation of errors. To identify the errors in the code is time-consuming. In this paper, we address the problem by developing a systematic and largely automated approach to the isolation of coding errors. There are three steps: firstly, possible errors are classified according to their place in the model structure and a feature matrix is established for each class of errors. Secondly, an observer is designed to generate residuals, such that each class of errors imposes a subspace, spanned by its feature matrix, on the residuals. Finally. localising the residuals in a subspace isolates coding errors. The algorithm proved capable of rapidly and reliably isolating a variety of single and simultaneous errors in a case study using the ASM 1 activated sludge model. In this paper a newly coded model was verified against a known implementation. The method is also applicable to simultaneous verification of any two independent implementations, hence is useful in commercial model development.
Resumo:
Computational simulations of the title reaction are presented, covering a temperature range from 300 to 2000 K. At lower temperatures we find that initial formation of the cyclopropene complex by addition of methylene to acetylene is irreversible, as is the stabilisation process via collisional energy transfer. Product branching between propargyl and the stable isomers is predicted at 300 K as a function of pressure for the first time. At intermediate temperatures (1200 K), complex temporal evolution involving multiple steady states begins to emerge. At high temperatures (2000 K) the timescale for subsequent unimolecular decay of thermalized intermediates begins to impinge on the timescale for reaction of methylene, such that the rate of formation of propargyl product does not admit a simple analysis in terms of a single time-independent rate constant until the methylene supply becomes depleted. Likewise, at the elevated temperatures the thermalized intermediates cannot be regarded as irreversible product channels. Our solution algorithm involves spectral propagation of a symmetrised version of the discretized master equation matrix, and is implemented in a high precision environment which makes hitherto unachievable low-temperature modelling a reality.
Resumo:
The QU-GENE Computing Cluster (QCC) is a hardware and software solution to the automation and speedup of large QU-GENE (QUantitative GENEtics) simulation experiments that are designed to examine the properties of genetic models, particularly those that involve factorial combinations of treatment levels. QCC automates the management of the distribution of components of the simulation experiments among the networked single-processor computers to achieve the speedup.