987 resultados para pull-up testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: As opposed to the cementation metal posts, the cementation of fiber posts has several details that can significantly influence the success of post retention. This study evaluated the effect of the relining procedure, the cement type, and the luted length of the post on fiber posts retention. Methods: One hundred eighty bovine incisors were selected to assess post retention; after endodontic treatment, the canals were flared with diamonds burs. Post holes were prepared in lengths of 5, 7.5, and 10 mm; the fiber posts were relined with composite resin and luted with RelyX ARC, RelyX Unicem, or RelyX Luting 2. All cements are manufactured by 3M ESPE (St. Paul, MN). Samples were subjected to a pull-out bond strength test in a universal testing machine; the results (N) were submitted to a three-way analysis of variance and the Tukey post hoc test (alpha = 0.05). Results: The improvement of post retention occurred with the increase of the post length luted into the root canal; the relining procedure improved the pull-out bond strength. RelyX Unicem and RelyX ARC showed similar values of retention, both showing higher values than RelyX Luting 2. Conclusion: Post length, the reining procedure, and the cement type are all important factors for improving the retention of fiber posts. (J Endod 2010;36:1543-1546)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dois pacientes índices da família analisada neste estudo foram submetidos a adrenalectomia bilateral devido a feocromocitoma. Foi, então, realizado o estudo genético dos pacientes e de sete parentes de primeiro grau. Os dois pacientes com feocromocitoma e dois outros membros assintomáticos da família apresentaram a mutação c496G>T no exon 3 do gene VHL. A família perdeu seguimento médico. Três anos após a realização da avaliação genética, a irmã dos pacientes, portadora da mutação, foi encaminhada para o nosso serviço após uma gestação complicada por pré-eclampsia. Ela referia paroxismos sugestivos de feocromocitoma, mas as metanefrinas urinárias eram negativas. Entretanto, a tomografia computadorizada de abdômen evidenciou uma massa adrenal que também se contrastou na cintilografia com metaiodobenzilguanidina (MIBG). Esse estudo mostra que a análise molecular do paciente índice pode levar à identificação de parentes assintomáticos portadores da mutação. Além disso, mesmo com as metanefrinas urinárias negativas, a identificação de uma mutação específica levou a um aumento da suspeita e detecção de feocromocitoma na irmã dos afetados pela doença.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: We describe an experimental model for transanal endorectal pull-through surgery using the method of de]a Torre and Ortega that can be used for training purposes in experimental laboratories.Methods: Ten rabbits were submitted to the transanal endorectal pull-through technique of de la Torre and Ortega. Animals were randomly selected in the Botucatu School of Medicine experimental laboratory. Animals weighted between 2800 and 4400 g. Colons were not prepared, and antibiotic therapy was not used; dipyrone(1) was administered postoperatively for analgesic purposes. We standardized resected segment size, recorded surgical time, and observed Survival and possible complications for 1 month.Results: All animals survived the initial follow-up period without infection. Bowel movements returned quickly, and all animals were evacuating regularly within the first 24 hours. Mean surgical time was 48.6 minutes.Conclusions: the experimental model proposed in this study is very useful for training and improving surgical techniques using the method of de la Torre and Ortega. The rabbit is an excellent animal for this surgery because of its size and postoperative resistance. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The purpose of this study was to assess the influence of conditioning methods and thermocycling on the bond strength between composite core and resin cement. Material and Methods: Eighty blocks (8x8x4 mm) were prepared with core build-up composite. The cementation surface was roughened with 120-grit carbide paper and the blocks were thermocycled (5,000 cycles, between 5 degrees C and 55 degrees C, with a 30 s dwell time in each bath). A layer of temporary luting agent was applied. After 24 h, the layer was removed, and the blocks were divided into five groups, according to surface treatment: (NT) No treatment (control); (SP) Grinding with 120-grit carbide paper; (AC) Etching with 37% phosphoric acid; (SC) Sandblasting with 30 mm SiO2 particles, silane application; (AO) Sandblasting with 50 mu m Al2O3 particles, silane application. Two composite blocks were cemented to each other (n=8) and sectioned into sticks. Half of the specimens from each block were immediately tested for microtensile bond strength (mu TBS), while the other half was subjected to storage for 6 months, thermocycling (12,000 cycles, between 5 degrees C and 55 degrees C, with a dwell time of 30 s in each bath) and mu TBS test in a mechanical testing machine. Bond strength data were analyzed by repeated measures two-way ANOVA and Tukey test (alpha=0.05). Results: The mu TBS was significantly affected by surface treatment (p=0.007) and thermocycling (p=0.000). Before aging, the SP group presented higher bond strength when compared to NT and AC groups, whereas all the other groups were statistically similar. After aging, all the groups were statistically similar. SP submitted to thermocycling showed lower bond strength than SP without thermocycling. Conclusion: Core composites should be roughened with a diamond bur before the luting process. Thermocycling tends to reduce the bond strength between composite and resin cement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The aim of this prospective study was to cephalometrically analyze the stability of dentoalveolar and skeletal changes produced by a removable appliance with palatal crib associated to high-pull chincup in individuals with anterior open bite treated for 12 months, and compare them to individuals with similar malocclusion and age, not submitted to orthodontic treatment, also followed for the same period. METHODS: Nineteen children with a mean age of 9.78 years old treated for 12 months with a removable appliance with palatal crib associated with chincup therapy were evaluated after 15 months (post-treatment period) and compared with a control group of 19 subjects with mean age of 9.10 years with the same malocclusion that was followed-up for the same period. Seventy-six lateral cephalograms were evaluated at T1 (after correction) and T2 (follow-up) and cephalometric variables were analyzed by statistical tests. RESULTS: The results did not show significant skeletal, soft tissue or maxillary dentoalveolar changes. Overall, treatment effects on the experimental group were maintained at T2 evaluation with an increase of 0.56 mm in overbite. Overjet and maxillary incisors/molars position (vertical and sagittal) remained essentially unchanged during the study period. Only mandibular incisors showed significant changes (labial inclination and protrusion) compared to control group. CONCLUSIONS: Thus, it can be concluded that the early open bite treatment with a removable appliance and palatal crib associated with high-pull chincup therapy provided stability of 95%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Obiettivi: Valutare la prevalenza dei diversi genotipi di HPV in pazienti con diagnosi di CIN2/3 nella Regione Emilia-Romagna, la persistenza genotipo-specifica di HPV e l’espressione degli oncogeni virali E6/E7 nel follow-up post-trattamento come fattori di rischio di recidiva/persistenza o progressione di malattia; verificare l’applicabilità di nuovi test diagnostici biomolecolari nello screening del cervicocarcinoma. Metodi: Sono state incluse pazienti con citologia di screening anormale, sottoposte a trattamento escissionale (T0) per diagnosi di CIN2/3 su biopsia mirata. Al T0 e durante il follow-up a 6, 12, 18 e 24 mesi, oltre al Pap test e alla colposcopia, sono state effettuate la ricerca e la genotipizzazione dell'HPV DNA di 28 genotipi. In caso di positività al DNA dei 5 genotipi 16, 18, 31, 33 e/o 45, si è proceduto alla ricerca dell'HPV mRNA di E6/E7. Risultati preliminari: Il 95.8% delle 168 pazienti selezionate è risultato HPV DNA positivo al T0. Nel 60.9% dei casi le infezioni erano singole (prevalentemente da HPV 16 e 31), nel 39.1% erano multiple. L'HPV 16 è stato il genotipo maggiormente rilevato (57%). Il 94.3% (117/124) delle pazienti positive per i 5 genotipi di HPV DNA sono risultate mRNA positive. Abbiamo avuto un drop-out di 38/168 pazienti. A 18 mesi (95% delle pazienti) la persistenza dell'HPV DNA di qualsiasi genotipo era del 46%, quella dell'HPV DNA dei 5 genotipi era del 39%, con espressione di mRNA nel 21%. Abbiamo avuto recidiva di malattia (CIN2+) nel 10.8% (14/130) a 18 mesi. Il pap test era negativo in 4/14 casi, l'HPV DNA test era positivo in tutti i casi, l'mRNA test in 11/12 casi. Conclusioni: L'HR-HPV DNA test è più sensibile della citologia, l'mRNA test è più specifico nell'individuare una recidiva. I dati definitivi saranno disponibili al termine del follow-up programmato.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this thesis is the power transient analysis concerning experimental devices placed within the reflector of Jules Horowitz Reactor (JHR). Since JHR material testing facility is designed to achieve 100 MW core thermal power, a large reflector hosts fissile material samples that are irradiated up to total relevant power of 3 MW. MADISON devices are expected to attain 130 kW, conversely ADELINE nominal power is of some 60 kW. In addition, MOLFI test samples are envisaged to reach 360 kW for what concerns LEU configuration and up to 650 kW according to HEU frame. Safety issues concern shutdown transients and need particular verifications about thermal power decreasing of these fissile samples with respect to core kinetics, as far as single device reactivity determination is concerned. Calculation model is conceived and applied in order to properly account for different nuclear heating processes and relative time-dependent features of device transients. An innovative methodology is carried out since flux shape modification during control rod insertions is investigated regarding the impact on device power through core-reflector coupling coefficients. In fact, previous methods considering only nominal core-reflector parameters are then improved. Moreover, delayed emissions effect is evaluated about spatial impact on devices of a diffuse in-core delayed neutron source. Delayed gammas transport related to fission products concentration is taken into account through evolution calculations of different fuel compositions in equilibrium cycle. Provided accurate device reactivity control, power transients are then computed for every sample according to envisaged shutdown procedures. Results obtained in this study are aimed at design feedback and reactor management optimization by JHR project team. Moreover, Safety Report is intended to utilize present analysis for improved device characterization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sub-grid scale (SGS) models are required in order to model the influence of the unresolved small scales on the resolved scales in large-eddy simulations (LES), the flow at the smallest scales of turbulence. In the following work two SGS models are presented and deeply analyzed in terms of accuracy through several LESs with different spatial resolutions, i.e. grid spacings. The first part of this thesis focuses on the basic theory of turbulence, the governing equations of fluid dynamics and their adaptation to LES. Furthermore, two important SGS models are presented: one is the Dynamic eddy-viscosity model (DEVM), developed by \cite{germano1991dynamic}, while the other is the Explicit Algebraic SGS model (EASSM), by \cite{marstorp2009explicit}. In addition, some details about the implementation of the EASSM in a Pseudo-Spectral Navier-Stokes code \cite{chevalier2007simson} are presented. The performance of the two aforementioned models will be investigated in the following chapters, by means of LES of a channel flow, with friction Reynolds numbers $Re_\tau=590$ up to $Re_\tau=5200$, with relatively coarse resolutions. Data from each simulation will be compared to baseline DNS data. Results have shown that, in contrast to the DEVM, the EASSM has promising potentials for flow predictions at high friction Reynolds numbers: the higher the friction Reynolds number is the better the EASSM will behave and the worse the performances of the DEVM will be. The better performance of the EASSM is contributed to the ability to capture flow anisotropy at the small scales through a correct formulation for the SGS stresses. Moreover, a considerable reduction in the required computational resources can be achieved using the EASSM compared to DEVM. Therefore, the EASSM combines accuracy and computational efficiency, implying that it has a clear potential for industrial CFD usage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first part of this three-part review on the relevance of laboratory testing of composites and adhesives deals with approval requirements for composite materials. We compare the in vivo and in vitro literature data and discuss the relevance of in vitro analyses. The standardized ISO protocols are presented, with a focus on the evaluation of physical parameters. These tests all have a standardized protocol that describes the entire test set-up. The tests analyse flexural strength, depth of cure, susceptibility to ambient light, color stability, water sorption and solubility, and radiopacity. Some tests have a clinical correlation. A high flexural strength, for instance, decreases the risk of fractures of the marginal ridge in posterior restorations and incisal edge build-ups of restored anterior teeth. Other tests do not have a clinical correlation or the threshold values are too low, which results in an approval of materials that show inferior clinical properties (e.g., radiopacity). It is advantageous to know the test set-ups and the ideal threshold values to correctly interpret the material data. Overall, however, laboratory assessment alone cannot ensure the clinical success of a product.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past few decades, integrated circuits have become a major part of everyday life. Every circuit that is created needs to be tested for faults so faulty circuits are not sent to end-users. The creation of these tests is time consuming, costly and difficult to perform on larger circuits. This research presents a novel method for fault detection and test pattern reduction in integrated circuitry under test. By leveraging the FPGA's reconfigurability and parallel processing capabilities, a speed up in fault detection can be achieved over previous computer simulation techniques. This work presents the following contributions to the field of Stuck-At-Fault detection: We present a new method for inserting faults into a circuit net list. Given any circuit netlist, our tool can insert multiplexers into a circuit at correct internal nodes to aid in fault emulation on reconfigurable hardware. We present a parallel method of fault emulation. The benefit of the FPGA is not only its ability to implement any circuit, but its ability to process data in parallel. This research utilizes this to create a more efficient emulation method that implements numerous copies of the same circuit in the FPGA. A new method to organize the most efficient faults. Most methods for determinin the minimum number of inputs to cover the most faults require sophisticated softwareprograms that use heuristics. By utilizing hardware, this research is able to process data faster and use a simpler method for an efficient way of minimizing inputs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Tuberculin skin testing (TST) and preventive treatment of tuberculosis (TB) are recommended for all persons with human immunodeficiency virus (HIV) infection. We aimed to assess the effect of TST and preventive treatment of TB on the incidence of TB in the era of combination antiretroviral therapy in an area with low rates of TB transmission. METHODS: We calculated the incidence of TB among participants who entered the Swiss HIV Cohort Study after 1995, and we studied the associations of TST results, epidemiological and laboratory markers, preventive TB treatment, and combination antiretroviral therapy with TB incidence. RESULTS: Of 6160 participants, 142 (2.3%) had a history of TB at study entry, and 56 (0.91%) developed TB during a total follow-up period of 25,462 person-years, corresponding to an incidence of 0.22 cases per 100 person-years. TST was performed for 69% of patients; 9.4% of patients tested had positive results (induration > or = 5 mm in diameter). Among patients with positive TST results, TB incidence was 1.6 cases per 100 person-years if preventive treatment was withheld, but none of the 193 patients who received preventive treatment developed TB. Positive TST results (adjusted hazard ratio [HR], 25; 95% confidence interval [CI], 11-57), missing TST results (HR, 12; 95% CI, 4.8-20), origin from sub-Saharan Africa (HR, 5.8; 95% CI, 2.7-12.5), low CD4+ cell counts, and high plasma HIV RNA levels were associated with an increased risk of TB, whereas the risk was reduced among persons receiving combination antiretroviral therapy (HR, 0.44; 95% CI, 0.2-0.8). CONCLUSION: Screening for latent TB using TST and administering preventive treatment for patients with positive TST results is an efficacious strategy to reduce TB incidence in areas with low rates of TB transmission. Combination antiretroviral therapy reduces the incidence of TB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study evaluated the course of psychological variables during a 2-year follow-up in patients after common whiplash of the cervical spine. From a sample of 117 non-selected patients with common whiplash (investigated on average 7.2 +/- 4.2 days after trauma) a total of 21 suffered trauma-related symptoms over 2 years following initial injury. These patients (symptomatic group) were compared with 21 age, gender and education pair-matched patients, who showed complete recovery from trauma-related symptoms during the 2-year follow-up (asymptomatic group). Both groups underwent standardised testing procedures (i.e., Freiburg Personality Inventory and Well-Being Scale) at referral, and at 3, 6 and 24 months. In the symptomatic group during follow-up no significant changes in rating of neck pain or headache were found. Significant differences between the groups and significant deviation of scores over time were found on the Well-Being and Nervousness Scales. There was a lack of significant difference between the groups on the Depression Scale, indicating a possible somatic basis for changes in psychological functioning in the investigated sample. With regard to scales of Extraversion or Neuroticism, there were neither significant differences between the groups nor significant deviation over time. These results highlight that patients' psychological problems are rather a consequence than a cause of somatic symptoms in whiplash.