860 resultados para Treatment Effectiveness Evaluation


Relevância:

50.00% 50.00%

Publicador:

Resumo:

Background Total hip arthroplasty (THA) is a commonly performed procedure and numbers are increasing with ageing populations. One of the most serious complications in THA are surgical site infections (SSIs), caused by pathogens entering the wound during the procedure. SSIs are associated with a substantial burden for health services, increased mortality and reduced functional outcomes in patients. Numerous approaches to preventing these infections exist but there is no gold standard in practice and the cost-effectiveness of alternate strategies is largely unknown. Objectives The aim of this project was to evaluate the cost-effectiveness of strategies claiming to reduce deep surgical site infections following total hip arthroplasty in Australia. The objectives were: 1. Identification of competing strategies or combinations of strategies that are clinically relevant to the control of SSI related to hip arthroplasty 2. Evidence synthesis and pooling of results to assess the volume and quality of evidence claiming to reduce the risk of SSI following total hip arthroplasty 3. Construction of an economic decision model incorporating cost and health outcomes for each of the identified strategies 4. Quantification of the effect of uncertainty in the model 5. Assessment of the value of perfect information among model parameters to inform future data collection Methods The literature relating to SSI in THA was reviewed, in particular to establish definitions of these concepts, understand mechanisms of aetiology and microbiology, risk factors, diagnosis and consequences as well as to give an overview of existing infection prevention measures. Published economic evaluations on this topic were also reviewed and limitations for Australian decision-makers identified. A Markov state-transition model was developed for the Australian context and subsequently validated by clinicians. The model was designed to capture key events related to deep SSI occurring within the first 12 months following primary THA. Relevant infection prevention measures were selected by reviewing clinical guideline recommendations combined with expert elicitation. Strategies selected for evaluation were the routine use of pre-operative antibiotic prophylaxis (AP) versus no use of antibiotic prophylaxis (No AP) or in combination with antibiotic-impregnated cement (AP & ABC) or laminar air operating rooms (AP & LOR). The best available evidence for clinical effect size and utility parameters was harvested from the medical literature using reproducible methods. Queensland hospital data were extracted to inform patients’ transitions between model health states and related costs captured in assigned treatment codes. Costs related to infection prevention were derived from reliable hospital records and expert opinion. Uncertainty of model input parameters was explored in probabilistic sensitivity analyses and scenario analyses and the value of perfect information was estimated. Results The cost-effectiveness analysis was performed from a health services perspective using a hypothetical cohort of 30,000 THA patients aged 65 years. The baseline rate of deep SSI was 0.96% within one year of a primary THA. The routine use of antibiotic prophylaxis (AP) was highly cost-effective and resulted in cost savings of over $1.6m whilst generating an extra 163 QALYs (without consideration of uncertainty). Deterministic and probabilistic analysis (considering uncertainty) identified antibiotic prophylaxis combined with antibiotic-impregnated cement (AP & ABC) to be the most cost-effective strategy. Using AP & ABC generated the highest net monetary benefit (NMB) and an incremental $3.1m NMB compared to only using antibiotic prophylaxis. There was a very low error probability that this strategy might not have the largest NMB (<5%). Not using antibiotic prophylaxis (No AP) or using both antibiotic prophylaxis combined with laminar air operating rooms (AP & LOR) resulted in worse health outcomes and higher costs. Sensitivity analyses showed that the model was sensitive to the initial cohort starting age and the additional costs of ABC but the best strategy did not change, even for extreme values. The cost-effectiveness improved for a higher proportion of cemented primary THAs and higher baseline rates of deep SSI. The value of perfect information indicated that no additional research is required to support the model conclusions. Conclusions Preventing deep SSI with antibiotic prophylaxis and antibiotic-impregnated cement has shown to improve health outcomes among hospitalised patients, save lives and enhance resource allocation. By implementing a more beneficial infection control strategy, scarce health care resources can be used more efficiently to the benefit of all members of society. The results of this project provide Australian policy makers with key information about how to efficiently manage risks of infection in THA.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The objective of the study was to assess, from a health service perspective, whether a systematic program to modify kidney and cardiovascular disease reduced the costs of treating end-stage kidney failure. The participants in the study were 1,800 aboriginal adults with hypertension, diabetes with microalbuminuria or overt albuminuria, and overt albuminuria, living on two islands in the Northern Territory of Australia during 1995 to 2000. Perindopril was the primary treatment agent, and other medications were also used to control blood pressure. Control of glucose and lipid levels were attempted, and health education was offered. Evaluation of program resource use and costs for follow-up periods was done at 3 and 4.7 years. On an intention-to-treat basis, the number of dialysis starts and dialysis-years avoided were estimated by comparing the fate of the treatment group with that of historical control subjects, matched for disease severity, who were followed in the before the treatment program began. For the first three years, an estimated 11.6 person-years of dialysis were avoided, and over 4.7 years, 27.7 person-years of dialysis were avoided. The net cost of the program was 1,210 dollars more per person per year than status quo care, and dialyses avoided gave net savings of 1.0 million dollars at 3 years and 3.4 million dollars at 4.6 years. The treatment program provided significant health benefit and impressive cost savings in dialysis avoided.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Austroads called for responses to a tender to investigate options for rehabilitation in alcohol interlock programs. Following successful application by the Centre for Accident Research and Road Safety – Queensland (CARRS-Q), a program of work was developed. The project has four objectives: 1. Develop a matrix outlining existing policies in national and international jurisdictions with respect to treatment and rehabilitation programs and criteria for eligibility for interlock removal; 2. Critically review the available literature with a focus on evaluation outcomes regarding the effectiveness of treatment and rehabilitation programs; 3. Analyse and assess the strengths and weaknesses of the programs/approaches identified, and; 4. Outline options with an evidence base for consideration by licensing authorities.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Co-occurrence of HIV and substance abuse is associated with poor outcomes for HIV-related health and substance use. Integration of substance use and medical care holds promise for HIV patients, yet few integrated treatment models have been reported. Most of the reported models lack data on treatment outcomes in diverse settings. This study examined the substance use outcomes of an integrated treatment model for patients with both HIV and substance use at three different clinics. Sites differed by type and degree of integration, with one integrated academic medical center, one co-located academic medical center, and one co-located community health center. Participants (n=286) received integrated substance use and HIV treatment for 12 months and were interviewed at 6-month intervals. We used linear generalized estimating equation regression analysis to examine changes in Addiction Severity Index (ASI) alcohol and drug severity scores. To test whether our treatment was differentially effective across sites, we compared a full model including site by time point interaction terms to a reduced model including only site fixed effects. Alcohol severity scores decreased significantly at 6 and 12 months. Drug severity scores decreased significantly at 12 months. Once baseline severity variation was incorporated into the model, there was no evidence of variation in alcohol or drug score changes by site. Substance use outcomes did not differ by age, gender, income, or race. This integrated treatment model offers an option for treating diverse patients with HIV and substance use in a variety of clinic settings. Studies with control groups are needed to confirm these findings.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Objectives: To assess whether open angle glaucoma (OAG) screening meets the UK National Screening Committee criteria, to compare screening strategies with case finding, to estimate test parameters, to model estimates of cost and cost-effectiveness, and to identify areas for future research. Data sources: Major electronic databases were searched up to December 2005. Review methods: Screening strategies were developed by wide consultation. Markov submodels were developed to represent screening strategies. Parameter estimates were determined by systematic reviews of epidemiology, economic evaluations of screening, and effectiveness (test accuracy, screening and treatment). Tailored highly sensitive electronic searches were undertaken. Results: Most potential screening tests reviewed had an estimated specificity of 85% or higher. No test was clearly most accurate, with only a few, heterogeneous studies for each test. No randomised controlled trials (RCTs) of screening were identified. Based on two treatment RCTs, early treatment reduces the risk of progression. Extrapolating from this, and assuming accelerated progression with advancing disease severity, without treatment the mean time to blindness in at least one eye was approximately 23 years, compared to 35 years with treatment. Prevalence would have to be about 3-4% in 40 year olds with a screening interval of 10 years to approach cost-effectiveness. It is predicted that screening might be cost-effective in a 50-year-old cohort at a prevalence of 4% with a 10-year screening interval. General population screening at any age, thus, appears not to be cost-effective. Selective screening of groups with higher prevalence (family history, black ethnicity) might be worthwhile, although this would only cover 6% of the population. Extension to include other at-risk cohorts (e.g. myopia and diabetes) would include 37% of the general population, but the prevalence is then too low for screening to be considered cost-effective. Screening using a test with initial automated classification followed by assessment by a specialised optometrist, for test positives, was more cost-effective than initial specialised optometric assessment. The cost-effectiveness of the screening programme was highly sensitive to the perspective on costs (NHS or societal). In the base-case model, the NHS costs of visual impairment were estimated as £669. If annual societal costs were £8800, then screening might be considered cost-effective for a 40-year-old cohort with 1% OAG prevalence assuming a willingness to pay of £30,000 per quality-adjusted life-year. Of lesser importance were changes to estimates of attendance for sight tests, incidence of OAG, rate of progression and utility values for each stage of OAG severity. Cost-effectiveness was not particularly sensitive to the accuracy of screening tests within the ranges observed. However, a highly specific test is required to reduce large numbers of false-positive referrals. The findings that population screening is unlikely to be cost-effective are based on an economic model whose parameter estimates have considerable uncertainty, in particular, if rate of progression and/or costs of visual impairment are higher than estimated then screening could be cost-effective. Conclusions: While population screening is not cost-effective, the targeted screening of high-risk groups may be. Procedures for identifying those at risk, for quality assuring the programme, as well as adequate service provision for those screened positive would all be needed. Glaucoma detection can be improved by increasing attendance for eye examination, and improving the performance of current testing by either refining practice or adding in a technology-based first assessment, the latter being the more cost-effective option. This has implications for any future organisational changes in community eye-care services. Further research should aim to develop and provide quality data to populate the economic model, by conducting a feasibility study of interventions to improve detection, by obtaining further data on costs of blindness, risk of progression and health outcomes, and by conducting an RCT of interventions to improve the uptake of glaucoma testing. © Queen's Printer and Controller of HMSO 2007. All rights reserved.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

BACKGROUND: Age-related macular degeneration is the most common cause of sight impairment in the UK. In neovascular age-related macular degeneration (nAMD), vision worsens rapidly (over weeks) due to abnormal blood vessels developing that leak fluid and blood at the macula.

OBJECTIVES: To determine the optimal role of optical coherence tomography (OCT) in diagnosing people newly presenting with suspected nAMD and monitoring those previously diagnosed with the disease.

DATA SOURCES: Databases searched: MEDLINE (1946 to March 2013), MEDLINE In-Process & Other Non-Indexed Citations (March 2013), EMBASE (1988 to March 2013), Biosciences Information Service (1995 to March 2013), Science Citation Index (1995 to March 2013), The Cochrane Library (Issue 2 2013), Database of Abstracts of Reviews of Effects (inception to March 2013), Medion (inception to March 2013), Health Technology Assessment database (inception to March 2013).

REVIEW METHODS: Types of studies: direct/indirect studies reporting diagnostic outcomes.

INDEX TEST: time domain optical coherence tomography (TD-OCT) or spectral domain optical coherence tomography (SD-OCT).

COMPARATORS: clinical evaluation, visual acuity, Amsler grid, colour fundus photographs, infrared reflectance, red-free images/blue reflectance, fundus autofluorescence imaging, indocyanine green angiography, preferential hyperacuity perimetry, microperimetry. Reference standard: fundus fluorescein angiography (FFA). Risk of bias was assessed using quality assessment of diagnostic accuracy studies, version 2. Meta-analysis models were fitted using hierarchical summary receiver operating characteristic curves. A Markov model was developed (65-year-old cohort, nAMD prevalence 70%), with nine strategies for diagnosis and/or monitoring, and cost-utility analysis conducted. NHS and Personal Social Services perspective was adopted. Costs (2011/12 prices) and quality-adjusted life-years (QALYs) were discounted (3.5%). Deterministic and probabilistic sensitivity analyses were performed.

RESULTS: In pooled estimates of diagnostic studies (all TD-OCT), sensitivity and specificity [95% confidence interval (CI)] was 88% (46% to 98%) and 78% (64% to 88%) respectively. For monitoring, the pooled sensitivity and specificity (95% CI) was 85% (72% to 93%) and 48% (30% to 67%) respectively. The FFA for diagnosis and nurse-technician-led monitoring strategy had the lowest cost (£39,769; QALYs 10.473) and dominated all others except FFA for diagnosis and ophthalmologist-led monitoring (£44,649; QALYs 10.575; incremental cost-effectiveness ratio £47,768). The least costly strategy had a 46.4% probability of being cost-effective at £30,000 willingness-to-pay threshold.

LIMITATIONS: Very few studies provided sufficient information for inclusion in meta-analyses. Only a few studies reported other tests; for some tests no studies were identified. The modelling was hampered by a lack of data on the diagnostic accuracy of strategies involving several tests.

CONCLUSIONS: Based on a small body of evidence of variable quality, OCT had high sensitivity and moderate specificity for diagnosis, and relatively high sensitivity but low specificity for monitoring. Strategies involving OCT alone for diagnosis and/or monitoring were unlikely to be cost-effective. Further research is required on (i) the performance of SD-OCT compared with FFA, especially for monitoring but also for diagnosis; (ii) the performance of strategies involving combinations/sequences of tests, for diagnosis and monitoring; (iii) the likelihood of active and inactive nAMD becoming inactive or active respectively; and (iv) assessment of treatment-associated utility weights (e.g. decrements), through a preference-based study.

STUDY REGISTRATION: This study is registered as PROSPERO CRD42012001930.

FUNDING: The National Institute for Health Research Health Technology Assessment programme.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Aim The aim of the study is to evaluate factors that enable or constrain the implementation and service delivery of early warnings systems or acute care training in practice. Background To date there is limited evidence to support the effectiveness of acute care initiatives (early warning systems, acute care training, outreach) in reducing the number of adverse events (cardiac arrest, death, unanticipated Intensive Care admission) through increased recognition and management of deteriorating ward based patients in hospital [1-3]. The reasons posited are that previous research primarily focused on measuring patient outcomes following the implementation of an intervention or programme without considering the social factors (the organisation, the people, external influences) which may have affected the process of implementation and hence measured end-points. Further research which considers the social processes is required in order to understand why a programme works, or does not work, in particular circumstances [4]. Method The design is a multiple case study approach of four general wards in two acute hospitals where Early Warning Systems (EWS) and Acute Life-threatening Events Recognition and Treatment (ALERT) course have been implemented. Various methods are being used to collect data about individual capacities, interpersonal relationships and institutional balance and infrastructures in order to understand the intended and unintended process outcomes of implementing EWS and ALERT in practice. This information will be gathered from individual and focus group interviews with key participants (ALERT facilitators, nursing and medical ALERT instructors, ward managers, doctors, ward nurses and health care assistants from each hospital); non-participant observation of ward organisation and structure; audit of patients' EWS charts and audit of the medical notes of patients who deteriorated during the study period to ascertain whether ALERT principles were followed. Discussion & progress to date This study commenced in January 2007. Ethical approval has been granted and data collection is ongoing with interviews being conducted with key stakeholders. The findings from this study will provide evidence for policy-makers to make informed decisions regarding the direction for strategic and service planning of acute care services to improve the level of care provided to acutely ill patients in hospital. References 1. Esmonde L, McDonnell A, Ball C, Waskett C, Morgan R, Rashidain A et al. Investigating the effectiveness of Critical Care Outreach Services: A systematic review. Intensive Care Medicine 2006; 32: 1713-1721 2. McGaughey J, Alderdice F, Fowler R, Kapila A, Mayhew A, Moutray M. Outreach and Early Warning Systems for the prevention of Intensive Care admission and death of critically ill patients on general hospital wards. Cochrane Database of Systematic Reviews 2007, Issue 3. www.thecochranelibrary.com 3. Winters BD, Pham JC, Hunt EA, Guallar E, Berenholtz S, Pronovost PJ (2007) Rapid Response Systems: A systematic review. Critical Care Medicine 2007; 35 (5): 1238-43 4. Pawson R and Tilley N. Realistic Evaluation. London; Sage: 1997

Relevância:

50.00% 50.00%

Publicador:

Resumo:

BACKGROUND: Diabetic retinopathy is an important cause of visual loss. Laser photocoagulation preserves vision in diabetic retinopathy but is currently used at the stage of proliferative diabetic retinopathy (PDR).

OBJECTIVES: The primary aim was to assess the clinical effectiveness and cost-effectiveness of pan-retinal photocoagulation (PRP) given at the non-proliferative stage of diabetic retinopathy (NPDR) compared with waiting until the high-risk PDR (HR-PDR) stage was reached. There have been recent advances in laser photocoagulation techniques, and in the use of laser treatments combined with anti-vascular endothelial growth factor (VEGF) drugs or injected steroids. Our secondary questions were: (1) If PRP were to be used in NPDR, which form of laser treatment should be used? and (2) Is adjuvant therapy with intravitreal drugs clinically effective and cost-effective in PRP?

ELIGIBILITY CRITERIA: Randomised controlled trials (RCTs) for efficacy but other designs also used.


REVIEW METHODS: Systematic review and economic modelling.

RESULTS: The Early Treatment Diabetic Retinopathy Study (ETDRS), published in 1991, was the only trial designed to determine the best time to initiate PRP. It randomised one eye of 3711 patients with mild-to-severe NPDR or early PDR to early photocoagulation, and the other to deferral of PRP until HR-PDR developed. The risk of severe visual loss after 5 years for eyes assigned to PRP for NPDR or early PDR compared with deferral of PRP was reduced by 23% (relative risk 0.77, 99% confidence interval 0.56 to 1.06). However, the ETDRS did not provide results separately for NPDR and early PDR. In economic modelling, the base case found that early PRP could be more effective and less costly than deferred PRP. Sensitivity analyses gave similar results, with early PRP continuing to dominate or having low incremental cost-effectiveness ratio. However, there are substantial uncertainties. For our secondary aims we found 12 trials of lasers in DR, with 982 patients in total, ranging from 40 to 150. Most were in PDR but five included some patients with severe NPDR. Three compared multi-spot pattern lasers against argon laser. RCTs comparing laser applied in a lighter manner (less-intensive burns) with conventional methods (more intense burns) reported little difference in efficacy but fewer adverse effects. One RCT suggested that selective laser treatment targeting only ischaemic areas was effective. Observational studies showed that the most important adverse effect of PRP was macular oedema (MO), which can cause visual impairment, usually temporary. Ten trials of laser and anti-VEGF or steroid drug combinations were consistent in reporting a reduction in risk of PRP-induced MO.

LIMITATION: The current evidence is insufficient to recommend PRP for severe NPDR.

CONCLUSIONS: There is, as yet, no convincing evidence that modern laser systems are more effective than the argon laser used in ETDRS, but they appear to have fewer adverse effects. We recommend a trial of PRP for severe NPDR and early PDR compared with deferring PRP till the HR-PDR stage. The trial would use modern laser technologies, and investigate the value adjuvant prophylactic anti-VEGF or steroid drugs.

STUDY REGISTRATION: This study is registered as PROSPERO CRD42013005408.

FUNDING: The National Institute for Health Research Health Technology Assessment programme.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Background: Clostridium difficile (C. difficile) is a leading cause of infectious diarrhoea in hospitals. Sending faecal samples for testing expedites diagnosis and appropriate treatment. Clinical suspicion of C. difficile based on patient history, signs and symptoms is the basis for sampling. Sending faecal samples from patients with diarrhoea ‘just in case’ the patient has C. difficile may be an indication of poor clinical management.

Aim: To evaluate the effectiveness of an intervention by an Infection Prevention and Control Team (IPCT) in reducing inappropriate faecal samples sent for C. difficile testing.

Method: An audit of numbers of faecal samples sent before and after a decision-making algorithm was introduced. The number of samples received in the laboratory was retrospectively counted for 12-week periods before and after an algorithm was introduced.
Findings: There was a statistically significant reduction in the mean number of faecal samples sent post the algorithm. Results were compared to a similar intervention carried out in 2009 in which the same message was delivered by a memorandum. In 2009 the memorandum had no effect on the overall number of weekly samples being sent.

Conclusion: An algorithm intervention had an effect on the number of faecal samples being sent for C. difficile testing and thus contributed to the effective use of the laboratory service.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-02

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In the past two decades numerous programs have emerged to treat individuals with developmental disabilities who have sexual offending behaviours. There has, however been very few studies that systematically examine the effectiveness of long term treatment with this population. The present research examines the therapeutic outcomes of a multi-modal behaviour approach with six individuals with intellectual disabilities previously charged with sexual assault. The participants also exhibited severe behavioural challenges that included verbal aggression, physical aggression, destruction and self-injury. These six participants (5 males, 1 female) were admitted to a Long Term Residential Treatment Program (LTRTP), due to the severity of their behaviours and due to their lack of treatment success in other programs. Individualized treatment plans focused on the reduction of maladaptive behaviours and the enhancing of skills such as positive coping strategies, socio-sexual knowledge, life skills, recreation and leisure skills. The treatment program also included psychiatric, psychological, medical, behavioural and educational interventions. The participants remained in the Long Term Residential Treatment Program (LTRTP) program from 181 to 932 days (average of 1.5 years). Pre and post treatment evaluations were conducted using the following tools: frequency of target behaviours, Psychopathology Inventory for Mentally Retarded Adults (PIMRA), Emotional Problems Scale (EPS), Socio-Sexual Knowledge and Attitudes Assessment Tool (SSKAAT-R) and Quality of Life Questionnaire (QOL-Q). Recidivism rates and the need for re-hospitalization were also noted for each participant. By offering high levels of individualized interventions, all six participants showed a 37 % rate of reduction in maladaptive behaviours with zero to low rates of inappropriate sexualbehaviour, there were no psychiatric hospitalizations, and there was no recidivism for 5 of 6 participants. In addition, medication was reduced. Mental health scores on the PIMRA were reduced across all participants by 25 % and scores on the Quality of Life Questionnaire increased for all participants by an average of 72 %. These findings add to and build upon the existing literature on long term treatment benefits for individuals with a intellectual disability who sexually offend. By utilizing an individualized and multimodal treatment approach to reduce severe behavioural challenges, not only can the maladaptive behaviours be reduced, but adaptive behaviours can be increased, mental health concerns can be managed, and overall quality of life can be improved.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Diffuse pollution, and the contribution from agriculture in particular, has become increasingly important as pollution from point sources has been addressed by wastewater treatment. Land management approaches, such as construction of field wetlands, provide one group of mitigation options available to farmers. Although field wetlands are widely used for diffuse pollution control in temperate environments worldwide, there is a shortage of evidence for the effectiveness and viability of these mitigation options in the UK. The Mitigation Options for Phosphorus and Sediment Project aims to make recommendations regarding the design and effectiveness of field wetlands for diffuse pollution control in UK landscapes. Ten wetlands have been built on four farms in Cumbria and Leicestershire. This paper focuses on sediment retention within the wetlands, estimated from annual sediment surveys in the first two years, and discusses establishment costs. It is clear that the wetlands are effective in trapping a substantial amount of sediment. Estimates of annual sediment retention suggest higher trapping rates at sandy sites (0.5–6 t ha�1 yr�1), compared to silty sites (0.02–0.4 t ha�1 yr�1) and clay sites (0.01–0.07 t ha�1 yr�1). Establishment costs for the wetlands ranged from £280 to £3100 and depended more on site specific factors, such as fencing and gateways on livestock farms, rather than on wetland size or design. Wetlands with lower trapping rates would also have lower maintenance costs, as dredging would be required less frequently. The results indicate that field wetlands show promise for inclusion in agri-environment schemes, particularly if capital payments can be provided for establishment, to encourage uptake of these multi-functional features.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The aims of this in vivo study were to compare the effectiveness and color stability of at-home and in-office bleaching techniques and to evaluate whether the use of light sources can alter bleaching results. According to preestablished criteria, 40 patients were selected and randomly divided into four groups according to bleaching treatment: (1) at-home bleaching with 10% carbamide peroxide, (2) in-office bleaching with 35% hydrogen peroxide (HP) without a light source, (3) in-office bleaching with 35% HP with quartz-tungsten-halogen light, and (4) in-office bleaching with 35% HP with a light-emitting diode/laser. Tooth shade was evaluated using the VITA Classical Shade Guide before bleaching as well as after the first and third weeks of bleaching. Tooth shade was evaluated again using the same guide 1 and 6 months after the completion of treatment. The shade guide was arranged to yield scores that were used for statistical comparison. Statistical analysis using the Kruskal-Wallis test showed no significant differences among the groups for any time point (P > .01). There was no color rebound in any of the groups. The bleaching techniques tested were equally effective. Light sources are unnecessary to bleach teeth. (Int J Periodontics Restorative Dent 2012;32:303-309.)

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This work evaluated the clinical and therapeutic aspects as well as serum levels of venom and antivenom IgG by enzyme-linked immunosorbent assay (ELISA) in experimental envenomation of dogs with Crotalus durissus terrificus venom. Twenty-eight mixed breed adult dogs were divided into four groups of seven animals each, Group I: only venom; Group II, venom + 50 ml of anti-bothropic-crotalic serum (50mg) + fluid therapy; Group III, venom + 50 ml of anti-bothropic-crotalic serum + fluid therapy + urine alkalination; Group IV, 50 ml of anti-bothropic-crotalic serum. The lyophilized venom of Crotalus durissus terrificus was reconstituted in saline solution and subcutaneously inoculated at the dose of 1mg/kg body weight. The dogs presented clinical signs of local pain, weakness, mandibular ptosis, mydriasis, emesis and salivation. The venom levels detected by ELISA ranged from 0 to 90ng/ml, according to the severity of the clinical signs. Serum antivenom ranged from 0 to 3ug/ml and was detected for up to 138h after treatment. ELISA results showed the effectiveness of the serum therapy for the venom neutralization.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This study evaluated the effectiveness of acidic low-fluoride dentifrices compared to conventional neutral dentifrices. Enamel blocks were submitted to pH cycling and treatment with slurries of dentifrices containing 0, 275, 412, 550 and 1,100 mu g F/g (pH 4.5 or 7.0), and also a commercial dentifrice (1,100 mu g F/g) and a commercial children's dentifrice (500 mu g F/ g). Variations in surface microhardness and in the mineral content in enamel after pH cycling were calculated. Enamel blocks treated with acidic dentifrices exhibited less mineral loss compared to neutral dentifrices (ANOVA; p < 0.05). The acidic dentifrices with 412 and 550 mu g F/g had the same effectiveness as the neutral 1,100-mu g F/g dentifrice and commercial 1,100-mu g F/g dentifrice. Copyright (c) 2007 S. Karger AG, Basel