44 resultados para "at risk" for school failure
Resumo:
Four studies, including two being published as an abstract, have recently demonstrated the feasibility of oral treatment of pyelonephritis in children, with no increased risk of treatment failure, early urinary tract re-infection, or renal scars. To do so, the pediatrician must ensure that: (1) the patient does not appear toxic, has no vomiting; (2) there is no known severe obstructive or refluxing uropathy and (3) parents are deemed to be adherent to the treatment. If these criteria are fulfilled, the pediatrician can start an oral treatment with a 3rd generation cephalosporine for 10 to 14 days. Ambulatory follow-up is crucial, and persistance of fever after 3 days is a reason for a new outpatient visit, additional or supplementary imaging studies (renal ultrasonography) and eventually a switch to intravenous treatment.
Resumo:
In 13 patients, the development of supraspinatus muscle atrophy and fatty infiltration after rotator cuff tendon repair was quantified prospectively via magnetic resonance imaging. Intraoperative electrical nerve stimulation at repair showed that the maximal supraspinatus tension (up to 200 N) strongly correlated with the anatomic cross-sectional muscle area and with muscle fatty infiltration (ranging from 12 N/cm(2) in Goutallier stage 3 to 42 N/cm(2) in Goutallier stage 0). Within 1 year after successful tendon repair (n = 8), fatty infiltration did not recover, and atrophy improved partially at best; however, if the repair failed (n = 5), atrophy and fatty infiltration progressed significantly. The ability of the rotator cuff muscles to develop tension not only correlates with their atrophy but also closely correlates with their degree of fatty infiltration. With current repair techniques, atrophy and fatty infiltration appear to be irreversible, despite successful tendon repair. Unexpectedly, not only weak but also very strong muscles are at risk for repair failure.
Resumo:
OBJECTIVE: To investigate predictors of continued HIV RNA viral load suppression in individuals switched to abacavir (ABC), lamivudine (3TC) and zidovudine (ZDV) after successful previous treatment with a protease inhibitor or non-nucleoside reverse transcriptase inhibitor-based combination antiretroviral therapy. DESIGN AND METHODS: An observational cohort study, which included individuals in the Swiss HIV Cohort Study switching to ABC/3TC/ZDV following successful suppression of viral load. The primary endpoint was time to treatment failure defined as the first of the following events: two consecutiveviral load measurements > 400 copies/ml under ABC/3TC/ZDV, one viral load measurement > 400 copies/ml and subsequent discontinuation of ABC/3TC/ZDV within 3 months, AIDS or death. RESULTS: We included 495 individuals; 47 experienced treatment failure in 1459 person-years of follow-up [rate = 3.22 events/100 person-years; 95% confidence interval (95% CI), 2.30-4.14]. Of all failures, 62% occurred in the first year after switching to ABC/3TC/ZDV. In a Cox regression analysis, treatment failure was independently associated with earlier exposure to nucleoside reverse transcriptase inhibitor (NRTI) mono or dual therapy [hazard ratio (HR), 8.02; 95% CI, 4.19-15.35) and low CD4 cell count at the time of the switch (HR, 0.66; 95% CI, 0.51-0.87 by +100 cells/microl up to 500 cells/microl). In patients without earlier exposure to mono or dual therapy, AIDS prior to switch to simplified maintenance therapy was an additional risk factor. CONCLUSIONS: The failure rate was low in patients with suppressed viral load and switch to ABC/3TC/ZDV treatment. Patients with earlier exposure to mono or dual NRTI therapy, low CD4 cell count at time of switch, or AIDS are at increased risk of treatment failure, limiting the use of ABC/3TC/ZDV in these patient groups.
Resumo:
BACKGROUND: Recently recommended treatment modalities for prosthetic joint infection (PJI) were evaluated. METHODS: A retrospective cohort analysis of 68 patients with PJI of hip or knee who were treated from 1995 through 2004 was conducted at the University Hospital Bern (Bern, Switzerland). RESULTS: A 2-stage exchange was the most frequent (75.0%) surgical strategy, followed by retention and debridement (17.6%), 1-stage exchange (5.9%), and resection arthroplasty or suppressive antimicrobial treatment (1.5%). The chosen strategy was in 88% agreement with the recommendations. Adherence was only 17% for retention and debridement and was 0% for 1-stage exchange. Most PJIs (84%) were treated with an adequate or partially adequate antimicrobial regimen. Recurrence-free survival was observed in 51.5% of PJI episodes after 24 months of follow-up. The risk of treatment failure was significantly higher for PJI treated with a surgical strategy other than that recommended (hazard ratio, 2.34; 95% confidence interval, 1.10-4.70; P = .01) and for PJIs treated with antibiotics not corresponding to recommendations (hazard ratio, 3.45; confidence interval, 1.50-7.60; P = .002). Other risk factors associated with lack of healing were a high infection score at the time of diagnosis (hazard ratio, 1.29; 95% confidence interval, 1.10-1.40; P < .001) and presence of a sinus tract (hazard ratio, 2.35; 95% confidence interval, 1.10-5.0; P = .02). CONCLUSIONS: Our study demonstrates the value of current treatment recommendations. Inappropriate choice of conservative surgical strategies (such as debridement and retention) and inadequate antibiotic treatment are associated with failure.
Resumo:
BACKGROUND: Transient left ventricular apical ballooning syndrome (TLVABS) is an acute cardiac syndrome mimicking ST-segment elevation myocardial infarction characterized by transient wall-motion abnormalities involving apical and mid-portions of the left ventricle in the absence of significant obstructive coronary disease. METHODS: Searching the MEDLINE database 28 case series met the eligibility criteria and were summarized in a narrative synthesis of the demographic characteristics, clinical features and pathophysiological mechanisms. RESULTS: TLVABS is observed in 0.7-2.5% of patients with suspected ACS, affects women in 90.7% (95% CI: 88.2-93.2%) with a mean age ranging from 62 to 76 years and most commonly presents with chest pain (83.4%, 95% CI: 80.0-86.7%) and dyspnea (20.4%, 95% CI: 16.3-24.5%) following an emotionally or physically stressful event. ECG on admission shows ST-segment elevations in 71.1% (95% CI: 67.2-75.1%) and is accompanied by usually mild elevations of Troponins in 85.0% (95% CI: 80.8-89.1%). Despite dramatic clinical presentation and substantial risk of heart failure, cardiogenic shock and arrhythmias, LVEF improved from 20-49.9% to 59-76% within a mean time of 7-37 days with an in-hospital mortality rate of 1.7% (95% CI: 0.5-2.8%), complete recovery in 95.9% (95% CI: 93.8-98.1%) and rare recurrence. The underlying etiology is thought to be based on an exaggerated sympathetic stimulation. CONCLUSION: TLVABS is a considerable differential diagnosis in ACS, especially in postmenopausal women with a preceding stressful event. Data on longterm follow-up is pending and further studies will be necessary to clarify the etiology and reach consensus in acute and longterm management of TLVABS.
Resumo:
OBJECTIVE: Virologic failure of HIV-positive patients is of special concern during pregnancy. We compared virologic failure and the frequency of treatment changes in pregnant and non-pregnant women of the Swiss HIV Cohort Study. METHODS: Using data on 372 pregnancies in 324 women we describe antiretroviral therapy during pregnancy. Pregnant women on HAART at conception (n = 131) were matched to 228 non-pregnant women (interindividual comparison) and to a time period of equal length before and after pregnancy (intraindividual comparison). Women starting HAART during pregnancy (n = 145) were compared with 578 non-pregnant women starting HAART. FINDINGS: The median age at conception was 31 years, 16% (n = 50) were infected through injecting drug use and the median CD4 cell count was 489 cells/microl. In the majority of pregnancies (n = 220, 59%), women had started ART before conception. When ART was started during pregnancy (n = 145, 39%), it was mainly during the second trimester (n = 100, 69%). Two thirds (n = 26) of 35 women starting in the third trimester were diagnosed with HIV during pregnancy. The risk of virologic failure tended to be lower in pregnant than in non-pregnant women [adjusted odds ratio 0.52 (95% confidence interval 0.25-1.09, P = 0.08)], but was similar in the intraindividual comparison (adjusted odds ratio 1.04, 95% confidence interval 0.48-2.28). Women starting HAART during pregnancy changed the treatment less often than non-pregnant women. CONCLUSION: Despite the physiological changes occurring during pregnancy, HIV infected pregnant women are not at higher risk of virologic failure.
Resumo:
PURPOSE: To evaluate whether systemic diseases with/without systemic medication increase the risk of implant failure and therefore diminish success and survival rates of dental implants. MATERIALS AND METHODS: A MEDLINE search was undertaken to find human studies reporting implant survival in subjects treated with osseointegrated dental implants who were diagnosed with at least one of 12 systemic diseases. RESULTS: For most conditions, no studies comparing patients with and without the condition in a controlled setting were found. For most systemic diseases there are only case reports or case series demonstrating that implant placement, integration, and function are possible in affected patients. For diabetes, heterogeneity of the material and the method of reporting data precluded a formal meta-analysis. No unequivocal tendency for subjects with diabetes to have higher failure rates emerged. The data from papers reporting on osteoporotic patients were also heterogeneous. The evidence for an association between osteoporosis and implant failure was low. Nevertheless, some reports now tend to focus on the medication used in osteoporotic patients, with oral bisphosphonates considered a potential risk factor for osteonecrosis of the jaws, rather than osteoporosis as a risk factor for implant success and survival on its own. CONCLUSIONS: The level of evidence indicative of absolute and relative contraindications for implant therapy due to systemic diseases is low. Studies comparing patients with and without the condition in a controlled setting are sparse. Especially for patients with manifest osteoporosis under an oral regime of bisphosphonates, prospective controlled studies are urgently needed.
Resumo:
INTRODUCTION Light cure of resin-based adhesives is the mainstay of orthodontic bonding. In recent years, alternatives to conventional halogen lights offering reduced curing time and the potential for lower attachment failure rates have emerged. The relative merits of curing lights in current use, including halogen-based lamps, light-emitting diodes (LEDs), and plasma arc lights, have not been analyzed systematically. In this study, we reviewed randomized controlled trials and controlled clinical trials to assess the risks of attachment failure and bonding time in orthodontic patients in whom brackets were cured with halogen lights, LEDs, or plasma arc systems. METHODS Multiple electronic database searches were undertaken, including MEDLINE, EMBASE, and the Cochrane Oral Health Group's Trials Register, CENTRAL. Language restrictions were not applied. Unpublished literature was searched on ClinicalTrials.gov, the National Research Register, Pro-Quest Dissertation Abstracts, and Thesis database. Search terms included randomized controlled trial, controlled clinical trial, random allocation, double blind method, single blind method, orthodontics, LED, halogen, bond, and bracket. Authors of primary studies were contacted as required, and reference lists of the included studies were screened. RESULTS Randomized controlled trials and clinical controlled trials directly comparing conventional halogen lights, LEDs, or plasma arc systems involving patients with full arch, fixed, or bonded orthodontic appliances (not banded) with follow-up periods of a minimum of 6 months were included. Using predefined forms, 2 authors undertook independent extraction of articles; disagreements were resolved by discussion. The assessment of the risk of bias of the randomized controlled trials was based on the Cochrane Risk of Bias tool. Ten studies met the inclusion criteria; 2 were excluded because of high risk of bias. In the comparison of bond failure risk with halogen lights and plasma arc lights, 1851 brackets were included in both groups. Little statistical heterogeneity was observed in this analysis (I(2) = 4.8%; P = 0.379). There was no statistical difference in bond failure risk between the groups (OR, 0.92; 95% CI, 0.68-1.23; prediction intervals, 0.54, 1.56). Similarly, no statistical difference in bond failure risk was observed in the meta-analysis comparing halogen lights and LEDs (OR, 0.96; 95% CI, 0.64-1.44; prediction intervals, 0.07, 13.32). The pooled estimates from both comparisons were OR, 0.93; 95% CI, 0.74-1.17; and prediction intervals, 0.69, 1.17. CONCLUSIONS There is no evidence to support the use of 1 light cure type over another based on risk of attachment failure.
Resumo:
BACKGROUND Monitoring of HIV viral load in patients on combination antiretroviral therapy (ART) is not generally available in resource-limited settings. We examined the cost-effectiveness of qualitative point-of-care viral load tests (POC-VL) in sub-Saharan Africa. DESIGN Mathematical model based on longitudinal data from the Gugulethu and Khayelitsha township ART programmes in Cape Town, South Africa. METHODS Cohorts of patients on ART monitored by POC-VL, CD4 cell count or clinically were simulated. Scenario A considered the more accurate detection of treatment failure with POC-VL only, and scenario B also considered the effect on HIV transmission. Scenario C further assumed that the risk of virologic failure is halved with POC-VL due to improved adherence. We estimated the change in costs per quality-adjusted life-year gained (incremental cost-effectiveness ratios, ICERs) of POC-VL compared with CD4 and clinical monitoring. RESULTS POC-VL tests with detection limits less than 1000 copies/ml increased costs due to unnecessary switches to second-line ART, without improving survival. Assuming POC-VL unit costs between US$5 and US$20 and detection limits between 1000 and 10,000 copies/ml, the ICER of POC-VL was US$4010-US$9230 compared with clinical and US$5960-US$25540 compared with CD4 cell count monitoring. In Scenario B, the corresponding ICERs were US$2450-US$5830 and US$2230-US$10380. In Scenario C, the ICER ranged between US$960 and US$2500 compared with clinical monitoring and between cost-saving and US$2460 compared with CD4 monitoring. CONCLUSION The cost-effectiveness of POC-VL for monitoring ART is improved by a higher detection limit, by taking the reduction in new HIV infections into account and assuming that failure of first-line ART is reduced due to targeted adherence counselling.
Resumo:
BACKGROUND The electrocardiographic PR interval increases with aging, differs by race, and is associated with atrial fibrillation (AF), pacemaker implantation, and all-cause mortality. We sought to determine the associations between PR interval and heart failure, AF, and mortality in a biracial cohort of older adults. METHODS AND RESULTS The Health, Aging, and Body Composition (Health ABC) Study is a prospective, biracial cohort. We used multivariable Cox proportional hazards models to examine PR interval (hazard ratios expressed per SD increase) and 10-year risks of heart failure, AF, and all-cause mortality. Multivariable models included demographic, anthropometric, and clinical variables in addition to established cardiovascular risk factors. We examined 2722 Health ABC participants (aged 74±3 years, 51.9% women, and 41% black). We did not identify significant effect modification by race for the outcomes studied. After multivariable adjustment, every SD increase (29 ms) in PR interval was associated with a 13% greater 10-year risk of heart failure (95% confidence interval, 1.02-1.25) and a 13% increased risk of incident AF (95% confidence interval, 1.04-1.23). PR interval >200 ms was associated with a 46% increased risk of incident heart failure (95% confidence interval, 1.11-1.93). PR interval was not associated with increased all-cause mortality. CONCLUSIONS We identified significant relationships of PR interval to heart failure and AF in older adults. Our findings extend prior investigations by examining PR interval and associations with adverse outcomes in a biracial cohort of older men and women.
Resumo:
The aim of this study was to describe long-term follow-up and difference in immune reactions in the tear film following penetrating keratoplasty (PK) in horses when differently preserved corneas were utilised. This report describes for the first time the use of corneal grafts preserved in tissue culture media in equine PK. Eight experimental horses with normal eyes were included and freshly harvested, frozen or preserved corneal grafts were used for the PK. The graft-taking technique and storage, PK surgery, postoperative treatments and complications are described. The mean postoperative follow-up time was 286 days. Tear film samples taken before and periodically after surgery were measured for IgM, IgG and IgA contents by direct ELISA. All grafts were incorporated into the donor horse but were rejected to some degree. The differently harvested corneal grafts healed in the same manner and looked similar. Preoperatively, the clear corneas meant low risk for graft failure, and the fresh or stored tissues provided intact endothelium, although there were no clear graft sites postoperatively. The presence of IgA, IgG and IgM was demonstrated in the tear film from the early postoperative period. IgG levels were lower than IgA or IgM and had a constant baseline in every case, as IgA and IgM had great variability with time and an individual pattern in each eye.
Resumo:
Serum samples from 142 calves and their dams were analyzed for gammaglobulins (gammaG, calves) and selenium concentrations (Se, calves and dams). A questionnaire provided information about birth and colostrum management. The calves and their dams were distributed into two groups according the calves' gammaG concentration (< 10 and >= 10 g/L), Se concentrations were compared between groups. The correlation between gammaG and Se concentrations in the calves and their dams was analyzed. Risk factors for failure of passive transfer and Se deficiency were assessed based on the questionnaire. The gammaG concentration of 42.9 % of the calves was < 10 g/L (median: 10.9). Calves showed significantly higher gammaG values after optimized colostrum administration than calves with suboptimal colostrum administration (p < 0.004). The median Se concentration was 26.8 and 36.5 microg/L for the calves and dams, respectively. A high correlation was observed between the Se concentration of the dam and her calf (r = 0.72, p < 0.001). The calves' Se and gammaG concentrations were not significantly correlated. These results demonstrate that further efforts toward better information of farmers regarding colostrum management and Se supply are warranted.
Resumo:
Prostate cancer (CaP) is the most commonly diagnosed malignancy in males in the Western world with one in six males diagnosed in their lifetime. Current clinical prognostication groupings use pathologic Gleason score, pre-treatment prostatic-specific antigen and Union for International Cancer Control-TNM staging to place patients with localized CaP into low-, intermediate- and high-risk categories. These categories represent an increasing risk of biochemical failure and CaP-specific mortality rates, they also reflect the need for increasing treatment intensity and justification for increased side effects. In this article, we point out that 30-50% of patients will still fail image-guided radiotherapy or surgery despite the judicious use of clinical risk categories owing to interpatient heterogeneity in treatment response. To improve treatment individualization, better predictors of prognosis and radiotherapy treatment response are needed to triage patients to bespoke and intensified CaP treatment protocols. These should include the use of pre-treatment genomic tests based on DNA or RNA indices and/or assays that reflect cancer metabolism, such as hypoxia assays, to define patient-specific CaP progression and aggression. More importantly, it is argued that these novel prognostic assays could be even more useful if combined together to drive forward precision cancer medicine for localized CaP.
Resumo:
OBJECTIVES Many paediatric antiretroviral therapy (ART) programmes in Southern Africa rely on CD4⁺ to monitor ART. We assessed the benefit of replacing CD4⁺ by viral load monitoring. DESIGN A mathematical modelling study. METHODS A simulation model of HIV progression over 5 years in children on ART, parameterized by data from seven South African cohorts. We simulated treatment programmes with 6-monthly CD4⁺ or 6- or 12-monthly viral load monitoring. We compared mortality, second-line ART use, immunological failure and time spent on failing ART. In further analyses, we varied the rate of virological failure, and assumed that the rate is higher with CD4⁺ than with viral load monitoring. RESULTS About 7% of children were predicted to die within 5 years, independent of the monitoring strategy. Compared with CD4⁺ monitoring, 12-monthly viral load monitoring reduced the 5-year risk of immunological failure from 1.6 to 1.0% and the mean time spent on failing ART from 6.6 to 3.6 months; 1% of children with CD4⁺ compared with 12% with viral load monitoring switched to second-line ART. Differences became larger when assuming higher rates of virological failure. When assuming higher virological failure rates with CD4⁺ than with viral load monitoring, up to 4.2% of children with CD4⁺ compared with 1.5% with viral load monitoring experienced immunological failure; the mean time spent on failing ART was 27.3 months with CD4⁺ monitoring and 6.0 months with viral load monitoring. Conclusion: Viral load monitoring did not affect 5-year mortality, but reduced time on failing ART, improved immunological response and increased switching to second-line ART.
Resumo:
After standard hip arthroplasty, an 82-year-old patient with previously undiagnosed diffuse idiopathic skeletal hyperostosis of the cervical spine experienced life-threatening side effects after use of a supraglottic airway device (i-gel). Extensive mucosal erosion and denudation of the cricoid cartilage caused postoperative supraglottic swelling and prolonged respiratory failure requiring tracheostomy. In this case report, we highlight the importance of evaluating risk factors for failure of supraglottic airway devices.