981 resultados para Non-informative prior
Resumo:
OBJECTIVE: To describe an alternative method for the treatment of non-responsive self-mutilation injuries in three dogs after carpal/tarsal arthrodesis. STUDY DESIGN: Case series ANIMALS: Two dogs with carpal injury and one dog with tarsal injury treated by arthrodesis METHODS: All dogs developed self-mutilation injuries due to licking and/or chewing of the toes within 21-52 days of surgery. Clinical signs did not resolve within one week after conservative treatment with wound debridement and protective bandages. Following general anaesthesia, a deep horseshoe-shaped skin incision, including the subdermal tissue, was performed proximal to the self-mutilation injury transecting the sensory cutaneous afferent nerves. The skin incision was closed with simple interrupted sutures. RESULTS: All wounds healed without complication. Self-mutilation resolved completely within 24 hours after surgery in all dogs. No recurrence was observed (5 months to 3 years). CONCLUSION: Non-selective cutaneous sensory neurectomy may lead to resolution of self-mutilation following arthrodesis in dogs. CLINICAL RELEVANCE: Failure of conservative treatment in self-mutilation injuries often leads to toe or limb amputation as a last resort. The technique described in this case series is a simple procedure that should be considered prior to amputation. The outcome of this procedure in dogs self-multilating due to neurological or behavioral disturbances unrelated to carpal or tarsal arthrodesis is not known.
Resumo:
An experiment was conducted using 95 Continental crossbred steers. The cattle were sorted by ultrasound 160 days before slaughter into a low backfat group (Low BF) and a higher backfat group (High BF). Half of the Low BF and half of the High BF were implanted whereas the other halves were not. Data from the experiment were used in two hypothetical markets. One market was a high yield beef program (HY) that did not allow the use of implants. The second market was a commodity beef program (CM) that allowed the use of implants. The cattle were priced as an unsorted group (ALL) and two sorted groups (Low BF and High BF) within the HY (non-implanted) and CM (implanted) markets. The CM program had a base price of $1.05/lb hot carcass weight (HCW) with a $0.15/lb HCW discount for quality grade (QG) Select and a $0.20/lb HCW discount for yield grade (YG) 4. The HY program used a base price of $1.07/lb HCW with premiums ($/lb HCW) paid for YG £ .9 (.15), 1.0 - 1.4 (.10), and 1.5 - 1.9 (.03). The carcasses were discounted ($/lb HCW) for YG 2.5 - 2.9 (.03), 3.0 - 3.9 (.15), and ³ 4.0 (.35). This data set provides good evidence that the end point at which to sell a group of cattle depends on the particular market. Sorting had an economic advantage over ALL in the HY Low BF and the CM High BF groups. The HY High BF cattle should have been sold sooner due to the discounts recieved for increased YG. The increased YG was directly affected by an increase in BF. Furthermore, the CM Low BF group should have been fed longer to increase the number of carcasses grading Choice.
Resumo:
OBJECTIVES To identify potential prognostic factors affecting outcome in septic peritonitis caused by gastrointestinal perforation in dogs and cats. METHODS A retrospective study. Animals operated on for septic peritonitis because of gastrointestinal perforation were evaluated. Risk factors assessed included age, duration of clinical signs, recent prior abdominal surgery, recent prior anti-inflammatory drug administration, placement of a closed-suction drain and location of perforation. RESULTS Fifty-five animals (44 dogs and 11 cats) were included. The overall mortality was 63·6%. No association was found between age, duration of clinical signs or prior abdominal surgery and outcome. Animals with a history of prior anti-inflammatory drugs were significantly (P=0·0011) more likely to have perforation of the pylorus (73·3%). No significant difference in outcome was found between animals treated with closed-suction drains and those treated with primary closure or between pyloric perforation and perforation at other gastrointestinal sites. CLINICAL SIGNIFICANCE Administration of anti-inflammatory drugs in dogs and cats is a significant risk factor for pyloric perforation. Pyloric perforation was not associated with a poorer outcome than perforation at other gastrointestinal sites. Placement of a closed suction drain did not improve outcome compared to primary closure.
Resumo:
BACKGROUND Outcome data are limited in patients with ST-segment elevation acute myocardial infarction (STEMI) or other acute coronary syndromes (ACSs) who receive a drug-eluting stent (DES). Data suggest that first generation DES is associated with an increased risk of stent thrombosis when used in STEMI. Whether this observation persists with newer generation DES is unknown. The study objective was to analyze the two-year safety and effectiveness of Resolute™ zotarolimus-eluting stents (R-ZESs) implanted for STEMI, ACS without ST segment elevation (non-STEACS), and stable angina (SA). METHODS Data from the Resolute program (Resolute All Comers and Resolute International) were pooled and patients with R-ZES implantation were categorized by indication: STEMI (n=335), non-STEACS (n=1416), and SA (n=1260). RESULTS Mean age was 59.8±11.3 years (STEMI), 63.8±11.6 (non-STEACS), and 64.9±10.1 (SA). Fewer STEMI patients had diabetes (19.1% vs. 28.5% vs. 29.2%; P<0.001), prior MI (11.3% vs. 27.2% vs. 29.4%; P<0.001), or previous revascularization (11.3% vs. 27.9% vs. 37.6%; P<0.001). Two-year definite/probable stent thrombosis occurred in 2.4% (STEMI), 1.2% (non-STEACS) and 1.1% (SA) of patients with late/very late stent thrombosis (days 31-720) rates of 0.6% (STEMI and non-STEACS) and 0.4% (SA) (P=NS). The two-year mortality rate was 2.1% (STEMI), 4.8% (non-STEACS) and 3.7% (SA) (P=NS). Death or target vessel re-infarction occurred in 3.9% (STEMI), 8.7% (non-STEACS) and 7.3% (SA) (P=0.012). CONCLUSION R-ZES in STEMI and in other clinical presentations is effective and safe. Long term outcomes are favorable with an extremely rare incidence of late and very late stent thrombosis following R-ZES implantation across indications.
Resumo:
Infrared thermography (IRT) was used to detect digital dermatitis (DD) prior to routine claw trimming. A total of 1192 IRT observations were collected from 149 cows on eight farms. All cows were housed in tie-stalls. The maximal surface temperatures of the coronary band (CB) region and skin (S) of the fore and rear feet (mean value of the maximal surface temperatures of both digits for each foot separately, CBmax and Smax) were assessed. Grouping was performed at the foot level (presence of DD, n=99; absence, n=304), or at the cow level (all four feet healthy, n=24) or where there was at least one DD lesion on the rear feet, n=37). For individual cows (n=61), IRT temperature difference was determined by subtracting the mean sum of CBmax and Smax of the rear feet from that of the fore feet. Feet with DD had higher CBmax and Smax (P<0.001) than healthy feet. Smax was significantly higher in feet with infectious DD lesions (M-stage: M2+M4; n=15) than in those with non-infectious M-lesions (M1+M3; n=84) (P=0.03), but this was not the case for CBmax (P=0.12). At the cow level, an optimal cut-off value for detecting DD of 0.99°C (IRT temperature difference between rear and front feet) yielded a sensitivity of 89.1% and a specificity of 66.6%. The results indicate that IRT may be a useful non-invasive diagnostic tool to screen for the presence of DD in dairy cows by measuring CBmax and Smax.
Resumo:
Breast cancer incidence and mortality rates for Hispanic women are lower than for non-Hispanic white (NHW) women, but recently rates have increased more rapidly among Hispanic women. Many studies have shown a consistent increased breast cancer risk associated with modest or high alcohol intake, but few included Hispanic women. Alcohol consumption and risk of breast cancer was investigated in a New Mexico statewide population-based case-control study. The New Mexico Tumor Registry ascertained women, newly diagnosed with breast cancer (1992–1994) aged 30–74 years. Controls were identified by random digit dialing and were frequency-matched for ethnicity, age-group, and health planning district. In-person interviews of 712 cases and 844 controls were conducted. Data were collected for breast cancer risk factors, including alcohol intake. Recent alcohol intake data was collected for a four-week period, six months prior to interview. Past alcohol intake included information on alcohol consumption at ages 25, 35, and 50. History of alcohol consumption was reported by 81% of cases and 85% of controls. Of these women, 42% of cases and 48% of controls reported recent alcohol intake. Results for past alcohol intake did not show any trend with breast cancer risk, and were nonsignificant. Multivariate-adjusted odds ratios for recent alcohol intake and breast cancer suggested an increased risk at the highest level for both ethnic groups, but estimates were unstable and statistically nonsignificant. Low level of recent alcohol intake (<148 grams/week) was associated with a reduced risk for NHW women (Odds Ratio (OR) = 0.49 95% Confidence Interval (CI) 0.35–0.69). This pattern was independent of hormone-receptor status. The reduced breast cancer risk for low alcohol intake was present for premenopausal (OR = 0.29, 95% CI 0.15–0.56) and postmenopausal NHW women (OR = 0.56, 95% CI 0.35–0.90). The possibility of an increased risk associated with high alcohol intake could not be adequately addressed, because there were few drinkers with more than light to moderate intake, especially among Hispanic women. An alcohol-estrogen link is hypothesized to be the mechanism responsible for increased breast cancer risk, but has not been consistently substantiated. More studies are needed of the underlying mechanism for an association between alcohol intake and breast cancer. ^
Resumo:
OBJECTIVE The cost-effectiveness of cast nonprecious frameworks has increased their prevalence in cemented implant crowns. The purpose of this study was to assess the effect of the design and height of the retentive component of a standard titanium implant abutment on the fit, possible horizontal rotation and retention forces of cast nonprecious alloy crowns prior to cementation. MATERIALS AND METHODS Two abutment designs were examined: Type A with a 6° taper and 8 antirotation planes (Straumann Tissue-Level RN) and Type B with a 7.5° taper and 1 antirotation plane (SICace implant). Both types were analyzed using 60 crowns: 20 with a full abutment height (6 mm), 20 with a medium abutment height (4 mm), and 20 with a minimal (2.5 mm) abutment height. The marginal and internal fit and the degree of possible rotation were evaluated by using polyvinylsiloxane impressions under a light microscope (magnification of ×50). To measure the retention force, a custom force-measuring device was employed. STATISTICAL ANALYSIS one-sided Wilcoxon rank-sum tests with Bonferroni-Holm corrections, Fisher's exact tests, and Spearman's rank correlation coefficient. RESULTS Type A exhibited increased marginal gaps (primary end-point: 55 ± 20 μm vs. 138 ± 59 μm, P < 0.001) but less rotation (P < 0.001) than Type B. The internal fit was also better for Type A than for Type B (P < 0.001). The retention force of Type A (2.49 ± 3.2 N) was higher (P = 0.019) than that of Type B (1.27 ± 0.84 N). Reduction in abutment height did not affect the variables observed. CONCLUSION Less-tapered abutments with more antirotation planes provide an increase in the retention force, which confines the horizontal rotation but widens the marginal gaps of the crowns. Thus, casting of nonprecious crowns with Type A abutments may result in clinically unfavorable marginal gaps.
Resumo:
Domestic dog rabies is an endemic disease in large parts of the developing world and also epidemic in previously free regions. For example, it continues to spread in eastern Indonesia and currently threatens adjacent rabies-free regions with high densities of free-roaming dogs, including remote northern Australia. Mathematical and simulation disease models are useful tools to provide insights on the most effective control strategies and to inform policy decisions. Existing rabies models typically focus on long-term control programs in endemic countries. However, simulation models describing the dog rabies incursion scenario in regions where rabies is still exotic are lacking. We here describe such a stochastic, spatially explicit rabies simulation model that is based on individual dog information collected in two remote regions in northern Australia. Illustrative simulations produced plausible results with epidemic characteristics expected for rabies outbreaks in disease free regions (mean R0 1.7, epidemic peak 97 days post-incursion, vaccination as the most effective response strategy). Systematic sensitivity analysis identified that model outcomes were most sensitive to seven of the 30 model parameters tested. This model is suitable for exploring rabies spread and control before an incursion in populations of largely free-roaming dogs that live close together with their owners. It can be used for ad-hoc contingency or response planning prior to and shortly after incursion of dog rabies in previously free regions. One challenge that remains is model parameterisation, particularly how dogs' roaming and contacts and biting behaviours change following a rabies incursion in a previously rabies free population.
Resumo:
Background. The rise in survival rates along with more detailed follow-up using sophisticated imaging studies among non-small lung cancer (NSCLC) patients has led to an increased risk of second primary tumors (SPT) among these cases. Population and hospital based studies of lung cancer patients treated between 1974 and 1996 have found an increasing risk over time for the development of all cancers following treatment of non-small cell lung cancer (NSCLC). During this time the primary modalities for treatment were surgery alone, radiation alone, surgery and post-operative radiation therapy, or combinations of chemotherapy and radiation (sequentially or concurrently). There is limited information in the literature about the impact of treatment modalities on the development of second primary tumors in these patients. ^ Purpose. To investigate the impact of treatment modalities on the risk of second primary tumors in patients receiving treatment with curative intent for non-metastatic (Stage I–III) non-small cell lung cancer (NSCLC). ^ Methods. The hospital records of 1,095 NSCLC patients who were diagnosed between 1980–2001 and received treatment with curative intent at M.D. Anderson Cancer Center with surgery alone, radiation alone (with a minimum total radiation dose of at least 45Gy), surgery and post-operative radiation therapy, radiation therapy in combination with chemotherapy or surgery in combination with chemotherapy and radiation were retrospectively reviewed. A second primary malignancy was be defined as any tumor histologically different from the initial cancer, or of another anatomic location, or a tumor of the same location and histology as the initial tumor having an interval between cancers of at least five years. Only primary tumors occurring after treatment for NSCLC will qualified as second primary tumors for this study. ^ Results. The incidence of second primary tumor was 3.3%/year and the rate increased over time following treatment. The type of NSCLC treatment was not found to have a striking effect upon SPT development. Increased rates were observed in the radiation only and chemotherapy plus radiation treatment groups; but, these increases did not exceed expected random variation. Higher radiation treatment dose, patient age and weight loss prior to index NSCLC treatment were associated with higher SPT development. ^
Resumo:
Objective. To conduct a systematic review of published literature on preconception care in pre-existing diabetic women looking at the effect of glycemic control and multivitamin usage on the frequency of spontaneous abortion and birth defects.^ Methods. Articles were retrieved from Medline (1950–Dec 2007), Cochrane Library (1800–Dec 2007), Academic Search Complete (Ebsco) (Jan 1800–Dec 2007) and Maternal and Child Health Library (1965–Dec 2007). Studies included women with pre-existing, non-gestational diabetes and a comparison group. Participants must have either received preconception care and/or consumed a multivitamin as part of the study.^ Results. Overall, seven studies met the study criteria and applicability to the study objectives. Four of these reported the frequency of spontaneous abortion. Only one found a statistically significant increased risk of spontaneous abortion among pregnant women who did not receive preconception care compared with those who did receive care, odds ratio 4.32; 95% CI 1.34 to 13.9. Of the seven studies, six reported the frequency of birth defects. Five of these six studies found a significantly increased rate of birth defects among pregnant women who did not receive preconception care compared with those who did receive care, with odds ratios ranging from 1.53 to 10.16. All seven studies based their preconception care intervention on glycemic control. One study also used multivitamins as part of the preconception care.^ Conclusion. Glycemic control was shown to be useful in reducing the prevalence of birth defects, but not as useful in reducing the prevalence of spontaneous abortion. Insulin regimen options vary widely for the diabetic woman. No author excluded or controlled for women who may have been taking a multivitamin on their own. Due to the small amount of literature available, it is still not known which preconception care option, glucose control and/or multivitamin usage, provides better protection from birth defects and spontaneous abortion for the diabetic woman. An area for future investigation would be glycemic control and the use of folic acid started before pregnancy and the effects on birth defects and spontaneous abortion.^
Resumo:
Background. The number of infections of cardiac implantable electronic devices (CIED) continues to escalate out of proportion to the increase rate of device implantation. Staphylococcal organisms account for 70% to 90% of all CIED infections. However, little is known about non-staphylococcal infections, which have been described only in case reports, small case series or combined in larger studies with staphylococcal CIED infections, thereby diluting their individual impact. ^ Methods. A retrospective review of hospital records of patients admitted with a CIED-related infections were identified within four academic hospitals in Houston, Texas between 2002 and 2009. ^ Results. Of the 504 identified patients with CIED-related infection, 80 (16%) had a non-staphylococcal infection and were the focus of this study. Although the demographics and comorbities of subjects were comparable to other reports, our study illustrates many key points: (a) the microbiologic diversity of non-staphylococcal infections was rather extensive, as it included other Gram-positive bacteria like streptococci and enterococci, a variety of Gram-negative bacteria, atypical bacteria including Nocardia and Mycobacteria, and fungi like Candida and Aspergillus; (b) the duration of CIED insertion prior to non-staphylococcal infection was relatively prolong (mean, 109 ± 27 weeks), of these 44% had their device previously manipulated within a mean of 29.5 ± 6 weeks; (c) non-staphylococcal organisms appear to be less virulent, cause prolonged clinical symptoms prior to admission (mean, 48 ± 12.8 days), and are associated with a lower mortality (4%) than staphylococcal organisms; (d) thirteen patients (16%) presented with CIED-related endocarditis; (e) although not described in prior reports, we identified 3 definite and 2 suspected cases of secondary Gram-negative bacteremia seeding of the CIED; and (f) inappropriate antimicrobial coverage was provided in approximately 50% of patients with non-staphylococcal infections for a mean period of 2.1 days. ^ Conclusions. Non-staphylococcal CIED-related infections are prevalent and diverse with a relatively low virulence and mortality rate. Since non-staphylococcal organisms are capable of secondarily seeding the CIED, a high suspicion for CIED-related infection is warranted in patients with bloodstream infection. Additionally, in patients with suspected CIED infection, adequate Gram positive and -negative antibacterial coverage should be administered until microbiologic data become available.^
Resumo:
Objectives. To examine the association between prior rifamycin exposure and later development of C. difficile infection (CDI) caused by a rifamycin-resistant strain of C. difficile , and to compare patient characteristics between rifamycin-resistant strains of C. difficile infection and rifamycin-susceptible strains of C. difficile infection. ^ Methods. A case-control study was performed in a large university-affiliated hospital in Houston, Texas. Study subjects were patients with C. difficile infection acquired at the hospital with culture-positive isolates of C. difficile with which in vitro rifaximin and rifampin susceptibility has been tested. Prior use of rifamycin, demographic and clinical characteristics was compared between case and control groups using univariate statistics. ^ Results. A total of 49 C. difficile strains met the study inclusion criteria for rifamycin-resistant case isolates, and a total of 98 rifamycin-susceptible C. difficile strains were matched to case isolates. Of 49 case isolates, 12 (4%) were resistant to rifampin alone, 12 (4%) were resistant to rifaximin alone, and 25 (9%) were resistant to both rifampin and rifaximin. There was no significant association between prior rifamycin use and rifamycin-resistant CDI. Cases and controls did not differ according to demographic characteristics, length of hospital stay, known risk factors of CDI, type of CDI-onset, and pre-infection medical co-morbidities. Our results on 37 rifaximin-resistant isolates (MIC ≥32 &mgr;g/ml) showed more than half of isolates had a rifaximin MIC ≥256 &mgr;g/ml, and out of these isolates, 19 isolates had MICs ≥1024 &mgr;g/ml. ^ Conclusions. Using a large series of rifamycin-non-susceptible isolates, no patient characteristics were independently associated with rifamycin-resistant CDI. This data suggests that factors beyond previous use of rifamycin antibiotics are primary risk factors for rifamycin-resistant C. difficile. ^
Resumo:
The high-altitude lake Tso Moriri (32°55'46'' N, 78°19'24'' E; 4522 m a.s.l.) is situated at the margin of the ISM and westerly influences in the Trans-Himalayan region of Ladakh. Human settlements are rare and domestic and wild animals are concentrating at the alpine meadows. A set of modern surface samples and fossil pollen from deep-water TMD core was evaluated with a focus on indicator types revealing human impact, grazing activities and lake system development during the last ca. 12 cal ka BP. Furthermore, the non-pollen palynomorph (NPP) record, comprising remains of limnic algae and invertebrates as well as fungal spores and charred plant tissue fragments, were examined in order to attest palaeolimnic phases and human impact, respectively. Changes in the early and middle Holocene limnic environment are mainly influenced by regional climatic conditions and glacier-fed meltwater flow in the catchment area. The NPP record indicates low lake productivity with high influx of freshwater between ca. 11.5 and 4.5 cal ka BP which is in agreement with the regional monsoon dynamics and published climate reconstructions. Geomorphologic observations suggest that during this period of enhanced precipitation the lake had a regular outflow and contributed large amounts of water to the Sutlej River, the lower reaches of which were integral part of the Indus Civilization area. The inferred minimum fresh water input and maximum lake productivity between ca. 4.5-1.8 cal ka BP coincides with the reconstruction of greatest aridity and glaciation in the Korzong valley resulting in significantly reduced or even ceased outflow. We suggest that lowered lake levels and river discharge on a larger regional scale may have caused irrigation problems and harvest losses in the Indus valley and lowlands occupied by sedentary agricultural communities. This scenario, in turn, supports the theory that, Mature Harappan urbanism (ca. 4.5-3.9 cal ka BP) emerged in order to facilitate storage, protection, administration, and redistribution of crop yields and secondly, the eventual collapse of the Harappan Culture (ca. 3.5-3 cal ka BP) was promoted by prolonged aridity. There is no clear evidence for human impact around Tso Moriri prior to ca. 3.7 cal ka BP, with a more distinct record since ca. 2.7 cal ka BP. This suggests that the sedimentary record from Tso Moriri primarily archives the regional climate history.
Resumo:
Blind Deconvolution consists in the estimation of a sharp image and a blur kernel from an observed blurry image. Because the blur model admits several solutions it is necessary to devise an image prior that favors the true blur kernel and sharp image. Many successful image priors enforce the sparsity of the sharp image gradients. Ideally the L0 “norm” is the best choice for promoting sparsity, but because it is computationally intractable, some methods have used a logarithmic approximation. In this work we also study a logarithmic image prior. We show empirically how well the prior suits the blind deconvolution problem. Our analysis confirms experimentally the hypothesis that a prior should not necessarily model natural image statistics to correctly estimate the blur kernel. Furthermore, we show that a simple Maximum a Posteriori formulation is enough to achieve state of the art results. To minimize such formulation we devise two iterative minimization algorithms that cope with the non-convexity of the logarithmic prior: one obtained via the primal-dual approach and one via majorization-minimization.
Resumo:
Along the recent years, several moving object detection strategies by non-parametric background-foreground modeling have been proposed. To combine both models and to obtain the probability of a pixel to belong to the foreground, these strategies make use of Bayesian classifiers. However, these classifiers do not allow to take advantage of additional prior information at different pixels. So, we propose a novel and efficient alternative Bayesian classifier that is suitable for this kind of strategies and that allows the use of whatever prior information. Additionally, we present an effective method to dynamically estimate prior probability from the result of a particle filter-based tracking strategy.