779 resultados para Associated management
Resumo:
OBJECTIVE To evaluate possible predictive factors for recurrence after laparoscopic segmental bowel resection for bowel endometriosis. DESIGN Cohort study. SETTING Academic tertiary referral center. METHODS 95 symptomatic women with bowel endometriosis who underwent laparoscopic segmental bowel resection at the Endometriosis clinic, University of Berne, between 2002 and 2012 were enrolled. Since 14 women were lost to follow-up, 81 formed the final cohort. Clinical and histological characteristics were examined as possible predictive factors for disease recurrence. MAIN OUTCOME MEASURES Recurrence, defined as a subsequent operation due to recurrent endometriosis-associated pain with a histologically confirmed endometriotic lesion. RESULTS Recurrence was observed in 13 (16%) patients. Variables that were significantly associated to recurrence by the Cox regression analysis were positive bowel resection margins (hazard ratio 6.5, 95% confidence interval 1.8-23.5, p = 0.005), age <31 years (hazard ratio 5.6, 95% confidence interval 1.7-18.6, p = 0.005) and body mass index ≥23 kg/m(2) (hazard ratio 11.0, 95% confidence interval 2.7-44.6, p = 0.001). CONCLUSIONS Positive bowel resection margins as well as age <31 years and body mass index ≥23 kg/m(2) appear to be independent predictors of disease recurrence.
Resumo:
Germline mutation testing in patients with colorectal cancer (CRC) is offered only to a subset of patients with a clinical presentation or tumor histology suggestive of familial CRC syndromes, probably underestimating familial CRC predisposition. The aim of our study was to determine whether unbiased screening of newly diagnosed CRC cases with next generation sequencing (NGS) increases the overall detection rate of germline mutations. We analyzed 152 consecutive CRC patients for germline mutations in 18 CRC-associated genes using NGS. All patients were also evaluated for Bethesda criteria and all tumors were investigated for microsatellite instability, immunohistochemistry for mismatch repair proteins and the BRAF*V600E somatic mutation. NGS based sequencing identified 27 variants in 9 genes in 23 out of 152 patients studied (18%). Three of them were already reported as pathogenic and 12 were class 3 germline variants with an uncertain prediction of pathogenicity. Only 1 of these patients fulfilled Bethesda criteria and had a microsatellite instable tumor and an MLH1 germline mutation. The others would have been missed with current approaches: 2 with a MSH6 premature termination mutation and 12 uncertain, potentially pathogenic class 3 variants in APC, MLH1, MSH2, MSH6, MSH3 and MLH3. The higher NGS mutation detection rate compared with current testing strategies based on clinicopathological criteria is probably due to the large genetic heterogeneity and overlapping clinical presentation of the various CRC syndromes. It can also identify apparently nonpenetrant germline mutations complicating the clinical management of the patients and their families.
Resumo:
The Swiss Swiss Consultant Trust Fund (CTF) support covered the period from July to December 2007 and comprised four main tasks: (1) Analysis of historic land degradation trends in the four watersheds of Zerafshan, Surkhob, Toirsu, and Vanj; (2) Translation of standard CDE GIS training materials into Russian and Tajik to enable local government staff and other specialists to use geospatial data and tools; (3) Demonstration of geospatial tools that show land degradation trends associated with land use and vegetative cover data in the project areas, (4) Preliminary training of government staff in using appropriate data, including existing information, global datasets, inexpensive satellite imagery and other datasets and webbased visualization tools like spatial data viewers, etc. The project allowed building of local awareness of, and skills in, up-to-date, inexpensive, easy-to-use GIS technologies, data sources, and applications relevant to natural resource management and especially to sustainable land management. In addition to supporting the implementation of the World Bank technical assistance activity to build capacity in the use of geospatial tools for natural resource management, the Swiss CTF support also aimed at complementing the Bank supervision work on the ongoing Community Agriculture and Watershed Management Project (CAWMP).
Resumo:
Perceived profitability is a key factor in explaining farmers' decision to adopt or not adopt sustainable land management (SLM) technologies. Despite this importance, relatively little is known about the economics of SLM. This paper contributes to the literature by analysing data on costs and perceived cost/benefit ratios of SLM technologies. Data are taken from the World Overview of Conservation Approaches and Technologies technology database and cover 363 case studies conducted in a variety of countries between 1990 and 2012. Based on an in-depth descriptive analysis, we determine what costs accrue to local stakeholders and assess perceived short-term and long-term cost/benefit ratios. Our results show that a large majority of the technologies in our sample are perceived as being profitable: 73% were perceived to have a positive or at least neutral cost/benefit ratio in the short term, while 97% were perceived to have a positive or very positive cost/benefit ratio in the long term. An additional empirical analysis confirms that economic factors are key determinants of land users' decisions to adopt or not adopt SLM technologies. We conclude that a wide range of existing SLM practices generate considerable benefits not only for land users, but for other stakeholders as well. High initial investment costs associated with some practices may, however, constitute a barrier to their adoption; short-term support for land users can help to promote these practices where appropriate.
Resumo:
Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.
Resumo:
Ensuring sustainable use of natural resources is crucial for maintaining the basis for our livelihoods. With threats from climate change, disputes over water, biodiversity loss, competing claims on land, and migration increasing worldwide, the demands for sustainable land management (SLM) practices will only increase in the future. For years already, various national and international organizations (GOs, NGOs, donors, research institutes, etc.) have been working on alternative forms of land management. And numerous land users worldwide – especially small farmers – have been testing, adapting, and refining new and better ways of managing land. All too often, however, the resulting SLM knowledge has not been sufficiently evaluated, documented and shared. Among other things, this has often prevented valuable SLM knowledge from being channelled into evidence-based decision-making processes. Indeed, proper knowledge management is crucial for SLM to reach its full potential. Since more than 20 years, the international WOCAT network documents and promotes SLM through its global platform. As a whole, the WOCAT methodology comprises tools for documenting, evaluating, and assessing the impact of SLM practices, as well as for knowledge sharing, analysis and use for decision support in the field, at the planning level, and in scaling up identified good practices. In early 2014, WOCAT’s growth and ongoing improvement culminated in its being officially recognized by the UNCCD as the primary recommended database for SLM best practices. Over the years, the WOCAT network confirmed that SLM helps to prevent desertification, to increase biodiversity, enhance food security and to make people less vulnerable to the effects of climate variability and change. In addi- tion, it plays an important role in mitigating climate change through improving soil organic matter and increasing vegetation cover. In-depth assessments of SLM practices from desertification sites enabled an evaluation of how SLM addresses prevalent dryland threats. The impacts mentioned most were diversified and enhanced production and better management of water and soil degradation, whether through water harvesting, improving soil moisture, or reducing runoff. Among others, favourable local-scale cost-benefit relationships of SLM practices play a crucial role in their adoption. An economic analysis from the WOCAT database showed that land users perceive a large majority of the technologies as having benefits that outweigh costs in the long term. The high investment costs associated with some practices may constitute a barrier to adoption, however, where appropriate, short-term support for land users can help to promote these practices. The increased global concerns on climate change, disaster risks and food security redirect attention to, and trigger more funds for SLM. To provide the necessary evidence-based rationale for investing in SLM and to reinforce expert and land users assessments of SLM impacts, more field research using inter- and transdisciplinary approaches is needed. This includes developing methods to quantify and value ecosystem services, both on-site and off-site, and assess the resilience of SLM practices, as currently aimed at within the EU FP7 projects CASCADE and RECARE.
Resumo:
OBJECTIVES Rates of TB/HIV coinfection and multi-drug resistant (MDR)-TB are increasing in Eastern Europe (EE). We aimed to study clinical characteristics, factors associated with MDR-TB and predicted activity of empiric anti-TB treatment at time of TB diagnosis among TB/HIV coinfected patients in EE, Western Europe (WE) and Latin America (LA). DESIGN AND METHODS Between January 1, 2011, and December 31, 2013, 1413 TB/HIV patients (62 clinics in 19 countries in EE, WE, Southern Europe (SE), and LA) were enrolled. RESULTS Significant differences were observed between EE (N = 844), WE (N = 152), SE (N = 164), and LA (N = 253) in the proportion of patients with a definite TB diagnosis (47%, 71%, 72% and 40%, p<0.0001), MDR-TB (40%, 5%, 3% and 15%, p<0.0001), and use of combination antiretroviral therapy (cART) (17%, 40%, 44% and 35%, p<0.0001). Injecting drug use (adjusted OR (aOR) = 2.03 (95% CI 1.00-4.09), prior anti-TB treatment (3.42 (1.88-6.22)), and living in EE (7.19 (3.28-15.78)) were associated with MDR-TB. Among 585 patients with drug susceptibility test (DST) results, the empiric (i.e. without knowledge of the DST results) anti-TB treatment included ≥3 active drugs in 66% of participants in EE compared with 90-96% in other regions (p<0.0001). CONCLUSIONS In EE, TB/HIV patients were less likely to receive a definite TB diagnosis, more likely to house MDR-TB and commonly received empiric anti-TB treatment with reduced activity. Improved management of TB/HIV patients in EE requires better access to TB diagnostics including DSTs, empiric anti-TB therapy directed at both susceptible and MDR-TB, and more widespread use of cART.
Resumo:
AIMS Our aim was to report on a survey initiated by the European Association of Percutaneous Cardiovascular Interventions (EAPCI) collecting the opinion of the cardiology community on the invasive management of acute coronary syndrome (ACS), before and after the MATRIX trial presentation at the American College of Cardiology (ACC) 2015 Scientific Sessions. METHODS AND RESULTS A web-based survey was distributed to all individuals registered on the EuroIntervention mailing list (n=15,200). A total of 572 and 763 physicians responded to the pre- and post-ACC survey, respectively. The radial approach emerged as the preferable access site for ACS patients undergoing invasive management with roughly every other responder interpreting the evidence for mortality benefit as definitive and calling for a guidelines upgrade to class I. The most frequently preferred anticoagulant in ACS patients remains unfractionated heparin (UFH), due to higher costs and greater perceived thrombotic risks associated with bivalirudin. However, more than a quarter of participants declared the use of bivalirudin would increase after MATRIX. CONCLUSIONS The MATRIX trial reinforced the evidence for a causal association between bleeding and mortality and triggered consensus on the superiority of the radial versus femoral approach. The belief that bivalirudin mitigates bleeding risk is common, but UFH still remains the preferred anticoagulant based on lower costs and thrombotic risks.
Resumo:
Bullous pemphigoid is the most common autoimmune subepidermal blistering disease of the skin and mucous membranes. This disease typically affects the elderly and presents with itch and localized or generalized bullous lesions. In up to 20% of affected patients, bullae may be completely absent, and only excoriations, prurigo-like lesions, eczematous lesions, urticated lesions and/or infiltrated plaques are observed. The disease is significantly associated with neurological disorders. The morbidity of bullous pemphigoid and its impact on quality of life are significant. So far, a limited number of national treatment guidelines have been proposed, but no common European consensus has emerged. Our consensus for the treatment of bullous pemphigoid has been developed under the guidance of the European Dermatology Forum in collaboration with the European Academy of Dermatology and Venereology. It summarizes evidence-based and expert-based recommendations.
Resumo:
Pelvic discontinuity is a complex problem in revision total hip arthroplasty. Although rare, the incidence is likely to increase due to the ageing population and the increasing number of total hip arthroplasties being performed. The various surgical options available to solve this problem include plating, massive allografts, reconstruction rings, custom triflanged components and tantalum implants. However, the optimal solution remains controversial. None of the known methods completely solves the major obstacles associated with this problem, such as restoration of massive bone loss, implant failure in the short- and long-term and high complication rates. This review discusses the diagnosis, decision making, and treatment options of pelvic discontinuity in revision total hip arthroplasty.
Resumo:
BACKGROUND Symptoms associated with pes planovalgus or flatfeet occur frequently, even though some people with a flatfoot deformity remain asymptomatic. Pes planovalgus is proposed to be associated with foot/ankle pain and poor function. Concurrently, the multifactorial weakness of the tibialis posterior muscle and its tendon can lead to a flattening of the longitudinal arch of the foot. Those affected can experience functional impairment and pain. Less severe cases at an early stage are eligible for non-surgical treatment and foot orthoses are considered to be the first line approach. Furthermore, strengthening of arch and ankle stabilising muscles are thought to contribute to active compensation of the deformity leading to stress relief of soft tissue structures. There is only limited evidence concerning the numerous therapy approaches, and so far, no data are available showing functional benefits that accompany these interventions. METHODS After clinical diagnosis and clarification of inclusion criteria (e.g., age 40-70, current complaint of foot and ankle pain more than three months, posterior tibial tendon dysfunction stage I & II, longitudinal arch flattening verified by radiography), sixty participants with posterior tibial tendon dysfunction associated complaints will be included in the study and will be randomly assigned to one of three different intervention groups: (i) foot orthoses only (FOO), (ii) foot orthoses and eccentric exercise (FOE), or (iii) sham foot orthoses only (FOS). Participants in the FOO and FOE groups will be allocated individualised foot orthoses, the latter combined with eccentric exercise for ankle stabilisation and strengthening of the tibialis posterior muscle. Participants in the FOS group will be allocated sham foot orthoses only. During the intervention period of 12 weeks, all participants will be encouraged to follow an educational program for dosed foot load management (e.g., to stop activity if they experience increasing pain). Functional impairment will be evaluated pre- and post-intervention by the Foot Function Index. Further outcome measures include the Pain Disability Index, Visual Analogue Scale for pain, SF-12, kinematic data from 3D-movement analysis and neuromuscular activity during level and downstairs walking. Measuring outcomes pre- and post-intervention will allow the calculation of intervention effects by 3×3 Analysis of Variance (ANOVA) with repeated measures. DISCUSSION The purpose of this randomised trial is to evaluate the therapeutic benefit of three different non-surgical treatment regimens in participants with posterior tibial tendon dysfunction and accompanying pes planovalgus. Furthermore, the analysis of changes in gait mechanics and neuromuscular control will contribute to an enhanced understanding of functional changes and eventually optimise conservative management strategies for these patients. TRIAL REGISTRATION ClinicalTrials.gov Protocol Registration System: ClinicalTrials.gov ID NCT01839669.
Resumo:
OBJECTIVE To determine the success of medical management of presumptive cervical disk herniation in dogs and variables associated with treatment outcome. DESIGN Retrospective case series. ANIMALS Dogs (n=88) with presumptive cervical disk herniation. METHODS Dogs with presumptive cervical and thoracolumbar disk herniation were identified from medical records at 2 clinics and clients were mailed a questionnaire related to the success of therapy, clinical recurrence of signs, and quality of life (QOL) as interpreted by the owner. Signalment, duration and degree of neurologic dysfunction, and medication administration were determined from medical records. RESULTS Ninety-seven percent of dogs (84/87) with complete information were described as ambulatory at initial evaluation. Successful treatment was reported for 48.9% of dogs with 33% having recurrence of clinical signs and 18.1% having therapeutic failure. Bivariable logistic regression showed that non-steroidal anti-inflammatory drug (NSAID) administration was associated with success (P=.035; odds ratio [OR]=2.52). Duration of cage rest and glucocorticoid administration were not significantly associated with success or QOL. Dogs with less-severe neurologic dysfunction were more likely to have a successful outcome (OR=2.56), but this association was not significant (P=.051). CONCLUSIONS Medical management can lead to an acceptable outcome in many dogs with presumptive cervical disk herniation. Based on these data, NSAIDs should be considered as part of the therapeutic regimen. Cage rest duration and glucocorticoid administration do not appear to benefit these dogs, but this should be interpreted cautiously because of the retrospective data collection and use of client self-administered questionnaire follow-up. CLINICAL RELEVANCE These results provide insight into the success of medical management for presumptive cervical disk herniation in dogs and may allow for refinement of treatment protocols.
Resumo:
OBJECTIVE To determine the success of medical management of presumptive thoracolumbar disk herniation in dogs and the variables associated with treatment outcome. STUDY DESIGN Retrospective case series. ANIMALS Dogs (n=223) with presumptive thoracolumbar disk herniation. METHODS Medical records from 2 clinics were used to identify affected dogs, and owners were mailed a questionnaire about success of therapy, recurrence of clinical signs, and quality of life (QOL) as interpreted by the owner. Signalment, duration and degree of neurologic dysfunction, and medication administration were determined from medical records. RESULTS Eighty-three percent of dogs (185/223) were ambulatory at initial evaluation. Successful treatment was reported for 54.7% of dogs, with 30.9% having recurrence of clinical signs and 14.4% classified as therapeutic failures. From bivariable logistic regression, glucocorticoid administration was negatively associated with success (P=.008; odds ratio [OR]=.48) and QOL scores (P=.004; OR=.48). The duration of cage rest was not significantly associated with success or QOL. Nonambulatory dogs were more likely to have lower QOL scores (P=.01; OR=2.34). CONCLUSIONS Medical management can lead to an acceptable outcome in many dogs with presumptive thoracolumbar disk herniation. Cage rest duration does not seem to affect outcome and glucocorticoids may negatively impact success and QOL. The conclusions in this report should be interpreted cautiously because of the retrospective data collection and the use of client self-administered questionnaire follow-up. CLINICAL RELEVANCE These results provide an insight into the success of medical management for presumptive thoracolumbar disk herniation in dogs and may allow for refinement of treatment protocols.
Resumo:
Pleural infection is a frequent clinical condition. Prompt treatment has been shown to reduce hospital costs, morbidity and mortality. Recent advances in treatment have been variably implemented in clinical practice. This statement reviews the latest developments and concepts to improve clinical management and stimulate further research. The European Association for Cardio-Thoracic Surgery (EACTS) Thoracic Domain and the EACTS Pleural Diseases Working Group established a team of thoracic surgeons to produce a comprehensive review of available scientific evidence with the aim to cover all aspects of surgical practice related to its treatment, in particular focusing on: surgical treatment of empyema in adults; surgical treatment of empyema in children; and surgical treatment of post-pneumonectomy empyema (PPE). In the management of Stage 1 empyema, prompt pleural space chest tube drainage is required. In patients with Stage 2 or 3 empyema who are fit enough to undergo an operative procedure, there is a demonstrated benefit of surgical debridement or decortication [possibly by video-assisted thoracoscopic surgery (VATS)] over tube thoracostomy alone in terms of treatment success and reduction in hospital stay. In children, a primary operative approach is an effective management strategy, associated with a lower mortality rate and a reduction of tube thoracostomy duration, length of antibiotic therapy, reintervention rate and hospital stay. Intrapleural fibrinolytic therapy is a reasonable alternative to primary operative management. Uncomplicated PPE [without bronchopleural fistula (BPF)] can be effectively managed with minimally invasive techniques, including fenestration, pleural space irrigation and VATS debridement. PPE associated with BPF can be effectively managed with individualized open surgical techniques, including direct repair, myoplastic and thoracoplastic techniques. Intrathoracic vacuum-assisted closure may be considered as an adjunct to the standard treatment. The current literature cements the role of VATS in the management of pleural empyema, even if the choice of surgical approach relies on the individual surgeon's preference.
Resumo:
OBJECTIVE There is controversy regarding the significance of radiological consolidation in the context of COPD exacerbation (eCOPD). While some studies into eCOPD exclude these cases, consolidation is a common feature of eCOPD admissions in real practice. This study aims to address the question of whether consolidation in eCOPD is a distinct clinical phenotype with implications for management decisions and outcomes. PATIENTS AND METHODS The European COPD Audit was carried out in 384 hospitals from 13 European countries between 2010 and 2011 to analyze guideline adherence in eCOPD. In this analysis, admissions were split according to the presence or not of consolidation on the admission chest radiograph. Groups were compared in terms of clinical and epidemiological features, existing treatment, clinical care utilized and mortality. RESULTS 14,111 cases were included comprising 2,714 (19.2%) with consolidation and 11,397 (80.8%) without. The risk of radiographic consolidation increased with age, female gender, cardiovascular diseases, having had two or more admissions in the previous year, and sputum color change. Previous treatment with inhaled steroids was not associated. Patients with radiographic consolidation were significantly more likely to receive antibiotics, oxygen and non-invasive ventilation during the admission and had a lower survival from admission to 90-day follow-up. CONCLUSIONS Patients admitted for COPD exacerbation who have radiological consolidation have a more severe illness course, are treated more intensively by clinicians and have a poorer prognosis. We recommend that these patients be considered a distinct subset in COPD exacerbation.