14 resultados para Primary history
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
In many patients, optimal results after pallidal deep brain stimulation (DBS) for primary dystonia may appear over several months, possibly beyond 1 year after implant. In order to elucidate the factors predicting such protracted clinical effect, we retrospectively reviewed the clinical records of 44 patients with primary dystonia and bilateral pallidal DBS implants. Patients with fixed skeletal deformities, as well as those with a history of prior ablative procedures, were excluded. The Burke-Fahn-Marsden Dystonia Rating Scale (BFMDRS) scores at baseline, 1 and 3 years after DBS were used to evaluate clinical outcome. All subjects showed a significant improvement after DBS implants (mean BFMDRS improvement of 74.9% at 1 year and 82.6% at 3 years). Disease duration (DD, median 15 years, range 2-42) and age at surgery (AS, median 31 years, range 10-59) showed a significant negative correlation with DBS outcome at 1 and 3 years. A partition analysis, using DD and AS, clustered subjects into three groups: (1) younger subjects with shorter DD (n = 19, AS < 27, DD ? 17); (2) older subjects with shorter DD (n = 8, DD ? 17, AS ? 27); (3) older subjects with longer DD (n = 17, DD > 17, AS ? 27). Younger patients with short DD benefitted more and faster than older patients, who however continued to improve 10% on average 1 year after DBS implants. Our data suggest that subjects with short DD may expect to achieve a better general outcome than those with longer DD and that AS may influence the time necessary to achieve maximal clinical response.
Management of primary ciliary dyskinesia in European children: recommendations and clinical practice
Resumo:
The European Respiratory Society Task Force on primary ciliary dyskinesia (PCD) in children recently published recommendations for diagnosis and management. This paper compares these recommendations with current clinical practice in Europe. Questionnaires were returned by 194 paediatric respiratory centres caring for PCD patients in 26 countries. In most countries, PCD care was not centralised, with a median (interquartile range) of 4 (2-9) patients treated per centre. Overall, 90% of centres had access to nasal or bronchial mucosal biopsy. Samples were analysed by electron microscopy (77%) and ciliary function tests (57%). Nasal nitric oxide was used for screening in 46% of centres and saccharine tests in 36%. Treatment approaches varied widely, both within and between countries. European region, size of centre and the country's general government expenditure on health partly defined availability of advanced diagnostic tests and choice of treatments. In conclusion, we found substantial heterogeneity in management of PCD within and between countries, and poor concordance with current recommendations. This demonstrates how essential it is to standardise management and decrease inequality between countries. Our results also demonstrate the urgent need for research: to simplify PCD diagnosis, to understand the natural history and to test the effectiveness of interventions.
Resumo:
OBJECTIVE: To evaluate the effectiveness of a practice nurse led strategy to improve the notification and treatment of partners of people with chlamydia infection. DESIGN: Randomised controlled trial. SETTING: 27 general practices in the Bristol and Birmingham areas. PARTICIPANTS: 140 men and women with chlamydia (index cases) diagnosed by screening of a home collected urine sample or vulval swab specimen. INTERVENTIONS: Partner notification at the general practice immediately after diagnosis by trained practice nurses, with telephone follow up by a health adviser; or referral to a specialist health adviser at a genitourinary medicine clinic. MAIN OUTCOME MEASURES: Primary outcome was the proportion of index cases with at least one treated sexual partner. Specified secondary outcomes included the number of sexual contacts elicited during a sexual history, positive test result for chlamydia six weeks after treatment, and the cost of each strategy in 2003 sterling prices. RESULTS: 65.3% (47/72) of participants receiving practice nurse led partner notification had at least one partner treated compared with 52.9% (39/68) of those referred to a genitourinary medicine clinic (risk difference 12.4%, 95% confidence interval -1.8% to 26.5%). Of 68 participants referred to the clinic, 21 (31%) did not attend. The costs per index case were 32.55 pounds sterling for the practice nurse led strategy and 32.62 pounds sterling for the specialist referral strategy. CONCLUSION: Practice based partner notification by trained nurses with telephone follow up by health advisers is at least as effective as referral to a specialist health adviser at a genitourinary medicine clinic, and costs the same. Trial registration Clinical trials: NCT00112255.
Resumo:
OBJECTIVE: Initial presentation with primary spinal involvement in chronic recurrent multifocal osteomyelitis of childhood (CRMO) is rare. Our objective was to review the imaging appearances of three patients who had CRMO who initially presented with isolated primary spinal involvement. DESIGN AND PATIENTS: The imaging, clinical, laboratory and histology findings of the three patients were retrospectively reviewed. Imaging included seven spinal MR imaging scans, one computed tomography scan, nine bone scans, two tomograms and 16 radiographs. These were reviewed by two musculoskeletal radiologists and a consensus view is reported. All three patients presented with atraumatic spinal pain and had extensive bone spinal pathology. The patients were aged 11, 13 and 12 years. There were two females and one male. RESULTS AND CONCLUSIONS: The initial patient had thoracic T6 and T8 vertebra plana. Bone scan showed additional vertebral body involvement. Follow-up was available over a 3 year period. The second patient had partial collapse of T9 and, 2 years later, of C6. Subsequently extensive multifocal disease ensued and follow-up was available over 8 years. The third patient initially had L3 inferior partial collapse and 1 year later T8 involvement with multifocal disease. Follow-up was available over 3 years. The imaging findings of the three patients include partial and complete vertebra plana with a subchondral line adjacent to endplates associated with bone marrow MR signal alterations. Awareness of the imaging appearances may help the radiologist to include this entity in the differential diagnosis in children who present with spinal pathology and no history of trauma. Histopathological examination excludes tumor and infection but with typical imaging findings may not always be necessary.
Resumo:
BACKGROUND: Ischemic stroke is the leading cause of mortality worldwide and a major contributor to neurological disability and dementia. Terutroban is a specific TP receptor antagonist with antithrombotic, antivasoconstrictive, and antiatherosclerotic properties, which may be of interest for the secondary prevention of ischemic stroke. This article describes the rationale and design of the Prevention of cerebrovascular and cardiovascular Events of ischemic origin with teRutroban in patients with a history oF ischemic strOke or tRansient ischeMic Attack (PERFORM) Study, which aims to demonstrate the superiority of the efficacy of terutroban versus aspirin in secondary prevention of cerebrovascular and cardiovascular events. METHODS AND RESULTS: The PERFORM Study is a multicenter, randomized, double-blind, parallel-group study being carried out in 802 centers in 46 countries. The study population includes patients aged > or =55 years, having suffered an ischemic stroke (< or =3 months) or a transient ischemic attack (< or =8 days). Participants are randomly allocated to terutroban (30 mg/day) or aspirin (100 mg/day). The primary efficacy endpoint is a composite of ischemic stroke (fatal or nonfatal), myocardial infarction (fatal or nonfatal), or other vascular death (excluding hemorrhagic death of any origin). Safety is being evaluated by assessing hemorrhagic events. Follow-up is expected to last for 2-4 years. Assuming a relative risk reduction of 13%, the expected number of primary events is 2,340. To obtain statistical power of 90%, this requires inclusion of at least 18,000 patients in this event-driven trial. The first patient was randomized in February 2006. CONCLUSIONS: The PERFORM Study will explore the benefits and safety of terutroban in secondary cardiovascular prevention after a cerebral ischemic event.
Resumo:
An 8-year-old crossbred dog was presented with a one-month history of progressive weakness, respiratory impairment and abdominal distension. Surgical exploration revealed the presence of a splenic mass that infiltrated the mesentery and was adherent to the stomach and pancreas. The mass was composed of highly cellular areas of spindle-shaped cells arranged in interlacing bundles, streams, whorls and storiform patterns (Antoni A pattern) and less cellular areas with more loosely arranged spindle to oval cells (Antoni B pattern). The majority of neoplastic cells expressed vimentin, S-100 and glial fibrillary acidic protein (GFAP), but did not express desmin, alpha-smooth muscle actin or factor VIII. These morphological and immunohistochemical findings characterized the lesion as a malignant peripheral nerve sheath tumour (PNST). Primary splenic PNST has not been documented previously in the dog.
Resumo:
This study aimed to examine the aetiology of acute diarrhoea and the relapse rate in 100 client-owned dogs presented to a first-opinion clinic. History, physical examination, faecal testing and owner questionnaire data were collected at initial presentation (T0) and at either the time of relapse or at a recheck performed within 3 months. All dogs received treatment according to their clinical signs. Of 96 dogs that completed the study, 37 (38.5%) relapsed during the study period, 21 (21.9%) relapsed within 3 months, and 16 others (16.6%) at 3 months to 1 year after initial examination. Dogs that had undergone a change in housing location within 1 month prior to presentation and dogs <1 year old were significantly more likely to have positive parasitological analyses (P=0.02 and P=0.001, respectively). Pica was a risk factor for relapse (P=0.0002).
Resumo:
Varved lake sediments are excellent natural archives providing quantitative insights into climatic and environmental changes at very high resolution and chronological accuracy. However, due to the multitude of responses within lake ecosystems it is often difficult to understand how climate variability interacts with other environmental pressures such as eutrophication, and to attribute observed changes to specific causes. This is particularly challenging during the past 100 years when multiple strong trends are superposed. Here we present a high-resolution multi-proxy record of sedimentary pigments and other biogeochemical data from the varved sediments of Lake Żabińskie (Masurian Lake District, north-eastern Poland, 54°N–22°E, 120 m a.s.l.) spanning AD 1907 to 2008. Lake Żabińskie exhibits biogeochemical varves with highly organic late summer and winter layers separated by white layers of endogenous calcite precipitated in early summer. The aim of our study is to investigate whether climate-driven changes and anthropogenic changes can be separated in a multi-proxy sediment data set, and to explore which sediment proxies are potentially suitable for long quantitative climate reconstructions. We also test if convoluted analytical techniques (e.g. HPLC) can be substituted by rapid scanning techniques (visible reflectance spectroscopy VIS-RS; 380–730 nm). We used principal component analysis and cluster analysis to show that the recent eutrophication of Lake Żabińskie can be discriminated from climate-driven changes for the period AD 1907–2008. The eutrophication signal (PC1 = 46.4%; TOC, TN, TS, Phe-b, high TC/CD ratios total carotenoids/chlorophyll-a derivatives) is mainly expressed as increasing aquatic primary production, increasing hypolimnetic anoxia and a change in the algal community from green algae to blue-green algae. The proxies diagnostic for eutrophication show a smooth positive trend between 1907 and ca 1980 followed by a very rapid increase from ca. 1980 ± 2 onwards. We demonstrate that PC2 (24.4%, Chl-a-related pigments) is not affected by the eutrophication signal, but instead is sensitive to spring (MAM) temperature (r = 0.63, pcorr < 0.05, RMSEP = 0.56 °C; 5-yr filtered). Limnological monitoring data (2011–2013) support this finding. We also demonstrate that scanning visible reflectance spectroscopy (VIS-RS) data can be calibrated to HPLC-measured chloropigment data and be used to infer concentrations of sedimentary Chl-a derivatives {pheophytin a + pyropheophytin a}. This offers the possibility for very high-resolution (multi)millennial-long paleoenvironmental reconstructions.
Resumo:
Biomass burning is a major source of greenhouse gases and influences regional to global climate. Pre-industrial fire-history records from black carbon, charcoal and other proxies provide baseline estimates of biomass burning at local to global scales spanning millennia, and are thus useful to examine the role of fire in the carbon cycle and climate system. Here we use the specific biomarker levoglucosan together with black carbon and ammonium concentrations from the North Greenland Eemian (NEEM) ice cores (77.49° N, 51.2° W; 2480 m a.s.l) over the past 2000 years to infer changes in boreal fire activity. Increases in boreal fire activity over the periods 1000–1300 CE and decreases during 700–900 CE coincide with high-latitude NH temperature changes. Levoglucosan concentrations in the NEEM ice cores peak between 1500 and 1700 CE, and most levoglucosan spikes coincide with the most extensive central and northern Asian droughts of the past millennium. Many of these multi-annual droughts are caused by Asian monsoon failures, thus suggesting a connection between low- and high-latitude climate processes. North America is a primary source of biomass burning aerosols due to its relative proximity to the Greenland Ice Cap. During major fire events, however, isotopic analyses of dust, back trajectories and links with levoglucosan peaks and regional drought reconstructions suggest that Siberia is also an important source of pyrogenic aerosols to Greenland.
Resumo:
OBJECTIVES The aim of this study was to assess the safety of the concurrent administration of a clopidogrel and prasugrel loading dose in patients undergoing primary percutaneous coronary intervention. BACKGROUND Prasugrel is one of the preferred P2Y12 platelet receptor antagonists for ST-segment elevation myocardial infarction patients. The use of prasugrel was evaluated clinically in clopidogrel-naive patients. METHODS Between September 2009 and October 2012, a total of 2,023 STEMI patients were enrolled in the COMFORTABLE (Comparison of Biomatrix Versus Gazelle in ST-Elevation Myocardial Infarction [STEMI]) and the SPUM-ACS (Inflammation and Acute Coronary Syndromes) studies. Patients receiving a prasugrel loading dose were divided into 2 groups: 1) clopidogrel and a subsequent prasugrel loading dose; and 2) a prasugrel loading dose. The primary safety endpoint was Bleeding Academic Research Consortium types 3 to 5 bleeding in hospital at 30 days. RESULTS Of 2,023 patients undergoing primary percutaneous coronary intervention, 427 (21.1%) received clopidogrel and a subsequent prasugrel loading dose, 447 (22.1%) received a prasugrel loading dose alone, and the remaining received clopidogrel only. At 30 days, the primary safety endpoint was observed in 1.9% of those receiving clopidogrel and a subsequent prasugrel loading dose and 3.4% of those receiving a prasugrel loading dose alone (adjusted hazard ratio [HR]: 0.57; 95% confidence interval [CI]: 0.25 to 1.30, p = 0.18). The HAS-BLED (hypertension, abnormal renal/liver function, stroke, bleeding history or predisposition, labile international normalized ratio, elderly, drugs/alcohol concomitantly) bleeding score tended to be higher in prasugrel-treated patients (p = 0.076). The primary safety endpoint results, however, remained unchanged after adjustment for these differences (clopidogrel and a subsequent prasugrel loading dose vs. prasugrel only; HR: 0.54 [95% CI: 0.23 to 1.27], p = 0.16). No differences in the composite of cardiac death, myocardial infarction, or stroke were observed at 30 days (adjusted HR: 0.66, 95% CI: 0.27 to 1.62, p = 0.36). CONCLUSIONS This observational, nonrandomized study of ST-segment elevation myocardial infarction patients suggests that the administration of a loading dose of prasugrel in patients pre-treated with a loading dose of clopidogrel is not associated with an excess of major bleeding events. (Comparison of Biomatrix Versus Gazelle in ST-Elevation Myocardial Infarction [STEMI] [COMFORTABLE]; NCT00962416; and Inflammation and Acute Coronary Syndromes [SPUM-ACS]; NCT01000701).
Resumo:
Knowledge about vegetation and fire history of the mountains of Northern Sicily is scanty. We analysed five sites to fill this gap and used terrestrial plant macrofossils to establish robust radiocarbon chronologies. Palynological records from Gorgo Tondo, Gorgo Lungo, Marcato Cixé, Urgo Pietra Giordano and Gorgo Pollicino show that under natural or near natural conditions, deciduous forests (Quercus pubescens, Q. cerris, Fraxinus ornus, Ulmus), that included a substantial portion of evergreen broadleaved species (Q. suber, Q. ilex, Hedera helix), prevailed in the upper meso-mediterranean belt. Mesophilous deciduous and evergreen broadleaved trees (Fagus sylvatica, Ilex aquifolium) dominated in the natural or quasi-natural forests of the oro-mediterranean belt. Forests were repeatedly opened for agricultural purposes. Fire activity was closely associated with farming, providing evidence that burning was a primary land use tool since Neolithic times. Land use and fire activity intensified during the Early Neolithic at 5000 bc, at the onset of the Bronze Age at 2500 bc and at the onset of the Iron Age at 800 bc. Our data and previous studies suggest that the large majority of open land communities in Sicily, from the coastal lowlands to the mountain areas below the thorny-cushion Astragalus belt (ca. 1,800 m a.s.l.), would rapidly develop into forests if land use ceased. Mesophilous Fagus-Ilex forests developed under warm mid Holocene conditions and were resilient to the combined impacts of humans and climate. The past ecology suggests a resilience of these summer-drought adapted communities to climate warming of about 2 °C. Hence, they may be particularly suited to provide heat and drought-adapted Fagus sylvatica ecotypes for maintaining drought-sensitive Central European beech forests under global warming conditions.
Resumo:
The spatial distributions of species of tree ≥10 cm gbh were examined in two 4 ha plots and related to the local variation in topography and soil chemistry. The plots were similar in their species composition, particularly in terms of the densities of small trees, and they showed very similar edaphic characteristics. Size class distributions varied little within and between plots. Ordination of 0.25 ha subplots highlighted parallel gradients in the vegetation of both plots when the densities of trees ≥10 cm gbh were considered. Focusing on understorey trees in the 10-<50 cm gbh class at the 0.04 ha subplot scale showed a similar vegetation gradient in both plots closely associated with change from lower slope to ridge. No relationship with soil chemistry was found. On the ridges a special group of understorey species formed clumps and these species contributed importantly to the ordinations. Borneo has a regional history of occasionally severe droughts. It is suggested here that the observed patterns in the understorey are due to differential responses to low soil water supply, the ridges probably tending to dryness more than the lower slopes. Within the large and diverse family Euphorbiaceae, which dominates the understorey at Danum, there may be ecophysiological groupings of species. The long-term effects of disturbance interacting with local edaphic factors on forest structure and composition are discussed.
Resumo:
Primary ciliary dyskinesia is a rare heterogeneous recessive genetic disorder of motile cilia, leading to chronic upper and lower respiratory symptoms. Prevalence is estimated at around 1:10,000, but many patients remain undiagnosed, while others receive the label incorrectly. Proper diagnosis is complicated by the fact that the key symptoms such as wet cough, chronic rhinitis and recurrent upper and lower respiratory infection, are common and nonspecific. There is no single gold standard test to diagnose PCD. Presently, the diagnosis is made by augmenting the medical history and physical examination with in patients with a compatible medical history following a demanding combination of tests including nasal nitric oxide, high- speed video microscopy, transmission electron microscopy, genetics, and ciliary culture. These tests are costly and need sophisticated equipment and experienced staff, restricting use to highly specialised centers. Therefore, it would be desirable to have a screening test for identifying those patients who should undergo detailed diagnostic testing. Three recent studies focused on potential screening tools: one paper assessed the validity of nasal nitric oxide for screening, and two studies developed new symptom-based screening tools. These simple tools are welcome, and hopefully remind physicians whom to refer for definitive testing. However, they have been developed in tertiary care settings, where 10 to 50% of tested patients have PCD. Sensitivity and specificity of the tools are reasonable, but positive and negative predictive values may be poor in primary or secondary care settings. While these studies take an important step forward towards an earlier diagnosis of PCD, more remains to be done before we have tools tailored to different health care settings.
Resumo:
Symptoms of primary ciliary dyskinesia (PCD) are nonspecific and guidance on whom to refer for testing is limited. Diagnostic tests for PCD are highly specialised, requiring expensive equipment and experienced PCD scientists. This study aims to develop a practical clinical diagnostic tool to identify patients requiring testing.Patients consecutively referred for testing were studied. Information readily obtained from patient history was correlated with diagnostic outcome. Using logistic regression, the predictive performance of the best model was tested by receiver operating characteristic curve analyses. The model was simplified into a practical tool (PICADAR) and externally validated in a second diagnostic centre.Of 641 referrals with a definitive diagnostic outcome, 75 (12%) were positive. PICADAR applies to patients with persistent wet cough and has seven predictive parameters: full-term gestation, neonatal chest symptoms, neonatal intensive care admittance, chronic rhinitis, ear symptoms, situs inversus and congenital cardiac defect. Sensitivity and specificity of the tool were 0.90 and 0.75 for a cut-off score of 5 points. Area under the curve for the internally and externally validated tool was 0.91 and 0.87, respectively.PICADAR represents a simple diagnostic clinical prediction rule with good accuracy and validity, ready for testing in respiratory centres referring to PCD centres.