988 resultados para Long-term monitoring
Resumo:
The depth-dependent attenuation of the secondary cosmic-ray particle flux due to snow cover and its effects on production rates of cosmogenic nuclides constitutes a potential source of uncertainty for studies conducted in regions characterized by frequent seasonal snow burial. Recent experimental and numerical modelling studies have yielded new constraints on the effect of hydrogen-rich media on the production rates of cosmogenic nuclides by low- and high-energy neutrons (<10(-3) MeV and >10(2) MeV, respectively). Here we present long-term neutron-detector monitoring data from a natural setting that we use to quantify the effect of snow cover on the attenuation of fast neutrons (0.1-10 MeV), which are responsible for the production of Ne-21 from Mg and Cl-36 from K. We use data measured between July 2001 and May 2008 at seven stations located throughout the Ecrins-Pelvoux massif (French Western Alps) and its surroundings, at elevations ranging from 200 to 2500 m a.s.l. From the cosmic-ray fluxes recorded during summer, when snow is absent, we infer an apparent attenuation length of 148 g cm(-2) in the atmosphere at a latitude of similar to 45 degrees N and for altitudes ranging from similar to 200 to 2500 m a.s.l. Using snow water-equivalent (SWE) values obtained through snow-coring campaigns that overlap in time the neutron monitoring for five stations, we show that fast neutrons are much more strongly attenuated in snow than predicted by a conventional mass-shielding formulation and the attenuation length estimated in the atmosphere. We suggest that such strong attenuation results from boundary effects at the atmosphere/snow interface induced by the high efficiency of water as a neutron moderator. Finally, we propose an empirical model that allows calculating snow-shielding correction factors as a function of SWE for studies using Ne-21 and Cl-36 analyses in Mg- and K-rich minerals, respectively. This empirical model is of interest for studies with a focus on cosmic-ray exposure dating, particularly if the target rocks are made up of mafic to ultramafic units where seasonal snow-cover is a common phenomenon.
Resumo:
AIMS: Device-based remote monitoring (RM) has been linked to improved clinical outcomes at short to medium-term follow-up. Whether this benefit extends to long-term follow-up is unknown. We sought to assess the effect of device-based RM on long-term clinical outcomes in recipients of implantable cardioverter-defibrillators (ICD). METHODS: We performed a retrospective cohort study of consecutive patients who underwent ICD implantation for primary prevention. RM was initiated with patient consent according to availability of RM hardware at implantation. Patients with concomitant cardiac resynchronization therapy were excluded. Data on hospitalizations, mortality and cause of death were systematically assessed using a nationwide healthcare platform. A Cox proportional hazards model was employed to estimate the effect of RM on mortality and a composite endpoint of cardiovascular mortality and hospital admission due to heart failure (HF). RESULTS: 312 patients were included with a median follow-up of 37.7months (range 1 to 146). 121 patients (38.2%) were under RM since the first outpatient visit post-ICD and 191 were in conventional follow-up. No differences were found regarding age, left ventricular ejection fraction, heart failure etiology or NYHA class at implantation. Patients under RM had higher long-term survival (hazard ratio [HR] 0.50, CI 0.27-0.93, p=0.029) and lower incidence of the composite outcome (HR 0.47, CI 0.27-0.82, p=0.008). After multivariate survival analysis, overall survival was independently associated with younger age, higher LVEF, NYHA class lower than 3 and RM. CONCLUSION: RM was independently associated with increased long-term survival and a lower incidence of a composite endpoint of hospitalization for HF or cardiovascular mortality.
Resumo:
Invasive insects that successfully establish in introduced areas can significantly alter natural communities. These pests require specific establishment criteria (e.g. host suitability) that, when known, can help quantify potential damage to infested areas. Emerald ash borer (Agrilus planipennis [Coleoptera: Buprestidae]) is an invasive phloem-feeding pest which is responsible for the death of millions of ash trees (Fraxinus spp. L.). Over 200 surviving ash trees were previously identified in the Huron-Clinton Metroparks located in southeast Michigan. Trees were assessed over a four year period and a hierarchical cluster analysis was performed on dieback, vigor, and presence of signs and symptoms, in order to place trees into one of three tolerance groups. The clustering of trees with different responses to emerald ash borer attack suggests that there are different tolerance levels in North American ash trees in southeastern Michigan, and these groups were designated as apparently tolerant, not tolerant and intermediate tolerance. Adult landing rates and evidence of adult emergence were significantly lower in the apparently tolerant group compared with the not tolerant group, but larval survival from eggs placed on trees did not differ between tolerance groups. Therefore, it appears that apparently tolerant trees survive because they are less attractive to adult beetles which results in fewer eggs being laid on them. Trees in the apparently tolerant group remained of higher vigor over the four years of the study. North American ash may survive the emerald ash borer epidemic due to natural variation and inherent resistance regardless of the lack of co-evolutionary history with emerald ash borer.
Resumo:
Background: Despite antihypertensive therapy, it is difficult to maintain optimal systemic blood pressure (BP) values in hypertensive patients (HPT). Exercise may reduce BP in untreated HPT. However, evidence regarding its effect in long-term antihypertensive therapy is lacking. Our purpose was to evaluate the acute effects of 40-minute continuous (CE) or interval exercise (IE) using cycle ergometers on BP in long-term treated HPT. Methods: Fifty-two treated HPT were randomized to CE (n=26) or IE (n=26) protocols. CE was performed at 60% of reserve heart rate (HR). IE alternated consecutively 2 min at 50% reserve HR with 1 min at 80%. Two 24-h ambulatory BP monitoring were made after exercise (postexercise) or a nonexercise control period (control) in random order. Results: CE reduced mean 24-h systolic (S) BP (2.6 +/- 6.6 mm Hg, p-0.05) and diastolic (D) BP (2.3 +/- 4.6, p-0.01), and nighttime SBP (4.8 +/- 6.4, p < 0.001) and DBP (4.6 +/- 5.2 mm Hg, p-0.001). IE reduced 24-h SBP (2.8 +/- 6.5, p-0.03) and nighttime SBP (3.4 +/- 7.2, p-0.02), and tended to reduce nighttime DBP (p=0.06). Greater reductions occurred in higher BP levels. Percentage of normal ambulatory BP values increased after CE (24-h: 42% to 54%; daytime: 42% to 61%; nighttime: 61% to 69%) and IE (24-h: 31% to 46%; daytime: 54% to 61%; nighttime: 46% to 69%). Conclusion: CE and IE reduced ambulatory BP in treated HPT, increasing the number of patients reaching normal ambulatory BP values. These effects suggest that continuous and interval aerobic exercise may have a role in BP management in treated HPT. (c) 2008 Elsevier Ireland Ltd. All rights reserved.
Resumo:
The adaptations of muscle to sprint training can be separated into metabolic and morphological changes. Enzyme adaptations represent a major metabolic adaptation to sprint training, with the enzymes of all three energy systems showing signs of adaptation to training and some evidence of a return to baseline levels with detraining. Myokinase and creatine phosphokinase have shown small increases as a result of short-sprint training in some studies and elite sprinters appear better able to rapidly breakdown phosphocreatine (PCr) than the sub-elite. No changes in these enzyme levels have been reported as a result of detraining. Similarly, glycolytic enzyme activity (notably lactate dehydrogenase, phosphofructokinase and glycogen phosphorylase) has been shown to increase after training consisting of either long (> 10-second) or short (< 10-second) sprints. Evidence suggests that these enzymes return to pre-training levels after somewhere between 7 weeks and 6 months of detraining. Mitochondrial enzyme activity also increases after sprint training, particularly when long sprints or short recovery between short sprints are used as the training stimulus. Morphological adaptations to sprint training include changes in muscle fibre type, sarcoplasmic reticulum, and fibre cross-sectional area. An appropriate sprint training programme could be expected to induce a shift toward type Ha muscle, increase muscle cross-sectional area and increase the sarcoplasmic reticulum volume to aid release of Ca2+. Training volume and/or frequency of sprint training in excess of what is optimal for an individual, however, will induce a shift toward slower muscle contractile characteristics. In contrast, detraining appears to shift the contractile characteristics towards type IIb, although muscle atrophy is also likely to occur. Muscle conduction velocity appears to be a potential non-invasive method of monitoring contractile changes in response to sprint training and detraining. In summary, adaptation to sprint training is clearly dependent on the duration of sprinting, recovery between repetitions, total volume and frequency of training bouts. These variables have profound effects on the metabolic, structural and performance adaptations from a sprint-training programme and these changes take a considerable period of time to return to baseline after a period of detraining. However, the complexity of the interaction between the aforementioned variables and training adaptation combined with individual differences is clearly disruptive to the transfer of knowledge and advice from laboratory to coach to athlete.
Resumo:
OBJETIVE: With the increased use of intracoronary stents, in-stent restenosis has become a clinically significant drawback in invasive cardiology. We retrospectively assessed the short- and long-term outcomes after excimer laser coronary angioplasty of in-stent restenosis. METHODS: Twenty-five patients with 33 incidents of in-stent restenosis treated with excimer laser coronary angioplasty (ELCA) were analyzed. Sixty-six percent were males, mean age of 73±11 years, and 83% were functional class III-IV (NYHA). ELCA was performed using 23 concentric and 10 eccentric catheters with a diameter of 1.6-2.2 mm, followed by balloon angioplasty (PTCA) and ultrasound monitoring. The procedure was performed in the following vessels: left anterior descending artery, 10; left circumflex artery, 8; right coronary artery, 6; left main coronary artery, 2; and venous bypass graft, 7. RESULTS: The ELCA was successful in 71% of the cases, and PTCA was 100% successful. The diameter of the treated vessels was 3.44±0.5mm; the minimal luminal diameter (MLD) increased from 0.30mm pre-ECLA to 1.97mm post-ELCA, and to 2.94mm post-PTCA (p<0.001). The percent stenosis was reduced from 91.4±9.5% before ECLA to 42.3±14.9% after ELCA and to 14.6 ± 9.3% after PTCA (p<0.001). Seventeen (68%) patients were asymptomatic at 6 months and 15 (60%) at 1 year. New restenosis rates were 8/33 (24.2%) at 6 months and 9 /33 (27.3%) at 12 months. CONCLUSION: ELCA is safe and effective for the treatment of in-stent restenosis. In the present sample, a slight increase in new restenotic lesions between 6 and 12 months was found.
Resumo:
The article provides a method for long-term forecast of frame alignment losses based on the bit-error rate monitoring for structure-agnostic circuit emulation service over Ethernet in a mobile backhaul network. The developed method with corresponding algorithm allows to detect instants of probable frame alignment losses in a long term perspective in order to give engineering personnel extra time to take some measures aimed at losses prevention. Moreover, long-term forecast of frame alignment losses allows to make a decision about the volume of TDM data encapsulated into a circuit emulation frame in order to increase utilization of the emulated circuit. The developed long-term forecast method formalized with the corresponding algorithm is recognized as cognitive and can act as a part of network predictive monitoring system.
Resumo:
Background. Accurate quantification of the prevalence of human immunodeficiency virus type 1 (HIV-1) drug resistance in patients who are receiving antiretroviral therapy (ART) is difficult, and results from previous studies vary. We attempted to assess the prevalence and dynamics of resistance in a highly representative patient cohort from Switzerland. Methods. On the basis of genotypic resistance test results and clinical data, we grouped patients according to their risk of harboring resistant viruses. Estimates of resistance prevalence were calculated on the basis of either the proportion of individuals with a virologic failure or confirmed drug resistance (lower estimate) or the frequency-weighted average of risk group-specific probabilities for the presence of drug resistance mutations (upper estimate). Results. Lower and upper estimates of drug resistance prevalence in 8064 ART-exposed patients were 50% and 57% in 1999 and 37% and 45% in 2007, respectively. This decrease was driven by 2 mechanisms: loss to follow-up or death of high-risk patients exposed to mono- or dual-nucleoside reverse-transcriptase inhibitor therapy (lower estimates range from 72% to 75%) and continued enrollment of low-risk patients who were taking combination ART containing boosted protease inhibitors or nonnucleoside reverse-transcriptase inhibitors as first-line therapy (lower estimates range from 7% to 12%). A subset of 4184 participants (52%) had 1 study visit per year during 2002-2007. In this subset, lower and upper estimates increased from 45% to 49% and from 52% to 55%, respectively. Yearly increases in prevalence were becoming smaller in later years. Conclusions. Contrary to earlier predictions, in situations of free access to drugs, close monitoring, and rapid introduction of new potent therapies, the emergence of drug-resistant viruses can be minimized at the population level. Moreover, this study demonstrates the necessity of interpreting time trends in the context of evolving cohort populations.
Resumo:
ABSTRACT: BACKGROUND: The dissemination of palliative care for patients presenting complex chronic diseases at various stages has become an important matter of public health. A death census in Swiss long-term care facilities (LTC) was set up with the aim of monitoring the frequency of selected indicators of palliative care. METHODS: The survey covered 150 LTC facilities (105 nursing homes and 45 home health services), each of which was asked to complete a questionnaire for every non-accidental death over a period of six months. The frequency of 4 selected indicators of palliative care (resort to a specialized palliative care service, the administration of opiates, use of any pain measurement scale or other symptom measurement scale) was monitored in respect of the stages of care and analysed based on gender, age, medical condition and place of residence. RESULTS: Overall, 1200 deaths were reported, 29.1% of which were related to cancer. The frequencies of each indicator varied according to the type of LTC, mostly regarding the administration of opiate. It appeared that the access to palliative care remained associated with cancer, terminal care and partly with age, whereas gender and the presence of mental disorders had no effect on the indicators. In addition, the use of drugs was much more frequent than the other indicators. CONCLUSION: The profile of patients with access to palliative care must become more diversified. Among other recommendations, equal access to opiates in nursing homes and in home health services, palliative care at an earlier stage and the systematic use of symptom management scales when resorting to opiates have to become of prime concern.
Resumo:
BACKGROUND: Persisting metallic intraocular foreign bodies (IOFB) with a ferrous content have been associated with ocular siderosis and retinal degeneration. We describe two patients in whom a metallic IOFB containing iron was left embedded for many years in the choroid and sclera after having penetrated through the vitreous and the retina. HISTORY AND SIGNS: Two male patients, aged 41 and 48 years, presented with a metallic IOFB sustained during a work accident involving metal tools. THERAPY AND OUTCOME: For the first patient it was deemed unwise to operate, as the IOFB was also lodged very deeply in the choroid and sclera in the inferior temporal quadrant. The second patient underwent pars plana vitrectomy, but the IOFB could not be removed surgically as it was too deeply embedded in the sclera and choroid. After a period of 6 years (Case 1) and 4 years (Case 2) of follow-up, visual acuity remained at 1.0 and the IOFB was encased in a fibrotic capsule in both cases. Full-field and multifocal electroretinograms showed an inter-ocular asymmetry at baseline, which remained stable during the follow-up. CONCLUSIONS: Ocular siderosis may not develop in patients with a deeply embedded metallic IOFB. Regular monitoring of both visual function and the electroretinogram is mandatory when the IOFB is left inside the eye.
Resumo:
The National Council on Ageing and Older People has long been concerned about the quality of long-term residential care for older people in Ireland. In 1986, its predecessor, the National Council for the Aged published “It’s Our Home”. The Quality of Life in Private and Voluntary Nursing Homes. In 1999 the Council commissioned a postal survey of all long-term residential care facilities in the country to determine whether facilities had quality initiatives in operation; providers’ views and aspirations for future provision of long-term care; providers’ views on the introduction of a national quality monitoring policy Download the Report here
Resumo:
In 1999 the National Council on Ageing and Older People commissioned a postal survey of all long-term residential care facilities in the country to determie: - whether facilities had quality initiatives in operation - providers' views and aspirations for future provision of long-term care - providers' views on the introduction of a national quality monitoring policy This report is the outcome of the programme of work conducted by the Council on the quality of long-term residential care provision for older people in Ireland. The aim of the report is to provide a framework for developing quality in long-term residential care settings with a focus on the well-being, dignity and autonomy of older residents.
Resumo:
Imatinib (Glivec®) has transformed the treatment and short-term prognosis of chronic myeloid leukaemia (CML) and gastrointestinal stromal tumour (GIST). However, the treatment must be taken indefinitely and is not devoid of inconvenience and toxicity. Moreover, resistance or escape from disease control occurs in a significant number of patients. Imatinib is a substrate of the cytochromes P450 CYP3A4/5 and of the multidrug transporter P glycoprotein (product of the MDR1 gene), and is also bound to the alpha1-acid glycoprotein (AAG) in plasma. Considering the large inter-individual differences in the expression and function of those systems, the disposition and clinical activity of imatinib can be expected to vary widely among patients, calling for dosage individualisation. The aim of this exploratory study was to determine the average pharmacokinetic parameters characterizing the disposition of imatinib in the target population, to assess their inter-individual variability, and to identify influential factors affecting them. A total of 321 plasma concentrations were measured in 59 patients receiving Glivec® at diverse dosage regimens, using a validated chromatographic method developed for this study. The results were analysed by non-linear mixed effect modelling (NONMEM). A one-compartment model with first-order absorption described the data appropriately, with an average apparent clearance of 12.4 l/h, a volume of distribution of 268 l and an absorption constant of 0.47 h-1. The clearance was affected by body weight, age and sex. No influences of interacting drugs were found. DNA samples were used for pharmacogenetic explorations. The MDR1 polymorphism 3435C>T and the AAG phenotype appears to modulate the disposition of imatinib. Large inter-individual variability (CV %) remained unexplained by the demographic covariates considered, both on clearance (40%) and distribution volume (71%). Together with intra-patient variability (34%), this translates into an 8-fold width of the 90%-prediction interval of plasma concentrations expected under a fixed dosing regimen. This is a strong argument to further investigate the possible usefulness of a therapeutic drug monitoring programme for imatinib. It may help in individualising the dosing regimen before overt disease progression or observation of treatment toxicity, thus improving both the long-term therapeutic effectiveness and tolerability of this drug.