59 resultados para Rectangular protocol in field
em Universit
Resumo:
OBJECTIVE: To assess whether breath acetone concentration can be used to monitor the effects of a prolonged physical activity on whole body lipolysis and hepatic ketogenesis in field conditions. METHODS: Twenty-three non-diabetic, 11 type 1 diabetic, and 17 type 2 diabetic subjects provided breath and blood samples for this study. Samples were collected during the International Four Days Marches, in the Netherlands. For each participant, breath acetone concentration was measured using proton transfer reaction ion trap mass spectrometry, before and after a 30-50 km walk on four consecutive days. Blood non-esterified free fatty acid (NEFA), beta-hydroxybutyrate (BOHB), and glucose concentrations were measured after walking. RESULTS: Breath acetone concentration was significantly higher after than before walking, and was positively correlated with blood NEFA and BOHB concentrations. The effect of walking on breath acetone concentration was repeatedly observed on all four consecutive days. Breath acetone concentrations were higher in type 1 diabetic subjects and lower in type 2 diabetic subjects than in control subjects. CONCLUSIONS: Breath acetone can be used to monitor hepatic ketogenesis during walking under field conditions. It may, therefore, provide real-time information on fat burning, which may be of use for monitoring the lifestyle interventions.
Resumo:
Body mass and body condition are often tightly linked to animal health and fitness in the wild and thus are key measures for ecophysiologists and behavioral ecologists. In some animals, such as large seabird species, obtaining indexes of structural size is relatively easy, whereas measuring body mass under specific field circumstances may be more of a challenge. Here, we suggest an alternative, easily measurable, and reliable surrogate of body mass in field studies, that is, body girth. Using 234 free-living king penguins (Aptenodytes patagonicus) at various stages of molt and breeding, we measured body girth under the flippers, body mass, and bill and flipper length. We found that body girth was strongly and positively related to body mass in both molting (R(2) = 0.91) and breeding (R(2) = 0.73) birds, with the mean error around our predictions being 6.4%. Body girth appeared to be a reliable proxy measure of body mass because the relationship did not vary according to year and experimenter, bird sex, or stage within breeding groups. Body girth was, however, a weak proxy of body mass in birds at the end of molt, probably because most of those birds had reached a critical depletion of energy stores. Body condition indexes established from ordinary least squares regressions of either body girth or body mass on structural size were highly correlated (r(s) = 0.91), suggesting that body girth was as good as body mass in establishing body condition indexes in king penguins. Body girth may prove a useful proxy to body mass for estimating body condition in field investigations and could likely provide similar information in other penguins and large animals that may be complicated to weigh in the wild.
Resumo:
BACKGROUND AND STUDY AIMS: The current gold standard in Barrett's esophagus monitoring consists of four-quadrant biopsies every 1-2 cm in accordance with the Seattle protocol. Adding brush cytology processed by digital image cytometry (DICM) may further increase the detection of patients with Barrett's esophagus who are at risk of neoplasia. The aim of the present study was to assess the additional diagnostic value and accuracy of DICM when added to the standard histological analysis in a cross-sectional multicenter study of patients with Barrett's esophagus in Switzerland. METHODS: One hundred sixty-four patients with Barrett's esophagus underwent 239 endoscopies with biopsy and brush cytology. DICM was carried out on 239 cytology specimens. Measures of the test accuracy of DICM (relative risk, sensitivity, specificity, likelihood ratios) were obtained by dichotomizing the histopathology results (high-grade dysplasia or adenocarcinoma vs. all others) and DICM results (aneuploidy/intermediate pattern vs. diploidy). RESULTS: DICM revealed diploidy in 83% of 239 endoscopies, an intermediate pattern in 8.8%, and aneuploidy in 8.4%. An intermediate DICM result carried a relative risk (RR) of 12 and aneuploidy a RR of 27 for high-grade dysplasia/adenocarcinoma. Adding DICM to the standard biopsy protocol, a pathological cytometry result (aneuploid or intermediate) was found in 25 of 239 endoscopies (11%; 18 patients) with low-risk histology (no high-grade dysplasia or adenocarcinoma). During follow-up of 14 of these 18 patients, histological deterioration was seen in 3 (21%). CONCLUSION: DICM from brush cytology may add important information to a standard biopsy protocol by identifying a subgroup of BE-patients with high-risk cellular abnormalities.
Resumo:
BACKGROUND: Newborn screening (NBS) for Cystic Fibrosis (CF) has been introduced in many countries, but there is no ideal protocol suitable for all countries. This retrospective study was conducted to evaluate whether the planned two step CF NBS with immunoreactive trypsinogen (IRT) and 7 CFTR mutations would have detected all clinically diagnosed children with CF in Switzerland. METHODS: IRT was measured using AutoDELFIA Neonatal IRT-Kit in stored NBS cards. RESULTS: Between 2006 and 2009, 66 children with CF were reported, 4 of which were excluded for various reasons (born in another country, NBS at 6 months, no informed consent). 98% (61/62) had significantly higher IRT compared to matched control group. There was one false negative IRT result in an asymptomatic child with atypical CF (normal pancreatic function and sweat test). CONCLUSIONS: All children but one with atypical CF would have been detected with the planned two step protocol.
Resumo:
Purpose: To compare the additional informations obtainedwith axial and sagittal T2 weighted with fat saturation(T2FS) and T1 weighted with Gadolinium iv sequenceswith fat saturation (T1FSGd) to detect degenerativeinflammatory lumbar spine lesions.Materials and Methods: Our retrospective study included73 patients (365 lumbar levels) with lumbar spinedegenerative disease (25 males, 48 females, mean age56 years). MRI protocol was performed with T1 and T2weighted sagittal and T2 weighted axial sequences(standard protocol), axial and sagittal T2FS and T1FSGd.Images were independently analyzed by two musculoskeletalradiologists and a neurosurgeon. Two groups ofsequences were analyzed: standard + T2FS sequences(group 1), standard + T1FSGd sequences (group 2).Degenerative inflammatory lumbar spine lesions werenoted at each level in: anterior column (vertebralendplate), spinal canal (epidural and peri-radicular fat)and posterior column (facet joint with capsular recessand subchondral bone).Results: Degenerative inflammatory lesions were present in18% (66/365) of levels in group 1, and 48% (175/365) oflevels in group 2. In details, lesions were noted in group 1 and2 respectively:-in 44 and 66 levels for anterior column,-in22 and 131 levels for posterior column,-in 0 and 36 levelsfor spinal canal. All these differences were statisticallysignificant. Intra and Interobserver agreements were good.Conclusion: The T1FSGd sequence is more sensitive thanT2FS to show the degenerative inflammatory lumbar spinelesions, especially in spinal canal and posterior column.
Resumo:
High-intensity intermittent training in hypoxia: A double-blinded, placebo-controlled field study in youth football players. J Strength Cond Res 29(1): 226-237, 2015-This study examined the effects of 5 weeks (∼60 minutes per training, 2 d·wk) of run-based high-intensity repeated-sprint ability (RSA) and explosive strength/agility/sprint training in either normobaric hypoxia repeated sprints in hypoxia (RSH; inspired oxygen fraction [FIO2] = 14.3%) or repeated sprints in normoxia (RSN; FIO2 = 21.0%) on physical performance in 16 highly trained, under-18 male footballers. For both RSH (n = 8) and RSN (n = 8) groups, lower-limb explosive power, sprinting (10-40 m) times, maximal aerobic speed, repeated-sprint (10 × 30 m, 30-s rest) and repeated-agility (RA) (6 × 20 m, 30-s rest) abilities were evaluated in normoxia before and after supervised training. Lower-limb explosive power (+6.5 ± 1.9% vs. +5.0 ± 7.6% for RSH and RSN, respectively; both p < 0.001) and performance during maximal sprinting increased (from -6.6 ± 2.2% vs. -4.3 ± 2.6% at 10 m to -1.7 ± 1.7% vs. -1.3 ± 2.3% at 40 m for RSH and RSN, respectively; p values ranging from <0.05 to <0.01) to a similar extent in RSH and RSN. Both groups improved best (-3.0 ± 1.7% vs. -2.3 ± 1.8%; both p ≤ 0.05) and mean (-3.2 ± 1.7%, p < 0.01 vs. -1.9 ± 2.6%, p ≤ 0.05 for RSH and RSN, respectively) repeated-sprint times, whereas sprint decrement did not change. Significant interactions effects (p ≤ 0.05) between condition and time were found for RA ability-related parameters with very likely greater gains (p ≤ 0.05) for RSH than RSN (initial sprint: 4.4 ± 1.9% vs. 2.0 ± 1.7% and cumulated times: 4.3 ± 0.6% vs. 2.4 ± 1.7%). Maximal aerobic speed remained unchanged throughout the protocol. In youth highly trained football players, the addition of 10 repeated-sprint training sessions performed in hypoxia vs. normoxia to their regular football practice over a 5-week in-season period was more efficient at enhancing RA ability (including direction changes), whereas it had no additional effect on improvements in lower-limb explosive power, maximal sprinting, and RSA performance.
Resumo:
Objectives: Imatinib has been increasingly proposed for therapeutic drug monitoring (TDM), as trough concentrations (Cmin) correlate with response rates in CML patients. This analysis aimed to evaluate the impact of imatinib exposure on optimal molecular response rates in a large European cohort of patients followed by centralized TDM.¦Methods: Sequential PK/PD analysis was performed in NONMEM 7 on 2230 plasma (PK) samples obtained along with molecular response (PD) data from 1299 CML patients. Model-based individual Bayesian estimates of exposure, parameterized as to initial dose adjusted and log-normalized Cmin (log-Cmin) or clearance (CL), were investigated as potential predictors of optimal molecular response, while accounting for time under treatment (stratified at 3 years), gender, CML phase, age, potentially interacting comedication, and TDM frequency. PK/PD analysis used mixed-effect logistic regression (iterative two-stage method) to account for intra-patient correlation.¦Results: In univariate analyses, CL, log-Cmin, time under treatment, TDM frequency, gender (all p<0.01) and CML phase (p=0.02) were significant predictors of the outcome. In multivariate analyses, all but log-Cmin remained significant (p<0.05). Our model estimates a 54.1% probability of optimal molecular response in a female patient with a median CL of 14.4 L/h, increasing by 4.7% with a 35% decrease in CL (percentile 10 of CL distribution), and decreasing by 6% with a 45% increased CL (percentile 90), respectively. Male patients were less likely than female to be in optimal response (odds ratio: 0.62, p<0.001), with an estimated probability of 42.3%.¦Conclusions: Beyond CML phase and time on treatment, expectedly correlated to the outcome, an effect of initial imatinib exposure on the probability of achieving optimal molecular response was confirmed in field-conditions by this multivariate analysis. Interestingly, male patients had a higher risk of suboptimal response, which might not exclusively derive from their 18.5% higher CL, but also from reported lower adherence to the treatment. A prospective longitudinal study would be desirable to confirm the clinical importance of identified covariates and to exclude biases possibly affecting this observational survey.
Resumo:
Recent laboratory studies have suggested that heart rate variability (HRV) may be an appropriate criterion for training load (TL) quantification. The aim of this study was to validate a novel HRV index that may be used to assess TL in field conditions. Eleven well-trained long-distance male runners performed four exercises of different duration and intensity. TL was evaluated using Foster and Banister methods. In addition, HRV measurements were performed 5 minutes before exercise and 5 and 30 minutes after exercise. We calculated HRV index (TLHRV) based on the ratio between HRV decrease during exercise and HRV increase during recovery. HRV decrease during exercise was strongly correlated with exercise intensity (R = -0.70; p < 0.01) but not with exercise duration or training volume. TLHRV index was correlated with Foster (R = 0.61; p = 0.01) and Banister (R = 0.57; p = 0.01) methods. This study confirms that HRV changes during exercise and recovery phase are affected by both intensity and physiological impact of the exercise. Since the TLHRV formula takes into account the disturbance and the return to homeostatic balance induced by exercise, this new method provides an objective and rational TL index. However, some simplification of the protocol measurement could be envisaged for field use.
Resumo:
PURPOSE: Optimal high-intensity interval training (HIIT) regimens for running performance are unknown, although most protocols result in some benefit to key performance factors (running economy (RE), anaerobic threshold (AT), or maximal oxygen uptake (V˙O2max)). Lower-body positive pressure (LBPP) treadmills offer the unique possibility to partially unload runners and reach supramaximal speeds. We studied the use of LBPP to test an overspeed HIIT protocol in trained runners. METHODS: Eleven trained runners (35 ± 8 yr, V˙O2max, 55.7 ± 6.4 mL·kg·min) were randomized to an LBPP (n = 6) or a regular treadmill (CON, n = 5), eight sessions over 4 wk of HIIT program. Four to five intervals were run at 100% of velocity at V˙O2max (vV˙O2max) during 60% of time to exhaustion at vV˙O2max (Tlim) with a 1:1 work:recovery ratio. Performance outcomes were 2-mile track time trial, V˙O2max, vV˙O2max, vAT, Tlim, and RE. LBPP sessions were carried out at 90% body weight. RESULTS: Group-time effects were present for vV˙O2max (CON, 17.5 vs. 18.3, P = 0.03; LBPP, 19.7 vs. 22.3 km·h; P < 0.001) and Tlim (CON, 307.0 vs. 404.4 s, P = 0.28; LBPP, 444.5 vs. 855.5, P < 0.001). Simple main effects for time were present for field performance (CON, -18; LBPP, -25 s; P = 0.002), V˙O2max (CON, 57.6 vs. 59.6; LBPP, 54.1 vs. 55.1 mL·kg·min; P = 0.04) and submaximal HR (157.7 vs. 154.3 and 151.4 vs. 148.5 bpm; P = 0.002). RE was unchanged. CONCLUSIONS: A 4-wk HIIT protocol at 100% vV˙O2max improves field performance, vV˙O2max, V˙O2max and submaximal HR in trained runners. Improvements are similar if intervals are run on a regular treadmill or at higher speeds on a LPBB treadmill with 10% body weight reduction. LBPP could provide an alternative for taxing HIIT sessions.
Resumo:
Patients with a solid organ transplant have increased in numbers and in individual survival in Switzerland over the last decades. As a consequence of long-term immunosuppression, skin cancer in solid organ recipients (SOTRs) has been recognized as an important problem. Screening and education of potential SOTRs about prevention of sun damage and early recognition of skin cancer are important before transplantation. Once transplanted, SOTRs should be seen by a dermatologist yearly for repeat education as well as early diagnosis, prevention and treatment of skin cancer. Squamous cell carcinoma of the skin (SCC) is the most frequent cancer in the setting of long-term immunosuppression. Sun protection by behaviour, clothing and daily sun screen application is the most effective prevention. Cumulative sun damage results in field cancerisation with numerous in-situ SCC such as actinic keratosis and Bowen's disease which should be treated proactively. Invasive SCC is cured by complete surgical excision. Early removal is the best precaution against potential metastases of SCC. Reduction of immunosuppression and switch to mTOR inhibitors and potentially, mycophenolate, may reduce the incidence of further SCC. Chemoprevention with the retinoid acitretin reduces the recurrence rate of SCC. The dermatological follow-up of SOTRs should be integrated into the comprehensive post-transplant care.
Resumo:
Purpose: Optimal induction and maintenance immunosuppressive therapies in renal transplantation are still a matter of debate.Chronic corticosteroid usage is a major cause of morbidity but steroid-free immunosuppression (SF) can result in unacceptably high rates of acute rejection and even graft loss. Methods and materials: We have conducted a prospective openlabelled clinical trial in the Geneva-Lausanne Transplant Network from March 2005 to May 2008. 20 low immunological risk (<20% PRA, no DSA) adult recipients of a primary kidney allograft received a 4-day course of thymoglobulin (1.5 mg/kg/d) with methylprednisolone and maintenance based immunosuppression of tacrolimus and entericcoated mycophenolic acid (MPA). The control arm consisted of 16 matched recipients treated with basiliximab induction, tacrolimus, mycophenolate mofetil and corticosteroids. Primary endpoints were the percentage of recipients not taking steroids and the percentage of rejection-free recipients at 12 months.Secondary end points were allograft survival at 12 months and significant thymoglobulin and/or other drugs side effects. Results: In the SF group, 85% of the kidney recipients remained steroid-free at 12 months. The 3 cases of steroids introduction were due to one acute tubulo-interstitial rejection occurring at day 11, one tacrolimus withdrawal due to thrombotic microangiopathy and one MPA withdrawal because of multiple sinusitis and CMV reactivations. No BK viremia was detected nor CMV disease. The 6 CMV negative patients who received a positive CMV allograft had a symptomatic primoinfection after their 6-month course valgancyclovir prophylaxis. In the steroid-based group, 3 acute rejection episodes (acute humoral rejection, acute tubulointerstitial Banff IA and vascular Banff IIA) occurred in 2 recipients, 3 BK virus nephropathies were diagnosed between 45 and 135 days post transplant No side effects were associated with thymoglobulin infusion.In the SF group, 4 recipients presented severe leukopenia or agranulocytosis and one recipient had febrile hepatitis leading to transient MPA withdrawal. Discontinuation of MPA was needed in 2 patients for recurrent sinusitis and CMV reactivations. Patient and graft survival was 100% in both groups at 12 month follow-up. Conclusion: Steroid-free with short-course thymoglobulin induction therapy was a safe protocol in low-risk renal transplant recipients. Lower rates of acute rejection and BK virus infections episodes were seen compared to the steroid-based control group. A longer follow-up will be needed to determine whether this SF immunosuppressive regimen will result in higher graft and patient survival.
Resumo:
INTRODUCTION: The phase III EORTC 22033-26033/NCIC CE5 intergroup trial compares 50.4 Gy radiotherapy with up-front temozolomide in previously untreated low-grade glioma. We describe the digital EORTC individual case review (ICR) performed to evaluate protocol radiotherapy (RT) compliance. METHODS: Fifty-eight institutions were asked to submit 1-2 randomly selected cases. Digital ICR datasets were uploaded to the EORTC server and accessed by three central reviewers. Twenty-seven parameters were analysed including volume delineation, treatment planning, organ at risk (OAR) dosimetry and verification. Consensus reviews were collated and summary statistics calculated. RESULTS: Fifty-seven of seventy-two requested datasets from forty-eight institutions were technically usable. 31/57 received a major deviation for at least one section. Relocation accuracy was according to protocol in 45. Just over 30% had acceptable target volumes. OAR contours were missing in an average of 25% of cases. Up to one-third of those present were incorrectly drawn while dosimetry was largely protocol compliant. Beam energy was acceptable in 97% and 48 patients had per protocol beam arrangements. CONCLUSIONS: Digital RT plan submission and review within the EORTC 22033-26033 ICR provide a solid foundation for future quality assurance procedures. Strict evaluation resulted in overall grades of minor and major deviation for 37% and 32%, respectively.
Resumo:
Animals may use plant compounds to defend themselves against parasites. Wood ants, Formica paralugubris, incorporate pieces of solidified conifer resin into their nests. This behaviour inhibits the growth of bacteria and fungi in nest material and protects the ants against some detrimental microorganisms. Here, we studied the resin-collecting behaviour of ants under field and laboratory conditions. First, we focused on an important assumption of the self-medication hypothesis, which is that the animals deliberately choose the active plant material. In field cafeteria tests, the ants indeed showed a strong preference for resin over twigs and stones, which are building materials commonly encountered in their environment. We detected seasonal variation in the choice of ants: the preference for resin over twigs was more pronounced in spring than in summer, whereas in autumn the ants collected twigs and resin at equal rates. Second, we found almost similar seasonal patterns when comparing the collecting rates of pieces of wood that had been impregnated with turpentine (a distillate of oleoresin) and untreated pieces of wood, which reveals that the preference for resin is based on odour cues. Third, we tested whether the collection of resin is prophylactic or therapeutic. We found that the relative collection rate of resin versus stones did not depend on an experimental infection with the entomopathogenic fungus Metarhizium anisopliae in laboratory colonies. Together, these results show that the ants deliberately choose the resin and suggest that resin collection is prophylactic rather than therapeutic.
Resumo:
Arsenic contamination of natural waters is a worldwide concern, as the drinking water supplies for large populations can have high concentrations of arsenic. Traditional techniques to detect arsenic in natural water samples can be costly and time-consuming; therefore, robust and inexpensive methods to detect arsenic in water are highly desirable. Additionally, methods for detecting arsenic in the field have been greatly sought after. This article focuses on the use of bacteria-based assays as an emerging method that is both robust and inexpensive for the detection of arsenic in groundwater both in the field and in the laboratory. The arsenic detection elements in bacteria-based bioassays are biosensor-reporter strains; genetically modified strains of, e.g., Escherichia coli, Bacillus subtilis, Staphylococcus aureus, and Rhodopseudomonas palustris. In response to the presence of arsenic, such bacteria produce a reporter protein, the amount or activity of which is measured in the bioassay. Some of these bacterial biosensor-reporters have been successfully utilized for comparative in-field analyses through the use of simple solution-based assays, but future methods may concentrate on miniaturization using fiberoptics or microfluidics platforms. Additionally, there are other potential emerging bioassays for the detection of arsenic in natural waters including nematodes and clams.