17 resultados para Total incident duration

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aim. External fertilisation requires synchronisation of gamete release between the two sexes. Adequate synchronisation is essential in aquatic media because sperm is very short-lived in water. In the cichlid Lamprologus callipterus, fertilisation of the eggs takes place inside an empty snail shell, where females stay inside the shell and males have to ejaculate into the shell opening. This spawning pattern makes the coordination of gamete release difficult. Methods. This study examined the synchronisation of males and females during egg laying. Results. The results showed that the male initiates each spawning sequence and that sperm release and egg laying are very well synchronised. 68% of all sperm releases occurred at exactly the same time when the female laid an egg, and 99% of ejaculations occurred within ±5 seconds from egg deposition. On average 95 eggs are laid one by one with intervals of several minutes between subsequent eggs, leading to a total spawning duration in excess of six hours. Conclusions. We discuss this exceptional spawning pattern and how it might reflect a conflict between the sexes, with males attempting to induce egg laying and females extending the egg laying period to raise the chance for parasitic males to participate in spawning.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Evidence-based decisions on indicated prevention in early psychosis require large-scale studies on the pathways to care in high-risk subjects. EPOS (The European Prediction of Psychosis Study), a prospective multi-center, naturalistic field study in four European countries (Finland, Germany, The Netherlands and England), was designed to acquire accurate knowledge about pathways to care and delay in obtaining specialized high risk care. Our high risk sample (n=233) reported on average 2.9 help-seeking contacts, with an average delay between onset of relevant problems to initial help-seeking contact of 72.6 weeks, and between initial help-seeking contact and reaching specialized high risk care of 110.9 weeks. This resulted in a total estimated duration of an unrecognized risk for psychosis of 3 ½ years. Across EPOS EU regions, about 90% of care pathway contacts were within professional health care sectors. Between EPOS regions, differences in the pathways parameters including early detection and health-care systems were often very pronounced. High-risk participants who later made transition to a full psychotic disorder had significantly longer delays between initial help-seeking and receiving appropriate interventions. Our study underlines the need for regionally adapted implementation of early detection and intervention programs within respective mental health and health care networks, including enhancing public awareness of early psychosis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background Serologic testing algorithms for recent HIV seroconversion (STARHS) provide important information for HIV surveillance. We have previously demonstrated that a patient's antibody reaction pattern in a confirmatory line immunoassay (INNO-LIA™ HIV I/II Score) provides information on the duration of infection, which is unaffected by clinical, immunological and viral variables. In this report we have set out to determine the diagnostic performance of Inno-Lia algorithms for identifying incident infections in patients with known duration of infection and evaluated the algorithms in annual cohorts of HIV notifications. Methods Diagnostic sensitivity was determined in 527 treatment-naive patients infected for up to 12 months. Specificity was determined in 740 patients infected for longer than 12 months. Plasma was tested by Inno-Lia and classified as either incident (< = 12 m) or older infection by 26 different algorithms. Incident infection rates (IIR) were calculated based on diagnostic sensitivity and specificity of each algorithm and the rule that the total of incident results is the sum of true-incident and false-incident results, which can be calculated by means of the pre-determined sensitivity and specificity. Results The 10 best algorithms had a mean raw sensitivity of 59.4% and a mean specificity of 95.1%. Adjustment for overrepresentation of patients in the first quarter year of infection further reduced the sensitivity. In the preferred model, the mean adjusted sensitivity was 37.4%. Application of the 10 best algorithms to four annual cohorts of HIV-1 notifications totalling 2'595 patients yielded a mean IIR of 0.35 in 2005/6 (baseline) and of 0.45, 0.42 and 0.35 in 2008, 2009 and 2010, respectively. The increase between baseline and 2008 and the ensuing decreases were highly significant. Other adjustment models yielded different absolute IIR, although the relative changes between the cohorts were identical for all models. Conclusions The method can be used for comparing IIR in annual cohorts of HIV notifications. The use of several different algorithms in combination, each with its own sensitivity and specificity to detect incident infection, is advisable as this reduces the impact of individual imperfections stemming primarily from relatively low sensitivities and sampling bias.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Serologic testing algorithms for recent HIV seroconversion (STARHS) provide important information for HIV surveillance. We have shown that a patient's antibody reaction in a confirmatory line immunoassay (INNO-LIATM HIV I/II Score, Innogenetics) provides information on the duration of infection. Here, we sought to further investigate the diagnostic specificity of various Inno-Lia algorithms and to identify factors affecting it. Methods Plasma samples of 714 selected patients of the Swiss HIV Cohort Study infected for longer than 12 months and representing all viral clades and stages of chronic HIV-1 infection were tested blindly by Inno-Lia and classified as either incident (up to 12 m) or older infection by 24 different algorithms. Of the total, 524 patients received HAART, 308 had HIV-1 RNA below 50 copies/mL, and 620 were infected by a HIV-1 non-B clade. Using logistic regression analysis we evaluated factors that might affect the specificity of these algorithms. Results HIV-1 RNA <50 copies/mL was associated with significantly lower reactivity to all five HIV-1 antigens of the Inno-Lia and impaired specificity of most algorithms. Among 412 patients either untreated or with HIV-1 RNA ≥50 copies/mL despite HAART, the median specificity of the algorithms was 96.5% (range 92.0-100%). The only factor that significantly promoted false-incident results in this group was age, with false-incident results increasing by a few percent per additional year. HIV-1 clade, HIV-1 RNA, CD4 percentage, sex, disease stage, and testing modalities exhibited no significance. Results were similar among 190 untreated patients. Conclusions The specificity of most Inno-Lia algorithms was high and not affected by HIV-1 variability, advanced disease and other factors promoting false-recent results in other STARHS. Specificity should be good in any group of untreated HIV-1 patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experiments were designed to investigate the suitability of a combination of a short manual teat stimulation with a short latency period before teat cup attachment to induce and maintain oxytocin release and milk ejection without interruption. In Experiment 1, seven dairy cows in mid lactation were manually pre-stimulated for 15, 30 or 45 s, followed by either 30 s or 45 s of latency period. It was shown that all treatments induced a similar release of oxytocin without interruption until the end of milking. In particular, the latency period of up to 45 s did not cause a transient decrease of oxytocin concentration. In Experiment 2, milking characteristics were recorded in seven cows each in early, mid, and late lactation, respectively. Because the course of milk ejection depends mainly on the degree of udder filling, individual milkings were classified based on the actual degree of udder filling which differs between lactational stages but also between morning and evening milkings. All animals underwent twelve different udder preparation treatments, i.e. 15, 30, or 45 s of pre-stimulation followed by latency periods of 0, 30, 45, or 60 s. Milking characteristics were recorded. Total milk yield, main milking time and average milk flow rate did not differ between treatments if the degree of udder filling at the start of milking was >40% of the maximum storage capacity. However, if the udder filling was <40%, main milking time was decreased with the duration of a latency period up to 45 s, independent of duration of pre-stimulation. Average milk flow at an udder filling of <40% was highest after a pre-stimulation of 45 s followed by a latency period of another 45 s. In contrast, average milk flow reached its lowest values at a pre-stimulation of 15 s without additional latency period. However, average milk flow after a 15-s pre-stimulation increased with increasing latency period. In conclusion, a very short pre-stimulation when followed by a latency period up to 45 s before teat cup attachment remains a suitable alternative for continuous stimulation to induce milk ejection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To examine the duration of methicillin-resistant Staphylococcus aureus (MRSA) carriage and its determinants and the influence of eradication regimens. DESIGN: Retrospective cohort study. SETTING: A 1,033-bed tertiary care university hospital in Bern, Switzerland, in which the prevalence of methicillin resistance among S. aureus isolates is less than 5%. PATIENTS: A total of 116 patients with first-time MRSA detection identified at University Hospital Bern between January 1, 2000, and December 31, 2003, were followed up for a mean duration of 16.2 months. RESULTS: Sixty-eight patients (58.6%) cleared colonization, with a median time to clearance of 7.4 months. Independent determinants for shorter carriage duration were the absence of any modifiable risk factor (receipt of antibiotics, use of an indwelling device, or presence of a skin lesion) (hazard ratio [HR], 0.20 [95% confidence interval {CI}, 0.09-0.42]), absence of immunosuppressive therapy (HR, 0.49 [95% CI, 0.23-1.02]), and hemodialysis (HR, 0.08 [95% CI, 0.01-0.66]) at the time MRSA was first MRSA detected and the administration of decolonization regimen in the absence of a modifiable risk factor (HR, 2.22 [95% CI, 1.36-3.64]). Failure of decolonization treatment was associated with the presence of risk factors at the time of treatment (P=.01). Intermittent screenings that were negative for MRSA were frequent (26% of patients), occurred early after first detection of MRSA (median, 31.5 days), and were associated with a lower probability of clearing colonization (HR, 0.34 [95% CI, 0.17-0.67]) and an increased risk of MRSA infection during follow-up. CONCLUSIONS: Risk factors for MRSA acquisition should be carefully assessed in all MRSA carriers and should be included in infection control policies, such as the timing of decolonization treatment, the definition of MRSA clearance, and the decision of when to suspend isolation measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Empirical antibiotic therapy is based on patients' characteristics and antimicrobial susceptibility data. Hospital-wide cumulative antibiograms may not sufficiently support informed decision-making for optimal treatment of hospitalized patients. METHODS: We studied different approaches to analysing antimicrobial susceptibility rates (SRs) of all diagnostic bacterial isolates collected from patients hospitalized between July 2005 and June 2007 at the University Hospital in Zurich, Switzerland. We compared stratification for unit-specific, specimen type-specific (blood, urinary, respiratory versus all specimens) and isolate sequence-specific (first, follow-up versus all isolates) data with hospital-wide cumulative antibiograms, and studied changes of mean SR during the course of hospitalization. RESULTS: A total of 16 281 isolates (7965 first, 1201 follow-up and 7115 repeat isolates) were tested. We found relevant differences in SRs across different hospital departments. Mean SRs of Escherichia coli to ciprofloxacin ranged between 64.5% and 95.1% in various departments, and mean SRs of Pseudomonas aeruginosa to imipenem and meropenem ranged from 54.2% to 100% and 80.4% to 100%, respectively. Compared with hospital cumulative antibiograms, lower SRs were observed in intensive care unit specimens, follow-up isolates and isolates causing nosocomial infections (except for Staphylococcus aureus). Decreasing SRs were observed in first isolates of coagulase-negative staphylococci with increasing interval between hospital admission and specimen collection. Isolates from different anatomical sites showed variations in SRs. CONCLUSIONS: We recommend the reporting of unit-specific rather than hospital-wide cumulative antibiograms. Decreasing antimicrobial susceptibility during hospitalization and variations in SRs in isolates from different anatomical sites should be taken into account when selecting empirical antibiotic treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sequential studies of osteopenic bone disease in small animals require the availability of non-invasive, accurate and precise methods to assess bone mineral content (BMC) and bone mineral density (BMD). Dual-energy X-ray absorptiometry (DXA), which is currently used in humans for this purpose, can also be applied to small animals by means of adapted software. Precision and accuracy of DXA was evaluated in 10 rats weighing 50-265 g. The rats were anesthetized with a mixture of ketamine-xylazine administrated intraperitoneally. Each rat was scanned six times consecutively in the antero-posterior incidence after repositioning using the rat whole-body software for determination of whole-body BMC and BMD (Hologic QDR 1000, software version 5.52). Scan duration was 10-20 min depending on rat size. After the last measurement, rats were sacrificed and soft tissues were removed by dermestid beetles. Skeletons were then scanned in vitro (ultra high resolution software, version 4.47). Bones were subsequently ashed and dissolved in hydrochloric acid and total body calcium directly assayed by atomic absorption spectrophotometry (TBCa[chem]). Total body calcium was also calculated from the DXA whole-body in vivo measurement (TBCa[DXA]) and from the ultra high resolution measurement (TBCa[UH]) under the assumption that calcium accounts for 40.5% of the BMC expressed as hydroxyapatite. Precision error for whole-body BMC and BMD (mean +/- S.D.) was 1.3% and 1.5%, respectively. Simple regression analysis between TBCa[DXA] or TBCa[UH] and TBCa[chem] revealed tight correlations (n = 0.991 and 0.996, respectively), with slopes and intercepts which were significantly different from 1 and 0, respectively.(ABSTRACT TRUNCATED AT 250 WORDS)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE Well-developed collaterals provide survival benefit in patients with obstructive coronary artery disease (CAD). Therefore, in this study we sought to determine which clinical variables are associated with arteriogenesis. DESIGN Clinical and laboratory variables were collected before percutaneous coronary intervention. Multivariate analysis was performed to determine which variables are associated with the collateral flow index (CFI). PATIENTS Data from 295 chronic total occlusion (CTO) patients (Bern, Switzerland, Amsterdam, the Netherlands and Jena, Germany) were pooled. In earlier studies, patients had varying degrees of stenosis. Therefore, different stages of development of the collaterals were used. In our study, a unique group of patients with CTO was analysed. INTERVENTIONS Instead of angiography used earlier, we used a more accurate method to determine CFI using intracoronary pressure measurements. CFI was calculated from the occlusive pressure distal of the coronary lesion, the aortic pressure and central venous pressure. RESULTS The mean CFI was 0.39 ± 0.14. After multivariate analysis, β blockers, hypertension and angina pectoris duration were positively associated with CFI (B: correlation coefficient β=0.07, SE=0.03, p=0.02, B=0.040, SE=0.02, p=0.042 and B=0.001, SE=0.000, p=0.02). Furthermore also after multivariate analysis, high serum leucocytes, prior myocardial infarction and high diastolic blood pressure were negatively associated with CFI (B=-0.01, SE=0.005, p=0.03, B=-0.04, SE=0.02, p=0.03 and B=-0.002, SE=0.001, p=0.011). CONCLUSIONS In this unique cohort, high serum leucocytes and high diastolic blood pressure are associated with poorly developed collaterals. Interestingly, the use of β blockers is associated with well-developed collaterals, shedding new light on the potential action mode of this drug in patients with CAD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To explore the risk of endometrial cancer in relation to metformin and other antidiabetic drugs. METHODS We conducted a case-control analysis to explore the association between use of metformin and other antidiabetic drugs and the risk of endometrial cancer using the UK-based General Practice Research Database (GPRD). Cases were women with an incident diagnosis of endometrial cancer, and up to 6 controls per case were matched in age, sex, calendar time, general practice, and number of years of active history in the GPRD prior to the index date. Odds ratios (ORs) with 95% confidence intervals (95% CI) were calculated and results were adjusted by multivariate logistic regression analyses for BMI, smoking, a recorded diagnosis of diabetes mellitus, and diabetes duration. RESULTS A total of 2554 cases with incident endometrial cancer and 15,324 matched controls were identified. Ever use of metformin compared to never use of metformin was not associated with an altered risk of endometrial cancer (adj. OR 0.86, 95% CI 0.63-1.18). Stratified by exposure duration, neither long-term (≥25 prescriptions) use of metformin (adj. OR 0.79, 95% CI 0.54-1.17), nor long-term use of sulfonylureas (adj. OR 0.96, 95% CI 0.65-1.44), thiazolidinediones (≥15 prescriptions; adj. OR 1.22, 95% CI 0.67-2.21), or insulin (adj. OR 1.05 (0.79-1.82) was associated with the risk of endometrial cancer. CONCLUSION Use of metformin and other antidiabetic drugs were not associated with an altered risk of endometrial cancer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND In postmenopausal women, yearly intravenous zoledronate (ZOL) compared to placebo (PLB) significantly increased bone mineral density (BMD) at lumbar spine (LS), femoral neck (FN), and total hip (TH) and decreased fracture risk. The effects of ZOL on BMD at the tibial epiphysis (T-EPI) and diaphysis (T-DIA) are unknown. METHODS A randomized controlled ancillary study of the HORIZON trial was conducted at the Department of Osteoporosis of the University Hospital of Berne, Switzerland. Women with ≥1 follow-up DXA measurement who had received ≥1 dose of either ZOL (n=55) or PLB (n=55) were included. BMD was measured at LS, FN, TH, T-EPI, and T-DIA at baseline, 6, 12, 24, and 36 months. Morphometric vertebral fractures were assessed. Incident clinical fractures were recorded as adverse events. RESULTS Baseline characteristics were comparable with those in HORIZON and between groups. After 36 months, BMD was significantly higher in women treated with ZOL vs. PLB at LS, FN, TH, and T-EPI (+7.6%, +3.7%, +5.6%, and +5.5%, respectively, p<0.01 for all) but not T-DIA (+1.1%). The number of patients with ≥1 incident non-vertebral or morphometric fracture did not differ between groups (9 ZOL/11 PLB). Mean changes in BMD did not differ between groups with and without incident fracture, except that women with an incident non-vertebral fracture had significantly higher bone loss at predominantly cortical T-DIA (p=0.005). CONCLUSION ZOL was significantly superior to PLB at T-EPI but not at T-DIA. Women with an incident non-vertebral fracture experienced bone loss at T-DIA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Tests for recent infections (TRIs) are important for HIV surveillance. We have shown that a patient's antibody pattern in a confirmatory line immunoassay (Inno-Lia) also yields information on time since infection. We have published algorithms which, with a certain sensitivity and specificity, distinguish between incident (< = 12 months) and older infection. In order to use these algorithms like other TRIs, i.e., based on their windows, we now determined their window periods. Methods We classified Inno-Lia results of 527 treatment-naïve patients with HIV-1 infection < = 12 months according to incidence by 25 algorithms. The time after which all infections were ruled older, i.e. the algorithm's window, was determined by linear regression of the proportion ruled incident in dependence of time since infection. Window-based incident infection rates (IIR) were determined utilizing the relationship ‘Prevalence = Incidence x Duration’ in four annual cohorts of HIV-1 notifications. Results were compared to performance-based IIR also derived from Inno-Lia results, but utilizing the relationship ‘incident = true incident + false incident’ and also to the IIR derived from the BED incidence assay. Results Window periods varied between 45.8 and 130.1 days and correlated well with the algorithms' diagnostic sensitivity (R2 = 0.962; P<0.0001). Among the 25 algorithms, the mean window-based IIR among the 748 notifications of 2005/06 was 0.457 compared to 0.453 obtained for performance-based IIR with a model not correcting for selection bias. Evaluation of BED results using a window of 153 days yielded an IIR of 0.669. Window-based IIR and performance-based IIR increased by 22.4% and respectively 30.6% in 2008, while 2009 and 2010 showed a return to baseline for both methods. Conclusions IIR estimations by window- and performance-based evaluations of Inno-Lia algorithm results were similar and can be used together to assess IIR changes between annual HIV notification cohorts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE The aim of this study was to explore the risk of incident gout in patients with type 2 diabetes mellitus (T2DM) in association with diabetes duration, diabetes severity and antidiabetic drug treatment. METHODS We conducted a case-control study in patients with T2DM using the UK-based Clinical Practice Research Datalink (CPRD). We identified case patients aged ≥18 years with an incident diagnosis of gout between 1990 and 2012. We matched to each case patient one gout-free control patient. We used conditional logistic regression analysis to calculate adjusted ORs (adj. ORs) with 95% CIs and adjusted our analyses for important potential confounders. RESULTS The study encompassed 7536 T2DM cases with a first-time diagnosis of gout. Compared to a diabetes duration <1 year, prolonged diabetes duration (1-3, 3-6, 7-9 and ≥10 years) was associated with decreased adj. ORs of 0.91 (95% CI 0.79 to 1.04), 0.76 (95% CI 0.67 to 0.86), 0.70 (95% CI 0.61 to 0.86), and 0.58 (95% CI 0.51 to 0.66), respectively. Compared to a reference A1C level of <7%, the risk estimates of increasing A1C levels (7.0-7.9, 8.0-8.9 and ≥9%) steadily decreased with adj. ORs of 0.79 (95% CI 0.72 to 0.86), 0.63 (95% CI 0.55 to 0.72), and 0.46 (95% CI 0.40 to 0.53), respectively. Neither use of insulin, metformin, nor sulfonylureas was associated with an altered risk of incident gout. CONCLUSIONS Increased A1C levels, but not use of antidiabetic drugs, was associated with a decreased risk of incident gout among patients with T2DM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE Mechanical loading is an important parameter that alters the homeostasis of the intervertebral disc (IVD). Studies have demonstrated the role of compression in altering the cellular metabolism, anabolic and catabolic events of the disc, but little is known how complex loading such as torsion-compression affects the IVD cell metabolism and matrix homeostasis. Studying how the duration of torsion affects disc matrix turnover could provide guidelines to prevent overuse injury to the disc and suggest possible beneficial effect of torsion. The aim of the study was to evaluate the biological response of the IVD to different durations of torsional loading. METHODS Intact bovine caudal IVD were isolated for organ culture in a bioreactor. Different daily durations of torsion were applied over 7 days at a physiological magnitude (±2°) in combination with 0.2 MPa compression, at a frequency of 1 Hz. RESULTS Nucleus pulpous (NP) cell viability and total disc volume decreased with 8 h of torsion-compression per day. Gene expression analysis suggested a down-regulated MMP13 with increased time of torsion. 1 and 4 h per day torsion-compression tended to increase the glycosaminoglycans/hydroxyproline ratio in the NP tissue group. CONCLUSIONS Our result suggests that load duration thresholds exist in both torsion and compression with an optimal load duration capable of promoting matrix synthesis and overloading can be harmful to disc cells. Future research is required to evaluate the specific mechanisms for these observed effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS Our aim was to report on a survey initiated by the European Association of Percutaneous Cardiovascular Interventions (EAPCI) concerning opinion on the evidence relating to dual antiplatelet therapy (DAPT) duration after coronary stenting. METHODS AND RESULTS Results from three randomised clinical trials were scheduled to be presented at the American Heart Association Scientific Sessions 2014 (AHA 2014). A web-based survey was distributed to all individuals registered in the EuroIntervention mailing list (n=15,200) both before and after AHA 2014. A total of 1,134 physicians responded to the first (i.e., before AHA 2014) and 542 to the second (i.e., after AHA 2014) survey. The majority of respondents interpreted trial results consistent with a substantial equipoise regarding the benefits and risks of an extended versus a standard DAPT strategy. Two respondents out of ten believed extended DAPT should be implemented in selected patients. After AHA 2014, 46.1% of participants expressed uncertainty about the available evidence on DAPT duration, and 40.0% the need for clinical guidance. CONCLUSIONS This EAPCI survey highlights considerable uncertainty within the medical community with regard to the optimal duration of DAPT after coronary stenting in the light of recent reported trial results. Updated recommendations for practising physicians to guide treatment decisions in routine clinical practice should be provided by international societies.