47 resultados para Variable Sampling Interval Control Charts


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the anti-saccade paradigm, subjects are instructed not to make a reflexive saccade to an appearing lateral target but to make an intentional saccade to the opposite side instead. The inhibition of reflexive saccade triggering is under the control of the dorsolateral prefrontal cortex (DLPFC). The critical time interval at which this inhibition takes place during the paradigm, however, is not exactly known. In the present study, we used single-pulse transcranial magnetic stimulation (TMS) to interfere with DLPFC function in 15 healthy subjects. TMS was applied over the right DLPFC either 100 ms before the onset of the visual target (i.e. -100 ms), at target onset (i.e. 0 ms) or 100 ms after target onset (i.e. +100 ms). Stimulation 100 ms before target onset significantly increased the percentage of anti-saccade errors to both sides, while stimulation at, or after, target onset had no significant effect. All three stimulation conditions had no significant influence on saccade latency of correct or erroneous anti-saccades. These findings show that the critical time interval at which the DLPFC controls the suppression of a reflexive saccade in the anti-saccade paradigm is before target onset. In addition, the results suggest the view that the triggering of correct anti-saccades is not under direct control of the DLPFC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Many epidemiological studies indicate a positive correlation between cataract surgery and the subsequent progression of age-related macular degeneration (AMD). Such a correlation would have far-reaching consequences. However, in epidemiological studies it is difficult to determine the significance of a single risk factor, such as cataract surgery. PATIENTS AND METHODS: We performed a retrospective case-control study of patients with new onset exudative age-related macular degeneration to determine if cataract surgery was a predisposing factor. A total of 1496 eyes were included in the study: 984 cases with new onset of exudative AMD and 512 control eyes with early signs of age-related maculopathy. Lens status (phakic or pseudophakic) was determined for each eye. RESULTS: There was no significant difference in lens status between study and control group (227/984 [23.1 %] vs. 112/512 [21.8 %] pseudophakic, p = 0.6487; OR = 1.071; 95 % CI = 0.8284-1.384). In cases with bilateral pseudophakia (n = 64) no statistically significant difference of the interval between cataract surgery in either eye and the onset of exudative AMD in the study eye was found (225.9 +/- 170.4 vs. 209.9 +/- 158.2 weeks, p = 0.27). CONCLUSIONS: Our results provide evidence that cataract surgery is not a major risk factor for the development of exudative AMD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Duplications and deletions in the human genome can cause disease or predispose persons to disease. Advances in technologies to detect these changes allow for the routine identification of submicroscopic imbalances in large numbers of patients. METHODS: We tested for the presence of microdeletions and microduplications at a specific region of chromosome 1q21.1 in two groups of patients with unexplained mental retardation, autism, or congenital anomalies and in unaffected persons. RESULTS: We identified 25 persons with a recurrent 1.35-Mb deletion within 1q21.1 from screening 5218 patients. The microdeletions had arisen de novo in eight patients, were inherited from a mildly affected parent in three patients, were inherited from an apparently unaffected parent in six patients, and were of unknown inheritance in eight patients. The deletion was absent in a series of 4737 control persons (P=1.1x10(-7)). We found considerable variability in the level of phenotypic expression of the microdeletion; phenotypes included mild-to-moderate mental retardation, microcephaly, cardiac abnormalities, and cataracts. The reciprocal duplication was enriched in nine children with mental retardation or autism spectrum disorder and other variable features (P=0.02). We identified three deletions and three duplications of the 1q21.1 region in an independent sample of 788 patients with mental retardation and congenital anomalies. CONCLUSIONS: We have identified recurrent molecular lesions that elude syndromic classification and whose disease manifestations must be considered in a broader context of development as opposed to being assigned to a specific disease. Clinical diagnosis in patients with these lesions may be most readily achieved on the basis of genotype rather than phenotype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To evaluate strategies used to select cases and controls and how reported odds ratios are interpreted, the authors examined 150 case-control studies published in leading general medicine, epidemiology, and clinical specialist journals from 2001 to 2007. Most of the studies (125/150; 83%) were based on incident cases; among these, the source population was mostly dynamic (102/125; 82%). A minority (23/125; 18%) sampled from a fixed cohort. Among studies with incident cases, 105 (84%) could interpret the odds ratio as a rate ratio. Fifty-seven (46% of 125) required the source population to be stable for such interpretation, while the remaining 48 (38% of 125) did not need any assumptions because of matching on time or concurrent sampling. Another 17 (14% of 125) studies with incident cases could interpret the odds ratio as a risk ratio, with 16 of them requiring the rare disease assumption for this interpretation. The rare disease assumption was discussed in 4 studies but was not relevant to any of them. No investigators mentioned the need for a stable population. The authors conclude that in current case-control research, a stable exposure distribution is much more frequently needed to interpret odds ratios than the rare disease assumption. At present, investigators conducting case-control studies rarely discuss what their odds ratios estimate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The extent to which mortality differs following individual acquired immunodeficiency syndrome (AIDS)-defining events (ADEs) has not been assessed among patients initiating combination antiretroviral therapy. METHODS: We analyzed data from 31,620 patients with no prior ADEs who started combination antiretroviral therapy. Cox proportional hazards models were used to estimate mortality hazard ratios for each ADE that occurred in >50 patients, after stratification by cohort and adjustment for sex, HIV transmission group, number of antiretroviral drugs initiated, regimen, age, date of starting combination antiretroviral therapy, and CD4+ cell count and HIV RNA load at initiation of combination antiretroviral therapy. ADEs that occurred in <50 patients were grouped together to form a "rare ADEs" category. RESULTS: During a median follow-up period of 43 months (interquartile range, 19-70 months), 2880 ADEs were diagnosed in 2262 patients; 1146 patients died. The most common ADEs were esophageal candidiasis (in 360 patients), Pneumocystis jiroveci pneumonia (320 patients), and Kaposi sarcoma (308 patients). The greatest mortality hazard ratio was associated with non-Hodgkin's lymphoma (hazard ratio, 17.59; 95% confidence interval, 13.84-22.35) and progressive multifocal leukoencephalopathy (hazard ratio, 10.0; 95% confidence interval, 6.70-14.92). Three groups of ADEs were identified on the basis of the ranked hazard ratios with bootstrapped confidence intervals: severe (non-Hodgkin's lymphoma and progressive multifocal leukoencephalopathy [hazard ratio, 7.26; 95% confidence interval, 5.55-9.48]), moderate (cryptococcosis, cerebral toxoplasmosis, AIDS dementia complex, disseminated Mycobacterium avium complex, and rare ADEs [hazard ratio, 2.35; 95% confidence interval, 1.76-3.13]), and mild (all other ADEs [hazard ratio, 1.47; 95% confidence interval, 1.08-2.00]). CONCLUSIONS: In the combination antiretroviral therapy era, mortality rates subsequent to an ADE depend on the specific diagnosis. The proposed classification of ADEs may be useful in clinical end point trials, prognostic studies, and patient management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

HYPOTHESIS: Clinically apparent surgical glove perforation increases the risk of surgical site infection (SSI). DESIGN: Prospective observational cohort study. SETTING: University Hospital Basel, with an average of 28,000 surgical interventions per year. PARTICIPANTS: Consecutive series of 4147 surgical procedures performed in the Visceral Surgery, Vascular Surgery, and Traumatology divisions of the Department of General Surgery. MAIN OUTCOME MEASURES: The outcome of interest was SSI occurrence as assessed pursuant to the Centers of Disease Control and Prevention standards. The primary predictor variable was compromised asepsis due to glove perforation. RESULTS: The overall SSI rate was 4.5% (188 of 4147 procedures). Univariate logistic regression analysis showed a higher likelihood of SSI in procedures in which gloves were perforated compared with interventions with maintained asepsis (odds ratio [OR], 2.0; 95% confidence interval [CI], 1.4-2.8; P < .001). However, multivariate logistic regression analyses showed that the increase in SSI risk with perforated gloves was different for procedures with vs those without surgical antimicrobial prophylaxis (test for effect modification, P = .005). Without antimicrobial prophylaxis, glove perforation entailed significantly higher odds of SSI compared with the reference group with no breach of asepsis (adjusted OR, 4.2; 95% CI, 1.7-10.8; P = .003). On the contrary, when surgical antimicrobial prophylaxis was applied, the likelihood of SSI was not significantly higher for operations in which gloves were punctured (adjusted OR, 1.3; 95% CI, 0.9-1.9; P = .26). CONCLUSION: Without surgical antimicrobial prophylaxis, glove perforation increases the risk of SSI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The purpose of the study was to investigate allogeneic blood transfusion (ABT) and preoperative anemia as risk factors for surgical site infection (SSI). STUDY DESIGN AND METHODS: A prospective, observational cohort of 5873 consecutive general surgical procedures at Basel University Hospital was analyzed to determine the relationship between perioperative ABT and preoperative anemia and the incidence of SSI. ABT was defined as transfusion of leukoreduced red blood cells during surgery and anemia as hemoglobin concentration of less than 120 g/L before surgery. Surgical wounds and resulting infections were assessed to Centers for Disease Control standards. RESULTS: The overall SSI rate was 4.8% (284 of 5873). In univariable logistic regression analyses, perioperative ABT (crude odds ratio [OR], 2.93; 95% confidence interval [CI], 2.1 to 4.0; p < 0.001) and preoperative anemia (crude OR, 1.32; 95% CI, 1.0 to 1.7; p = 0.037) were significantly associated with an increased odds of SSI. After adjusting for 13 characteristics of the patient and the procedure in multivariable analyses, associations were substantially reduced for ABT (OR, 1.25; 95% CI, 0.8 to 1.9; p = 0.310; OR, 1.07; 95% CI, 0.6 to 2.0; p = 0.817 for 1-2 blood units and >or=3 blood units, respectively) and anemia (OR, 0.91; 95% CI, 0.7 to 1.2; p = 0.530). Duration of surgery was the main confounding variable. CONCLUSION: Our findings point to important confounding factors and strengthen existing doubts on leukoreduced ABT during general surgery and preoperative anemia as risk factors for SSIs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Elevated plasma fibrinogen levels have prospectively been associated with an increased risk of coronary artery disease in different populations. Plasma fibrinogen is a measure of systemic inflammation crucially involved in atherosclerosis. The vagus nerve curtails inflammation via a cholinergic antiinflammatory pathway. We hypothesized that lower vagal control of the heart relates to higher plasma fibrinogen levels. METHODS: Study participants were 559 employees (age 17-63 years; 89% men) of an airplane manufacturing plant in southern Germany. All subjects underwent medical examination, blood sampling, and 24-hour ambulatory heart rate recording while kept on their work routine. The root mean square of successive differences in RR intervals during the night period (nighttime RMSSD) was computed as the heart rate variability index of vagal function. RESULTS: After controlling for demographic, lifestyle, and medical factors, nighttime RMSSD explained 1.7% (P = 0.001), 0.8% (P = 0.033), and 7.8% (P = 0.007), respectively, of the variance in fibrinogen levels in all subjects, men, and women. Nighttime RMSSD and fibrinogen levels were stronger correlated in women than in men. In all workers, men, and women, respectively, there was a mean +/- SEM increase of 0.41 +/- 0.13 mg/dL, 0.28 +/- 0.13 mg/dL, and 1.16 +/- 0.41 mg/dL fibrinogen for each millisecond decrease in nighttime RMSSD. CONCLUSIONS: Reduced vagal outflow to the heart correlated with elevated plasma fibrinogen levels independent of the established cardiovascular risk factors. This relationship seemed comparably stronger in women than men. Such an autonomic mechanism might contribute to the atherosclerotic process and its thrombotic complications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In all European Union countries, chemical residues are required to be routinely monitored in meat. Good farming and veterinary practice can prevent the contamination of meat with pharmaceutical substances, resulting in a low detection of drug residues through random sampling. An alternative approach is to target-monitor farms suspected of treating their animals with antimicrobials. The objective of this project was to assess, using a stochastic model, the efficiency of these two sampling strategies. The model integrated data on Swiss livestock as well as expert opinion and results from studies conducted in Switzerland. Risk-based sampling showed an increase in detection efficiency of up to 100% depending on the prevalence of contaminated herds. Sensitivity analysis of this model showed the importance of the accuracy of prior assumptions for conducting risk-based sampling. The resources gained by changing from random to risk-based sampling should be transferred to improving the quality of prior information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Infectious diseases and social contacts in early life have been proposed to modulate brain tumour risk during late childhood and adolescence. METHODS CEFALO is an interview-based case-control study in Denmark, Norway, Sweden and Switzerland, including children and adolescents aged 7-19 years with primary intracranial brain tumours diagnosed between 2004 and 2008 and matched population controls. RESULTS The study included 352 cases (participation rate: 83%) and 646 controls (71%). There was no association with various measures of social contacts: daycare attendance, number of childhours at daycare, attending baby groups, birth order or living with other children. Cases of glioma and embryonal tumours had more frequent sick days with infections in the first 6 years of life compared with controls. In 7-19 year olds with 4+ monthly sick day, the respective odds ratios were 2.93 (95% confidence interval: 1.57-5.50) and 4.21 (95% confidence interval: 1.24-14.30). INTERPRETATION There was little support for the hypothesis that social contacts influence childhood and adolescent brain tumour risk. The association between reported sick days due to infections and risk of glioma and embryonal tumour may reflect involvement of immune functions, recall bias or inverse causality and deserve further attention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Few studies have addressed the interaction between instruction content and saccadic eye movement control. To assess the impact of instructions on top-down control, we instructed 20 healthy volunteers to deliberately delay saccade triggering, to make inaccurate saccades or to redirect saccades--i.e. to glimpse towards and then immediately opposite to the target. Regular pro- and antisaccade tasks were used for comparison. Bottom-up visual input remained unchanged and was a gap paradigm for all instructions. In the inaccuracy and delay tasks, both latencies and accuracies were detrimentally impaired by either type of instruction and the variability of latency and accuracy was increased. The intersaccadic interval (ISI) required to correct erroneous antisaccades was shorter than the ISI for instructed direction changes in the redirection task. The word-by-word instruction content interferes with top-down saccade control. Top-down control is a time consuming process, which may override bottom-up processing only during a limited time period. It is questionable whether parallel processing is possible in top-down control, since the long ISI for instructed direction changes suggests sequential planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Etravirine (ETV) is metabolized by cytochrome P450 (CYP) 3A, 2C9, and 2C19. Metabolites are glucuronidated by uridine diphosphate glucuronosyltransferases (UGT). To identify the potential impact of genetic and non-genetic factors involved in ETV metabolism, we carried out a two-step pharmacogenetics-based population pharmacokinetic study in HIV-1 infected individuals. Materials and methods: The study population included 144 individuals contributing 289 ETV plasma concentrations and four individuals contributing 23 ETV plasma concentrations collected in a rich sampling design. Genetic variants [n=125 single-nucleotide polymorphisms (SNPs)] in 34 genes with a predicted role in ETV metabolism were selected. A first step population pharmacokinetic model included non-genetic and known genetic factors (seven SNPs in CYP2C, one SNP in CYP3A5) as covariates. Post-hoc individual ETV clearance (CL) was used in a second (discovery) step, in which the effect of the remaining 98 SNPs in CYP3A, P450 cytochrome oxidoreductase (POR), nuclear receptor genes, and UGTs was investigated. Results: A one-compartment model with zero-order absorption best characterized ETV pharmacokinetics. The average ETV CL was 41 (l/h) (CV 51.1%), the volume of distribution was 1325 l, and the mean absorption time was 1.2 h. The administration of darunavir/ritonavir or tenofovir was the only non-genetic covariate influencing ETV CL significantly, resulting in a 40% [95% confidence interval (CI): 13–69%] and a 42% (95% CI: 17–68%) increase in ETV CL, respectively. Carriers of rs4244285 (CYP2C19*2) had 23% (8–38%) lower ETV CL. Co-administered antiretroviral agents and genetic factors explained 16% of the variance in ETV concentrations. None of the SNPs in the discovery step influenced ETV CL. Conclusion: ETV concentrations are highly variable, and co-administered antiretroviral agents and genetic factors explained only a modest part of the interindividual variability in ETV elimination. Opposing effects of interacting drugs effectively abrogate genetic influences on ETV CL, and vice-versa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although persons infected with human immunodeficiency virus (HIV), particularly men who have sex with men, are at excess risk for anal cancer, it has been difficult to disentangle the influences of anal exposure to human papillomavirus (HPV) infection, immunodeficiency, and combined antiretroviral therapy. A case-control study that included 59 anal cancer cases and 295 individually matched controls was nested in the Swiss HIV Cohort Study (1988-2011). In a subset of 41 cases and 114 controls, HPV antibodies were tested. A majority of anal cancer cases (73%) were men who have sex with men. Current smoking was significantly associated with anal cancer (odds ratio (OR) = 2.59, 95% confidence interval (CI): 1.25, 5.34), as were antibodies against L1 (OR = 4.52, 95% CI: 2.00, 10.20) and E6 (OR = ∞, 95% CI: 4.64, ∞) of HPV16, as well as low CD4+ cell counts, whether measured at nadir (OR per 100-cell/μL decrease = 1.53, 95% CI: 1.18, 2.00) or at cancer diagnosis (OR per 100-cell/μL decrease = 1.24, 95% CI: 1.08, 1.42). However, the influence of CD4+ cell counts appeared to be strongest 6-7 years prior to anal cancer diagnosis (OR for <200 vs. ≥500 cells/μL = 14.0, 95% CI: 3.85, 50.9). Smoking cessation and avoidance of even moderate levels of immunosuppression appear to be important in reducing long-term anal cancer risks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.