942 resultados para Variable Sampling Interval Control Charts


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel solution to the long standing issue of chip entanglement and breakage in metal cutting is presented in this dissertation. Through this work, an attempt is made to achieve universal chip control in machining by using chip guidance and subsequent breakage by backward bending (tensile loading of the chip's rough top surface) to effectively control long continuous chips into small segments. One big limitation of using chip breaker geometries in disposable carbide inserts is that the application range is limited to a narrow band depending on cutting conditions. Even within a recommended operating range, chip breakers do not function effectively as designed due to the inherent variations of the cutting process. Moreover, for a particular process, matching the chip breaker geometry with the right cutting conditions to achieve effective chip control is a very iterative process. The existence of a large variety of proprietary chip breaker designs further exacerbates the problem of easily implementing a robust and comprehensive chip control technique. To address the need for a robust and universal chip control technique, a new method is proposed in this work. By using a single tool top form geometry coupled with a tooling system for inducing chip breaking by backward bending, the proposed method achieves comprehensive chip control over a wide range of cutting conditions. A geometry based model is developed to predict a variable edge inclination angle that guides the chip flow to a predetermined target location. Chip kinematics for the new tool geometry is examined via photographic evidence from experimental cutting trials. Both qualitative and quantitative methods are used to characterize the chip kinematics. Results from the chip characterization studies indicate that the chip flow and final form show a remarkable consistency across multiple levels of workpiece and tool configurations as well as cutting conditions. A new tooling system is then designed to comprehensively break the chip by backward bending. Test results with the new tooling system prove that by utilizing the chip guidance and backward bending mechanism, long continuous chips can be more consistently broken into smaller segments that are generally deemed acceptable or good chips. It is found that the proposed tool can be applied effectively over a wider range of cutting conditions than present chip breakers thus taking possibly the first step towards achieving universal chip control in machining.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The extent to which mortality differs following individual acquired immunodeficiency syndrome (AIDS)-defining events (ADEs) has not been assessed among patients initiating combination antiretroviral therapy. METHODS: We analyzed data from 31,620 patients with no prior ADEs who started combination antiretroviral therapy. Cox proportional hazards models were used to estimate mortality hazard ratios for each ADE that occurred in >50 patients, after stratification by cohort and adjustment for sex, HIV transmission group, number of antiretroviral drugs initiated, regimen, age, date of starting combination antiretroviral therapy, and CD4+ cell count and HIV RNA load at initiation of combination antiretroviral therapy. ADEs that occurred in <50 patients were grouped together to form a "rare ADEs" category. RESULTS: During a median follow-up period of 43 months (interquartile range, 19-70 months), 2880 ADEs were diagnosed in 2262 patients; 1146 patients died. The most common ADEs were esophageal candidiasis (in 360 patients), Pneumocystis jiroveci pneumonia (320 patients), and Kaposi sarcoma (308 patients). The greatest mortality hazard ratio was associated with non-Hodgkin's lymphoma (hazard ratio, 17.59; 95% confidence interval, 13.84-22.35) and progressive multifocal leukoencephalopathy (hazard ratio, 10.0; 95% confidence interval, 6.70-14.92). Three groups of ADEs were identified on the basis of the ranked hazard ratios with bootstrapped confidence intervals: severe (non-Hodgkin's lymphoma and progressive multifocal leukoencephalopathy [hazard ratio, 7.26; 95% confidence interval, 5.55-9.48]), moderate (cryptococcosis, cerebral toxoplasmosis, AIDS dementia complex, disseminated Mycobacterium avium complex, and rare ADEs [hazard ratio, 2.35; 95% confidence interval, 1.76-3.13]), and mild (all other ADEs [hazard ratio, 1.47; 95% confidence interval, 1.08-2.00]). CONCLUSIONS: In the combination antiretroviral therapy era, mortality rates subsequent to an ADE depend on the specific diagnosis. The proposed classification of ADEs may be useful in clinical end point trials, prognostic studies, and patient management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

HYPOTHESIS: Clinically apparent surgical glove perforation increases the risk of surgical site infection (SSI). DESIGN: Prospective observational cohort study. SETTING: University Hospital Basel, with an average of 28,000 surgical interventions per year. PARTICIPANTS: Consecutive series of 4147 surgical procedures performed in the Visceral Surgery, Vascular Surgery, and Traumatology divisions of the Department of General Surgery. MAIN OUTCOME MEASURES: The outcome of interest was SSI occurrence as assessed pursuant to the Centers of Disease Control and Prevention standards. The primary predictor variable was compromised asepsis due to glove perforation. RESULTS: The overall SSI rate was 4.5% (188 of 4147 procedures). Univariate logistic regression analysis showed a higher likelihood of SSI in procedures in which gloves were perforated compared with interventions with maintained asepsis (odds ratio [OR], 2.0; 95% confidence interval [CI], 1.4-2.8; P < .001). However, multivariate logistic regression analyses showed that the increase in SSI risk with perforated gloves was different for procedures with vs those without surgical antimicrobial prophylaxis (test for effect modification, P = .005). Without antimicrobial prophylaxis, glove perforation entailed significantly higher odds of SSI compared with the reference group with no breach of asepsis (adjusted OR, 4.2; 95% CI, 1.7-10.8; P = .003). On the contrary, when surgical antimicrobial prophylaxis was applied, the likelihood of SSI was not significantly higher for operations in which gloves were punctured (adjusted OR, 1.3; 95% CI, 0.9-1.9; P = .26). CONCLUSION: Without surgical antimicrobial prophylaxis, glove perforation increases the risk of SSI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The purpose of the study was to investigate allogeneic blood transfusion (ABT) and preoperative anemia as risk factors for surgical site infection (SSI). STUDY DESIGN AND METHODS: A prospective, observational cohort of 5873 consecutive general surgical procedures at Basel University Hospital was analyzed to determine the relationship between perioperative ABT and preoperative anemia and the incidence of SSI. ABT was defined as transfusion of leukoreduced red blood cells during surgery and anemia as hemoglobin concentration of less than 120 g/L before surgery. Surgical wounds and resulting infections were assessed to Centers for Disease Control standards. RESULTS: The overall SSI rate was 4.8% (284 of 5873). In univariable logistic regression analyses, perioperative ABT (crude odds ratio [OR], 2.93; 95% confidence interval [CI], 2.1 to 4.0; p < 0.001) and preoperative anemia (crude OR, 1.32; 95% CI, 1.0 to 1.7; p = 0.037) were significantly associated with an increased odds of SSI. After adjusting for 13 characteristics of the patient and the procedure in multivariable analyses, associations were substantially reduced for ABT (OR, 1.25; 95% CI, 0.8 to 1.9; p = 0.310; OR, 1.07; 95% CI, 0.6 to 2.0; p = 0.817 for 1-2 blood units and >or=3 blood units, respectively) and anemia (OR, 0.91; 95% CI, 0.7 to 1.2; p = 0.530). Duration of surgery was the main confounding variable. CONCLUSION: Our findings point to important confounding factors and strengthen existing doubts on leukoreduced ABT during general surgery and preoperative anemia as risk factors for SSIs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Elevated plasma fibrinogen levels have prospectively been associated with an increased risk of coronary artery disease in different populations. Plasma fibrinogen is a measure of systemic inflammation crucially involved in atherosclerosis. The vagus nerve curtails inflammation via a cholinergic antiinflammatory pathway. We hypothesized that lower vagal control of the heart relates to higher plasma fibrinogen levels. METHODS: Study participants were 559 employees (age 17-63 years; 89% men) of an airplane manufacturing plant in southern Germany. All subjects underwent medical examination, blood sampling, and 24-hour ambulatory heart rate recording while kept on their work routine. The root mean square of successive differences in RR intervals during the night period (nighttime RMSSD) was computed as the heart rate variability index of vagal function. RESULTS: After controlling for demographic, lifestyle, and medical factors, nighttime RMSSD explained 1.7% (P = 0.001), 0.8% (P = 0.033), and 7.8% (P = 0.007), respectively, of the variance in fibrinogen levels in all subjects, men, and women. Nighttime RMSSD and fibrinogen levels were stronger correlated in women than in men. In all workers, men, and women, respectively, there was a mean +/- SEM increase of 0.41 +/- 0.13 mg/dL, 0.28 +/- 0.13 mg/dL, and 1.16 +/- 0.41 mg/dL fibrinogen for each millisecond decrease in nighttime RMSSD. CONCLUSIONS: Reduced vagal outflow to the heart correlated with elevated plasma fibrinogen levels independent of the established cardiovascular risk factors. This relationship seemed comparably stronger in women than men. Such an autonomic mechanism might contribute to the atherosclerotic process and its thrombotic complications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

These studies were designed to determine whether continuous intravenous infusion of increasing dosages of porcine relaxin during late pregnancy in beef heifers would influence circulating blood concentrations of relaxin, progesterone, and oxytocin, and time of onset of parturition. Beef heifers were bred by artificial insemination and, on Day 277, fitted with indwelling jugular cannulas for hormone infusion and blood sampling from Day 277 to 286. Intravenous infusion of purified porcine relaxin (pRLX, 3000 U mg-1) was started in heifers (n = 8) at increasing dosages (200 U h-1 on Days 277 and 278, 300 U h-1 on Days 279 and 280, 500 U h-1 on Day 281, 600 U h-1 on Day 282, and 700 U h-1 on Days 283 to 286). Phosphate buffer saline (PBS, 10 ml h-1) was infused during these same times to control (n = 6) animals. Relaxin treatment steadily increased the circulating plasma concentration of immunoreactive relaxin to more than 120 ng ml-1 compared with less than 0.5 ng ml-1 in PBStreated controls. Relaxin infusion in increasing dosages over the treatment time was associated with a significant decrease (P < 0.01) in plasma progesterone concentration compared with the PBS controls. Plasma levels of oxytocin at 4- hour intervals remained similar (P > 0.05) during the pretreatment period and throughout continuous infusion of pRLX and PBS. Although continuous intravenous infusion of relaxin resulted in a decrease in circulating blood levels of progesterone, it did not significantly reduce the interval between the beginning of pRLX treatment and parturition compared with the PBS-infused control heifers. These results indicate that continuous intravenous infusion of high levels of porcine relaxin resulted in a decrease in progesterone secretion in late pregnant beef heifers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Financial, economic, and biological data collected from cow-calf producers who participated in the Illinois and Iowa Standardized Performance Analysis (SPA) programs were used in this study. Data used were collected for the 1996 through 1999 calendar years, with each herd within year representing one observation. This resulted in a final database of 225 observations (117 from Iowa and 108 from Illinois) from commercial herds with a range in size from 20 to 373 cows. Two analyses were conducted, one utilizing financial cost of production data, the other economic cost of production data. Each observation was analyzed as the difference from the mean for that given year. The independent variable utilized in both the financial and economic models as an indicator of profit was return to unpaid labor and management per cow (RLM). Used as dependent variables were the five factors that make up total annual cow cost: feed cost, operating cost, depreciation cost, capital charge, and hired labor, all on an annual cost per cow basis. In the economic analysis, family labor was also included. Production factors evaluated as dependent variables in both models were calf weight, calf price, cull weight, cull price, weaning percentage, and calving distribution. Herd size and investment were also analyzed. All financial factors analyzed were significantly correlated to RLM (P < .10) except cull weight, and cull price. All economic factors analyzed were significantly correlated to RLM (P < .10) except calf weight, cull weight and cull price. Results of the financial prediction equation indicate that there are eight measurements capable of explaining over 82 percent of the farm-to-farm variation in RLM. Feed cost is the overriding factor driving RLM in both the financial and economic stepwise regression analyses. In both analyses over 50 percent of the herd-to-herd variation in RLM could be explained by feed cost. Financial feed cost is correlated (P < .001) to operating cost, depreciation cost, and investment. Economic feed cost is correlated (P < .001) with investment and operating cost, as well as capital charge. Operating cost, depreciation, and capital charge were all negatively correlated (P < .10) to herd size, and positively correlated (P < .01) to feed cost in both analyses. Operating costs were positively correlated with capital charge and investment (P < .01) in both analyses. In the financial regression model, depreciation cost was the second critical factor explaining almost 9 percent of the herd-to-herd variation in RLM followed by operating cost (5 percent). Calf weight had a greater impact than calf price on RLM in both the financial and economic regression models. Calf weight was the fourth indicator of RLM in the financial model and was similar in magnitude to operating cost. Investment was not a significant variable in either regression model; however, it was highly correlated to a number of the significant cost variables including feed cost, depreciation cost, and operating cost (P < .001, financial; P < .10, economic). Cost factors were far more influential in driving RLM than production, reproduction, or producer controlled marketing factors. Of these cost factors, feed cost had by far the largest impact. As producers focus attention on factors that affect the profitability of the operation, feed cost is the most critical control point because it was responsible for over 50 percent of the herd-to-herd variation in profit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In all European Union countries, chemical residues are required to be routinely monitored in meat. Good farming and veterinary practice can prevent the contamination of meat with pharmaceutical substances, resulting in a low detection of drug residues through random sampling. An alternative approach is to target-monitor farms suspected of treating their animals with antimicrobials. The objective of this project was to assess, using a stochastic model, the efficiency of these two sampling strategies. The model integrated data on Swiss livestock as well as expert opinion and results from studies conducted in Switzerland. Risk-based sampling showed an increase in detection efficiency of up to 100% depending on the prevalence of contaminated herds. Sensitivity analysis of this model showed the importance of the accuracy of prior assumptions for conducting risk-based sampling. The resources gained by changing from random to risk-based sampling should be transferred to improving the quality of prior information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Infectious diseases and social contacts in early life have been proposed to modulate brain tumour risk during late childhood and adolescence. METHODS CEFALO is an interview-based case-control study in Denmark, Norway, Sweden and Switzerland, including children and adolescents aged 7-19 years with primary intracranial brain tumours diagnosed between 2004 and 2008 and matched population controls. RESULTS The study included 352 cases (participation rate: 83%) and 646 controls (71%). There was no association with various measures of social contacts: daycare attendance, number of childhours at daycare, attending baby groups, birth order or living with other children. Cases of glioma and embryonal tumours had more frequent sick days with infections in the first 6 years of life compared with controls. In 7-19 year olds with 4+ monthly sick day, the respective odds ratios were 2.93 (95% confidence interval: 1.57-5.50) and 4.21 (95% confidence interval: 1.24-14.30). INTERPRETATION There was little support for the hypothesis that social contacts influence childhood and adolescent brain tumour risk. The association between reported sick days due to infections and risk of glioma and embryonal tumour may reflect involvement of immune functions, recall bias or inverse causality and deserve further attention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Few studies have addressed the interaction between instruction content and saccadic eye movement control. To assess the impact of instructions on top-down control, we instructed 20 healthy volunteers to deliberately delay saccade triggering, to make inaccurate saccades or to redirect saccades--i.e. to glimpse towards and then immediately opposite to the target. Regular pro- and antisaccade tasks were used for comparison. Bottom-up visual input remained unchanged and was a gap paradigm for all instructions. In the inaccuracy and delay tasks, both latencies and accuracies were detrimentally impaired by either type of instruction and the variability of latency and accuracy was increased. The intersaccadic interval (ISI) required to correct erroneous antisaccades was shorter than the ISI for instructed direction changes in the redirection task. The word-by-word instruction content interferes with top-down saccade control. Top-down control is a time consuming process, which may override bottom-up processing only during a limited time period. It is questionable whether parallel processing is possible in top-down control, since the long ISI for instructed direction changes suggests sequential planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Etravirine (ETV) is metabolized by cytochrome P450 (CYP) 3A, 2C9, and 2C19. Metabolites are glucuronidated by uridine diphosphate glucuronosyltransferases (UGT). To identify the potential impact of genetic and non-genetic factors involved in ETV metabolism, we carried out a two-step pharmacogenetics-based population pharmacokinetic study in HIV-1 infected individuals. Materials and methods: The study population included 144 individuals contributing 289 ETV plasma concentrations and four individuals contributing 23 ETV plasma concentrations collected in a rich sampling design. Genetic variants [n=125 single-nucleotide polymorphisms (SNPs)] in 34 genes with a predicted role in ETV metabolism were selected. A first step population pharmacokinetic model included non-genetic and known genetic factors (seven SNPs in CYP2C, one SNP in CYP3A5) as covariates. Post-hoc individual ETV clearance (CL) was used in a second (discovery) step, in which the effect of the remaining 98 SNPs in CYP3A, P450 cytochrome oxidoreductase (POR), nuclear receptor genes, and UGTs was investigated. Results: A one-compartment model with zero-order absorption best characterized ETV pharmacokinetics. The average ETV CL was 41 (l/h) (CV 51.1%), the volume of distribution was 1325 l, and the mean absorption time was 1.2 h. The administration of darunavir/ritonavir or tenofovir was the only non-genetic covariate influencing ETV CL significantly, resulting in a 40% [95% confidence interval (CI): 13–69%] and a 42% (95% CI: 17–68%) increase in ETV CL, respectively. Carriers of rs4244285 (CYP2C19*2) had 23% (8–38%) lower ETV CL. Co-administered antiretroviral agents and genetic factors explained 16% of the variance in ETV concentrations. None of the SNPs in the discovery step influenced ETV CL. Conclusion: ETV concentrations are highly variable, and co-administered antiretroviral agents and genetic factors explained only a modest part of the interindividual variability in ETV elimination. Opposing effects of interacting drugs effectively abrogate genetic influences on ETV CL, and vice-versa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although persons infected with human immunodeficiency virus (HIV), particularly men who have sex with men, are at excess risk for anal cancer, it has been difficult to disentangle the influences of anal exposure to human papillomavirus (HPV) infection, immunodeficiency, and combined antiretroviral therapy. A case-control study that included 59 anal cancer cases and 295 individually matched controls was nested in the Swiss HIV Cohort Study (1988-2011). In a subset of 41 cases and 114 controls, HPV antibodies were tested. A majority of anal cancer cases (73%) were men who have sex with men. Current smoking was significantly associated with anal cancer (odds ratio (OR) = 2.59, 95% confidence interval (CI): 1.25, 5.34), as were antibodies against L1 (OR = 4.52, 95% CI: 2.00, 10.20) and E6 (OR = ∞, 95% CI: 4.64, ∞) of HPV16, as well as low CD4+ cell counts, whether measured at nadir (OR per 100-cell/μL decrease = 1.53, 95% CI: 1.18, 2.00) or at cancer diagnosis (OR per 100-cell/μL decrease = 1.24, 95% CI: 1.08, 1.42). However, the influence of CD4+ cell counts appeared to be strongest 6-7 years prior to anal cancer diagnosis (OR for <200 vs. ≥500 cells/μL = 14.0, 95% CI: 3.85, 50.9). Smoking cessation and avoidance of even moderate levels of immunosuppression appear to be important in reducing long-term anal cancer risks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to widespread development of anthelmintic resistance in equine parasites, recommendations for their control are currently undergoing marked changes with a shift of emphasis toward more coprological surveillance and reduced treatment intensity. Denmark was the first nation to introduce prescription-only restrictions of anthelmintic drugs in 1999, but other European countries have implemented similar legislations over recent years. A questionnaire survey was performed in 2008 among Danish horse owners to provide a current status of practices and perceptions with relation to parasite control. Questions aimed at describing the current use of coprological surveillance and resulting anthelmintic treatment intensities, evaluating knowledge and perceptions about the importance of various attributes of parasite control, and assessing respondents' willingness to pay for advice and parasite surveillance services from their veterinarians. A total of 1060 respondents completed the questionnaire. A large majority of respondents (71.9%) were familiar with the concept of selective therapy. Results illustrated that the respondents' self-evaluation of their knowledge about parasites and their control associated significantly with their level of interest in the topic and their type of education (P<0.0001). The large majority of respondents either dewormed their horses twice a year and/or performed two fecal egg counts per horse per year. This approach was almost equally pronounced in foals, horses aged 1-3 years old, and adult horses. The respondents rated prevention of parasitic disease and prevention of drug resistance as the most important attributes, while cost and frequent fecal testing were rated least important. Respondents' actual spending on parasite control per horse in the previous year correlated significantly with the amount they declared themselves willing to spend (P<0.0001). However, 44.4% declared themselves willing to pay more than what they were spending. Altogether, results indicate that respondents were generally familiar with equine parasites and the concept of selective therapy, although there was some confusion over the terms small and large strongyles. They used a large degree of fecal surveillance in all age groups, with a majority of respondents sampling and/or treating around twice a year. Finally, respondents appeared willing to spend money on parasite control for their horses. It is of concern that the survey suggested that foals and young horses are treated in a manner very similar to adult horses, which is against current recommendations. Thus, the survey illustrates the importance of clear communication of guidelines for equine parasite control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Voluntary control of information processing is crucial to allocate resources and prioritize the processes that are most important under a given situation; the algorithms underlying such control, however, are often not clear. We investigated possible algorithms of control for the performance of the majority function, in which participants searched for and identified one of two alternative categories (left or right pointing arrows) as composing the majority in each stimulus set. We manipulated the amount (set size of 1, 3, and 5) and content (ratio of left and right pointing arrows within a set) of the inputs to test competing hypotheses regarding mental operations for information processing. Using a novel measure based on computational load, we found that reaction time was best predicted by a grouping search algorithm as compared to alternative algorithms (i.e., exhaustive or self-terminating search). The grouping search algorithm involves sampling and resampling of the inputs before a decision is reached. These findings highlight the importance of investigating the implications of voluntary control via algorithms of mental operations.