981 resultados para intoxication hazard
Resumo:
PURPOSE Updated results are presented after a median follow-up of 7.3 years from the phase III First-Line Indolent Trial of yttrium-90 ((90)Y) -ibritumomab tiuxetan in advanced-stage follicular lymphoma (FL) in first remission. PATIENTS AND METHODS Patients with CD20(+) stage III or IV FL with complete response (CR), unconfirmed CR (CRu), or partial response (PR) after first-line induction treatment were randomly assigned to (90)Y-ibritumomab consolidation therapy (rituximab 250 mg/m(2) days -7 and 0, then (90)Y-ibritumomab 14.8 MBq/kg day 0; maximum 1,184 MBq) or no further treatment (control). Primary end point was progression-free survival (PFS) from date of random assignment. Results For 409 patients available for analysis ((90)Y-ibritumomab, n = 207; control, n = 202), estimated 8-year overall PFS was 41% with (90)Y-ibritumomab versus 22% for control (hazard ratio [HR], 0.47; P < .001). For patients in CR/CRu after induction, 8-year PFS with (90)Y-ibritumomab was 48% versus 32% for control (HR, 0.61; P = .008), and for PR patients, it was 33% versus 10% (HR, 0.38; P < .001). For (90)Y-ibritumomab consolidation, median PFS was 4.1 years (v 1.1 years for control; P < .001). Median time to next treatment (TTNT) was 8.1 years for (90)Y-ibritumomab versus 3.0 years for control (P < .001) with approximately 80% response rates to second-line therapy in either arm, including autologous stem-cell transplantation. No unexpected toxicities emerged during long-term follow-up. Estimated between-group 8-year overall survival rates were similar. Annualized incidence rate of myelodysplastic syndrome/acute myeloblastic leukemia was 0.50% versus 0.07% in (90)Y-ibritumomab and control groups, respectively (P = .042). CONCLUSION (90)Y-ibritumomab consolidation after achieving PR or CR/CRu to induction confers 3-year benefit in median PFS with durable 19% PFS advantage at 8 years and improves TTNT by 5.1 years for patients with advanced FL.
Resumo:
BACKGROUND: High-dose chemotherapy with autologous stem-cell transplantation is a standard treatment for young patients with multiple myeloma. Residual disease is almost always present after transplantation and is responsible for relapse. This phase 3, placebo-controlled trial investigated the efficacy of lenalidomide maintenance therapy after transplantation. METHODS: We randomly assigned 614 patients younger than 65 years of age who had nonprogressive disease after first-line transplantation to maintenance treatment with either lenalidomide (10 mg per day for the first 3 months, increased to 15 mg if tolerated) or placebo until relapse. The primary end point was progression-free survival. RESULTS: Lenalidomide maintenance therapy improved median progression-free survival (41 months, vs. 23 months with placebo; hazard ratio, 0.50; P<0.001). This benefit was observed across all patient subgroups, including those based on the β(2)-microglobulin level, cytogenetic profile, and response after transplantation. With a median follow-up period of 45 months, more than 70% of patients in both groups were alive at 4 years. The rates of grade 3 or 4 peripheral neuropathy were similar in the two groups. The incidence of second primary cancers was 3.1 per 100 patient-years in the lenalidomide group versus 1.2 per 100 patient-years in the placebo group (P=0.002). Median event-free survival (with events that included second primary cancers) was significantly improved with lenalidomide (40 months, vs. 23 months with placebo; P<0.001). CONCLUSIONS: Lenalidomide maintenance after transplantation significantly prolonged progression-free and event-free survival among patients with multiple myeloma. Four years after randomization, overall survival was similar in the two study groups. (Funded by the Programme Hospitalier de Recherche Clinique and others; ClinicalTrials.gov number, NCT00430365.).
Resumo:
BACKGROUND: The aim of this study was to explore the predictive value of longitudinal self-reported adherence data on viral rebound. METHODS: Individuals in the Swiss HIV Cohort Study on combined antiretroviral therapy (cART) with RNA <50 copies/ml over the previous 3 months and who were interviewed about adherence at least once prior to 1 March 2007 were eligible. Adherence was defined in terms of missed doses of cART (0, 1, 2 or >2) in the previous 28 days. Viral rebound was defined as RNA >500 copies/ml. Cox regression models with time-independent and -dependent covariates were used to evaluate time to viral rebound. RESULTS: A total of 2,664 individuals and 15,530 visits were included. Across all visits, missing doses were reported as follows: 1 dose 14.7%, 2 doses 5.1%, >2 doses 3.8% taking <95% of doses 4.5% and missing > or =2 consecutive doses 3.2%. In total, 308 (11.6%) patients experienced viral rebound. After controlling for confounding variables, self-reported non-adherence remained significantly associated with the rate of occurrence of viral rebound (compared with zero missed doses: 1 dose, hazard ratio [HR] 1.03, 95% confidence interval [CI] 0.72-1.48; 2 doses, HR 2.17, 95% CI 1.46-3.25; >2 doses, HR 3.66, 95% CI 2.50-5.34). Several variables significantly associated with an increased risk of viral rebound irrespective of adherence were identified: being on a protease inhibitor or triple nucleoside regimen (compared with a non-nucleoside reverse transcriptase inhibitor), >5 previous cART regimens, seeing a less-experienced physician, taking co-medication, and a shorter time virally suppressed. CONCLUSIONS: A simple self-report adherence questionnaire repeatedly administered provides a sensitive measure of non-adherence that predicts viral rebound.
Resumo:
BACKGROUND: Pathological complete response (pCR) following chemotherapy is strongly associated with both breast cancer subtype and long-term survival. Within a phase III neoadjuvant chemotherapy trial, we sought to determine whether the prognostic implications of pCR, TP53 status and treatment arm (taxane versus non-taxane) differed between intrinsic subtypes. PATIENTS AND METHODS: Patients were randomized to receive either six cycles of anthracycline-based chemotherapy or three cycles of docetaxel then three cycles of eprirubicin/docetaxel (T-ET). pCR was defined as no evidence of residual invasive cancer (or very few scattered tumour cells) in primary tumour and lymph nodes. We used a simplified intrinsic subtypes classification, as suggested by the 2011 St Gallen consensus. Interactions between pCR, TP53 status, treatment arm and intrinsic subtype on event-free survival (EFS), distant metastasis-free survival (DMFS) and overall survival (OS) were studied using a landmark and a two-step approach multivariate analyses. RESULTS: Sufficient data for pCR analyses were available in 1212 (65%) of 1856 patients randomized. pCR occurred in 222 of 1212 (18%) patients: 37 of 496 (7.5%) luminal A, 22 of 147 (15%) luminal B/HER2 negative, 51 of 230 (22%) luminal B/HER2 positive, 43 of 118 (36%) HER2 positive/non-luminal, 69 of 221(31%) triple negative (TN). The prognostic effect of pCR on EFS did not differ between subtypes and was an independent predictor for better EFS [hazard ratio (HR) = 0.40, P < 0.001 in favour of pCR], DMFS (HR = 0.32, P < 0.001) and OS (HR = 0.32, P < 0.001). Chemotherapy arm was an independent predictor only for EFS (HR = 0.73, P = 0.004 in favour of T-ET). The interaction between TP53, intrinsic subtypes and survival outcomes only approached statistical significance for EFS (P = 0.1). CONCLUSIONS: pCR is an independent predictor of favourable clinical outcomes in all molecular subtypes in a two-step multivariate analysis. CLINICALTRIALSGOV: EORTC 10994/BIG 1-00 Trial registration number NCT00017095.
Resumo:
Abstract This thesis presents three empirical studies in the field of health insurance in Switzerland. First we investigate the link between health insurance coverage and health care expenditures. We use claims data for over 60 000 adult individuals covered by a major Swiss Health Insurance Fund, followed for four years; the data show a strong positive correlation between coverage and expenditures. Two methods are developed and estimated in order to separate selection effects (due to individual choice of coverage) and incentive effects ("ex post moral hazard"). The first method uses the comparison between inpatient and outpatient expenditures to identify both effects and we conclude that both selection and incentive effects are significantly present in our data. The second method is based on a structural model of joint demand of health care and health insurance and makes the most of the change in the marginal cost of health care to identify selection and incentive effects. We conclude that the correlation between insurance coverage and health care expenditures may be decomposed into the two effects: 75% may be attributed to selection, and 25 % to incentive effects. Moreover, we estimate that a decrease in the coinsurance rate from 100% to 10% increases the marginal demand for health care by about 90% and from 100% to 0% by about 150%. Secondly, having shown that selection and incentive effects exist in the Swiss health insurance market, we present the consequence of this result in the context of risk adjustment. We show that if individuals choose their insurance coverage in function of their health status (selection effect), the optimal compensations should be function of the se- lection and incentive effects. Therefore, a risk adjustment mechanism which ignores these effects, as it is the case presently in Switzerland, will miss his main goal to eliminate incentives for sickness funds to select risks. Using a simplified model, we show that the optimal compensations have to take into account the distribution of risks through the insurance plans in case of self-selection in order to avoid incentives to select risks.Then, we apply our propositions to Swiss data and propose a simple econometric procedure to control for self-selection in the estimation of the risk adjustment formula in order to compute the optimal compensations.
Resumo:
Due to various contexts and processes, forensic science communities may have different approaches, largely influenced by their criminal justice systems. However, forensic science practices share some common characteristics. One is the assurance of a high (scientific) quality within processes and practices. For most crime laboratory directors and forensic science associations, this issue is conditioned by the triangle of quality, which represents the current paradigm of quality assurance in the field. It consists of the implementation of standardization, certification, accreditation, and an evaluation process. It constitutes a clear and sound way to exchange data between laboratories and enables databasing due to standardized methods ensuring reliable and valid results; but it is also a means of defining minimum requirements for practitioners' skills for specific forensic science activities. The control of each of these aspects offers non-forensic science partners the assurance that the entire process has been mastered and is trustworthy. Most of the standards focus on the analysis stage and do not consider pre- and post-laboratory stages, namely, the work achieved at the investigation scene and the evaluation and interpretation of the results, intended for intelligence beneficiaries or for court. Such localized consideration prevents forensic practitioners from identifying where the problems really lie with regard to criminal justice systems. According to a performance-management approach, scientific quality should not be restricted to standardized procedures and controls in forensic science practice. Ensuring high quality also strongly depends on the way a forensic science culture is assimilated (into specific education training and workplaces) and in the way practitioners understand forensic science as a whole.
Resumo:
A high heart rate (HR) predicts future cardiovascular events. We explored the predictive value of HR in patients with high-risk hypertension and examined whether blood pressure reduction modifies this association. The participants were 15,193 patients with hypertension enrolled in the Valsartan Antihypertensive Long-term Use Evaluation (VALUE) trial and followed up for 5 years. The HR was assessed from electrocardiographic recordings obtained annually throughout the study period. The primary end point was the interval to cardiac events. After adjustment for confounders, the hazard ratio of the composite cardiac primary end point for a 10-beats/min of the baseline HR increment was 1.16 (95% confidence interval 1.12 to 1.20). Compared to the lowest HR quintile, the adjusted hazard ratio in the highest quintile was 1.73 (95% confidence interval 1.46 to 2.04). Compared to the pooled lower quintiles of baseline HR, the annual incidence of primary end point in the top baseline quintile was greater in each of the 5 study years (all p <0.05). The adjusted hazard ratio for the primary end point in the highest in-trial HR heart rate quintile versus the lowest quintile was 1.53 (95% confidence interval 1.26 to 1.85). The incidence of primary end points in the highest in-trial HR group compared to the pooled 4 lower quintiles was 53% greater in patients with well-controlled blood pressure (p <0.001) and 34% greater in those with uncontrolled blood pressure (p = 0.002). In conclusion, an increased HR is a long-term predictor of cardiovascular events in patients with high-risk hypertension. This effect was not modified by good blood pressure control. It is not yet known whether a therapeutic reduction of HR would improve cardiovascular prognosis.
Resumo:
Because of the increase in workplace automation and the diversification of industrial processes, workplaces have become more and more complex. The classical approaches used to address workplace hazard concerns, such as checklists or sequence models, are, therefore, of limited use in such complex systems. Moreover, because of the multifaceted nature of workplaces, the use of single-oriented methods, such as AEA (man oriented), FMEA (system oriented), or HAZOP (process oriented), is not satisfactory. The use of a dynamic modeling approach in order to allow multiple-oriented analyses may constitute an alternative to overcome this limitation. The qualitative modeling aspects of the MORM (man-machine occupational risk modeling) model are discussed in this article. The model, realized on an object-oriented Petri net tool (CO-OPN), has been developed to simulate and analyze industrial processes in an OH&S perspective. The industrial process is modeled as a set of interconnected subnets (state spaces), which describe its constitutive machines. Process-related factors are introduced, in an explicit way, through machine interconnections and flow properties. While man-machine interactions are modeled as triggering events for the state spaces of the machines, the CREAM cognitive behavior model is used in order to establish the relevant triggering events. In the CO-OPN formalism, the model is expressed as a set of interconnected CO-OPN objects defined over data types expressing the measure attached to the flow of entities transiting through the machines. Constraints on the measures assigned to these entities are used to determine the state changes in each machine. Interconnecting machines implies the composition of such flow and consequently the interconnection of the measure constraints. This is reflected by the construction of constraint enrichment hierarchies, which can be used for simulation and analysis optimization in a clear mathematical framework. The use of Petri nets to perform multiple-oriented analysis opens perspectives in the field of industrial risk management. It may significantly reduce the duration of the assessment process. But, most of all, it opens perspectives in the field of risk comparisons and integrated risk management. Moreover, because of the generic nature of the model and tool used, the same concepts and patterns may be used to model a wide range of systems and application fields.
Resumo:
Delta(9)-Tetrahydrocannabinol (THC) is frequently found in the blood of drivers suspected of driving under the influence of cannabis or involved in traffic crashes. The present study used a double-blind crossover design to compare the effects of medium (16.5 mg THC) and high doses (45.7 mg THC) of hemp milk decoctions or of a medium dose of dronabinol (20 mg synthetic THC, Marinol on several skills required for safe driving. Forensic interpretation of cannabinoids blood concentrations were attempted using the models proposed by Daldrup (cannabis influencing factor or CIF) and Huestis and coworkers. First, the time concentration-profiles of THC, 11-hydroxy-Delta(9)-tetrahydrocannabinol (11-OH-THC) (active metabolite of THC), and 11-nor-9-carboxy-Delta(9)-tetrahydrocannabinol (THCCOOH) in whole blood were determined by gas chromatography-mass spectrometry-negative ion chemical ionization. Compared to smoking studies, relatively low concentrations were measured in blood. The highest mean THC concentration (8.4 ng/mL) was achieved 1 h after ingestion of the strongest decoction. Mean maximum 11-OH-THC level (12.3 ng/mL) slightly exceeded that of THC. THCCOOH reached its highest mean concentration (66.2 ng/mL) 2.5-5.5 h after intake. Individual blood levels showed considerable intersubject variability. The willingness to drive was influenced by the importance of the requested task. Under significant cannabinoids influence, the participants refused to drive when they were asked whether they would agree to accomplish several unimportant tasks, (e.g., driving a friend to a party). Most of the participants reported a significant feeling of intoxication and did not appreciate the effects, notably those felt after drinking the strongest decoction. Road sign and tracking testing revealed obvious and statistically significant differences between placebo and treatments. A marked impairment was detected after ingestion of the strongest decoction. A CIF value, which relies on the molar ratio of main active to inactive cannabinoids, greater than 10 was found to correlate with a strong feeling of intoxication. It also matched with a significant decrease in the willingness to drive, and it matched also with a significant impairment in tracking performances. The mathematic model II proposed by Huestis et al. (1992) provided at best a rough estimate of the time of oral administration with 27% of actual values being out of range of the 95% confidence interval. The sum of THC and 11-OH-THC blood concentrations provided a better estimate of impairment than THC alone. This controlled clinical study points out the negative influence on fitness to drive after medium or high dose oral THC or dronabinol.
Resumo:
BACKGROUND: The risk of falls is the most commonly cited reason for not providing oral anticoagulation, although the risk of bleeding associated with falls on oral anticoagulants is still debated. We aimed to evaluate whether patients on oral anticoagulation with high falls risk have an increased risk of major bleeding. METHODS: We prospectively studied consecutive adult medical patients who were discharged on oral anticoagulants. The outcome was the time to a first major bleed within a 12-month follow-up period adjusted for age, sex, alcohol abuse, number of drugs, concomitant treatment with antiplatelet agents, and history of stroke or transient ischemic attack. RESULTS: Among the 515 enrolled patients, 35 patients had a first major bleed during follow-up (incidence rate: 7.5 per 100 patient-years). Overall, 308 patients (59.8%) were at high risk of falls, and these patients had a nonsignificantly higher crude incidence rate of major bleeding than patients at low risk of falls (8.0 vs 6.8 per 100 patient-years, P=.64). In multivariate analysis, a high falls risk was not statistically significantly associated with the risk of a major bleed (hazard ratio 1.09; 95% confidence interval, 0.54-2.21). Overall, only 3 major bleeds occurred directly after a fall (incidence rate: 0.6 per 100 patient-years). CONCLUSIONS: In this prospective cohort, patients on oral anticoagulants at high risk of falls did not have a significantly increased risk of major bleeds. These findings suggest that being at risk of falls is not a valid reason to avoid oral anticoagulants in medical patients.
Resumo:
BACKGROUND: This study was undertaken to determine whether use of the direct renin inhibitor aliskiren would reduce cardiovascular and renal events in patients with type 2 diabetes and chronic kidney disease, cardiovascular disease, or both. METHODS: In a double-blind fashion, we randomly assigned 8561 patients to aliskiren (300 mg daily) or placebo as an adjunct to an angiotensin-converting-enzyme inhibitor or an angiotensin-receptor blocker. The primary end point was a composite of the time to cardiovascular death or a first occurrence of cardiac arrest with resuscitation; nonfatal myocardial infarction; nonfatal stroke; unplanned hospitalization for heart failure; end-stage renal disease, death attributable to kidney failure, or the need for renal-replacement therapy with no dialysis or transplantation available or initiated; or doubling of the baseline serum creatinine level. RESULTS: The trial was stopped prematurely after the second interim efficacy analysis. After a median follow-up of 32.9 months, the primary end point had occurred in 783 patients (18.3%) assigned to aliskiren as compared with 732 (17.1%) assigned to placebo (hazard ratio, 1.08; 95% confidence interval [CI], 0.98 to 1.20; P=0.12). Effects on secondary renal end points were similar. Systolic and diastolic blood pressures were lower with aliskiren (between-group differences, 1.3 and 0.6 mm Hg, respectively) and the mean reduction in the urinary albumin-to-creatinine ratio was greater (between-group difference, 14 percentage points; 95% CI, 11 to 17). The proportion of patients with hyperkalemia (serum potassium level, ≥6 mmol per liter) was significantly higher in the aliskiren group than in the placebo group (11.2% vs. 7.2%), as was the proportion with reported hypotension (12.1% vs. 8.3%) (P<0.001 for both comparisons). CONCLUSIONS: The addition of aliskiren to standard therapy with renin-angiotensin system blockade in patients with type 2 diabetes who are at high risk for cardiovascular and renal events is not supported by these data and may even be harmful. (Funded by Novartis; ALTITUDE ClinicalTrials.gov number, NCT00549757.).
Resumo:
Tort claims resulting from alleged highway defects have introduced an additional element in the planning, design, construction, and maintenance of highways. A survey of county governments in Iowa was undertaken in order to quantify the magnitude and determine the nature of this problem. This survey included the use of mailed questionnaires and personal interviews with County Engineers. Highway-related claims filed against counties in Iowa amounted to about $52,000,000 during the period 1973 through 1978. Over $30,000,000 in claims was pending at the end of 1978. Settlements of judgments were made at a cost of 12.2% of the amount claimed for those claims that had been disposed of, not including costs for handling claims, attorney fees, or court costs. There was no clear time trend in the amount of claims for the six-year period surveyed, although the anount claimed in 1978 was about double the average for the preceding five years. Problems that resulted in claims for damages from counties have generally related to alleged omissions in the use of traffic control devices or defects, often temporary, resulting from alleged inadequacies in highway maintenance. The absence of stop signs or warning signs often has been the central issue in a highway-related tort claim. Maintenance problems most frequently alleged have included inadequate shoulders, surface roughness, ice o? snow conditions, and loose gravel. The variation in the occurrence of tort claims among 85 counties in Iowa could not be related to any of the explanatory variables that were tested. Claims hppeared to have occurred randomly. However, using data from a subsample of 11 counties, a significant relationship was shown probably to exist between the amount of tort claims and the extensiveness of use of wcirning signs on the respective county road systems. Although there was no indication in any county that their use of warning signs did not conform with provisions of the Manual on Uniform Traffic Control Devices (Federal Highway Administration, Government Printing Office, Washington, D.C., 1978), many more warning signs were used in some counties than would be required to satisfy this minimum requirement. Sign vandalism reportedly is a problem in all counties. The threat of vandalism and the added costs incurred thereby have tended to inhibit more extensive use of traffic control devices. It also should be noted that there is no indication from this research of a correlation between the intensiveness of sign usage and highway safety. All highway maintenance activities introduce some extraordinary hazard for motorists. Generally effective methodologies have evolved for use on county road systems for routine maintenance activities, procedures that tend to reduce the hazard to practical and reasonably acceptable levels. Blading of loose-surfaced roads is an examples such a routine maintenance activity. Alternative patterns for blading that were investigated as part of this research offered no improvements in safety when compared with the method in current use and introduced a significant additional cost that was unacceptable, given the existing limitations in resources available for county roads.
Resumo:
Refractory status epilepticus (RSE)-that is, seizures resistant to at least two antiepileptic drugs (AEDs)-is generally managed with barbiturates, propofol, or midazolam, despite a low level of evidence (Rossetti, 2007). When this approach fails, the need for alternative pharmacologic and nonpharmacologic strategies emerges. These have been investigated even less systematically than the aforementioned compounds, and are often used, sometimes in succession, in cases of extreme refractoriness (Robakis & Hirsch, 2006). Several possibilities are reviewed here. In view of the marked heterogeneity of reported information, etiologies, ages, and comedications, it is extremely difficult to evaluate a given method, not to say to compare different strategies among them. Pharmacologic Approaches Isoflurane and desflurane may complete the armamentarium of anesthetics,' and should be employed in a ''close'' environment, in order to prevent intoxication of treating personnel. c-Aminobutyric acid (GABA)A receptor potentiation represents the putative mechanism of action. In an earlier report, isoflurane was used for up to 55 h in nine patients, controlling seizures in all; mortality was, however, 67% (Kofke et al., 1989). More recently, the use of these inhalational anesthetics was described in seven subjects with RSE, for up to 26 days, with an endtidal concentration of 1.2-5%. All patients required vasopressors, and paralytic ileus occurred in three; outcome was fatal in three patients (43%) (Mirsattari et al., 2004). Ketamine, known as an emergency anesthetic because of its favorable hemodynamic profile, is an N-methyl-daspartate (NMDA) antagonist; the interest for its use in RSE derives from animal works showing loss of GABAA efficacy and maintained NMDA sensitivity in prolonged status epilepticus (Mazarati & Wasterlain, 1999). However, to avoid possible neurotoxicity, it appears safer to combine ketamine with GABAergic compounds (Jevtovic-Todorovic et al., 2001; Ubogu et al., 2003), also because of a likely synergistic effect (Martin & Kapur, 2008). There are few reported cases in humans, describing progressive dosages up to 7.5 mg/kg/h for several days (Sheth & Gidal, 1998; Quigg et al., 2002; Pruss & Holtkamp, 2008), with moderate outcomes. Paraldehyde acts through a yet-unidentified mechanism, and appears to be relatively safe in terms of cardiovascular tolerability (Ramsay, 1989; Thulasimani & Ramaswamy, 2002), but because of the risk of crystal formation and its reactivity with plastic, it should be used only as fresh prepared solution in glass devices (Beyenburg et al., 2000). There are virtually no recent reports regarding its use in adults RSE, whereas rectal paraldehyde in children with status epilepticus resistant to benzodiazepines seems less efficacious than intravenous phenytoin (Chin et al., 2008). Etomidate is another anesthetic agent for which the exact mechanism of action is also unknown, which is also relatively favorable regarding cardiovascular side effects, and may be used for rapid sedation. Its use in RSE was reported in eight subjects (Yeoman et al., 1989). After a bolus of 0.3 mg/kg, a drip of up to 7.2 mg/kg/h for up to 12 days was administered, with hypotension occurring in five patients; two patients died. A reversible inhibition of cortisol synthesis represents an important concern, limiting its widespread use and implying a careful hormonal substitution during treatment (Beyenburg et al., 2000). Several nonsedating approaches have been reported. The use of lidocaine in RSE, a class Ib antiarrhythmic agent modulating sodium channels, was reviewed in 1997 (Walker & Slovis, 1997). Initial boluses up to 5 mg/kg and perfusions of up to 6 mg/kg/h have been mentioned; somewhat surprisingly, at times lidocaine seemed to be successful in controlling seizures in patients who were refractory to phenytoin. The aforementioned dosages should not be overshot, in order to keep lidocaine levels under 5 mg/L and avoid seizure induction (Hamano et al., 2006). A recent pediatric retrospective survey on 57 RSE episodes (37 patients) described a response in 36%, and no major adverse events; mortality was not given (Hamano et al., 2006 Verapamil, a calcium-channel blocker, also inhibits P-glycoprotein, a multidrug transporter that may diminish AED availability in the brain (Potschka et al., 2002). Few case reports on its use in humans are available; this medication nevertheless appears relatively safe (under cardiac monitoring) up to dosages of 360 mg/day (Iannetti et al., 2005). Magnesium, a widely used agent for seizures elicited by eclampsia, has also been anecdotally reported in RSE (Fisher et al., 1988; Robakis & Hirsch, 2006), but with scarce results even at serum levels of 14 mm. The rationale may be found in the physiologic blockage of NMDA channels by magnesium ions (Hope & Blumenfeld, 2005). Ketogenic diet has been prescribed for decades, mostly in children, to control refractory seizures. Its use in RSE as ''ultima ratio'' has been occasionally described: three of six children (Francois et al., 2003) and one adult (Bodenant et al., 2008) were responders. This approach displays its effect subacutely over several days to a few weeks. Because ''malignant RSE'' seems at times to be the consequence of immunologic processes (Holtkamp et al., 2005), a course of immunomodulatory treatment is often advocated in this setting, even in the absence of definite autoimmune etiologies (Robakis & Hirsch, 2006); steroids, adrenocorticotropic hormone (ACTH), plasma exchanges, or intravenous immunoglobulins may be used alone or in sequential combination. Nonpharmacologic Approaches These strategies are described somewhat less frequently than pharmacologic approaches. Acute implantation of vagus nerve stimulation (VNS) has been reported in RSE (Winston et al., 2001; Patwardhan et al., 2005; De Herdt et al., 2009). Stimulation was usually initiated in the operation room, and intensity progressively adapted over a few days up to 1.25 mA (with various regimens regarding the other parameters), allowing a subacute seizure control; one transitory episode of bradycardia/asystole has been described (De Herdt et al., 2009). Of course, pending identification of a definite seizure focus, resective surgery may also be considered in selected cases (Lhatoo & Alexopoulos, 2007). Low-frequency (0.5 Hz) transcranial magnetic stimulation (TMS) at 90% of the resting motor threshold has been reported to be successful for about 2 months in a patient with epilepsia partialis continua, but with a weaning effect afterward, implying the need for a repetitive use (Misawa et al., 2005). More recently, TMS was applied in a combination of a short ''priming'' high frequency (up to 100 Hz) and longer runs of low-frequency stimulations (1 Hz) at 90-100% of the motor threshold in seven other patients with simple-partial status, with mixed results (Rotenberg et al., 2009). Paradoxically at first glance, electroconvulsive treatment may be found in cases of extremely resistant RSE. A recent case report illustrates its use in an adult patient with convulsive status, with three sessions (three convulsions each) carried out over 3 days, resulting in a moderate recovery; the mechanism is believed to be related to modification of the synaptic release of neurotransmitters (Cline & Roos, 2007). Therapeutic hypothermia, which is increasingly used in postanoxic patients (Oddo et al., 2008), has been the object of a recent case series in RSE (Corry et al., 2008). Reduction of energy demand, excitatory neurotransmission, and neuroprotective effects may account for the putative mechanism of action. Four adult patients in RSE were cooled to 31_-34_C with an endovascular system for up to 90 h, and then passively rewarmed over 2-50 h. Seizures were controlled in two patients, one of whom died; also one of the other two patients in whom seizures continued subsequently deceased. Possible side effects are related to acid-base and electrolyte disturbances, and coagulation dysfunction including thrombosis, infectious risks, cardiac arrhythmia, and paralytic ileus (Corry et al., 2008; Cereda et al., 2009). Finally, anecdotic evidence suggests that cerebrospinal fluid (CSF)-air exchange may induce some transitory benefit in RSE (Kohrmann et al., 2006); although this approach was already in use in the middle of the twentieth century, the mechanism is unknown. Acknowledgment A wide spectrum of pharmacologic (sedating and nonsedating) and nonpharmacologic (surgical, or involving electrical stimulation) regimens might be applied to attempt RSE control. Their use should be considered only after refractoriness to AED or anesthetics displaying a higher level of evidence. Although it seems unlikely that these uncommon and scarcely studied strategies will influence the RSE outcome in a decisive way, some may be interesting in particular settings. However, because the main prognostic determinant in status epilepticus appears to be related to the underlying etiology rather than to the treatment approach (Rossetti et al., 2005, 2008), the safety issue should always represent a paramount concern for the prescribing physician. Conclusion The author confirms that he has read the Journal's position on issues involved in ethical publication and affirms that this paper is consistent with those guidelines.
Resumo:
The objective of this research was to investigate the application of integrated risk modeling to operations and maintenance activities, specifically moving operations, such as pavement testing, pavement marking, painting, snow removal, shoulder work, mowing, and so forth. The ultimate goal is to reduce the frequency and intensity of loss events (property damage, personal injury, and fatality) during operations and maintenance activities. This report includes a literature review that identifies the current and common practices adopted by different state departments of transportation (DOTs) and other transportation agencies for safe and efficient highway operations and maintenance (O/M) activities. The final appendix to the report includes information for eight innovative O/M risk mitigation technologies/equipment and covers the following for these technologies/equipment: Appropriate conditions for deployment Performance/effectiveness, depending on hazard/activity Cost to purchase Cost to operate and maintain Availability (resources and references)
Resumo:
INTRODUCTION: We investigated whether mRNA levels of E2F1, a key transcription factor involved in proliferation, differentiation and apoptosis, could be used as a surrogate marker for the determination of breast cancer outcome. METHODS: E2F1 and other proliferation markers were measured by quantitative RT-PCR in 317 primary breast cancer patients from the Stiftung Tumorbank Basel. Correlations to one another as well as to the estrogen receptor and ERBB2 status and clinical outcome were investigated. Results were validated and further compared with expression-based prognostic profiles using The Netherlands Cancer Institute microarray data set reported by Fan and colleagues. RESULTS: E2F1 mRNA expression levels correlated strongly with the expression of other proliferation markers, and low values were mainly found in estrogen receptor-positive and ERBB2-negative phenotypes. Patients with low E2F1-expressing tumors were associated with favorable outcome (hazard ratio = 4.3 (95% confidence interval = 1.8-9.9), P = 0.001). These results were consistent in univariate and multivariate Cox analyses, and were successfully validated in The Netherlands Cancer Institute data set. Furthermore, E2F1 expression levels correlated well with the 70-gene signature displaying the ability of selecting a common subset of patients at good prognosis. Breast cancer patients' outcome was comparably predictable by E2F1 levels, by the 70-gene signature, by the intrinsic subtype gene classification, by the wound response signature and by the recurrence score. CONCLUSION: Assessment of E2F1 at the mRNA level in primary breast cancer is a strong determinant of breast cancer patient outcome. E2F1 expression identified patients at low risk of metastasis irrespective of the estrogen receptor and ERBB2 status, and demonstrated similar prognostic performance to different gene expression-based predictors.