121 resultados para Event Study Methodology
Resumo:
BACKGROUND: Ischemic stroke is the leading cause of mortality worldwide and a major contributor to neurological disability and dementia. Terutroban is a specific TP receptor antagonist with antithrombotic, antivasoconstrictive, and antiatherosclerotic properties, which may be of interest for the secondary prevention of ischemic stroke. This article describes the rationale and design of the Prevention of cerebrovascular and cardiovascular Events of ischemic origin with teRutroban in patients with a history oF ischemic strOke or tRansient ischeMic Attack (PERFORM) Study, which aims to demonstrate the superiority of the efficacy of terutroban versus aspirin in secondary prevention of cerebrovascular and cardiovascular events. METHODS AND RESULTS: The PERFORM Study is a multicenter, randomized, double-blind, parallel-group study being carried out in 802 centers in 46 countries. The study population includes patients aged > or =55 years, having suffered an ischemic stroke (< or =3 months) or a transient ischemic attack (< or =8 days). Participants are randomly allocated to terutroban (30 mg/day) or aspirin (100 mg/day). The primary efficacy endpoint is a composite of ischemic stroke (fatal or nonfatal), myocardial infarction (fatal or nonfatal), or other vascular death (excluding hemorrhagic death of any origin). Safety is being evaluated by assessing hemorrhagic events. Follow-up is expected to last for 2-4 years. Assuming a relative risk reduction of 13%, the expected number of primary events is 2,340. To obtain statistical power of 90%, this requires inclusion of at least 18,000 patients in this event-driven trial. The first patient was randomized in February 2006. CONCLUSIONS: The PERFORM Study will explore the benefits and safety of terutroban in secondary cardiovascular prevention after a cerebral ischemic event.
Resumo:
Caspofungin at standard dose was evaluated as first-line monotherapy of mycologically documented probable/proven invasive aspergillosis (IA) (unmodified European Organisation for Research and Treatment of Cancer/Mycosis Study Group criteria) in allogeneic hematopoietic SCT patients. The primary efficacy end point was complete or partial response at end of caspofungin treatment. Response at week 12, survival and safety were additional end points. Enrollment was stopped prematurely because of low accrual, with 42 enrolled and 24 eligible, giving the study a power of 85%. Transplant was from unrelated donors in 16 patients; acute or chronic GVHD was present in 15. In all, 12 patients were neutropenic (<500/microl) at baseline, 10 received steroids and 16 calcineurin inhibitors or sirolimus. Median duration of caspofungin treatment was 24 days. At the end of caspofungin therapy, 10 (42%) patients had complete or partial response (95% confidence interval: 22-63%); 1 (4%) and 12 (50%) had stable and progressing disease, respectively; one was not evaluable. At week 12, eight patients (33%) had complete or partial response. Survival rates at week 6 and 12 were 79 and 50%, respectively. No patient had a drug-related serious adverse event or discontinued because of toxicity. Caspofungin first-line therapy was effective and well tolerated in allogeneic hematopoietic SCT patients with mycologically documented IA.
Resumo:
Much research studies how individuals cope with disease threat by blaming out-groups and protecting the in-group. The model of collective symbolic coping (CSC) describes four stages by which representations of a threatening event are elaborated in the mass media: awareness, divergence, convergence, and normalization. We used the CSC model to predict when symbolic in-group protection (othering) would occur in the case of the avian influenza (AI) outbreak. Two studies documented CSC stages and showed that othering occurred during the divergence stage, characterized by an uncertain symbolic environment. Study 1 analysed media coverage of AI over time, documenting CSC stages of awareness and divergence. In Study 2, a two-wave repeated cross-sectional survey was conducted just after the divergence stage and a year later. Othering was measured by the number of foreign countries erroneously ticked by participants as having human victims. Individual differences in germ aversion and social dominance orientation interacted to predict othering during the divergence stage but not a year later. Implications for research on CSC and symbolic in-group protection strategies resulting from disease threat are discussed.
Resumo:
Aims: The HR-NBL1 Study of the European SIOP Neuroblastoma Group (SIOPEN) randomised two high dose regimens to learn about potential superiority and toxicity profiles.Patients and Methods: At interim analysis 1483 high risk neuroblastoma patients (893 males) were included since 2002 with either INSS stage 4 disease (1383 pts) above 1 year, or as infants (59 pts) and stage 2&3 of any age (145 pts) with MYCN amplification. The median age at diagnosis was 2.9 years (1 month-19.9 years) with a median follow up of 3 years. Response eligibility criteria prior randomisation after Rapid Cojec Induction (J Clin Oncol, 2010) ± 2 courses of TVD (Cancer, 2003) included complete bone marrow remission and at least partial response at skeletal sites with no more than 3, but improved mIBG positive spots and a PBSC harvest of at least 3x10E6 CD34/kgBW. The randomised regimens were BuMel [busulfan oral till 2006, 4x150mg/m² in 4 ED; or intravenous use according to body weight as licenced thereafter; melphalan 140mg/m²/day) and CEM [carboplatinum ctn. infusion (4x AUC 4.1mg/ml.min/day, etoposid ctn. infusion (4x 338mg/m²/day or [4x 200mg/m²/day]*, melphalan (3x70mg/m²/day; 3x60mg/m²/day*;*reduced dosis if GFR< 100ml/min/1.73m²). Supportive care followed institutional guidelines. VOD prophylaxis included ursadiol, but randomised patients were not eligible for the prophylactic defibrotide trial. Local control included surgery and radiotherapy of 21Gy.Results: Of 1483 patients, 584 were being randomised for the high dose question at data lock. A significant difference in event free survival (3-year EFS 49% vs. 33%, p<0.001) and overall survival (3-year OS 61% vs. 48%, p=0.003) favouring the BuMel regimen over the CEM regimen was demonstrated. The relapse/progression rate was significantly higher after CEM (0.60±0.03) than after BuMel (0.48±0.03)(p<0.001). Toxicity data had reached 80% completeness at last analysis. The severe toxicity rate up to day 100 (ICU and toxic deaths) was below 10%, but was significantly higher for CEM (p= 0.014). The acute toxic death rate was 3% for BuMel and 5% for CEM (NS). The acute HDT toxicity profile favours the BuMel regimen in spite of a total VOD incidence of 18% (grade 3:5%).Conclusions: The Peto rule of P<0.001 at interim analysis on the primary endpoint, EFS was met. Hence randomization was stopped with BuMel as recommended standard treatment in the HR-NBl1/SIOPEN trial which is still accruing for the randomised immunotherapy question.
Resumo:
BACKGROUND/RATIONALE: Patient safety is a major concern in healthcare systems worldwide. Although most safety research has been conducted in the inpatient setting, evidence indicates that medical errors and adverse events are a threat to patients in the primary care setting as well. Since information about the frequency and outcomes of safety incidents in primary care is required, the goals of this study are to describe the type, frequency, seasonal and regional distribution of medication incidents in primary care in Switzerland and to elucidate possible risk factors for medication incidents. Label="METHODS AND ANALYSIS" ="METHODS"/> <AbstractText STUDY DESIGN AND SETTING: We will conduct a prospective surveillance study to identify cases of medication incidents among primary care patients in Switzerland over the course of the year 2015. PARTICIPANTS: Patients undergoing drug treatment by 167 general practitioners or paediatricians reporting to the Swiss Federal Sentinel Reporting System. INCLUSION CRITERIA: Any erroneous event, as defined by the physician, related to the medication process and interfering with normal treatment course. EXCLUSION CRITERIA: Lack of treatment effect, adverse drug reactions or drug-drug or drug-disease interactions without detectable treatment error. PRIMARY OUTCOME: Medication incidents. RISK FACTORS: Age, gender, polymedication, morbidity, care dependency, hospitalisation. STATISTICAL ANALYSIS: Descriptive statistics to assess type, frequency, seasonal and regional distribution of medication incidents and logistic regression to assess their association with potential risk factors. Estimated sample size: 500 medication incidents. LIMITATIONS: We will take into account under-reporting and selective reporting among others as potential sources of bias or imprecision when interpreting the results. ETHICS AND DISSEMINATION: No formal request was necessary because of fully anonymised data. The results will be published in a peer-reviewed journal. TRIAL REGISTRATION NUMBER: NCT0229537.
Resumo:
Probabilistic inversion methods based on Markov chain Monte Carlo (MCMC) simulation are well suited to quantify parameter and model uncertainty of nonlinear inverse problems. Yet, application of such methods to CPU-intensive forward models can be a daunting task, particularly if the parameter space is high dimensional. Here, we present a 2-D pixel-based MCMC inversion of plane-wave electromagnetic (EM) data. Using synthetic data, we investigate how model parameter uncertainty depends on model structure constraints using different norms of the likelihood function and the model constraints, and study the added benefits of joint inversion of EM and electrical resistivity tomography (ERT) data. Our results demonstrate that model structure constraints are necessary to stabilize the MCMC inversion results of a highly discretized model. These constraints decrease model parameter uncertainty and facilitate model interpretation. A drawback is that these constraints may lead to posterior distributions that do not fully include the true underlying model, because some of its features exhibit a low sensitivity to the EM data, and hence are difficult to resolve. This problem can be partly mitigated if the plane-wave EM data is augmented with ERT observations. The hierarchical Bayesian inverse formulation introduced and used herein is able to successfully recover the probabilistic properties of the measurement data errors and a model regularization weight. Application of the proposed inversion methodology to field data from an aquifer demonstrates that the posterior mean model realization is very similar to that derived from a deterministic inversion with similar model constraints.
Resumo:
The GH-2000 and GH-2004 projects have developed a method for detecting GH misuse based on measuring insulin-like growth factor-I (IGF-I) and the amino-terminal pro-peptide of type III collagen (P-III-NP). The objectives were to analyze more samples from elite athletes to improve the reliability of the decision limit estimates, to evaluate whether the existing decision limits needed revision, and to validate further non-radioisotopic assays for these markers. The study included 998 male and 931 female elite athletes. Blood samples were collected according to World Anti-Doping Agency (WADA) guidelines at various sporting events including the 2011 International Association of Athletics Federations (IAAF) World Athletics Championships in Daegu, South Korea. IGF-I was measured by the Immunotech A15729 IGF-I IRMA, the Immunodiagnostic Systems iSYS IGF-I assay and a recently developed mass spectrometry (LC-MS/MS) method. P-III-NP was measured by the Cisbio RIA-gnost P-III-P, Orion UniQ? PIIINP RIA and Siemens ADVIA Centaur P-III-NP assays. The GH-2000 score decision limits were developed using existing statistical techniques. Decision limits were determined using a specificity of 99.99% and an allowance for uncertainty because of the finite sample size. The revised Immunotech IGF-I - Orion P-III-NP assay combination decision limit did not change significantly following the addition of the new samples. The new decision limits are applied to currently available non-radioisotopic assays to measure IGF-I and P-III-NP in elite athletes, which should allow wider flexibility to implement the GH-2000 marker test for GH misuse while providing some resilience against manufacturer withdrawal or change of assays. Copyright © 2015 John Wiley & Sons, Ltd.
Resumo:
BACKGROUND AND PURPOSE: The best time for administering anticoagulation therapy in acute cardioembolic stroke remains unclear. This prospective cohort study of patients with acute stroke and atrial fibrillation, evaluated (1) the risk of recurrent ischemic event and severe bleeding; (2) the risk factors for recurrence and bleeding; and (3) the risks of recurrence and bleeding associated with anticoagulant therapy and its starting time after the acute stroke. METHODS: The primary outcome of this multicenter study was the composite of stroke, transient ischemic attack, symptomatic systemic embolism, symptomatic cerebral bleeding and major extracranial bleeding within 90 days from acute stroke. RESULTS: Of the 1029 patients enrolled, 123 had 128 events (12.6%): 77 (7.6%) ischemic stroke or transient ischemic attack or systemic embolism, 37 (3.6%) symptomatic cerebral bleeding, and 14 (1.4%) major extracranial bleeding. At 90 days, 50% of the patients were either deceased or disabled (modified Rankin score ≥3), and 10.9% were deceased. High CHA2DS2-VASc score, high National Institutes of Health Stroke Scale, large ischemic lesion and type of anticoagulant were predictive factors for primary study outcome. At adjusted Cox regression analysis, initiating anticoagulants 4 to 14 days from stroke onset was associated with a significant reduction in primary study outcome, compared with initiating treatment before 4 or after 14 days: hazard ratio 0.53 (95% confidence interval 0.30-0.93). About 7% of the patients treated with oral anticoagulants alone had an outcome event compared with 16.8% and 12.3% of the patients treated with low molecular weight heparins alone or followed by oral anticoagulants, respectively (P=0.003). CONCLUSIONS: Acute stroke in atrial fibrillation patients is associated with high rates of ischemic recurrence and major bleeding at 90 days. This study has observed that high CHA2DS2-VASc score, high National Institutes of Health Stroke Scale, large ischemic lesions, and type of anticoagulant administered each independently led to a greater risk of recurrence and bleedings. Also, data showed that the best time for initiating anticoagulation treatment for secondary stroke prevention is 4 to 14 days from stroke onset. Moreover, patients treated with oral anticoagulants alone had better outcomes compared with patients treated with low molecular weight heparins alone or before oral anticoagulants.
Resumo:
In this study, we randomly compared high doses of the tyrosine kinase inhibitor imatinib combined with reduced-intensity chemotherapy (arm A) to standard imatinib/hyperCVAD (cyclophosphamide/vincristine/doxorubicin/dexamethasone) therapy (arm B) in 268 adults (median age, 47 years) with Philadelphia chromosome-positive (Ph+) acute lymphoblastic leukemia (ALL). The primary objective was the major molecular response (MMolR) rate after cycle 2, patients being then eligible for allogeneic stem cell transplantation (SCT) if they had a donor, or autologous SCT if in MMolR and no donor. With fewer induction deaths, the complete remission (CR) rate was higher in arm A than in arm B (98% vs 91%; P = .006), whereas the MMolR rate was similar in both arms (66% vs 64%). With a median follow-up of 4.8 years, 5-year event-free survival and overall survival (OS) rates were estimated at 37.1% and 45.6%, respectively, without difference between the arms. Allogeneic transplantation was associated with a significant benefit in relapse-free survival (hazard ratio [HR], 0.69; P = .036) and OS (HR, 0.64; P = .02), with initial white blood cell count being the only factor significantly interacting with this SCT effect. In patients achieving MMolR, outcome was similar after autologous and allogeneic transplantation. This study validates an induction regimen combining reduced-intensity chemotherapy and imatinib in Ph+ ALL adult patients and suggests that SCT in first CR is still a good option for Ph+ ALL adult patients. This trial was registered at www.clinicaltrials.gov as #NCT00327678.
Resumo:
We all make decisions of varying levels of importance every day. Because making a decision implies that there are alternative choices to be considered, almost all decision involves some conflicts or dissatisfaction. Traditional economic models esteem that a person must weight the positive and negative outcomes of each option, and based on all these inferences, determines which option is the best for that particular situation. However, individuals rather act as irrational agents and tend to deviate from these rational choices. They somewhat evaluate the outcomes' subjective value, namely, when they face a risky choice leading to losses, people are inclined to have some preference for risk over certainty, while when facing a risky choice leading to gains, people often avoid to take risks and choose the most certain option. Yet, it is assumed that decision making is balanced between deliberative and emotional components. Distinct neural regions underpin these factors: the deliberative pathway that corresponds to executive functions, implies the activation of the prefrontal cortex, while the emotional pathway tends to activate the limbic system. These circuits appear to be altered in individuals with ADHD, and result, amongst others, in impaired decision making capacities. Their impulsive and inattentive behaviors are likely to be the cause of their irrational attitude towards risk taking. Still, a possible solution is to administrate these individuals a drug treatment, with the knowledge that it might have several side effects. However, an alternative treatment that relies on cognitive rehabilitation might be appropriate. This project was therefore aimed at investigate whether an intensive working memory training could have a spillover effect on decision making in adults with ADHD and in age-matched healthy controls. We designed a decision making task where the participants had to select an amount to gamble with the chance of 1/3 to win four times the chosen amount, while in the other cases they could loose their investment. Their performances were recorded using electroencephalography prior and after a one-month Dual N-Back training and the possible near and far transfer effects were investigated. Overall, we found that the performance during the gambling task was modulated by personality factors and by the importance of the symptoms at the pretest session. At posttest, we found that all individuals demonstrated an improvement on the Dual N-Back and on similar untrained dimensions. In addition, we discovered that not only the adults with ADHD showed a stable decrease of the symptomatology, as evaluated by the CAARS inventory, but this reduction was also detected in the control samples. In addition, Event-Related Potential (ERP) data are in favor of an change within prefrontal and parietal cortices. These results suggest that cognitive remediation can be effective in adults with ADHD, and in healthy controls. An important complement of this work would be the examination of the data in regard to the attentional networks, which could empower the fact that complex programs covering the remediation of several executive functions' dimensions is not required, a unique working memory training can be sufficient. -- Nous prenons tous chaque jour des décisions ayant des niveaux d'importance variables. Toutes les décisions ont une composante conflictuelle et d'insatisfaction, car prendre une décision implique qu'il y ait des choix alternatifs à considérer. Les modèles économiques traditionnels estiment qu'une personne doit peser les conséquences positives et négatives de chaque option et en se basant sur ces inférences, détermine quelle option est la meilleure dans une situation particulière. Cependant, les individus peuvent dévier de ces choix rationnels. Ils évaluent plutôt les valeur subjective des résultats, c'est-à-dire que lorsqu'ils sont face à un choix risqué pouvant les mener à des pertes, les gens ont tendance à avoir des préférences pour le risque à la place de la certitude, tandis que lorsqu'ils sont face à un choix risqué pouvant les conduire à un gain, ils évitent de prendre des risques et choisissent l'option la plus su^re. De nos jours, il est considéré que la prise de décision est balancée entre des composantes délibératives et émotionnelles. Ces facteurs sont sous-tendus par des régions neurales distinctes: le chemin délibératif, correspondant aux fonctions exécutives, implique l'activation du cortex préfrontal, tandis que le chemin émotionnel active le système limbique. Ces circuits semblent être dysfonctionnels chez les individus ayant un TDAH, et résulte, entre autres, en des capacités de prise de décision altérées. Leurs comportements impulsifs et inattentifs sont probablement la cause de ces attitudes irrationnelles face au risque. Cependant, une solution possible est de leur administrer un traitement médicamenteux, en prenant en compte les potentiels effets secondaires. Un traitement alternatif se reposant sur une réhabilitation cognitive pourrait être appropriée. Le but de ce projet est donc de déterminer si un entrainement intensif de la mémoire de travail peut avoir un effet sur la prise de décision chez des adultes ayant un TDAH et chez des contrôles sains du même âge. Nous avons conçu une tâche de prise de décision dans laquelle les participants devaient sélectionner un montant à jouer en ayant une chance sur trois de gagner quatre fois le montant choisi, alors que dans l'autre cas, ils pouvaient perdre leur investissement. Leurs performances ont été enregistrées en utilisant l'électroencéphalographie avant et après un entrainement d'un mois au Dual N-Back, et nous avons étudié les possibles effets de transfert. Dans l'ensemble, nous avons trouvé au pré-test que les performances au cours du jeu d'argent étaient modulées par les facteurs de personnalité, et par le degré des sympt^omes. Au post-test, nous avons non seulement trouvé que les adultes ayant un TDAH montraient une diminutions stable des symptômes, qui étaient évalués par le questionnaire du CAARS, mais que cette réduction était également perçue dans l'échantillon des contrôles. Les rsultats expérimentaux mesurés à l'aide de l'éléctroencéphalographie suggèrent un changement dans les cortex préfrontaux et pariétaux. Ces résultats suggèrent que la remédiation cognitive est efficace chez les adultes ayant un TDAH, mais produit aussi un effet chez les contrôles sains. Un complément important de ce travail pourrait examiner les données sur l'attention, qui pourraient renforcer l'idée qu'il n'est pas nécessaire d'utiliser des programmes complexes englobant la remédiation de plusieurs dimensions des fonctions exécutives, un simple entraiment de la mémoire de travail devrait suffire.
Resumo:
BACKGROUND: Rivaroxaban has become an alternative to vitamin-K antagonists (VKA) for stroke prevention in non-valvular atrial fibrillation (AF) patients due to its favourable risk-benefit profile in the restrictive setting of a large randomized trial. However in the primary care setting, physician's motivation to begin with rivaroxaban, treatment satisfaction and the clinical event rate after the initiation of rivaroxaban are not known. METHODS: Prospective data collection by 115 primary care physicians in Switzerland on consecutive nonvalvular AF patients with newly established rivaroxaban anticoagulation with 3-month follow-up. RESULTS: We enrolled 537 patients (73±11years, 57% men) with mean CHADS2 and HAS-BLED-scores of 2.2±1.3 and 2.4±1.1, respectively: 301(56%) were switched from VKA to rivaroxaban (STR-group) and 236(44%) were VKA-naïve (VN-group). Absence of routine coagulation monitoring (68%) and fixed-dose once-daily treatment (58%) were the most frequent criteria for physicians to initiate rivaroxaban. In the STR-group, patient's satisfaction increased from 3.6±1.4 under VKA to 5.5±0.8 points (P<0.001), and overall physician satisfaction from 3.9±1.3 to 5.4±0.9 points (P<0.001) at 3months of rivaroxaban therapy (score from 1 to 6 with higher scores indicating greater satisfaction). In the VN-group, both patient's (5.4±0.9) and physician's satisfaction (5.5±0.7) at follow-up were comparable to the STR-group. During follow-up, 1(0.19%; 95%CI, 0.01-1.03%) ischemic stroke, 2(0.37%; 95%CI, 0.05-1.34%) major non-fatal bleeding and 11(2.05%; 95%CI, 1.03-3.64%) minor bleeding complications occurred. Rivaroxaban was stopped in 30(5.6%) patients, with side effects being the most frequent reason. CONCLUSION: Initiation of rivaroxaban for patients with nonvalvular AF by primary care physicians was associated with a low clinical event rate and with high overall patient's and physician's satisfaction.
Resumo:
This paper presents a prototype of an interactive web-GIS tool for risk analysis of natural hazards, in particular for floods and landslides, based on open-source geospatial software and technologies. The aim of the presented tool is to assist the experts (risk managers) in analysing the impacts and consequences of a certain hazard event in a considered region, providing an essential input to the decision-making process in the selection of risk management strategies by responsible authorities and decision makers. This tool is based on the Boundless (OpenGeo Suite) framework and its client-side environment for prototype development, and it is one of the main modules of a web-based collaborative decision support platform in risk management. Within this platform, the users can import necessary maps and information to analyse areas at risk. Based on provided information and parameters, loss scenarios (amount of damages and number of fatalities) of a hazard event are generated on the fly and visualized interactively within the web-GIS interface of the platform. The annualized risk is calculated based on the combination of resultant loss scenarios with different return periods of the hazard event. The application of this developed prototype is demonstrated using a regional data set from one of the case study sites, Fella River of northeastern Italy, of the Marie Curie ITN CHANGES project.
Resumo:
BACKGROUND: Patients with HIV exposed to the antiretroviral drug abacavir may have an increased risk of cardiovascular disease (CVD). There is concern that this association arises because of a channeling bias. Even if exposure is a risk, it is not clear how that risk changes as exposure cumulates. METHODS: We assess the effect of exposure to abacavir on the risk of CVD events in the Swiss HIV Cohort Study. We use a new marginal structural Cox model to estimate the effect of abacavir as a flexible function of past exposures while accounting for risk factors that potentially lie on a causal pathway between exposure to abacavir and CVD. RESULTS: A total of 11,856 patients were followed for a median of 6.6 years; 365 patients had a CVD event (4.6 events per 1000 patient-years). In a conventional Cox model, recent--but not cumulative--exposure to abacavir increased the risk of a CVD event. In the new marginal structural Cox model, continued exposure to abacavir during the past 4 years increased the risk of a CVD event (hazard ratio = 2.06; 95% confidence interval: 1.43 to 2.98). The estimated function for the effect of past exposures suggests that exposure during the past 6-36 months caused the greatest increase in risk. CONCLUSIONS: Abacavir increases the risk of a CVD event: the effect of exposure is not immediate, rather the risk increases as exposure cumulates over the past few years. This gradual increase in risk is not consistent with a rapidly acting mechanism, such as acute inflammation.
Resumo:
The use of the life history calendar (LHC) or the event history calendar as tools for collecting retrospective data has received increasing attention in many fields of social science and medicine. However, little research has examined the use of this method with web-based surveys. In this study, we adapted this method to an on-line setting to collect information about young adults' life histories, sexual behaviors, and substance use. We hypothesized that the LHC method would help respondents to date sensitive and non-sensitive events more precisely than when using a conventional questionnaire. We conducted an experimental design study comparing university students' responses to an on-line LHC and a conventional on-line question list. A test-retest design in which the respondents completed the survey again two weeks later was also applied to test the precision and reliability of the participants' dating of events. The results showed that whereas the numbers of sensitive and non-sensitive events were generally similar for the two on-line questionnaires, the responses obtained with the LHC were more consistent across the two administrations. Analyses of the respondents' on-line behavior while completing the LHC confirmed that respondents used the LHC's graphic interface to correct and reedit previous answers, thus decreasing data errors. (C) 2015 Elsevier Ltd. All rights reserved.