14 resultados para Star-count Data

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part I of this series of articles focused on the construction of graphical probabilistic inference procedures, at various levels of detail, for assessing the evidential value of gunshot residue (GSR) particle evidence. The proposed models - in the form of Bayesian networks - address the issues of background presence of GSR particles, analytical performance (i.e., the efficiency of evidence searching and analysis procedures) and contamination. The use and practical implementation of Bayesian networks for case pre-assessment is also discussed. This paper, Part II, concentrates on Bayesian parameter estimation. This topic complements Part I in that it offers means for producing estimates useable for the numerical specification of the proposed probabilistic graphical models. Bayesian estimation procedures are given a primary focus of attention because they allow the scientist to combine (his/her) prior knowledge about the problem of interest with newly acquired experimental data. The present paper also considers further topics such as the sensitivity of the likelihood ratio due to uncertainty in parameters and the study of likelihood ratio values obtained for members of particular populations (e.g., individuals with or without exposure to GSR).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Failure to detect a species in an area where it is present is a major source of error in biological surveys. We assessed whether it is possible to optimize single-visit biological monitoring surveys of highly dynamic freshwater ecosystems by framing them a priori within a particular period of time. Alternatively, we also searched for the optimal number of visits and when they should be conducted. We developed single-species occupancy models to estimate the monthly probability of detection of pond-breeding amphibians during a four-year monitoring program. Our results revealed that detection probability was species-specific and changed among sampling visits within a breeding season and also among breeding seasons. Thereby, the optimization of biological surveys with minimal survey effort (a single visit) is not feasible as it proves impossible to select a priori an adequate sampling period that remains robust across years. Alternatively, a two-survey combination at the beginning of the sampling season yielded optimal results and constituted an acceptable compromise between sampling efficacy and survey effort. Our study provides evidence of the variability and uncertainty that likely affects the efficacy of monitoring surveys, highlighting the need of repeated sampling in both ecological studies and conservation management.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Estimated cancer mortality statistics were published for the years 2011 and 2012 for the European Union (EU) and its six more populous countries. Patients and methods Using logarithmic Poisson count data joinpoint models and the World Health Organization mortality and population database, we estimated numbers of deaths and age-standardized (world) mortality rates (ASRs) in 2013 from all cancers and selected cancers. Results The 2013 predicted number of cancer deaths in the EU is 1 314 296 (737 747 men and 576 489 women). Between 2009 and 2013, all cancer ASRs are predicted to fall by 6% to 140.1/100 000 in men, and by 4% to 85.3/100 000 in women. The ASRs per 100 000 are 6.6 men and 2.9 women for stomach, 16.7 men and 9.5 women for intestines, 8.0 men and 5.5 women for pancreas, 37.1 men and 13.9 women for lung, 10.5 men for prostate, 14.6 women for breast, and 4.7 for uterine cancer, and 4.2 and 2.6 for leukaemia. Recent trends are favourable except for pancreatic cancer and lung cancer in women. Conclusions Favourable trends will continue in 2013. Pancreatic cancer has become the fourth cause of cancer death in both sexes, while in a few years lung cancer will likely become the first cause of cancer mortality in women as well, overtaking breast cancer.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Estimating current cancer mortality figures is important for defining priorities for prevention and treatment.Materials and methods:Using logarithmic Poisson count data joinpoint models on mortality and population data from the World Health Organization database, we estimated numbers of deaths and age-standardized rates in 2012 from all cancers and selected cancer sites for the whole European Union (EU) and its six more populated countries. RESULTS: Cancer deaths in the EU in 2012 are estimated to be 1 283 101 (717 398 men and 565 703 women) corresponding to standardized overall cancer death rates of 139/100 000 men and 85/100 000 women. The fall from 2007 was 10% in men and 7% in women. In men, declines are predicted for stomach (-20%), leukemias (-11%), lung and prostate (-10%) and colorectal (-7%) cancers, and for stomach (-23%), leukemias (-12%), uterus and colorectum (-11%) and breast (-9%) in women. Almost stable rates are expected for pancreatic cancer (+2-3%) and increases for female lung cancer (+7%). Younger women show the greatest falls in breast cancer mortality rates in the EU (-17%), and declines are expected in all individual countries, except Poland. CONCLUSION: Apart for lung cancer in women and pancreatic cancer, continuing falls are expected in mortality from major cancers in the EU.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES: Patients with inflammatory bowel disease (IBD) have a high resource consumption, with considerable costs for the healthcare system. In a system with sparse resources, treatment is influenced not only by clinical judgement but also by resource consumption. We aimed to determine the resource consumption of IBD patients and to identify its significant predictors. MATERIALS AND METHODS: Data from the prospective Swiss Inflammatory Bowel Disease Cohort Study were analysed for the resource consumption endpoints hospitalization and outpatient consultations at enrolment [1187 patients; 41.1% ulcerative colitis (UC), 58.9% Crohn's disease (CD)] and at 1-year follow-up (794 patients). Predictors of interest were chosen through an expert panel and a review of the relevant literature. Logistic regressions were used for binary endpoints, and negative binomial regressions and zero-inflated Poisson regressions were used for count data. RESULTS: For CD, fistula, use of biologics and disease activity were significant predictors for hospitalization days (all P-values <0.001); age, sex, steroid therapy and biologics were significant predictors for the number of outpatient visits (P=0.0368, 0.023, 0.0002, 0.0003, respectively). For UC, biologics, C-reactive protein, smoke quitters, age and sex were significantly predictive for hospitalization days (P=0.0167, 0.0003, 0.0003, 0.0076 and 0.0175 respectively); disease activity and immunosuppressive therapy predicted the number of outpatient visits (P=0.0009 and 0.0017, respectively). The results of multivariate regressions are shown in detail. CONCLUSION: Several highly significant clinical predictors for resource consumption in IBD were identified that might be considered in medical decision-making. In terms of resource consumption and its predictors, CD and UC show a different behaviour.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Emergency department frequent users (EDFUs) account for a disproportionally high number of emergency department (ED) visits, contributing to overcrowding and high health-care costs. At the Lausanne University Hospital, EDFUs account for only 4.4% of ED patients, but 12.1% of all ED visits. Our study tested the hypothesis that an interdisciplinary case management intervention red. Methods: In this randomized controlled trial, we allocated adult EDFUs (5 or more visits in the previous 12 months) who visited the ED of the University Hospital of Lausanne, Switzerland between May 2012 and July 2013 either to an intervention (N=125) or a standard emergency care (N=125) group and monitored them for 12 months. Randomization was computer generated and concealed, and patients and research staff were blinded to the allocation. Participants in the intervention group, in addition to standard emergency care, received case management from an interdisciplinary team at baseline, and at 1, 3, and 5 months, in the hospital, in the ambulatory care setting, or at their homes. A generalized, linear, mixed-effects model for count data (Poisson distribution) was applied to compare participants' numbers of visits to the ED during the 12 months (Period 1, P1) preceding recruitment to the numbers of visits during the 12 months monitored (Period 2, P2).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whole-body (WB) planar imaging has long been one of the staple methods of dosimetry, and its quantification has been formalized by the MIRD Committee in pamphlet no 16. One of the issues not specifically addressed in the formalism occurs when the count rates reaching the detector are sufficiently high to result in camera count saturation. Camera dead-time effects have been extensively studied, but all of the developed correction methods assume static acquisitions. However, during WB planar (sweep) imaging, a variable amount of imaged activity exists in the detector's field of view as a function of time and therefore the camera saturation is time dependent. A new time-dependent algorithm was developed to correct for dead-time effects during WB planar acquisitions that accounts for relative motion between detector heads and imaged object. Static camera dead-time parameters were acquired by imaging decaying activity in a phantom and obtaining a saturation curve. Using these parameters, an iterative algorithm akin to Newton's method was developed, which takes into account the variable count rate seen by the detector as a function of time. The algorithm was tested on simulated data as well as on a whole-body scan of high activity Samarium-153 in an ellipsoid phantom. A complete set of parameters from unsaturated phantom data necessary for count rate to activity conversion was also obtained, including build-up and attenuation coefficients, in order to convert corrected count rate values to activity. The algorithm proved successful in accounting for motion- and time-dependent saturation effects in both the simulated and measured data and converged to any desired degree of precision. The clearance half-life calculated from the ellipsoid phantom data was calculated to be 45.1 h after dead-time correction and 51.4 h with no correction; the physical decay half-life of Samarium-153 is 46.3 h. Accurate WB planar dosimetry of high activities relies on successfully compensating for camera saturation which takes into account the variable activity in the field of view, i.e. time-dependent dead-time effects. The algorithm presented here accomplishes this task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In previous years, several publications have reported cases of infants presenting neurological and gastrointestinal symptoms after ingestion of star anise tea. Such teas are sometimes given in various cultures for the treatment of infant colic pains. In most cases, the cause of intoxication was contamination of Chinese star anise (Illicium verum) by Japanese star anise (Illicium anisatum). Indeed, the toxicity of Illicium anisatum, also known as Shikimi, is caused by its content in potent neurotoxins (anisatin, neoanisatin, and pseudoanisatin), due to their activity as non-competitive antagonists of GABA receptors. The main reasons explaining the frequent contaminations are the strong macroscopic resemblance of the 2 substances, as well as the fact that the fruits are often sold partially broken or in ground form. Therefore, in most cases, chemical analysis is required to determine the possible adulterations. CASE REPORT: A 2-month-old infant, in good general health, was brought to the emergency unit after 3 consecutive episodes of central cyanosis and tetany of the limbs with spontaneous recovery the same afternoon. The child was also very irritable, regurgitated a lot, and positioned himself in opisthotonos. Between these episodes, the neurological exam showed some perturbations (horizontal nystagmus and Bell's phenomenon, hypertony of the extensor muscles, and mild hypotony of the axial flexor muscles) with slow improvement over the following hours. The remaining clinical exam, the laboratory work (complete blood count, renal, hepatic, and muscular tests, capillary blood gas, plasmatic amino acids, and urinary organic acids), and the electroencephalogram findings were all normal. In the course of a detailed interview, the parents reported having given 3 bottles to their child, each one containing 200 mL of an infusion with 4 to 5 fruits of star anise, in the hours preceding the symptoms to relieve colic pains. The last seizure-like event took place approximately 8h after the last ingestion. We could prove the ingestion of anisatin, the toxic substance found in Japanese star anise, and the contamination of Chinese star anise by the Japanese species. Indeed, the anisatin analysis by liquid chromatography and mass spectroscopy (LC-MS) in a urine sample taken 22 h after the last infusion ingestion showed trace amounts of the substance. In another urine sample taken 33 h after ingestion, no anisatin could be detected. Furthermore, the analysis of the fruit sample gave an anisatin concentration of 7800 μg/kg while the maximum tolerance value in Switzerland is 1000 μg/kg. CONCLUSION: The evaluation of ALTE in infants should always include the possibility of intoxication. Star anise is generally considered a harmless medicine. Nevertheless, it can sometimes cause a severe intoxication resulting in various neurological and gastrointestinal symptoms. To prevent such events, not only the parents, but also the care personnel and pharmacists must be informed about the possible adverse effects caused either by the overdose of Chinese star anise or by the eventual contamination of herbal teas with Japanese star anise. A better control of the substances by the health authorities is also necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification of clinical risk factors for AIDS in patients with preserved immune function is of significant interest. We examined whether patients with fungal infection (FI) and CD4 cell count >or=200/microl were at higher risk of disease progression in the era of cART. 11,009 EuroSIDA patients were followed from their first CD4 cell count >or=200/microl after 1 January 1997 until progression to any non-azoles/amphotericin B susceptible (AAS) AIDS disease, last visit or death. Initiation of antimycotic therapy (AMT) was used as a marker of FI and was modelled as a time-updated covariate using Poisson regression. After adjustment for current CD4 cell count, HIV-RNA, starting cART and diagnosis of AAS-AIDS, AMT was significantly associated with an increased incidence of non-AAS-AIDS (IRR=1.55, 95% CI 1.17-2.06, p=0.0024). Despite low incidence of AIDS in the cART era, FI in patients with a CD4 cell count >or=200/microl is associated with a 55% higher risk of non-AAS-AIDS (95% confidence interval 1.17-2.06, p=0.0024). These data suggest that patients with FI are more immune compromized than would be expected from their CD4 cell count alone. FI can be used as a clinical marker for disease progression and indirect indicator for initiation/changing cART in settings where laboratory facilities are limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Rivaroxaban has become an alternative to vitamin-K antagonists (VKA) for stroke prevention in non-valvular atrial fibrillation (AF) patients due to its favourable risk-benefit profile in the restrictive setting of a large randomized trial. However in the primary care setting, physician's motivation to begin with rivaroxaban, treatment satisfaction and the clinical event rate after the initiation of rivaroxaban are not known. METHODS: Prospective data collection by 115 primary care physicians in Switzerland on consecutive nonvalvular AF patients with newly established rivaroxaban anticoagulation with 3-month follow-up. RESULTS: We enrolled 537 patients (73±11years, 57% men) with mean CHADS2 and HAS-BLED-scores of 2.2±1.3 and 2.4±1.1, respectively: 301(56%) were switched from VKA to rivaroxaban (STR-group) and 236(44%) were VKA-naïve (VN-group). Absence of routine coagulation monitoring (68%) and fixed-dose once-daily treatment (58%) were the most frequent criteria for physicians to initiate rivaroxaban. In the STR-group, patient's satisfaction increased from 3.6±1.4 under VKA to 5.5±0.8 points (P<0.001), and overall physician satisfaction from 3.9±1.3 to 5.4±0.9 points (P<0.001) at 3months of rivaroxaban therapy (score from 1 to 6 with higher scores indicating greater satisfaction). In the VN-group, both patient's (5.4±0.9) and physician's satisfaction (5.5±0.7) at follow-up were comparable to the STR-group. During follow-up, 1(0.19%; 95%CI, 0.01-1.03%) ischemic stroke, 2(0.37%; 95%CI, 0.05-1.34%) major non-fatal bleeding and 11(2.05%; 95%CI, 1.03-3.64%) minor bleeding complications occurred. Rivaroxaban was stopped in 30(5.6%) patients, with side effects being the most frequent reason. CONCLUSION: Initiation of rivaroxaban for patients with nonvalvular AF by primary care physicians was associated with a low clinical event rate and with high overall patient's and physician's satisfaction.