889 resultados para Open Data, Dati Aperti, Open Government Data


Relevância:

60.00% 60.00%

Publicador:

Resumo:

By providing vehicle-to-vehicle and vehicle-to-infrastructure wireless communications, vehicular ad hoc networks (VANETs), also known as the “networks on wheels”, can greatly enhance traffic safety, traffic efficiency and driving experience for intelligent transportation system (ITS). However, the unique features of VANETs, such as high mobility and uneven distribution of vehicular nodes, impose critical challenges of high efficiency and reliability for the implementation of VANETs. This dissertation is motivated by the great application potentials of VANETs in the design of efficient in-network data processing and dissemination. Considering the significance of message aggregation, data dissemination and data collection, this dissertation research targets at enhancing the traffic safety and traffic efficiency, as well as developing novel commercial applications, based on VANETs, following four aspects: 1) accurate and efficient message aggregation to detect on-road safety relevant events, 2) reliable data dissemination to reliably notify remote vehicles, 3) efficient and reliable spatial data collection from vehicular sensors, and 4) novel promising applications to exploit the commercial potentials of VANETs. Specifically, to enable cooperative detection of safety relevant events on the roads, the structure-less message aggregation (SLMA) scheme is proposed to improve communication efficiency and message accuracy. The scheme of relative position based message dissemination (RPB-MD) is proposed to reliably and efficiently disseminate messages to all intended vehicles in the zone-of-relevance in varying traffic density. Due to numerous vehicular sensor data available based on VANETs, the scheme of compressive sampling based data collection (CS-DC) is proposed to efficiently collect the spatial relevance data in a large scale, especially in the dense traffic. In addition, with novel and efficient solutions proposed for the application specific issues of data dissemination and data collection, several appealing value-added applications for VANETs are developed to exploit the commercial potentials of VANETs, namely general purpose automatic survey (GPAS), VANET-based ambient ad dissemination (VAAD) and VANET based vehicle performance monitoring and analysis (VehicleView). Thus, by improving the efficiency and reliability in in-network data processing and dissemination, including message aggregation, data dissemination and data collection, together with the development of novel promising applications, this dissertation will help push VANETs further to the stage of massive deployment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

By employing interpretive policy analysis this thesis aims to assess, measure, and explain policy capacity for government and non-government organizations involved in reclaiming Alberta's oil sands. Using this type of analysis to assess policy capacity is a novel approach for understanding reclamation policy; and therefore, this research will provide a unique contribution to the literature surrounding reclamation policy. The oil sands region in northeast Alberta, Canada is an area of interest for a few reasons; primarily because of the vast reserves of bitumen and the environmental cost associated with developing this resource. An increase in global oil demand has established incentive for industry to seek out and develop new reserves. Alberta's oil sands are one of the largest remaining reserves in the world, and there is significant interest in increasing production in this region. Furthermore, tensions in several oil exporting nations in the Middle East remain unresolved, and this has garnered additional support for a supply side solution to North American oil demands. This solution relies upon the development of reserves in both the United States and Canada. These compounding factors have contributed to the increased development in the oil sands of northeastern Alberta. Essentially, a rapid expansion of oil sands operations is ongoing, and is the source of significant disturbance across the region. This disturbance, and the promises of reclamation, is a source of contentious debates amongst stakeholders and continues to be highly visible in the media. If oil sands operations are to retain their social license to operate, it is critical that reclamation efforts be effective. One concern non-governmental organizations (NGOs) expressed criticizes the current monitoring and enforcement of regulatory programs in the oil sands. Alberta's NGOs have suggested the data made available to them originates from industrial sources, and is generally unchecked by government. In an effort to discern the overall status of reclamation in the oil sands this study explores several factors essential to policy capacity: work environment, training, employee attitudes, perceived capacity, policy tools, evidence based work, and networking. Data was collected through key informant interviews with senior policy professionals in government and non-government agencies in Alberta. The following are agencies of interest in this research: Canadian Association of Petroleum Producers (CAPP); Alberta Environment and Sustainable Resource Development (AESRD); Alberta Energy Regulator (AER); Cumulative Environmental Management Association (CEMA); Alberta Environment Monitoring, Evaluation, and Reporting Agency (AEMERA); Wood Buffalo Environmental Association (WBEA). The aim of this research is to explain how and why reclamation policy is conducted in Alberta's oil sands. This will illuminate government capacity, NGO capacity, and the interaction of these two agency typologies. In addition to answering research questions, another goal of this project is to show interpretive analysis of policy capacity can be used to measure and predict policy effectiveness. The oil sands of Alberta will be the focus of this project, however, future projects could focus on any government policy scenario utilizing evidence-based approaches.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Back-pressure on a diesel engine equipped with an aftertreatment system is a function of the pressure drop across the individual components of the aftertreatment system, typically, a diesel oxidation catalyst (DOC), catalyzed particulate filter (CPF) and selective catalytic reduction (SCR) catalyst. Pressure drop across the CPF is a function of the mass flow rate and the temperature of the exhaust flowing through it as well as the mass of particulate matter (PM) retained in the substrate wall and the cake layer that forms on the substrate wall. Therefore, in order to control the back-pressure on the engine at low levels and to minimize the fuel consumption, it is important to control the PM mass retained in the CPF. Chemical reactions involving the oxidation of PM under passive oxidation and active regeneration conditions can be utilized with computer numerical models in the engine control unit (ECU) to control the pressure drop across the CPF. Hence, understanding and predicting the filtration and oxidation of PM in the CPF and the effect of these processes on the pressure drop across the CPF are necessary for developing control strategies for the aftertreatment system to reduce back-pressure on the engine and in turn fuel consumption particularly from active regeneration. Numerical modeling of CPF's has been proven to reduce development time and the cost of aftertreatment systems used in production as well as to facilitate understanding of the internal processes occurring during different operating conditions that the particulate filter is subjected to. A numerical model of the CPF was developed in this research work which was calibrated to data from passive oxidation and active regeneration experiments in order to determine the kinetic parameters for oxidation of PM and nitrogen oxides along with the model filtration parameters. The research results include the comparison between the model and the experimental data for pressure drop, PM mass retained, filtration efficiencies, CPF outlet gas temperatures and species (NO2) concentrations out of the CPF. Comparisons of PM oxidation reaction rates obtained from the model calibration to the data from the experiments for ULSD, 10 and 20% biodiesel-blended fuels are presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

How can we calculate earthquake magnitudes when the signal is clipped and over-run? When a volcano is very active, the seismic record may saturate (i.e., the full amplitude of the signal is not recorded) or be over-run (i.e., the end of one event is covered by the start of a new event). The duration, and sometimes the amplitude, of an earthquake signal are necessary for determining event magnitudes; thus, it may be impossible to calculate earthquake magnitudes when a volcano is very active. This problem is most likely to occur at volcanoes with limited networks of short period seismometers. This study outlines two methods for calculating earthquake magnitudes when events are clipped and over-run. The first method entails modeling the shape of earthquake codas as a power law function and extrapolating duration from the decay of the function. The second method draws relations between clipped duration (i.e., the length of time a signal is clipped) and the full duration. These methods allow for magnitudes to be determined within 0.2 to 0.4 units of magnitude. This error is within the range of analyst hand-picks and is within the acceptable limits of uncertainty when quickly quantifying volcanic energy release during volcanic crises. Most importantly, these estimates can be made when data are clipped or over-run. These methods were developed with data from the initial stages of the 2004-2008 eruption at Mount St. Helens. Mount St. Helens is a well-studied volcano with many instruments placed at varying distances from the vent. This fact makes the 2004-2008 eruption a good place to calibrate and refine methodologies that can be applied to volcanoes with limited networks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With recent advances in remote sensing processing technology, it has become more feasible to begin analysis of the enormous historic archive of remotely sensed data. This historical data provides valuable information on a wide variety of topics which can influence the lives of millions of people if processed correctly and in a timely manner. One such field of benefit is that of landslide mapping and inventory. This data provides a historical reference to those who live near high risk areas so future disasters may be avoided. In order to properly map landslides remotely, an optimum method must first be determined. Historically, mapping has been attempted using pixel based methods such as unsupervised and supervised classification. These methods are limited by their ability to only characterize an image spectrally based on single pixel values. This creates a result prone to false positives and often without meaningful objects created. Recently, several reliable methods of Object Oriented Analysis (OOA) have been developed which utilize a full range of spectral, spatial, textural, and contextual parameters to delineate regions of interest. A comparison of these two methods on a historical dataset of the landslide affected city of San Juan La Laguna, Guatemala has proven the benefits of OOA methods over those of unsupervised classification. Overall accuracies of 96.5% and 94.3% and F-score of 84.3% and 77.9% were achieved for OOA and unsupervised classification methods respectively. The greater difference in F-score is a result of the low precision values of unsupervised classification caused by poor false positive removal, the greatest shortcoming of this method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The continual eruptive activity, occurrence of an ancestral catastrophic collapse, and inherent geologic features of Pacaya volcano (Guatemala) demands an evaluation of potential collapse hazards. This thesis merges techniques in the field and laboratory for a better rock mass characterization of volcanic slopes and slope stability evaluation. New field geological, structural, rock mechanical and geotechnical data on Pacaya is reported and is integrated with laboratory tests to better define the physical-mechanical rock mass properties. Additionally, this data is used in numerical models for the quantitative evaluation of lateral instability of large sector collapses and shallow landslides. Regional tectonics and local structures indicate that the local stress regime is transtensional, with an ENE-WSW sigma 3 stress component. Aligned features trending NNW-SSE can be considered as an expression of this weakness zone that favors magma upwelling to the surface. Numerical modeling suggests that a large-scale collapse could be triggered by reasonable ranges of magma pressure (greater than or equal to 7.7 MPa if constant along a central dyke) and seismic acceleration (greater than or equal to 460 cm/s2), and that a layer of pyroclastic deposits beneath the edifice could have been a factor which controlled the ancestral collapse. Finally, the formation of shear cracks within zones of maximum shear strain could provide conduits for lateral flow, which would account for long lava flows erupted at lower elevations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Analyzing large-scale gene expression data is a labor-intensive and time-consuming process. To make data analysis easier, we developed a set of pipelines for rapid processing and analysis poplar gene expression data for knowledge discovery. Of all pipelines developed, differentially expressed genes (DEGs) pipeline is the one designed to identify biologically important genes that are differentially expressed in one of multiple time points for conditions. Pathway analysis pipeline was designed to identify the differentially expression metabolic pathways. Protein domain enrichment pipeline can identify the enriched protein domains present in the DEGs. Finally, Gene Ontology (GO) enrichment analysis pipeline was developed to identify the enriched GO terms in the DEGs. Our pipeline tools can analyze both microarray gene data and high-throughput gene data. These two types of data are obtained by two different technologies. A microarray technology is to measure gene expression levels via microarray chips, a collection of microscopic DNA spots attached to a solid (glass) surface, whereas high throughput sequencing, also called as the next-generation sequencing, is a new technology to measure gene expression levels by directly sequencing mRNAs, and obtaining each mRNA’s copy numbers in cells or tissues. We also developed a web portal (http://sys.bio.mtu.edu/) to make all pipelines available to public to facilitate users to analyze their gene expression data. In addition to the analyses mentioned above, it can also perform GO hierarchy analysis, i.e. construct GO trees using a list of GO terms as an input.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

To analyze the characteristics and predict the dynamic behaviors of complex systems over time, comprehensive research to enable the development of systems that can intelligently adapt to the evolving conditions and infer new knowledge with algorithms that are not predesigned is crucially needed. This dissertation research studies the integration of the techniques and methodologies resulted from the fields of pattern recognition, intelligent agents, artificial immune systems, and distributed computing platforms, to create technologies that can more accurately describe and control the dynamics of real-world complex systems. The need for such technologies is emerging in manufacturing, transportation, hazard mitigation, weather and climate prediction, homeland security, and emergency response. Motivated by the ability of mobile agents to dynamically incorporate additional computational and control algorithms into executing applications, mobile agent technology is employed in this research for the adaptive sensing and monitoring in a wireless sensor network. Mobile agents are software components that can travel from one computing platform to another in a network and carry programs and data states that are needed for performing the assigned tasks. To support the generation, migration, communication, and management of mobile monitoring agents, an embeddable mobile agent system (Mobile-C) is integrated with sensor nodes. Mobile monitoring agents visit distributed sensor nodes, read real-time sensor data, and perform anomaly detection using the equipped pattern recognition algorithms. The optimal control of agents is achieved by mimicking the adaptive immune response and the application of multi-objective optimization algorithms. The mobile agent approach provides potential to reduce the communication load and energy consumption in monitoring networks. The major research work of this dissertation project includes: (1) studying effective feature extraction methods for time series measurement data; (2) investigating the impact of the feature extraction methods and dissimilarity measures on the performance of pattern recognition; (3) researching the effects of environmental factors on the performance of pattern recognition; (4) integrating an embeddable mobile agent system with wireless sensor nodes; (5) optimizing agent generation and distribution using artificial immune system concept and multi-objective algorithms; (6) applying mobile agent technology and pattern recognition algorithms for adaptive structural health monitoring and driving cycle pattern recognition; (7) developing a web-based monitoring network to enable the visualization and analysis of real-time sensor data remotely. Techniques and algorithms developed in this dissertation project will contribute to research advances in networked distributed systems operating under changing environments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Data la sempre maggiore richiesta di fabbisogno energetico, si è sviluppata una nuova filosofia nella gestione dei consumi energetici, il DSM (demand side management), che ha lo scopo di incoraggiare il consumatore ad usare energia in modo più intelligente e coscienzioso. Questo obiettivo, unito all’accumulo di energia da fonti rinnovabili, permetterà un abbassamento dell’utilizzo dell’energia elettrica proveniente dal consumo di fonti non rinnovabili e altamente inquinanti come quelle a combustibili fossili ed una diminuzione sia del consumo energetico, sia del costo per produrre energia che dell’energia stessa. L’home automation e la domotica in ambiente domestico rappresentano un esempio di DSM. L’obiettivo di questa tesi è quello di creare un sistema di home automation utilizzando tecnologie opensource. Sono stati utilizzati device come board Arduino UNO, Raspberry Pi ed un PC con sistema operativo GNU/Linux per creare una simulazione di un sistema di home automation abbinato alla gestione di celle fotovoltaiche ed energy storaging. Il sistema permette di poter spegnere un carico energetico in base a delle particolari circostanze come, per esempio, il superamento di una certa soglia di consumo di energia elettrica. Il software utilizzato è opensource e mira a poter ottimizzare il consumo energetico secondo le proprie finalità. Il tutto a dimostrare che si può creare un sistema di home automation da abbinare con il presente e futuro delle fonti rinnovabili utilizzando tecnologie libere in modo tale da preservare privacy e security oltre che customizzazione e possibilità di adattamento a diverse circostanze. Nella progettazione del sistema è stato implementato un algoritmo per gestire varie situazioni all’interno di un ambiente domestico. La realizzazione di tale algoritmo ha prodotto ottimi risultati nella raggiungimento degli obiettivi prefissati. Il progetto di questa tesi può essere ulteriormente ampliato ed il codice è reperibile in un repository pubblico.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La mia tesi si occupa di trattare come, attraverso questo nuovo prodotto dell’informatica chiamato big data, si possano ottenere informazioni e fare previsioni sull’andamento del turismo.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

County jurisdictions in America are increasingly exercising self-government in the provision of public community services through the context of second order federalism. In states exercising this form of contemporary governance, county governments with “reformed” policy-making structures and professional management practices, have begun to rival or surpass municipalities in the delivery of local services with regional implications such as environmental protection (Benton 2002, 2003; Marando and Reeves, 1993). The voter referendum, a form of direct democracy, is an important component of county land preservation and environmental protection governmental policies. The recent growth and success of land preservation voter referendums nationwide reflects an increase in citizen participation in government and their desire to protect vacant land and its natural environment from threats of over-development, urbanization and sprawl, loss of open space and farmland, deterioration of ecosystems, and inadequate park and recreational amenities. The study’s design employs a sequential, mixed method. First, a quantitative approach employs the Heckman two-step model. It is fitted with variables for the non-random sample of 227 voter referendum counties and all non-voter referendum counties in the U.S. from 1988 to 2009. Second, the qualitative data collected from the in-depth investigation of three South Florida county case studies with twelve public administrator interviews is transformed for integration with the quantitative findings. The purpose of the qualitative method is to complement, explain and enrich the statistical analysis of county demographic, socio-economic, terrain, regional, governance and government, political preference, environmentalism, and referendum-specific factors. The research finds that government factors are significant in terms of the success of land preservation voter referendums; more specifically, the presence of self-government authority (home rule charter), a reformed structure (county administrator/manager or elected executive), and environmental interest groups. In addition, this study concludes that successful counties are often located coastal, exhibit population and housing growth, and have older and more educated citizens who vote democratic in presidential elections. The analysis of case study documents and public administrator interviews finds that pragmatic considerations of timing, local politics and networking of regional stakeholders are also important features of success. Further research is suggested utilizing additional public participation, local government and public administration factors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis presents a cloud-based software platform for sharing publicly available scientific datasets. The proposed platform leverages the potential of NoSQL databases and asynchronous IO technologies, such as Node.JS, in order to achieve high performances and flexible solutions. This solution will serve two main groups of users. The dataset providers, which are the researchers responsible for sharing and maintaining datasets, and the dataset users, that are those who desire to access the public data. To the former are given tools to easily publish and maintain large volumes of data, whereas the later are given tools to enable the preview and creation of subsets of the original data through the introduction of filter and aggregation operations. The choice of NoSQL over more traditional RDDMS emerged from and extended benchmark between relational databases (MySQL) and NoSQL (MongoDB) that is also presented in this thesis. The obtained results come to confirm the theoretical guarantees that NoSQL databases are more suitable for the kind of data that our system users will be handling, i. e., non-homogeneous data structures that can grow really fast. It is envisioned that a platform like this can lead the way to a new era of scientific data sharing where researchers are able to easily share and access all kinds of datasets, and even in more advanced scenarios be presented with recommended datasets and already existing research results on top of those recommendations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The key functional operability in the pre-Lisbon PJCCM pillar of the EU is the exchange of intelligence and information amongst the law enforcement bodies of the EU. The twin issues of data protection and data security within what was the EU’s third pillar legal framework therefore come to the fore. With the Lisbon Treaty reform of the EU, and the increased role of the Commission in PJCCM policy areas, and the integration of the PJCCM provisions with what have traditionally been the pillar I activities of Frontex, the opportunity for streamlining the data protection and data security provisions of the law enforcement bodies of the post-Lisbon EU arises. This is recognised by the Commission in their drafting of an amending regulation for Frontex , when they say that they would prefer “to return to the question of personal data in the context of the overall strategy for information exchange to be presented later this year and also taking into account the reflection to be carried out on how to further develop cooperation between agencies in the justice and home affairs field as requested by the Stockholm programme.” The focus of the literature published on this topic, has for the most part, been on the data protection provisions in Pillar I, EC. While the focus of research has recently sifted to the previously Pillar III PJCCM provisions on data protection, a more focused analysis of the interlocking issues of data protection and data security needs to be made in the context of the law enforcement bodies, particularly with regard to those which were based in the pre-Lisbon third pillar. This paper will make a contribution to that debate, arguing that a review of both the data protection and security provision post-Lisbon is required, not only in order to reinforce individual rights, but also inter-agency operability in combating cross-border EU crime. The EC’s provisions on data protection, as enshrined by Directive 95/46/EC, do not apply to the legal frameworks covering developments within the third pillar of the EU. Even Council Framework Decision 2008/977/JHA, which is supposed to cover data protection provisions within PJCCM expressly states that its provisions do not apply to “Europol, Eurojust, the Schengen Information System (SIS)” or to the Customs Information System (CIS). In addition, the post Treaty of Prüm provisions covering the sharing of DNA profiles, dactyloscopic data and vehicle registration data pursuant to Council Decision 2008/615/JHA, are not to be covered by the provisions of the 2008 Framework Decision. As stated by Hijmans and Scirocco, the regime is “best defined as a patchwork of data protection regimes”, with “no legal framework which is stable and unequivocal, like Directive 95/46/EC in the First pillar”. Data security issues are also key to the sharing of data in organised crime or counterterrorism situations. This article will critically analyse the current legal framework for data protection and security within the third pillar of the EU.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Edoxaban, an oral factor Xa inhibitor, is non-inferior for prevention of stroke and systemic embolism in patients with atrial fibrillation and is associated with less bleeding than well controlled warfarin therapy. Few safety data about edoxaban in patients undergoing electrical cardioversion are available. Methods We did a multicentre, prospective, randomised, open-label, blinded-endpoint evaluation trial in 19 countries with 239 sites comparing edoxaban 60 mg per day with enoxaparin–warfarin in patients undergoing electrical cardioversion of non-valvular atrial fibrillation. The dose of edoxaban was reduced to 30 mg per day if one or more factors (creatinine clearance 15–50 mL/min, low bodyweight [≤60 kg], or concomitant use of P-glycoprotein inhibitors) were present. Block randomisation (block size four)—stratified by cardioversion approach (transoesophageal echocardiography [TEE] or not), anticoagulant experience, selected edoxaban dose, and region—was done through a voice-web system. The primary efficacy endpoint was a composite of stroke, systemic embolic event, myocardial infarction, and cardiovascular mortality, analysed by intention to treat. The primary safety endpoint was major and clinically relevant non-major (CRNM) bleeding in patients who received at least one dose of study drug. Follow-up was 28 days on study drug after cardioversion plus 30 days to assess safety. This trial is registered with ClinicalTrials.gov, number NCT02072434. Findings Between March 25, 2014, and Oct 28, 2015, 2199 patients were enrolled and randomly assigned to receive edoxaban (n=1095) or enoxaparin–warfarin (n=1104). The mean age was 64 years (SD 10·54) and mean CHA2DS2-VASc score was 2·6 (SD 1·4). Mean time in therapeutic range on warfarin was 70·8% (SD 27·4). The primary efficacy endpoint occurred in five (<1%) patients in the edoxaban group versus 11 (1%) in the enoxaparin–warfarin group (odds ratio [OR] 0·46, 95% CI 0·12–1·43). The primary safety endpoint occurred in 16 (1%) of 1067 patients given edoxaban versus 11 (1%) of 1082 patients given enoxaparin–warfarin (OR 1·48, 95% CI 0·64–3·55). The results were independent of the TEE-guided strategy and anticoagulation status. Interpretation ENSURE-AF is the largest prospective randomised clinical trial of anticoagulation for cardioversion of patients with non-valvular atrial fibrillation. Rates of major and CRNM bleeding and thromboembolism were low in the two treatment groups. Funding Daiichi Sankyo provided financial support for the study. © 2016 Elsevier Ltd