950 resultados para Open Data, Dati Aperti, Open Government Data
Resumo:
BACKGROUND: EGFR overexpression occurs in 27-55% of oesophagogastric adenocarcinomas, and correlates with poor prognosis. We aimed to assess addition of the anti-EGFR antibody panitumumab to epirubicin, oxaliplatin, and capecitabine (EOC) in patients with advanced oesophagogastric adenocarcinoma. METHODS: In this randomised, open-label phase 3 trial (REAL3), we enrolled patients with untreated, metastatic, or locally advanced oesophagogastric adenocarcinoma at 63 centres (tertiary referral centres, teaching hospitals, and district general hospitals) in the UK. Eligible patients were randomly allocated (1:1) to receive up to eight 21-day cycles of open-label EOC (epirubicin 50 mg/m(2) and oxaliplatin 130 mg/m(2) on day 1 and capecitabine 1250 mg/m(2) per day on days 1-21) or modified-dose EOC plus panitumumab (mEOC+P; epirubicin 50 mg/m(2) and oxaliplatin 100 mg/m(2) on day 1, capecitabine 1000 mg/m(2) per day on days 1-21, and panitumumab 9 mg/kg on day 1). Randomisation was blocked and stratified for centre region, extent of disease, and performance status. The primary endpoint was overall survival in the intention-to-treat population. We assessed safety in all patients who received at least one dose of study drug. After a preplanned independent data monitoring committee review in October, 2011, trial recruitment was halted and panitumumab withdrawn. Data for patients on treatment were censored at this timepoint. This study is registered with ClinicalTrials.gov, number NCT00824785. FINDINGS: Between June 2, 2008, and Oct 17, 2011, we enrolled 553 eligible patients. Median overall survival in 275 patients allocated EOC was 11.3 months (95% CI 9.6-13.0) compared with 8.8 months (7.7-9.8) in 278 patients allocated mEOC+P (hazard ratio [HR] 1.37, 95% CI 1.07-1.76; p=0.013). mEOC+P was associated with increased incidence of grade 3-4 diarrhoea (48 [17%] of 276 patients allocated mEOC+P vs 29 [11%] of 266 patients allocated EOC), rash (29 [11%] vs two [1%]), mucositis (14 [5%] vs none), and hypomagnesaemia (13 [5%] vs none) but reduced incidence of haematological toxicity (grade ≥ 3 neutropenia 35 [13%] vs 74 [28%]). INTERPRETATION: Addition of panitumumab to EOC chemotherapy does not increase overall survival and cannot be recommended for use in an unselected population with advanced oesophagogastric adenocarcinoma. FUNDING: Amgen, UK National Institute for Health Research Biomedical Research Centre.
Resumo:
Abstract Massive Open Online Courses (MOOCs) generate enormous amounts of data. The University of Southampton has run and is running dozens of MOOC instances. The vast amount of data resulting from our MOOCs can provide highly valuable information to all parties involved in the creation and delivery of these courses. However, analysing and visualising such data is a task that not all educators have the time or skills to undertake. The recently developed MOOC Dashboard is a tool aimed at bridging such a gap: it provides reports and visualisations based on the data generated by learners in MOOCs. Speakers Manuel Leon is currently a Lecturer in Online Teaching and Learning in the Institute for Learning Innovation and Development (ILIaD). Adriana Wilde is a Teaching Fellow in Electronics and Computer Science, with research interests in MOOCs and Learning Analytics. Darron Tang (4th Year BEng Computer Science) and Jasmine Cheng (BSc Mathematics & Actuarial Science and starting MSc Data Science shortly) have been working as interns over this Summer (2016) as have been developing the MOOC Dashboard.
Resumo:
We study the existence of solutions of quasilinear elliptic systems involving $N$ equations and a measure on the right hand side, with the form $$\left\{\begin{array}{ll} -\sum_{i=1}^n \frac{\partial}{\partial x_i}\left(\sum\limits_{\beta=1}^{N}\sum\limits_{j=1}^{n}% a_{i,j}^{\alpha,\beta}\left( x,u\right)\frac{\partial}{\partial x_j}u^\beta\right)=\mu^\alpha& \mbox{ in }\Omega ,\\ u=0 & \mbox{ on }\partial\Omega, \end{array}\right.$$ where $\alpha\in\{1,\dots,N\}$ is the equation index, $\Omega$ is an open bounded subset of $\mathbb{R}^{n}$, $u:\Omega\rightarrow\mathbb{R}^{N}$ and $\mu$ is a finite Randon measure on $\mathbb{R}^{n}$ with values into $\mathbb{R}^{N}$. Existence of a solution is proved for two different sets of assumptions on $A$. Examples are provided that satisfy our conditions, but do not satisfy conditions required on previous works on this matter.
Resumo:
The last decades have been characterized by a continuous adoption of IT solutions in the healthcare sector, which resulted in the proliferation of tremendous amounts of data over heterogeneous systems. Distinct data types are currently generated, manipulated, and stored, in the several institutions where patients are treated. The data sharing and an integrated access to this information will allow extracting relevant knowledge that can lead to better diagnostics and treatments. This thesis proposes new integration models for gathering information and extracting knowledge from multiple and heterogeneous biomedical sources. The scenario complexity led us to split the integration problem according to the data type and to the usage specificity. The first contribution is a cloud-based architecture for exchanging medical imaging services. It offers a simplified registration mechanism for providers and services, promotes remote data access, and facilitates the integration of distributed data sources. Moreover, it is compliant with international standards, ensuring the platform interoperability with current medical imaging devices. The second proposal is a sensor-based architecture for integration of electronic health records. It follows a federated integration model and aims to provide a scalable solution to search and retrieve data from multiple information systems. The last contribution is an open architecture for gathering patient-level data from disperse and heterogeneous databases. All the proposed solutions were deployed and validated in real world use cases.
Resumo:
Aim. Laparoscopic Appendectomy (LA) is widely performed for the treatment of acute appendicitis. However the use of laparoscopic approach for complicated appendicitis is controversial, in particular because it has been reported an increased risk of postoperative IntraAbdominal Abscess (IAA). The aim of this study was to compare the outcomes of LA versus Open Appendectomy (OA) in the treatment of complicated appendicitis, especially with regard to the incidence of postoperative IAA. Patients and Methods. A retrospective study of all patients treated at our institution for complicated appendicitis, from May 2004 to June 2009, was performed. Data collection included demographic characteristics, postoperative complications, conversion rate, and length of hospital stay. Results. Thirty-eight patients with complicated appendicitis were analysed. Among these, 18 (47,3%) had LA and 20 (52,7%) had OA. There were no statistical differences in characteristics between the two groups. The incidence of postoperative IAA was higher (16,6%), although not statistically significant, in the LA compared with OA group (5%). On the other hand the rate of wound infection was lower (5%) in the LA versus OA (20%). Conclusion. Our study indicated that LA should be utilised with caution in case of perforated appendicitis, because it is associated with an increased risk of postoperative IAA compared with OA.
Resumo:
The Open Journal project has completed its three year period of funding by the UK Electronic Libraries (eLib) programme (Rusbridge 1998). During that time, the number of journals that are available electronically leapt from a few tens to a few thousand. Some of these journals are now developing the sort of features the project has been advocating, in particular the use of links within journals, between different primary journals, with secondary journals data, and to non-journal sources. Assessing the achievements of the project and considering some of the difficulties it faced, we report on the different approaches to linking that the project developed, and summarise the important user responses that indicate what works and what does not. Looking ahead, there are signs of change, not just to simple linking within journals but to schemes in which links are the basis of "distributed" journals, where information may be shared and documents built from different sources. The significance has yet to be appreciated, but this would be a major change from printed journals. If projects such as this and others have provided the initial impetus, the motivation for distributed journals comes, perhaps surprisingly, from within certain parts of the industry, as the paper shows.
Resumo:
The Open Access movement has encouraged the availability of publicly-funded research papers, data and learning content for barrier-free use of that content without payment by the user. The impact of increasing availability of content to researchers in European universities is understood in terms of easier access to previous research and greater exposure for new research results, bringing benefits to the research community itself. A new culture of informal sharing is evident within the teaching and learning communities and to some extent also within the research community, but as yet the growth in informal sharing has not had a major effect upon the use of formal publication choices. This briefing paper explores the impact of open access upon potential users of research outputs outside the walls of research-led European universities, where the economic value of open access may be even greater than the academic value within universities. The potential impact of open access is understood in many communities but requires a greater volume of open access content to be available for the full potential to be realised. More open access content will become available as the opportunities in open, internet-based digital scholarship are understood. This briefing paper was written in cooperation with SPARC Europe. All links provided in footnotes in this Briefing Paper are to studies available in open access.
Resumo:
In order to optimize frontal detection in sea surface temperature fields at 4 km resolution, a combined statistical and expert-based approach is applied to test different spatial smoothing of the data prior to the detection process. Fronts are usually detected at 1 km resolution using the histogram-based, single image edge detection (SIED) algorithm developed by Cayula and Cornillon in 1992, with a standard preliminary smoothing using a median filter and a 3 × 3 pixel kernel. Here, detections are performed in three study regions (off Morocco, the Mozambique Channel, and north-western Australia) and across the Indian Ocean basin using the combination of multiple windows (CMW) method developed by Nieto, Demarcq and McClatchie in 2012 which improves on the original Cayula and Cornillon algorithm. Detections at 4 km and 1 km of resolution are compared. Fronts are divided in two intensity classes (“weak” and “strong”) according to their thermal gradient. A preliminary smoothing is applied prior to the detection using different convolutions: three type of filters (median, average and Gaussian) combined with four kernel sizes (3 × 3, 5 × 5, 7 × 7, and 9 × 9 pixels) and three detection window sizes (16 × 16, 24 × 24 and 32 × 32 pixels) to test the effect of these smoothing combinations on reducing the background noise of the data and therefore on improving the frontal detection. The performance of the combinations on 4 km data are evaluated using two criteria: detection efficiency and front length. We find that the optimal combination of preliminary smoothing parameters in enhancing detection efficiency and preserving front length includes a median filter, a 16 × 16 pixel window size, and a 5 × 5 pixel kernel for strong fronts and a 7 × 7 pixel kernel for weak fronts. Results show an improvement in detection performance (from largest to smallest window size) of 71% for strong fronts and 120% for weak fronts. Despite the small window used (16 × 16 pixels), the length of the fronts has been preserved relative to that found with 1 km data. This optimal preliminary smoothing and the CMW detection algorithm on 4 km sea surface temperature data are then used to describe the spatial distribution of the monthly frequencies of occurrence for both strong and weak fronts across the Indian Ocean basin. In general strong fronts are observed in coastal areas whereas weak fronts, with some seasonal exceptions, are mainly located in the open ocean. This study shows that adequate noise reduction done by a preliminary smoothing of the data considerably improves the frontal detection efficiency as well as the global quality of the results. Consequently, the use of 4 km data enables frontal detections similar to 1 km data (using a standard median 3 × 3 convolution) in terms of detectability, length and location. This method, using 4 km data is easily applicable to large regions or at the global scale with far less constraints of data manipulation and processing time relative to 1 km data.
Resumo:
Data leakage is a serious issue and can result in the loss of sensitive data, compromising user accounts and details, potentially affecting millions of internet users. This paper contributes to research in online security and reducing personal footprint by evaluating the levels of privacy provided by the Firefox browser. The aim of identifying conditions that would minimize data leakage and maximize data privacy is addressed by assessing and comparing data leakage in the four possible browsing modes: normal and private modes using a browser installed on the host PC or using a portable browser from a connected USB device respectively. To provide a firm foundation for analysis, a series of carefully designed, pre-planned browsing sessions were repeated in each of the various modes of Firefox. This included low RAM environments to determine any effects low RAM may have on browser data leakage. The results show that considerable data leakage may occur within Firefox. In normal mode, all of the browsing information is stored within the Mozilla profile folder in Firefox-specific SQLite databases and sessionstore.js. While passwords were not stored as plain text, other confidential information such as credit card numbers could be recovered from the Form history under certain conditions. There is no difference when using a portable browser in normal mode, except that the Mozilla profile folder is located on the USB device rather than the host's hard disk. By comparison, private browsing reduces data leakage. Our findings confirm that no information is written to the Firefox-related locations on the hard disk or USB device during private browsing, implying that no deletion would be necessary and no remnants of data would be forensically recoverable from unallocated space. However, two aspects of data leakage occurred equally in all four browsing modes. Firstly, all of the browsing history was stored in the live RAM and was therefore accessible while the browser remained open. Secondly, in low RAM situations, the operating system caches out RAM to pagefile.sys on the host's hard disk. Irrespective of the browsing mode used, this may include Firefox history elements which can then remain forensically recoverable for considerable time.
Resumo:
Public agencies are increasingly required to collaborate with each other in order to provide high-quality e-government services. This collaboration is usually based on the service-oriented approach and supported by interoperability platforms. Such platforms are specialized middleware-based infrastructures enabling the provision, discovery and invocation of interoperable software services. In turn, given that personal data handled by governments are often very sensitive, most governments have developed some sort of legislation focusing on data protection. This paper proposes solutions for monitoring and enforcing data protection laws within an E-government Interoperability Platform. In particular, the proposal addresses requirements posed by the Uruguayan Data Protection Law and the Uruguayan E-government Platform, although it can also be applied in similar scenarios. The solutions are based on well-known integration mechanisms (e.g. Enterprise Service Bus) as well as recognized security standards (e.g. eXtensible Access Control Markup Language) and were completely prototyped leveraging the SwitchYard ESB product.
Resumo:
SARAL/AltiKa GDR-T are analyzed to assess the quality of the significant wave height (SWH) measurements. SARAL along-track SWH plots reveal cases of erroneous data, more or less isolated, not detected by the quality flags. The anomalies are often correlated with strong attenuation of the Ka-band backscatter coefficient, sensitive to clouds and rain. A quality test based on the 1Hz standard deviation is proposed to detect such anomalies. From buoy comparison, it is shown that SARAL SWH is more accurate than Jason-2, particularly at low SWH, and globally does not require any correction. Results are better with open ocean than with coastal buoys. The scatter and the number of outliers are much larger for coastal buoys. SARAL is then compared with Jason-2 and Cryosat-2. The altimeter data are extracted from the global altimeter SWH Ifremer data base, including specific corrections to calibrate the various altimeters. The comparison confirms the high quality of SARAL SWH. The 1Hz standard deviation is much less than for Jason-2 and Cryosat-2, particularly at low SWH. Furthermore, results show that the corrections applied to Jason-2 and to Cryosat-2, in the data base, are efficient, improving the global agreement between the three altimeters.
Resumo:
Wind-generated waves in the Kara, Laptev, and East-Siberian Seas are investigated using altimeter data from Envisat RA-2 and SARAL-AltiKa. Only isolated ice-free zones had been selected for analysis. Wind seas can be treated as pure wind-generated waves without any contamination by ambient swell. Such zones were identified using ice concentration data from microwave radiometers. Altimeter data, both significant wave height (SWH) and wind speed, for these areas were further obtained for the period 2002-2012 using Envisat RA-2 measurements, and for 2013 using SARAL-AltiKa. Dependencies of dimensionless SWH and wavelength on dimensionless wave generation spatial scale are compared to known empirical dependencies for fetch-limited wind wave development. We further check sensitivity of Ka- and Ku-band and discuss new possibilities that AltiKa's higher resolution can open.
Resumo:
Due to the growth of design size and complexity, design verification is an important aspect of the Logic Circuit development process. The purpose of verification is to validate that the design meets the system requirements and specification. This is done by either functional or formal verification. The most popular approach to functional verification is the use of simulation based techniques. Using models to replicate the behaviour of an actual system is called simulation. In this thesis, a software/data structure architecture without explicit locks is proposed to accelerate logic gate circuit simulation. We call thus system ZSIM. The ZSIM software architecture simulator targets low cost SIMD multi-core machines. Its performance is evaluated on the Intel Xeon Phi and 2 other machines (Intel Xeon and AMD Opteron). The aim of these experiments is to: • Verify that the data structure used allows SIMD acceleration, particularly on machines with gather instructions ( section 5.3.1). • Verify that, on sufficiently large circuits, substantial gains could be made from multicore parallelism ( section 5.3.2 ). • Show that a simulator using this approach out-performs an existing commercial simulator on a standard workstation ( section 5.3.3 ). • Show that the performance on a cheap Xeon Phi card is competitive with results reported elsewhere on much more expensive super-computers ( section 5.3.5 ). To evaluate the ZSIM, two types of test circuits were used: 1. Circuits from the IWLS benchmark suit [1] which allow direct comparison with other published studies of parallel simulators.2. Circuits generated by a parametrised circuit synthesizer. The synthesizer used an algorithm that has been shown to generate circuits that are statistically representative of real logic circuits. The synthesizer allowed testing of a range of very large circuits, larger than the ones for which it was possible to obtain open source files. The experimental results show that with SIMD acceleration and multicore, ZSIM gained a peak parallelisation factor of 300 on Intel Xeon Phi and 11 on Intel Xeon. With only SIMD enabled, ZSIM achieved a maximum parallelistion gain of 10 on Intel Xeon Phi and 4 on Intel Xeon. Furthermore, it was shown that this software architecture simulator running on a SIMD machine is much faster than, and can handle much bigger circuits than a widely used commercial simulator (Xilinx) running on a workstation. The performance achieved by ZSIM was also compared with similar pre-existing work on logic simulation targeting GPUs and supercomputers. It was shown that ZSIM simulator running on a Xeon Phi machine gives comparable simulation performance to the IBM Blue Gene supercomputer at very much lower cost. The experimental results have shown that the Xeon Phi is competitive with simulation on GPUs and allows the handling of much larger circuits than have been reported for GPU simulation. When targeting Xeon Phi architecture, the automatic cache management of the Xeon Phi, handles and manages the on-chip local store without any explicit mention of the local store being made in the architecture of the simulator itself. However, targeting GPUs, explicit cache management in program increases the complexity of the software architecture. Furthermore, one of the strongest points of the ZSIM simulator is its portability. Note that the same code was tested on both AMD and Xeon Phi machines. The same architecture that efficiently performs on Xeon Phi, was ported into a 64 core NUMA AMD Opteron. To conclude, the two main achievements are restated as following: The primary achievement of this work was proving that the ZSIM architecture was faster than previously published logic simulators on low cost platforms. The secondary achievement was the development of a synthetic testing suite that went beyond the scale range that was previously publicly available, based on prior work that showed the synthesis technique is valid.