889 resultados para Open Data, Dati Aperti, Open Government Data


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Three questions on the study of NO Iberian Peninsula sweat lodges are posed. First, the new sauna of Monte Ornedo (Cantabria), the review of the one of Armea (Ourense), and the Cantabrian pedra formosa type are discussed. Second, the known types of sweat lodges are reconsidered underlining the differences between the Cantabrian and the Douro - Minho groups as these differences contribute to a better assessment of the saunas located out of those territories, such as those of Monte Ornedo or Ulaca. Third, a richer record demands a more specific terminology, a larger use of archaeometric analysis and the application of landscape archaeology or art history methodologies. In this way the range of interpretation of the sweat lodges is opened, as an example an essay is proposed that digs on some already known proposals and suggests that the saunas are material metaphors of wombs whose rationale derives from ideologies and ritual practices of Indo-European tradition.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Here, we describe gene expression compositional assignment (GECA), a powerful, yet simple method based on compositional statistics that can validate the transfer of prior knowledge, such as gene lists, into independent data sets, platforms and technologies. Transcriptional profiling has been used to derive gene lists that stratify patients into prognostic molecular subgroups and assess biomarker performance in the pre-clinical setting. Archived public data sets are an invaluable resource for subsequent in silico validation, though their use can lead to data integration issues. We show that GECA can be used without the need for normalising expression levels between data sets and can outperform rank-based correlation methods. To validate GECA, we demonstrate its success in the cross-platform transfer of gene lists in different domains including: bladder cancer staging, tumour site of origin and mislabelled cell lines. We also show its effectiveness in transferring an epithelial ovarian cancer prognostic gene signature across technologies, from a microarray to a next-generation sequencing setting. In a final case study, we predict the tumour site of origin and histopathology of epithelial ovarian cancer cell lines. In particular, we identify and validate the commonly-used cell line OVCAR-5 as non-ovarian, being gastrointestinal in origin. GECA is available as an open-source R package.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: EGFR overexpression occurs in 27-55% of oesophagogastric adenocarcinomas, and correlates with poor prognosis. We aimed to assess addition of the anti-EGFR antibody panitumumab to epirubicin, oxaliplatin, and capecitabine (EOC) in patients with advanced oesophagogastric adenocarcinoma. METHODS: In this randomised, open-label phase 3 trial (REAL3), we enrolled patients with untreated, metastatic, or locally advanced oesophagogastric adenocarcinoma at 63 centres (tertiary referral centres, teaching hospitals, and district general hospitals) in the UK. Eligible patients were randomly allocated (1:1) to receive up to eight 21-day cycles of open-label EOC (epirubicin 50 mg/m(2) and oxaliplatin 130 mg/m(2) on day 1 and capecitabine 1250 mg/m(2) per day on days 1-21) or modified-dose EOC plus panitumumab (mEOC+P; epirubicin 50 mg/m(2) and oxaliplatin 100 mg/m(2) on day 1, capecitabine 1000 mg/m(2) per day on days 1-21, and panitumumab 9 mg/kg on day 1). Randomisation was blocked and stratified for centre region, extent of disease, and performance status. The primary endpoint was overall survival in the intention-to-treat population. We assessed safety in all patients who received at least one dose of study drug. After a preplanned independent data monitoring committee review in October, 2011, trial recruitment was halted and panitumumab withdrawn. Data for patients on treatment were censored at this timepoint. This study is registered with ClinicalTrials.gov, number NCT00824785. FINDINGS: Between June 2, 2008, and Oct 17, 2011, we enrolled 553 eligible patients. Median overall survival in 275 patients allocated EOC was 11.3 months (95% CI 9.6-13.0) compared with 8.8 months (7.7-9.8) in 278 patients allocated mEOC+P (hazard ratio [HR] 1.37, 95% CI 1.07-1.76; p=0.013). mEOC+P was associated with increased incidence of grade 3-4 diarrhoea (48 [17%] of 276 patients allocated mEOC+P vs 29 [11%] of 266 patients allocated EOC), rash (29 [11%] vs two [1%]), mucositis (14 [5%] vs none), and hypomagnesaemia (13 [5%] vs none) but reduced incidence of haematological toxicity (grade ≥ 3 neutropenia 35 [13%] vs 74 [28%]). INTERPRETATION: Addition of panitumumab to EOC chemotherapy does not increase overall survival and cannot be recommended for use in an unselected population with advanced oesophagogastric adenocarcinoma. FUNDING: Amgen, UK National Institute for Health Research Biomedical Research Centre.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract Massive Open Online Courses (MOOCs) generate enormous amounts of data. The University of Southampton has run and is running dozens of MOOC instances. The vast amount of data resulting from our MOOCs can provide highly valuable information to all parties involved in the creation and delivery of these courses. However, analysing and visualising such data is a task that not all educators have the time or skills to undertake. The recently developed MOOC Dashboard is a tool aimed at bridging such a gap: it provides reports and visualisations based on the data generated by learners in MOOCs. Speakers Manuel Leon is currently a Lecturer in Online Teaching and Learning in the Institute for Learning Innovation and Development (ILIaD). Adriana Wilde is a Teaching Fellow in Electronics and Computer Science, with research interests in MOOCs and Learning Analytics. Darron Tang (4th Year BEng Computer Science) and Jasmine Cheng (BSc Mathematics & Actuarial Science and starting MSc Data Science shortly) have been working as interns over this Summer (2016) as have been developing the MOOC Dashboard.

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We study the existence of solutions of quasilinear elliptic systems involving $N$ equations and a measure on the right hand side, with the form $$\left\{\begin{array}{ll} -\sum_{i=1}^n \frac{\partial}{\partial x_i}\left(\sum\limits_{\beta=1}^{N}\sum\limits_{j=1}^{n}% a_{i,j}^{\alpha,\beta}\left( x,u\right)\frac{\partial}{\partial x_j}u^\beta\right)=\mu^\alpha& \mbox{ in }\Omega ,\\ u=0 & \mbox{ on }\partial\Omega, \end{array}\right.$$ where $\alpha\in\{1,\dots,N\}$ is the equation index, $\Omega$ is an open bounded subset of $\mathbb{R}^{n}$, $u:\Omega\rightarrow\mathbb{R}^{N}$ and $\mu$ is a finite Randon measure on $\mathbb{R}^{n}$ with values into $\mathbb{R}^{N}$. Existence of a solution is proved for two different sets of assumptions on $A$. Examples are provided that satisfy our conditions, but do not satisfy conditions required on previous works on this matter.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The last decades have been characterized by a continuous adoption of IT solutions in the healthcare sector, which resulted in the proliferation of tremendous amounts of data over heterogeneous systems. Distinct data types are currently generated, manipulated, and stored, in the several institutions where patients are treated. The data sharing and an integrated access to this information will allow extracting relevant knowledge that can lead to better diagnostics and treatments. This thesis proposes new integration models for gathering information and extracting knowledge from multiple and heterogeneous biomedical sources. The scenario complexity led us to split the integration problem according to the data type and to the usage specificity. The first contribution is a cloud-based architecture for exchanging medical imaging services. It offers a simplified registration mechanism for providers and services, promotes remote data access, and facilitates the integration of distributed data sources. Moreover, it is compliant with international standards, ensuring the platform interoperability with current medical imaging devices. The second proposal is a sensor-based architecture for integration of electronic health records. It follows a federated integration model and aims to provide a scalable solution to search and retrieve data from multiple information systems. The last contribution is an open architecture for gathering patient-level data from disperse and heterogeneous databases. All the proposed solutions were deployed and validated in real world use cases.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aim. Laparoscopic Appendectomy (LA) is widely performed for the treatment of acute appendicitis. However the use of laparoscopic approach for complicated appendicitis is controversial, in particular because it has been reported an increased risk of postoperative IntraAbdominal Abscess (IAA). The aim of this study was to compare the outcomes of LA versus Open Appendectomy (OA) in the treatment of complicated appendicitis, especially with regard to the incidence of postoperative IAA. Patients and Methods. A retrospective study of all patients treated at our institution for complicated appendicitis, from May 2004 to June 2009, was performed. Data collection included demographic characteristics, postoperative complications, conversion rate, and length of hospital stay. Results. Thirty-eight patients with complicated appendicitis were analysed. Among these, 18 (47,3%) had LA and 20 (52,7%) had OA. There were no statistical differences in characteristics between the two groups. The incidence of postoperative IAA was higher (16,6%), although not statistically significant, in the LA compared with OA group (5%). On the other hand the rate of wound infection was lower (5%) in the LA versus OA (20%). Conclusion. Our study indicated that LA should be utilised with caution in case of perforated appendicitis, because it is associated with an increased risk of postoperative IAA compared with OA.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Open Journal project has completed its three year period of funding by the UK Electronic Libraries (eLib) programme (Rusbridge 1998). During that time, the number of journals that are available electronically leapt from a few tens to a few thousand. Some of these journals are now developing the sort of features the project has been advocating, in particular the use of links within journals, between different primary journals, with secondary journals data, and to non-journal sources. Assessing the achievements of the project and considering some of the difficulties it faced, we report on the different approaches to linking that the project developed, and summarise the important user responses that indicate what works and what does not. Looking ahead, there are signs of change, not just to simple linking within journals but to schemes in which links are the basis of "distributed" journals, where information may be shared and documents built from different sources. The significance has yet to be appreciated, but this would be a major change from printed journals. If projects such as this and others have provided the initial impetus, the motivation for distributed journals comes, perhaps surprisingly, from within certain parts of the industry, as the paper shows.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Open Access movement has encouraged the availability of publicly-funded research papers, data and learning content for barrier-free use of that content without payment by the user. The impact of increasing availability of content to researchers in European universities is understood in terms of easier access to previous research and greater exposure for new research results, bringing benefits to the research community itself. A new culture of informal sharing is evident within the teaching and learning communities and to some extent also within the research community, but as yet the growth in informal sharing has not had a major effect upon the use of formal publication choices. This briefing paper explores the impact of open access upon potential users of research outputs outside the walls of research-led European universities, where the economic value of open access may be even greater than the academic value within universities. The potential impact of open access is understood in many communities but requires a greater volume of open access content to be available for the full potential to be realised. More open access content will become available as the opportunities in open, internet-based digital scholarship are understood. This briefing paper was written in cooperation with SPARC Europe. All links provided in footnotes in this Briefing Paper are to studies available in open access.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In order to optimize frontal detection in sea surface temperature fields at 4 km resolution, a combined statistical and expert-based approach is applied to test different spatial smoothing of the data prior to the detection process. Fronts are usually detected at 1 km resolution using the histogram-based, single image edge detection (SIED) algorithm developed by Cayula and Cornillon in 1992, with a standard preliminary smoothing using a median filter and a 3 × 3 pixel kernel. Here, detections are performed in three study regions (off Morocco, the Mozambique Channel, and north-western Australia) and across the Indian Ocean basin using the combination of multiple windows (CMW) method developed by Nieto, Demarcq and McClatchie in 2012 which improves on the original Cayula and Cornillon algorithm. Detections at 4 km and 1 km of resolution are compared. Fronts are divided in two intensity classes (“weak” and “strong”) according to their thermal gradient. A preliminary smoothing is applied prior to the detection using different convolutions: three type of filters (median, average and Gaussian) combined with four kernel sizes (3 × 3, 5 × 5, 7 × 7, and 9 × 9 pixels) and three detection window sizes (16 × 16, 24 × 24 and 32 × 32 pixels) to test the effect of these smoothing combinations on reducing the background noise of the data and therefore on improving the frontal detection. The performance of the combinations on 4 km data are evaluated using two criteria: detection efficiency and front length. We find that the optimal combination of preliminary smoothing parameters in enhancing detection efficiency and preserving front length includes a median filter, a 16 × 16 pixel window size, and a 5 × 5 pixel kernel for strong fronts and a 7 × 7 pixel kernel for weak fronts. Results show an improvement in detection performance (from largest to smallest window size) of 71% for strong fronts and 120% for weak fronts. Despite the small window used (16 × 16 pixels), the length of the fronts has been preserved relative to that found with 1 km data. This optimal preliminary smoothing and the CMW detection algorithm on 4 km sea surface temperature data are then used to describe the spatial distribution of the monthly frequencies of occurrence for both strong and weak fronts across the Indian Ocean basin. In general strong fronts are observed in coastal areas whereas weak fronts, with some seasonal exceptions, are mainly located in the open ocean. This study shows that adequate noise reduction done by a preliminary smoothing of the data considerably improves the frontal detection efficiency as well as the global quality of the results. Consequently, the use of 4 km data enables frontal detections similar to 1 km data (using a standard median 3 × 3 convolution) in terms of detectability, length and location. This method, using 4 km data is easily applicable to large regions or at the global scale with far less constraints of data manipulation and processing time relative to 1 km data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Data leakage is a serious issue and can result in the loss of sensitive data, compromising user accounts and details, potentially affecting millions of internet users. This paper contributes to research in online security and reducing personal footprint by evaluating the levels of privacy provided by the Firefox browser. The aim of identifying conditions that would minimize data leakage and maximize data privacy is addressed by assessing and comparing data leakage in the four possible browsing modes: normal and private modes using a browser installed on the host PC or using a portable browser from a connected USB device respectively. To provide a firm foundation for analysis, a series of carefully designed, pre-planned browsing sessions were repeated in each of the various modes of Firefox. This included low RAM environments to determine any effects low RAM may have on browser data leakage. The results show that considerable data leakage may occur within Firefox. In normal mode, all of the browsing information is stored within the Mozilla profile folder in Firefox-specific SQLite databases and sessionstore.js. While passwords were not stored as plain text, other confidential information such as credit card numbers could be recovered from the Form history under certain conditions. There is no difference when using a portable browser in normal mode, except that the Mozilla profile folder is located on the USB device rather than the host's hard disk. By comparison, private browsing reduces data leakage. Our findings confirm that no information is written to the Firefox-related locations on the hard disk or USB device during private browsing, implying that no deletion would be necessary and no remnants of data would be forensically recoverable from unallocated space. However, two aspects of data leakage occurred equally in all four browsing modes. Firstly, all of the browsing history was stored in the live RAM and was therefore accessible while the browser remained open. Secondly, in low RAM situations, the operating system caches out RAM to pagefile.sys on the host's hard disk. Irrespective of the browsing mode used, this may include Firefox history elements which can then remain forensically recoverable for considerable time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Public agencies are increasingly required to collaborate with each other in order to provide high-quality e-government services. This collaboration is usually based on the service-oriented approach and supported by interoperability platforms. Such platforms are specialized middleware-based infrastructures enabling the provision, discovery and invocation of interoperable software services. In turn, given that personal data handled by governments are often very sensitive, most governments have developed some sort of legislation focusing on data protection. This paper proposes solutions for monitoring and enforcing data protection laws within an E-government Interoperability Platform. In particular, the proposal addresses requirements posed by the Uruguayan Data Protection Law and the Uruguayan E-government Platform, although it can also be applied in similar scenarios. The solutions are based on well-known integration mechanisms (e.g. Enterprise Service Bus) as well as recognized security standards (e.g. eXtensible Access Control Markup Language) and were completely prototyped leveraging the SwitchYard ESB product.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

SARAL/AltiKa GDR-T are analyzed to assess the quality of the significant wave height (SWH) measurements. SARAL along-track SWH plots reveal cases of erroneous data, more or less isolated, not detected by the quality flags. The anomalies are often correlated with strong attenuation of the Ka-band backscatter coefficient, sensitive to clouds and rain. A quality test based on the 1Hz standard deviation is proposed to detect such anomalies. From buoy comparison, it is shown that SARAL SWH is more accurate than Jason-2, particularly at low SWH, and globally does not require any correction. Results are better with open ocean than with coastal buoys. The scatter and the number of outliers are much larger for coastal buoys. SARAL is then compared with Jason-2 and Cryosat-2. The altimeter data are extracted from the global altimeter SWH Ifremer data base, including specific corrections to calibrate the various altimeters. The comparison confirms the high quality of SARAL SWH. The 1Hz standard deviation is much less than for Jason-2 and Cryosat-2, particularly at low SWH. Furthermore, results show that the corrections applied to Jason-2 and to Cryosat-2, in the data base, are efficient, improving the global agreement between the three altimeters.