939 resultados para Open Information Extraction


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Universitat Oberta de Catalunya (UOC, Open University of Catalonia) is involved inseveral research projects and educational activities related to the use of Open Educational Resources (OER). Some of the discussed issues in the concept of OER are research issues which are being tackled in two EC projects (OLCOS and SELF). Besides the research part, the UOC aims at developing a virtual centre for analysing and promoting the concept of OERin Europe in the sector of Higher and Further Education. The objectives are to makeinformation and learning services available to provide university management staff,eLearning support centres, faculty and learners with practical information required to create, share and re-use such interoperable digital content, tools and licensing schemes. In the realisation of these objectives, the main activities are the following: to provide organisationaland individual e-learning end-users with orientation; to develop perspectives and useful recommendations in the form of a medium-term Roadmap 2010 for OER in Higher and Further Education in Europe; to offer practical information and support services about how to create, share and re-use open educational content by means of tutorials, guidelines, best practices, and specimen of exemplary open e-learning content; to establish a larger group ofcommitted experts throughout Europe and other continents who not only share theirexpertise but also steer networking, workshops, and clustering efforts; and to foster and support a community of practice in open e-learning content know-how and experiences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Universitat Oberta de Catalunya (Open University of Catalonia, UOC) is an online university that makes extensive use of information and communication technologies to provide education. Ever since its establishment in 1995, the UOC has developed and tested methodologies and technological support services to meet the educational challenges posed by its student community and its teaching and management staff. The know-how it has acquired in doing so is the basis on which it has created the Open Apps platform, which is designed to provide access to open source technical applications, information on successful learning and teaching experiences, resources and other solutions, all in a single environment. Open Apps is an open, online catalogue, the content of which is available to all students for learning purposes, all IT professionals for downloading and all teachers for reusing.To contribute to the transfer of knowledge, experience and technology, each of the platform¿s apps comes with full documentation, plus information on cases in which it has been used and related tools. It is hoped that such transfer will lead to the growth of an external partner network, and that this, in turn, will result in improvements to the applications and teaching/learning practices, and in greater scope for collaboration.Open Apps is a strategic project that has arisen from the UOC's commitment to the open access movement and to giving knowledge and technology back to society, as well as its firm belief that sustainability depends on communities of interest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reason for this study is to propose a new quantitative approach on how to assess the quality of Open Access University Institutional Repositories. The results of this new approach are tested in the Spanish University Repositories. The assessment method is based in a binary codification of a proposal of features that objectively describes the repositories. The purposes of this method are assessing the quality and an almost automatically system for updating the data of the characteristics. First of all a database was created with the 38 Spanish institutional repositories. The variables of analysis are presented and explained either if they are coming from bibliography or are a set of new variables. Among the characteristics analyzed are the features of the software, the services of the repository, the features of the information system, the Internet visibility and the licenses of use. Results from Spanish universities ARE provided as a practical example of the assessment and for having a picture of the state of the development of the open access movement in Spain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The World Wide Web, the world¿s largest resource for information, has evolved from organizing information using controlled, top-down taxonomies to a bottom up approach that emphasizes assigning meaning to data via mechanisms such as the Social Web (Web 2.0). Tagging adds meta-data, (weak semantics) to the content available on the web. This research investigates the potential for repurposing this layer of meta-data. We propose a multi-phase approach that exploits user-defined tags to identify and extract domain-level concepts. We operationalize this approach and assess its feasibility by application to a publicly available tag repository. The paper describes insights gained from implementing and applying the heuristics contained in the approach, as well as challenges and implications of repurposing tags for extraction of domain-level concepts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organizations across the globe are creating and distributing products that include open source software. To ensure compliance with the open source licenses, each company needs to evaluate exactly what open source licenses and copyrights are included - resulting in duplicated effort and redundancy. This talk will provide an overview of a new Software Package Data Exchange (SPDX) specification. This specification will provide a common format to share information about the open source licenses and copyrights that are included in any software package, with the goal of saving time and improving data accuracy. This talk will review the progress of the initiative; discuss the benefits to organizations using open source and share information on how you can contribute.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Softcatalà is a non-profit associationcreated more than 10 years ago to fightthe marginalisation of the Catalan languagein information and communicationtechnologies. It has led the localisationof many applications and thecreation of a website which allows itsusers to translate texts between Spanishand Catalan using an external closed-sourcetranslation engine. Recently,the closed-source translation back-endhas been replaced by a free/open-sourcesolution completely managed by Softcatalà: the Apertium machine translationplatform and the ScaleMT web serviceframework. Thanks to the opennessof the new solution, it is possibleto take advantage of the huge amount ofusers of the Softcatalà translation serviceto improve it, using a series ofmethods presented in this paper. In addition,a study of the translations requestedby the users has been carriedout, and it shows that the translationback-end change has not affected theusage patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Selective publication of studies, which is commonly called publication bias, is widely recognized. Over the years a new nomenclature for other types of bias related to non-publication or distortion related to the dissemination of research findings has been developed. However, several of these different biases are often still summarized by the term 'publication bias'. METHODS/DESIGN: As part of the OPEN Project (To Overcome failure to Publish nEgative fiNdings) we will conduct a systematic review with the following objectives:- To systematically review highly cited articles that focus on non-publication of studies and to present the various definitions of biases related to the dissemination of research findings contained in the articles identified.- To develop and discuss a new framework on nomenclature of various aspects of distortion in the dissemination process that leads to public availability of research findings in an international group of experts in the context of the OPEN Project.We will systematically search Web of Knowledge for highly cited articles that provide a definition of biases related to the dissemination of research findings. A specifically designed data extraction form will be developed and pilot-tested. Working in teams of two, we will independently extract relevant information from each eligible article.For the development of a new framework we will construct an initial table listing different levels and different hazards en route to making research findings public. An international group of experts will iteratively review the table and reflect on its content until no new insights emerge and consensus has been reached. DISCUSSION: Results are expected to be publicly available in mid-2013. This systematic review together with the results of other systematic reviews of the OPEN project will serve as a basis for the development of future policies and guidelines regarding the assessment and prevention of publication bias.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Classical disease phenotypes are mainly based on descriptions of symptoms and the hypothesis that a given pattern of symptoms provides a diagnosis. With refined technologies there is growing evidence that disease expression in patients is much more diverse and subtypes need to be defined to allow a better targeted treatment. One of the aims of the Mechanisms of the Development of Allergy Project (MeDALL,FP7) is to re-define the classical phenotypes of IgE-associated allergic diseases from birth to adolescence, by consensus among experts using a systematic review of the literature and identify possible gaps in research for new disease markers. This paper describes the methods to be used for the systematic review of the classical IgE-associated phenotypes applicable in general to other systematic reviews also addressing phenotype definitions based on evidence. METHODS/DESIGN: Eligible papers were identified by PubMed search (complete database through April 2011). This search yielded 12,043 citations. The review includes intervention studies (randomized and clinical controlled trials) and observational studies (cohort studies including birth cohorts, case-control studies) as well as case series. Systematic and non-systematic reviews, guidelines, position papers and editorials are not excluded but dealt with separately. Two independent reviewers in parallel conducted consecutive title and abstract filtering scans. For publications where title and abstract fulfilled the inclusion criteria the full text was assessed. In the final step, two independent reviewers abstracted data using a pre-designed data extraction form with disagreements resolved by discussion among investigators. DISCUSSION: The systematic review protocol described here allows to generate broad,multi-phenotype reviews and consensus phenotype definitions. The in-depth analysis of the existing literature on the classification of IgE-associated allergic diseases through such a systematic review will 1) provide relevant information on the current epidemiologic definitions of allergic diseases, 2) address heterogeneity and interrelationships and 3) identify gaps in knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Perceiving the world visually is a basic act for humans, but for computers it is still an unsolved problem. The variability present innatural environments is an obstacle for effective computer vision. The goal of invariant object recognition is to recognise objects in a digital image despite variations in, for example, pose, lighting or occlusion. In this study, invariant object recognition is considered from the viewpoint of feature extraction. Thedifferences between local and global features are studied with emphasis on Hough transform and Gabor filtering based feature extraction. The methods are examined with respect to four capabilities: generality, invariance, stability, and efficiency. Invariant features are presented using both Hough transform and Gabor filtering. A modified Hough transform technique is also presented where the distortion tolerance is increased by incorporating local information. In addition, methods for decreasing the computational costs of the Hough transform employing parallel processing and local information are introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The patent system was created for the purpose of promoting innovation by granting the inventors a legally defined right to exclude others in return for public disclosure. Today, patents are being applied and granted in greater numbers than ever, particularly in new areas such as biotechnology and information andcommunications technology (ICT), in which research and development (R&D) investments are also high. At the same time, the patent system has been heavily criticized. It has been claimed that it discourages rather than encourages the introduction of new products and processes, particularly in areas that develop quickly, lack one-product-one-patent correlation, and in which theemergence of patent thickets is characteristic. A further concern, which is particularly acute in the U.S., is the granting of so-called 'bad patents', i.e. patents that do not factually fulfil the patentability criteria. From the perspective of technology-intensive companies, patents could,irrespective of the above, be described as the most significant intellectual property right (IPR), having the potential of being used to protect products and processes from imitation, to limit competitors' freedom-to-operate, to provide such freedom to the company in question, and to exchange ideas with others. In fact, patents define the boundaries of ownership in relation to certain technologies. They may be sold or licensed on their ownor they may be components of all sorts of technology acquisition and licensing arrangements. Moreover, with the possibility of patenting business-method inventions in the U.S., patents are becoming increasingly important for companies basing their businesses on services. The value of patents is dependent on the value of the invention it claims, and how it is commercialized. Thus, most of them are worth very little, and most inventions are not worth patenting: it may be possible to protect them in other ways, and the costs of protection may exceed the benefits. Moreover, instead of making all inventions proprietary and seeking to appropriate as highreturns on investments as possible through patent enforcement, it is sometimes better to allow some of them to be disseminated freely in order to maximize market penetration. In fact, the ideology of openness is well established in the software sector, which has been the breeding ground for the open-source movement, for instance. Furthermore, industries, such as ICT, that benefit from network effects do not shun the idea of setting open standards or opening up their proprietary interfaces to allow everyone todesign products and services that are interoperable with theirs. The problem is that even though patents do not, strictly speaking, prevent access to protected technologies, they have the potential of doing so, and conflicts of interest are not rare. The primary aim of this dissertation is to increase understanding of the dynamics and controversies of the U.S. and European patent systems, with the focus on the ICT sector. The study consists of three parts. The first part introduces the research topic and the overall results of the dissertation. The second part comprises a publication in which academic, political, legal and business developments that concern software and business-method patents are investigated, and contentiousareas are identified. The third part examines the problems with patents and open standards both of which carry significant economic weight inthe ICT sector. Here, the focus is on so-called submarine patents, i.e. patentsthat remain unnoticed during the standardization process and then emerge after the standard has been set. The factors that contribute to the problems are documented and the practical and juridical options for alleviating them are assessed. In total, the dissertation provides a good overview of the challenges and pressures for change the patent system is facing,and of how these challenges are reflected in standard setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Although randomized clinical trials (RCTs) are considered the gold standard of evidence, their reporting is often suboptimal. Trial registries have the potential to contribute important methodologic information for critical appraisal of study results. Methods and Findings: The objective of the study was to evaluate the reporting of key methodologic study characteristics in trial registries. We identified a random sample (n = 265) of actively recruiting RCTs using the World Health Organization International Clinical Trials Registry Platform (ICTRP) search portal in 2008. We assessed the reporting of relevant domains from the Cochrane Collaboration’s ‘Risk of bias’ tool and other key methodological aspects. Our primary outcomes were the proportion of registry records with adequate reporting of random sequence generation, allocation concealment, blinding, and trial outcomes. Two reviewers independently assessed each record. Weighted overall proportions in the ICTRP search portal for adequate reporting of sequence generation, allocation concealment, blinding (including and excluding open label RCT) and primary outcomes were 5.7% (95% CI 3.0–8.4%), 1.4% (0–2.8%), 41% (35–47%), 8.4% (4.1–13%), and 66% (60–72%), respectively. The proportion of adequately reported RCTs was higher for registries that used specific methodological fields for describing methods of randomization and allocation concealment compared to registries that did not. Concerning other key methodological aspects, weighted overall proportions of RCTs with adequately reported items were as follows: eligibility criteria (81%), secondary outcomes (46%), harm (5%) follow-up duration (62%), description of the interventions (53%) and sample size calculation (1%). Conclusions: Trial registries currently contain limited methodologic information about registered RCTs. In order to permit adequate critical appraisal of trial results reported in journals and registries, trial registries should consider requesting details on key RCT methods to complement journal publications. Full protocols remain the most comprehensive source of methodologic information and should be made publicly available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose- This study seeks to analyse the policies of library and information science (LIS) journals regarding the publication of supplementary materials, the number of journals and articles that include this feature, the kind of supplementary materials published with regard to their function in the article, the formats employed and the access provided to readers. Design/methodology/approach- The study analysed the instructions for authors of LIS journals indexed in the ISI Journal Citation Reports, as well as the supplementary materials attached to the articles published in their 2011 online volumes. Findings- Large publishers are more likely to have a policy regarding the publication of supplementary materials, and policies are usually homogeneous across all the journals of a given publisher. Most policies state the acceptance of supplementary materials, and even journals without a policy also publish supplementary materials. The majority of supplementary materials provided in LIS articles are extended methodological explanations and additional results in the form of textual information in PDF or Word files. Some toll-access journals provide open access to any reader to these files. Originality/value- This study provides new insights into the characteristics of supplementary materials in LIS journals. The results may be used by journal publishers to establish a policy on the publication of supplementary materials and, more broadly, to develop data sharing initiatives in academic settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: To evaluate the safety of immediate sequential bilateral cataract extraction (ISBCE) with respect to indications, visual outcomes, complications, benefits and disadvantages. Methods: This is a retrospective review of all ISBCEs performed at Kantonsspital Winterthur, Switzerland, between April 2000 and September 2013. The case notes of 500 eyes of 250 patients were reviewed. Of these 500 eyes, 472 (94.4%) had a straight forward phacoemulsification with posterior chamber intraocular lens implantation; 21 (4.2%) had a planned extracapsular cataract extraction; 4 (0.8%) had an intracapsular cataract extraction and 3 (0.6%) had a combined phacoemulsification with trabeculectomy. Results: Over 66% of eyes achieved improved visual acuity (at least 3 Snellen lines) following ISBCE. Median preoperative best corrected visual acuity (BCVA) was 0.5 LogMAR; the interquartile range was [0.4, 1] LogMAR. At one week control the median BCVA was 0.3 LogMAR, IQR [0.1, 0.5] LogMAR. At one month the median BCVA was 0.15 LogMAR, IQR [0.05, 0.3] (p < 0.01). There were no sight-threatening intraoperative or postoperative complications observed. Conclusions: ISBCE is an effective and safe option with high degree of patient satisfaction. The relative benefits of ISBCE should be balanced against the theoretically enhanced risks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The management of unresectable metastatic colorectal cancer (mCRC) is a comprehensive treatment strategy involving several lines of therapy, maintenance, salvage surgery, and treatment-free intervals. Besides chemotherapy (fluoropyrimidine, oxaliplatin, irinotecan), molecular-targeted agents such as anti-angiogenic agents (bevacizumab, aflibercept, regorafenib) and anti-epidermal growth factor receptor agents (cetuximab, panitumumab) have become available. Ultimately, given the increasing cost of new active compounds, new strategy trials are needed to define the optimal use and the best sequencing of these agents. Such new clinical trials require alternative endpoints that can capture the effect of several treatment lines and be measured earlier than overall survival to help shorten the duration and reduce the size and cost of trials. METHODS/DESIGN: STRATEGIC-1 is an international, open-label, randomized, multicenter phase III trial designed to determine an optimally personalized treatment sequence of the available treatment modalities in patients with unresectable RAS wild-type mCRC. Two standard treatment strategies are compared: first-line FOLFIRI-cetuximab, followed by oxaliplatin-based second-line chemotherapy with bevacizumab (Arm A) vs. first-line OPTIMOX-bevacizumab, followed by irinotecan-based second-line chemotherapy with bevacizumab, and by an anti-epidermal growth factor receptor monoclonal antibody with or without irinotecan as third-line treatment (Arm B). The primary endpoint is duration of disease control. A total of 500 patients will be randomized in a 1:1 ratio to one of the two treatment strategies. DISCUSSION: The STRATEGIC-1 trial is designed to give global information on the therapeutic sequences in patients with unresectable RAS wild-type mCRC that in turn is likely to have a significant impact on the management of this patient population. The trial is open for inclusion since August 2013. TRIAL REGISTRATION: STRATEGIC-1 is registered at Clinicaltrials.gov: NCT01910610, 23 July, 2013. STRATEGIC-1 is registered at EudraCT-No.: 2013-001928-19, 25 April, 2013.