988 resultados para Open-access algorithm
Resumo:
In this workshop seminar delivered twice at the CoFHE/UCR 2006 conference the author explored aspects relating to successful advocacy of Open Access and repositories. Areas covered included preconceptions on the part of academics and support staff, as well as models of implementation of an advocacy programme. A large portion of the material pulls together experience and narrative evidence from the SHERPA Consortium partners and repository administrators; with a particular focus on their successes and failures and the lessons that have been learned.
Resumo:
Open Access zu öffentlich geförderten wissenschaftlichen Publikationen ist unter dem Vorzeichen der „Openness“ Teil einer zunehmend bedeutsamen globalen Entwicklung mit strukturellen Folgen für Wissenschaft, Forschung und Bildung. Dabei bedingen die jeweiligen fachkulturellen Ausgangslagen und ökonomischen Interessenskonstellationen sehr stark, in welcher Weise, mit welcher Reichweite und Akzeptanz sich das Open-Access-Paradigma konkret materialisiert. Die vorliegende Arbeit geht dieser Frage am Beispiel des inter- bzw. pluridisziplinären Feldes der Erziehungswissenschaft/Bildungsforschung nach. Zum einen werden die fachlichen und soziokulturellen Konstellationen des Publizierens im disziplinären Feld, die verlagswirtschaftlichen Marktkonstellationen sowie die informationsinfrastrukturellen Bedingungen des Fachgebietes analysiert und ein differenziertes Gesamtbild erstellt. Gestützt auf eine Online-Befragung der Fachcommunity Erziehungswissenschaft/Bildungsforschung werden weitergehende Erkenntnisse über vorhandene Open-Access-Erfahrungen im Fachgebiet und Hemmnisse bzw. Anforderungen an das neue Publikationsmodell aus der Sicht der Wissenschaftler/innen selbst – sowie explorativ aus Sicht der Studierenden und der Bildungspraxis - ermittelt. Wesentliche Faktoren bei der Betrachtung der Potenziale und Effekte von Open Access im Fachgebiet bilden die Faktoren akademischer Status und Funktion, Interdisziplinarität und fachliche Provenienz sowie das Verhältnis von Bildungspraxis und akademischem Sektor. (DIPF/Orig.)
Resumo:
This chapter discusses the consequences of open-access (OA) publishing and dissemination for libraries in higher education institutions (HEIs). Key questions (which are addressed in this chapter) include: 1. How might OA help information provision? 2. What changes to library services will arise from OA developments (particularly if OA becomes widespread)? 3. How do these changes fit in with wider changes affecting the future role of libraries? 4. How can libraries and librarians help to address key practical issues associated with the implementation of OA (particularly transition issues)? This chapter will look at OA from the perspective of HE libraries and will make four key points: 1. Open access has the potential to bring benefits to the research community in particular and society in general by improving information provision. 2. If there is widespread open access to research content, there will be less need for library-based activity at the institution level, and more need for information management activity at the supra-institutional or national level. 3. Institutional libraries will, however, continue to have an important role to play in areas such as managing purchased or licensed content, curating institutional digital assets, and providing support in the use of content for teaching and research. 4. Libraries are well-placed to work with stakeholders within their institutions and beyond to help resolve current challenges associated with the implementation of OA policies and practices.
Resumo:
It is often assumed that open access repositories and peer-reviewed journals are in competition with each other and therefore will in the long term be unable to coexist. This paper takes a critical look at that assumption. It draws on the available evidence of actual practice which indicates that coexistence is possible at least in the medium term. It discusses possible future models of publication and dissemination which include open access, repositories, peer review and journals. The paper suggests that repositories and journals may coexist in the long term but that both may have to undergo significant changes. Important areas where changes need to occur include: widespread deployment of repository infrastructure, development of version identification standards, development of value-added features, new business models, new approaches to quality control and adoption of digital preservation as a repository function.
Resumo:
The mission of "the Depot" is to provide a service that enables all UK academics to share in the benefits of open access exposure for their post-print research outputs. It does this by providing a national intake and storage facility - the Depot - for use by any UK academic in Higher Education. This article describes the Depot and the service it provides.
Resumo:
Saltwater recreational fishing (SRF) in Portugal was for a long time an open-access activity, without restrictions of any kind. Restrictions to control the recreational harvest were first implemented in 2006 and were highly criticized by the angler community, for being highly restrictive and lacking scientific support. The present study aimed to obtain socio-economic data on the recreational shore anglers and gauge their perceptions about recreational fishing regulations and the newly implemented restrictions in Portugal. Roving creel surveys were conducted along the south and south-west coasts of Portugal, during pre and post regulation periods (2006-2007). A total of 1298 valid face-to-face interviews were conducted. Logit models were fitted to identify which characteristics influence anglers' perceptions about recreational fishing regulations. The majority of the interviewed anglers was aware and agreed with the existence of recreational fishing regulations. However, most were against the recreational fishing regulations currently in place. The logit models estimates revealed that Portuguese anglers with a higher level of formal education and income are more likely to agree with the existence of recreational fishing regulations. In contrast, anglers who perceive that more limitations and a better enforcement of commercial fishing would improve fishing in the area are less likely to agree with the existence of SRF regulations. The findings from this study will contribute to inform decision-makers about anglers' potential behaviour towards the new and future regulations. Although the existence of fishing regulations is a good starting point for effective management, the lack of acceptance and detailed knowledge of the regulations in place by fishers may result in lack of compliance, and ultimately hinder the success of recreational fishing regulations in Portugal. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Audio coding is used to compress digital audio signals, thereby reducing the amount of bits needed to transmit or to store an audio signal. This is useful when network bandwidth or storage capacity is very limited. Audio compression algorithms are based on an encoding and decoding process. In the encoding step, the uncompressed audio signal is transformed into a coded representation, thereby compressing the audio signal. Thereafter, the coded audio signal eventually needs to be restored (e.g. for playing back) through decoding of the coded audio signal. The decoder receives the bitstream and reconverts it into an uncompressed signal. ISO-MPEG is a standard for high-quality, low bit-rate video and audio coding. The audio part of the standard is composed by algorithms for high-quality low-bit-rate audio coding, i.e. algorithms that reduce the original bit-rate, while guaranteeing high quality of the audio signal. The audio coding algorithms consists of MPEG-1 (with three different layers), MPEG-2, MPEG-2 AAC, and MPEG-4. This work presents a study of the MPEG-4 AAC audio coding algorithm. Besides, it presents the implementation of the AAC algorithm on different platforms, and comparisons among implementations. The implementations are in C language, in Assembly of Intel Pentium, in C-language using DSP processor, and in HDL. Since each implementation has its own application niche, each one is valid as a final solution. Moreover, another purpose of this work is the comparison among these implementations, considering estimated costs, execution time, and advantages and disadvantages of each one.
Resumo:
LEÃO, Adriano de Castro; DÓRIA NETO, Adrião Duarte; SOUSA, Maria Bernardete Cordeiro de. New developmental stages for common marmosets (Callithrix jacchus) using mass and age variables obtained by K-means algorithm and self-organizing maps (SOM). Computers in Biology and Medicine, v. 39, p. 853-859, 2009
Resumo:
ABSTRACT: This paper presents a performance comparison between known propagation Models through least squares tuning algorithm for 5.8 GHz frequency band. The studied environment is based on the 12 cities located in Amazon Region. After adjustments and simulations, SUI Model showed the smaller RMS error and standard deviation when compared with COST231-Hata and ECC-33 models.
Resumo:
The transmission expansion planning problem in modern power systems is a large-scale, mixed-integer, nonlinear and non-convex problem. this paper presents a new mathematical model and a constructive heuristic algorithm (CHA) for solving transmission expansion planning problem under new environment of electricity restructuring. CHA finds an acceptable solution in an iterative process, where in each step a circuit is chosen using a sensitivity index and added to the system. The proposed model consider multiple generation scenarios therefore the methodology finds high quality solution in which it allows the power system operate adequacy in an environment with multiple generators scenarios. Case studies and simulation results using test systems show possibility of using Constructive heuristic algorithm in an open access system.
Resumo:
Abstract Background Identification of nontuberculous mycobacteria (NTM) based on phenotypic tests is time-consuming, labor-intensive, expensive and often provides erroneous or inconclusive results. In the molecular method referred to as PRA-hsp65, a fragment of the hsp65 gene is amplified by PCR and then analyzed by restriction digest; this rapid approach offers the promise of accurate, cost-effective species identification. The aim of this study was to determine whether species identification of NTM using PRA-hsp65 is sufficiently reliable to serve as the routine methodology in a reference laboratory. Results A total of 434 NTM isolates were obtained from 5019 cultures submitted to the Institute Adolpho Lutz, Sao Paulo Brazil, between January 2000 and January 2001. Species identification was performed for all isolates using conventional phenotypic methods and PRA-hsp65. For isolates for which these methods gave discordant results, definitive species identification was obtained by sequencing a 441 bp fragment of hsp65. Phenotypic evaluation and PRA-hsp65 were concordant for 321 (74%) isolates. These assignments were presumed to be correct. For the remaining 113 discordant isolates, definitive identification was based on sequencing a 441 bp fragment of hsp65. PRA-hsp65 identified 30 isolates with hsp65 alleles representing 13 previously unreported PRA-hsp65 patterns. Overall, species identification by PRA-hsp65 was significantly more accurate than by phenotype methods (392 (90.3%) vs. 338 (77.9%), respectively; p < .0001, Fisher's test). Among the 333 isolates representing the most common pathogenic species, PRA-hsp65 provided an incorrect result for only 1.2%. Conclusion PRA-hsp65 is a rapid and highly reliable method and deserves consideration by any clinical microbiology laboratory charged with performing species identification of NTM.
Resumo:
The Brazilian Diabetes Society is starting an innovative project of quantitative assessment of medical arguments of and implementing a new way of elaborating SBD Position Statements. The final aim of this particular project is to propose a new Brazilian algorithm for the treatment of type 2 diabetes, based on the opinions of endocrinologists surveyed from a poll conducted on the Brazilian Diabetes Society website regarding the latest algorithm proposed by American Diabetes Association /European Association for the Study of Diabetes, published in January 2009.
Resumo:
Abstract Background Once multi-relational approach has emerged as an alternative for analyzing structured data such as relational databases, since they allow applying data mining in multiple tables directly, thus avoiding expensive joining operations and semantic losses, this work proposes an algorithm with multi-relational approach. Methods Aiming to compare traditional approach performance and multi-relational for mining association rules, this paper discusses an empirical study between PatriciaMine - an traditional algorithm - and its corresponding multi-relational proposed, MR-Radix. Results This work showed advantages of the multi-relational approach in performance over several tables, which avoids the high cost for joining operations from multiple tables and semantic losses. The performance provided by the algorithm MR-Radix shows faster than PatriciaMine, despite handling complex multi-relational patterns. The utilized memory indicates a more conservative growth curve for MR-Radix than PatriciaMine, which shows the increase in demand of frequent items in MR-Radix does not result in a significant growth of utilized memory like in PatriciaMine. Conclusion The comparative study between PatriciaMine and MR-Radix confirmed efficacy of the multi-relational approach in data mining process both in terms of execution time and in relation to memory usage. Besides that, the multi-relational proposed algorithm, unlike other algorithms of this approach, is efficient for use in large relational databases.
Resumo:
The International Surface Temperature Initiative (ISTI) is striving towards substantively improving our ability to robustly understand historical land surface air temperature change at all scales. A key recently completed first step has been collating all available records into a comprehensive open access, traceable and version-controlled databank. The crucial next step is to maximise the value of the collated data through a robust international framework of benchmarking and assessment for product intercomparison and uncertainty estimation. We focus on uncertainties arising from the presence of inhomogeneities in monthly mean land surface temperature data and the varied methodological choices made by various groups in building homogeneous temperature products. The central facet of the benchmarking process is the creation of global-scale synthetic analogues to the real-world database where both the "true" series and inhomogeneities are known (a luxury the real-world data do not afford us). Hence, algorithmic strengths and weaknesses can be meaningfully quantified and conditional inferences made about the real-world climate system. Here we discuss the necessary framework for developing an international homogenisation benchmarking system on the global scale for monthly mean temperatures. The value of this framework is critically dependent upon the number of groups taking part and so we strongly advocate involvement in the benchmarking exercise from as many data analyst groups as possible to make the best use of this substantial effort.