982 resultados para Library Quality
Resumo:
Tärkeänä osana Internet-sivujen toteutusta on niiden huolellinen suunnittelu. Käyttöliittymäsuunnittelun työvälineinä Internet-sivujen toteuttamisessa käytetään apuna hahmotelmia ja erilaisia prototyyppejä. Näiden avulla sivuston suunnitelmaa selkeytetään yhteistyössä asiakkaan ja tulevien käyttäjien kanssa. Tässä diplomityössä toteutetaan komponenttikirjasto verkkosivujen suunnitteluun Uoma Oy nimiselle yritykselle. Kirjastoon tulevia komponentteja kartoitetaan analysoimalla toteutettuja projekteja. Työssä myös selvitetään komponenttien laatukriteerit ja toteutetaan kirjaston komponentit. Kirjaston laatua ja tehokkuutta arvioidaan toteuttamalla mallisivusto. Työ osoittaa, että käyttämällä kirjastoa saadaan sekä laadullista hyötyä että parannetaan tehokkuutta, verrattuna yrityksessä aiemmin käytettyyn tapaan työskennellä. Kirjastoa voidaan käyttää monipuolisesti eri suunnitteluvaiheiden tarpeisiin.
Resumo:
Este trabajo se propone el diseño de una grilla exhaustiva de características específicas de naturaleza formal, informativa y ergonómica para hacer que un sitio Web de bibliotecas universitarias sea de calidad, accesible y garantice el acceso a la información. Para ello, se efectúa un relevamiento de la bibliografía que analiza el tema y se elabora un modelo que pondera dichas características seleccionadas. Las conclusiones demuestran que es posible diseñar un modelo específico de atributos, estableciendo cuáles son los que mejor representan la calidad de una biblioteca universitaria en el entorno Web
Resumo:
Este trabajo se propone el diseño de una grilla exhaustiva de características específicas de naturaleza formal, informativa y ergonómica para hacer que un sitio Web de bibliotecas universitarias sea de calidad, accesible y garantice el acceso a la información. Para ello, se efectúa un relevamiento de la bibliografía que analiza el tema y se elabora un modelo que pondera dichas características seleccionadas. Las conclusiones demuestran que es posible diseñar un modelo específico de atributos, estableciendo cuáles son los que mejor representan la calidad de una biblioteca universitaria en el entorno Web
Resumo:
Este trabajo se propone el diseño de una grilla exhaustiva de características específicas de naturaleza formal, informativa y ergonómica para hacer que un sitio Web de bibliotecas universitarias sea de calidad, accesible y garantice el acceso a la información. Para ello, se efectúa un relevamiento de la bibliografía que analiza el tema y se elabora un modelo que pondera dichas características seleccionadas. Las conclusiones demuestran que es posible diseñar un modelo específico de atributos, estableciendo cuáles son los que mejor representan la calidad de una biblioteca universitaria en el entorno Web
Resumo:
We have previously described ProxiMAX, a technology that enables the fabrication of precise, combinatorial gene libraries via codon-by-codon saturation mutagenesis. ProxiMAX was originally performed using manual, enzymatic transfer of codons via blunt-end ligation. Here we present Colibra™: an automated, proprietary version of ProxiMAX used specifically for antibody library generation, in which double-codon hexamers are transferred during the saturation cycling process. The reduction in process complexity, resulting library quality and an unprecedented saturation of up to 24 contiguous codons are described. Utility of the method is demonstrated via fabrication of complementarity determining regions (CDR) in antibody fragment libraries and next generation sequencing (NGS) analysis of their quality and diversity.
Resumo:
Saturation mutagenesis is a powerful tool in modern protein engineering, which permits key residues within a protein to be targeted in order to potentially enhance specific functionalities. However, the creation of large libraries using conventional saturation mutagenesis with degenerate codons (NNN or NNK/S) has inherent redundancy and consequent disparities in codon representation. Therefore, both chemical (trinucleotide phosphoramidites) and biological methods (sequential, enzymatic single codon additions) of non-degenerate saturation mutagenesis have been developed in order to combat these issues and so improve library quality. Large libraries with multiple saturated positions can be limited by the method used to screen them. Although the traditional screening method of choice, cell-dependent methods, such as phage display, are limited by the need for transformation. A number of cell-free screening methods, such as CIS display, which link the screened phenotype with the encoded genotype, have the capability of screening libraries with up to 1014 members. This thesis describes the further development of ProxiMAX technology to reduce library codon bias and its integration with CIS display to screen the resulting library. Synthetic MAX oligonucleotides are ligated to an acceptor base sequence, amplified, and digested, subsequently adding a randomised codon to the acceptor, which forms an iterative cycle using the digested product of the previous cycle as the base sequence for the next. Initial use of ProxiMAX highlighted areas of the process where changes could be implemented in order to improve the codon representation in the final library. The refined process was used to construct a monomeric anti-NGF peptide library, based on two proprietary dimeric peptides (Isogenica) that bind NGF. The resulting library showed greatly improved codon representation that equated to a theoretical diversity of ~69%. The library was subsequently screened using CIS display and the discovered peptides assessed for NGF-TrkA inhibition by ELISA. Despite binding to TrkA, these peptides showed lower levels of inhibition of the NGF-TrkA interaction than the parental dimeric peptides, highlighting the importance of dimerization for inhibition of NGF-TrkA binding.
Resumo:
Background: Light microscopic analysis of diatom frustules is widely used both in basic and applied research, notably taxonomy, morphometrics, water quality monitoring and paleo-environmental studies. In these applications, usually large numbers of frustules need to be identified and / or measured. Although there is a need for automation in these applications, and image processing and analysis methods supporting these tasks have previously been developed, they did not become widespread in diatom analysis. While methodological reports for a wide variety of methods for image segmentation, diatom identification and feature extraction are available, no single implementation combining a subset of these into a readily applicable workflow accessible to diatomists exists. Results: The newly developed tool SHERPA offers a versatile image processing workflow focused on the identification and measurement of object outlines, handling all steps from image segmentation over object identification to feature extraction, and providing interactive functions for reviewing and revising results. Special attention was given to ease of use, applicability to a broad range of data and problems, and supporting high throughput analyses with minimal manual intervention. Conclusions: Tested with several diatom datasets from different sources and of various compositions, SHERPA proved its ability to successfully analyze large amounts of diatom micrographs depicting a broad range of species. SHERPA is unique in combining the following features: application of multiple segmentation methods and selection of the one giving the best result for each individual object; identification of shapes of interest based on outline matching against a template library; quality scoring and ranking of resulting outlines supporting quick quality checking; extraction of a wide range of outline shape descriptors widely used in diatom studies and elsewhere; minimizing the need for, but enabling manual quality control and corrections. Although primarily developed for analyzing images of diatom valves originating from automated microscopy, SHERPA can also be useful for other object detection, segmentation and outline-based identification problems.
Resumo:
Este estudo apresenta um diagnóstico de situação das Bibliotecas Públicas portuguesas face aos Sistemas de Gestão de Qualidade (SGQ) e a avaliação de indícios de emergência de qualidade neste tipo de bibliotecas, tipificando os diferentes tipos de emergência diagnosticados, identificando os factores que impulsionam a implementação dos SGQ, os motivos de recuo e/ou avanço para uma gestão de qualidade. É estabelecida a ligação ao percurso da qualidade na Administração Pública em Portugal, com vista a contextualizar o presente trabalho e entender o aparecimento de Sistemas de Gestão de Qualidade. Apresentam-se razões que justificam a lentidão da prevista e anunciada proliferação de Sistemas de Gestão de Qualidade nas Bibliotecas Públicas portuguesas (como se previa há uma década atrás), referenciando de forma sucinta alguns dos diferentes modelos de gestão de qualidade validados que se encontram implementados tanto no nosso país como no estrangeiro. ABSTRACT: This study presents a diagnostic explanation on the situation of the Portuguese Public Libraries, taking into consideration the system of Managenent Quality Control (SGC) and the evaluation of the emergence of quality in these types of libraries, typifying the different types of forthcoming diagnostics, identifying the factors that started the implementation of the SGC and the reasons of the withdrawal and advancement towards quality management. A synopsis of the quality of Public Administration in Portugal has also been made, with the idea of putting into context the current work and understanding up to what point this forced the arrival of the system of Management Quality Control and, in a way, try to find the reasons that justify the slowness of the arrival of the system of Management Quality Control in the Portuguese Public Libraries (as was predicted almost a decade ago). Throughout this document references and explanations will be made, in a succinct way, of some of the different forms of validated Quality Management Control that are to be found in our country, as well as abroad.
Resumo:
"In Service to Iowa : Public Library Measures of Quality, 4th edition" is the manual for the Accreditation and Standards program of the State Library of Iowa.
Resumo:
The flow concept describes a model of enjoyment that has relevance for understanding participation and experience across a wide range of activities (Csikszentmihalyi, 1997). The basic premise of the flow concept is that when challenges and skills are simultaneously balanced and extending the individual, a state of total absorption can occur. Research by Jackson and colleagues has examined the utility of the flow concept to understanding participation and performance in sport settings. Recently, Jackson and Eklund have examined flow in a range of performance settings: sport, exercise, dance, creative and performing arts, and music. In this paper, we present descriptive and construct validity data on how participants in these activities experienced flow, as assessed by the recently revised flow scales: The Dispositional Flow Scale-2 (DFS-2) and Flow State Scale-2 (FSS-2) (Jackson & Eklund, 2002). The fmdings will be discussed in relation to the utility of the flow concept to understanding participation across performance settings.
Resumo:
It is no secret that computer services came to revolutionize the world. More than a service, have since become an indispensable tool for humanity, because they simplify and make life easier in many of his orders, through its various support options. These advances are added Internet services, so that by the mid 1980's assessed the need for such a research tool in order to apply it in various fields of human knowledge.This research, based on the use and quality of documentary information from the Internet, with application to the students of the Universidad Nacional, Costa Rica, if the Library Board Laboratory Joaquin Garcia Monge, develops in the form of theses.
Resumo:
INTRODUCTION: Open access publishing is becoming increasingly popular within the biomedical sciences. SciELO, the Scientific Electronic Library Online, is a digital library covering a selected collection of Brazilian scientific journals many of which provide open access to full-text articles.This library includes a number of dental journals some of which may include reports of clinical trials in English, Portuguese and/or Spanish. Thus, SciELO could play an important role as a source of evidence for dental healthcare interventions especially if it yields a sizeable number of high quality reports. OBJECTIVE: The aim of this study was to identify reports of clinical trials by handsearching of dental journals that are accessible through SciELO, and to assess the overall quality of these reports. MATERIAL AND METHODS: Electronic versions of six Brazilian dental Journals indexed in SciELO were handsearched at www.scielo.br in September 2008. Reports of clinical trials were identified and classified as controlled clinical trials (CCTs - prospective, experimental studies comparing 2 or more healthcare interventions in human beings) or randomized controlled trials (RCTs - a random allocation method is clearly reported), according to Cochrane eligibility criteria. CRITERIA TO ASSESS METHODOLOGICAL QUALITY INCLUDED: method of randomization, concealment of treatment allocation, blinded outcome assessment, handling of withdrawals and losses and whether an intention-to-treat analysis had been carried out. RESULTS: The search retrieved 33 CCTs and 43 RCTs. A majority of the reports provided no description of either the method of randomization (75.3%) or concealment of the allocation sequence (84.2%). Participants and outcome assessors were reported as blinded in only 31.2% of the reports. Withdrawals and losses were only clearly described in 6.5% of the reports and none mentioned an intention-to-treat analysis or any similar procedure. CONCLUSIONS: The results of this study indicate that a substantial number of reports of trials and systematic reviews are available in the dental journals listed in SciELO, and that these could provide valuable evidence for clinical decision making. However, it is clear that the quality of a number of these reports is of some concern and that improvement in the conduct and reporting of these trials could be achieved if authors adhered to internationally accepted guidelines, e.g. the CONSORT statement.
Resumo:
Background: High-throughput molecular approaches for gene expression profiling, such as Serial Analysis of Gene Expression (SAGE), Massively Parallel Signature Sequencing (MPSS) or Sequencing-by-Synthesis (SBS) represent powerful techniques that provide global transcription profiles of different cell types through sequencing of short fragments of transcripts, denominated sequence tags. These techniques have improved our understanding about the relationships between these expression profiles and cellular phenotypes. Despite this, more reliable datasets are still necessary. In this work, we present a web-based tool named S3T: Score System for Sequence Tags, to index sequenced tags in accordance with their reliability. This is made through a series of evaluations based on a defined rule set. S3T allows the identification/selection of tags, considered more reliable for further gene expression analysis. Results: This methodology was applied to a public SAGE dataset. In order to compare data before and after filtering, a hierarchical clustering analysis was performed in samples from the same type of tissue, in distinct biological conditions, using these two datasets. Our results provide evidences suggesting that it is possible to find more congruous clusters after using S3T scoring system. Conclusion: These results substantiate the proposed application to generate more reliable data. This is a significant contribution for determination of global gene expression profiles. The library analysis with S3T is freely available at http://gdm.fmrp.usp.br/s3t/.S3T source code and datasets can also be downloaded from the aforementioned website.
Resumo:
Observational longitudinal research is particularly useful for assessing etiology and prognosis and for providing evidence for clinical decision making. However, there are no structured reporting requirements for studies of this design to assist authors, editors, and readers. The authors developed and tested a checklist of criteria related to threats to the internal and external validity of observational longitudinal studies. The checklist criteria concerned recruitment, data collection, biases, and data analysis and descriptive issues relevant to study rationale, study population, and generalizability. Two raters independently assessed 49 randomly selected articles describing stroke research published from 1999 to 2003 in six journals: American Journal of Epidemiology, Journal of Epidemiology and Community Health, Stroke, Annals of Neurology, Archives of Physical Medicine and Rehabilitation, and American Journal of Physical Medicine and Rehabilitation. On average, 17 of the 33 checklist criteria were reported. Criteria describing the study design were better reported than those related to internal validity. No relation was found between study type (etiologic or prognostic) or word count and quality of reporting. A flow diagram for summarizing participant flow through a study was developed. Editors and authors should consider using a checklist and flow diagram when reporting on observational longitudinal research.