896 resultados para software quality metrics


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow Results: A laboratory information management system (LIMS) has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat) genotyping data from the legume (chickpea, groundnut and pigeonpea) and cereal (sorghum and millets) crops of importance in the semi-arid tropics. Conclusions: A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping laboratory. The application with source code is freely available for academic users and can be downloaded from http://www.icrisat.org/bt-software-d-lims.htm

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The review of existing information has identified the following: - the juvenile core in Araucaria is probably contained within the first 15 growth rings in the pith, with spiral grain being a chief determinant of its extent within the stem; - a reduction in rotation length for a given site index will reduce ASV and mature wood volume, with an increase in the proportion of juvenile wood; - for a given rotation length, lower ASV stems were estimated to contain a lower proportion of juvenile wood (based on the assumptions made and crude simulations using WEEDS, PL YSIM and STEPS software); regardless of juvenile wood proportions, smaller stems will yield a higher proportion of pith-in material; - an increase in the proportion of juvenile wood, due to a reduction in rotation length, could affect wood quality due to an increase in the proportion of the recovery containing high spiral grain, shorter tracheids and higher micellar angle; - high spiral grain and high micellar angles adversely impact on wood quality through their influence on twist and longitudinal shrinkage, respectively; - positive outcomes from a reduction in rotation length might include an increase in the proportion of live knots in upper stem sections and a reduction in the extent of brown-stain heartwood; - the uniformity in basic density within Araucaria stems means reduced rotation lengths and lower stem ASVs are unlikely to have a major impact on this wood property, and - the effect of a reduction in rotation length on the incidence of compression wood and timber susceptible to kiln staining could not be established from the available information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quality of an online university degree is paramount to the student, the reputation of the university and most importantly, the profession that will be entered. At the School of Education within Curtin University, we aim to ensure that students within rural and remote areas are provided with high quality degrees equal to their city counterparts who access face-to-face classes on campus.In 2010, the School of Education moved to flexible delivery of a fully online Bachelor of Education degree for their rural students. In previous years, the degree had been delivered in physical locations around the state. Although this served the purpose for the time, it restricted the degree to only those rural students who were able to access the physical campus. The new model in 2010 allows access for students in any rural area who have a computer and an internet connection, regardless of their geographical location. As a result enrolments have seen a positive increase in new students. Academic staff had previously used an asynchronous environment to deliver learning modules housed within a learning management system (LMS). To enhance the learning environment and to provide high quality learning experiences to students learning at a distance, the adoption of synchronous software was introduced. This software is a real-time virtual classroom environment that allows for communication through Voice over Internet Protocol (VoIP) and videoconferencing, along with a large number of collaboration tools to engage learners. This research paper reports on the professional development of academic staff to integrate a live e-learning solution into their current LMS environment. It involved professional development, including technical orientation for teaching staff and course participants simultaneously. Further, pedagogical innovations were offered to engage the students in a collaborative learning environment. Data were collected from academic staff through semi-structured interviews and participant observation. The findings discuss the perceived value of the technology, problems encountered and solutions sought.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Transition Radiation Tracker (TRT) of the ATLAS experiment at the LHC is part of the Inner Detector. It is designed as a robust and powerful gaseous detector that provides tracking through individual drift-tubes (straws) as well as particle identification via transition radiation (TR) detection. The straw tubes are operated with Xe-CO2-O2 70/27/3, a gas that combines the advantages of efficient TR absorption, a short electron drift time and minimum ageing effects. The modules of the barrel part of the TRT were built in the United States while the end-cap wheels are assembled at two Russian institutes. Acceptance tests of barrel modules and end-cap wheels are performed at CERN before assembly and integration with the Semiconductor Tracker (SCT) and the Pixel Detector. This thesis first describes simulations the TRT straw tube. The argon-based acceptance gas mixture as well as two xenon-based operating gases are examined for its properties. Drift velocities and Townsend coefficients are computed with the help of the program Magboltz and used to study electron drift and multiplication in the straw using the software Garfield. The inclusion of Penning transfers in the avalanche process leads to remarkable agreements with experimental data. A high level of cleanliness in the TRT s acceptance test gas system is indispensable. To monitor gas purity, a small straw tube detector has been constructed and extensively used to study the ageing behaviour of the straw tube in Ar-CO2. A variety of ageing tests are presented and discussed. Acceptance tests for the TRT survey dimensions, wire tension, gas-tightness, high-voltage stability and gas gain uniformity along each individual straw. The thesis gives details on acceptance criteria and measurement methods in the case of the end-cap wheels. Special focus is put on wire tension and straw straightness. The effect of geometrically deformed straws on gas gain and energy resolution is examined in an experimental setup and compared to simulation studies. An overview of the most important results from the end-cap wheels tested up to this point is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water quality data are often collected at different sites over time to improve water quality management. Water quality data usually exhibit the following characteristics: non-normal distribution, presence of outliers, missing values, values below detection limits (censored), and serial dependence. It is essential to apply appropriate statistical methodology when analyzing water quality data to draw valid conclusions and hence provide useful advice in water management. In this chapter, we will provide and demonstrate various statistical tools for analyzing such water quality data, and will also introduce how to use a statistical software R to analyze water quality data by various statistical methods. A dataset collected from the Susquehanna River Basin will be used to demonstrate various statistical methods provided in this chapter. The dataset can be downloaded from website http://www.srbc.net/programs/CBP/nutrientprogram.htm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of wireless channel allocation to multiple users. A slot is given to a user with a highest metric (e.g., channel gain) in that slot. The scheduler may not know the channel states of all the users at the beginning of each slot. In this scenario opportunistic splitting is an attractive solution. However this algorithm requires that the metrics of different users form independent, identically distributed (iid) sequences with same distribution and that their distribution and number be known to the scheduler. This limits the usefulness of opportunistic splitting. In this paper we develop a parametric version of this algorithm. The optimal parameters of the algorithm are learnt online through a stochastic approximation scheme. Our algorithm does not require the metrics of different users to have the same distribution. The statistics of these metrics and the number of users can be unknown and also vary with time. Each metric sequence can be Markov. We prove the convergence of the algorithm and show its utility by scheduling the channel to maximize its throughput while satisfying some fairness and/or quality of service constraints.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a set of metrics that evaluate the uniformity, sharpness, continuity, noise, stroke width variance,pulse width ratio, transient pixels density, entropy and variance of components to quantify the quality of a document image. The measures are intended to be used in any optical character recognition (OCR) engine to a priori estimate the expected performance of the OCR. The suggested measures have been evaluated on many document images, which have different scripts. The quality of a document image is manually annotated by users to create a ground truth. The idea is to correlate the values of the measures with the user annotated data. If the measure calculated matches the annotated description,then the metric is accepted; else it is rejected. In the set of metrics proposed, some of them are accepted and the rest are rejected. We have defined metrics that are easily estimatable. The metrics proposed in this paper are based on the feedback of homely grown OCR engines for Indic (Tamil and Kannada) languages. The metrics are independent of the scripts, and depend only on the quality and age of the paper and the printing. Experiments and results for each proposed metric are discussed. Actual recognition of the printed text is not performed to evaluate the proposed metrics. Sometimes, a document image containing broken characters results in good document image as per the evaluated metrics, which is part of the unsolved challenges. The proposed measures work on gray scale document images and fail to provide reliable information on binarized document image.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Video decoders used in emerging applications need to be flexible to handle a large variety of video formats and deliver scalable performance to handle wide variations in workloads. In this paper we propose a unified software and hardware architecture for video decoding to achieve scalable performance with flexibility. The light weight processor tiles and the reconfigurable hardware tiles in our architecture enable software and hardware implementations to co-exist, while a programmable interconnect enables dynamic interconnection of the tiles. Our process network oriented compilation flow achieves realization agnostic application partitioning and enables seamless migration across uniprocessor, multi-processor, semi hardware and full hardware implementations of a video decoder. An application quality of service aware scheduler monitors and controls the operation of the entire system. We prove the concept through a prototype of the architecture on an off-the-shelf FPGA. The FPGA prototype shows a scaling in performance from QCIF to 1080p resolutions in four discrete steps. We also demonstrate that the reconfiguration time is short enough to allow migration from one configuration to the other without any frame loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[ES]El objetivo de este proyecto ha sido desarrollar una herramienta software que permita medir el rendimiento de redes con tecnología móvil 4G, también conocida como LTE. Para ello se ha creado un sistema software que está compuesto por una aplicación móvil y un servidor de aplicaciones. El sistema en conjunto realiza la función de recoger indicadores de calidad de la red móvil de diversa índole, que posteriormente son procesados utilizando herramientas software matemáticas, para así obtener gráficas y mapas que permiten analizar la situación y el rendimiento de una red 4G concreta. El desarrollo del software ha llegado a nivel de prototipo y se han realizado pruebas reales con él obteniendo resultados positivos de funcionamiento.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nesta dissertação foi desenvolvido o sistema SAQUA (Sistema para Análise da Qualidade das Águas Fluviais), que permite o acompanhamento dos dados de séries históricas de parâmetros físico-químicos para análise da qualidade de águas fluviais. A alimentação do sistema SAQUA se dá a partir do arquivo tipo texto gerado no Hidroweb, sistema de banco de dados hidrológicos da ANA (Agência Nacional de Águas), disponibilizado na internet. O SAQUA constitui uma interface que permite a análise espaço-temporal de parâmetros de qualidade da água específicos definidos pelo usuário. A interface foi construída utilizando o servidor de mapas Mapserver, as linguagens HTML e PHP, além de consultas SQL e o uso do servidor Web Apache. A utilização de uma linguagem dinâmica como o PHP permitiu usar recursos internos do Mapserver por meio de funções que interagem de forma mais flexível com códigos presentes e futuros, além de interagir com o código HTML. O Sistema apresenta como resultado a representação gráfica da série histórica por parâmetro e, em mapa, a localização das estações em análise também definidas pelo usuário, geralmente associadas a uma determinada região hidrográfica. Tanto na representação gráfica da série temporal quanto em mapa, são destacados a partir de código de cores a estação de monitoramento e a observação em que os limites estabelecidos na resolução CONAMA 357/05 não foi atendido. A classe de uso da resolução CONAMA que será usada na análise também pode ser definida pelo usuário. Como caso de estudo e demonstração das funções do SAQUA foi escolhida a bacia hidrográfica do rio Paraíba do Sul, localizada na região hidrográfica Atlântico Sudeste do Brasil. A aplicação do sistema demonstrou ótimos resultados e o potencial da ferramenta computacional como apoio ao planejamento e à gestão dos recursos hídricos. Ressalta-se ainda, que todo o sistema foi desenvolvido a partir de softwares disponibilizados segundo a licença GPL de software livre, ou seja, sem custo na aquisição de licenças, demonstrando o potencial da aplicação destas ferramentas no campo dos recursos hídricos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Na década de 90 com o aumento da capacidade de processamento e memória dos computadores, surgiu a fotogrametria digital, que tem como objetivo principal o mapeamento automático das feições naturais e artificiais do terreno, utilizando a imagem fotogramétrica digital como fonte primária de dados. As soluções fotogramétricas se tornaram mais compactas e versáteis. A estação fotogramétrica digital educacional E-FOTO é um projeto multidisciplinar, em desenvolvimento no laboratório de Fotogrametria Digital da Universidade do Estado do Rio de Janeiro, que se baseia em dois pilares: autoaprendizado e gratuidade. Este trabalho tem o objetivo geral de avaliar a qualidade das medições fotogramétricas utilizando a versão integrada 1.0β do E-FOTO. Para isso foram utilizados dois blocos de fotografias de regiões distintas do planeta: um bloco de fotografias (2005) do município de Seropédica-RJ e um bloco de fotografias antigas (1953) da região de Santiago de Compostela, na Espanha. Os resultados obtidos com o E-FOTO foram comparados com os resultados do software comercial de fotogrametria digital Leica Photogrammetry Suite (LPS 2010) e com as coordenadas no espaço-objeto de pontos medidos com posicionamento global por satélite (verdade de campo). Sendo possível avaliar as metodologias dos softwares na obtenção dos parâmetros das orientações interior e exterior e na determinação da exatidão das coordenadas no espaço-objeto dos pontos de verificação obtidas no módulo estereoplotter versão 1.64 do E-FOTO. Os resultados obtidos com a versão integrada 1.0β do E-FOTO na determinação dos parâmetros das orientações interior e exterior e no cálculo das coordenadas dos pontos de verificação, sem a inclusão dos parâmetros adicionais e a autocalibração são compatíveis com o processamento realizado com o software LPS. As diferenças dos parâmetros X0 e Y0 obtidos na orientação exterior com o E-FOTO, quando comparados com os obtidos com o LPS, incluindo os parâmetros adicionais e a autocalibração da câmara fotogramétrica, não são significativas. Em função da qualidade dos resultados obtidos e de acordo com o Padrão de Exatidão Cartográfica, seria possível obter um documento cartográfico Classe A em relação à planimetria e Classe B em relação à altimetria na escala 1/10.000, com o projeto Rural e Classe A em relação à planimetria e Classe C em relação à altimetria na escala 1/25.000, com o Projeto Santiago de Compostela. As coordenadas tridimensionais (E, N e H) dos pontos de verificação obtidas fotogrametricamente no módulo estereoplotter versão 1.64 do E-FOTO, podem ser consideradas equivalentes as medidas com tecnologia de posicionamento por satélites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Jogos educacionais constituem em uma importante ferramenta de ensino na área de Engenharia de Software, onde, muitas vezes, os alunos não passam por nenhum mecanismo de treinamento prático. Um jogo educacional de qualidade tem que ter objetivos educacionais bem definidos, motivar os alunos e efetivar a aprendizagem dos conteúdos. A aplicação de jogos no Ensino de Engenharia de Software deve ser realizada de forma sistemática e controlada com base em avaliação. A técnica Estatística de Experimentação permite a medição e a análise das variáveis envolvidas no processo de aplicação de jogos para que estes possam ser aplicados com qualidade. Para definir melhor os experimentos no uso de jogos para o ensino de Engenharia de Software, este trabalho propõe diretrizes para o planejamento de experimentos em jogos educacionais, de forma que permita verificar a influência e a significância da utilização desses jogos no ensino e aprendizado dos conceitos de Engenharia de Software. Um experimento com o SimulES-W foi realizado seguindo essas diretrizes, onde foi possível ser demonstrada sua aplicabilidade e simplicidade em sua definição. A experiência de uso do SimulES-W mostra que aprender com jogos de computador é divertido, interativo e que, apesar dos resultados obtidos não serem significativos estatisticamente, de certa forma contribui para o ensino da Engenharia de Software, não sendo necessariamente um conhecimento prévio do conteúdo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grass shrimp, Palaemonetes pugio, are a common inhabitant of US East and Gulf coast salt marshes and are a food source for recreationally and economically important fish and crustacean species. Due to the relationship of grass shrimp with their ecosystem, any significant changes in grass shrimp population may have the potential to affect the estuarine system. Land use is a crucial concern in coastal areas where increasing development impacts the surrounding estuaries and salt marshes and has made grass shrimp population studies a logical choice to investigate urbanization effects. Any impact on tidal creeks will be an impact on grass shrimp populations and their associated micro-environment whether predator, prey or parasitic symbiont. Anthropogenic stressors introduced into the grass shrimp ecosystem may even change the intensity of infections from parasitic symbionts. An ectoparasite found on P. pugio is the bopyrid isopod Probopyrus pandalicola. Little is known about factors that may affect the occurrence of this isopod in grass shrimp populations. The goal was to analyze the prevalence of P. pandalicola in grass shrimp in relation to land use classifications, water quality parameters, and grass shrimp population metrics. Eight tidal creeks in coastal South Carolina were sampled monthly over a three year period. The occurrence of P. pandalicola ranged from 1.2% to 5.7%. Analysis indicated that greater percent water and marsh coverage resulted in a higher incidence of bopyrid occurrence. Analysis also indicated that higher bopyrid incidence occurred in creeks with higher salinity, temperature, and pH but lower dissolved oxygen. The land use characteristics found to limit bopyrid incidence were limiting to grass shrimp (definitive host) populations and probably copepod (intermediate host) populations as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ontologies play a core role to provide shared knowledge models to semantic-driven applications targeted by Semantic Web. Ontology metrics become an important area because they can help ontology engineers to assess ontology and better control project management and development of ontology based systems, and therefore reduce the risk of project failures. In this paper, we propose a set of ontology cohesion metrics which focuses on measuring (possibly inconsistent) ontologies in the context of dynamic and changing Web. They are: Number of Ontology Partitions (NOP), Number of Minimally Inconsistent Subsets (NMIS) and Average Value of Axiom Inconsistencies (AVAI). These ontology metrics are used to measure ontological semantics rather than ontological structure. They are theoretically validated for ensuring their theoretical soundness, and further empirically validated by a standard test set of debugging ontologies. The related algorithms to compute these ontology metrics also are discussed. These metrics proposed in this paper can be used as a very useful complementarity of existing ontology cohesion metrics.