972 resultados para Software quality


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: This study aims to investigate the influence of tube potential (kVp) variation in relation to perceptual image quality and effective dose for pelvis using automatic exposure control (AEC) and non-AEC in a computed radiography (CR) system. Methods and Materials: To determine the effects of using AEC and non-AEC by applying the 10 kVp rule in two experiments using an anthropomorphic pelvis phantom. Images were acquired using 10 kVp increments (60-120 kVp) for both experiments. The first experiment, based on seven AEC combinations, produced 49 images. The mean mAs from each kVp increment were used as a baseline for the second experiment producing 35 images. A total of 84 images were produced and a panel of 5 experienced observers participated for the image scoring using the 2 AFC visual grading software. PCXMC software was used to estimate the effective dose. Results: A decrease in perceptual image quality as the kVp increases was observed both in non-AEC and AEC experiments, however no significant statistical differences (p> 0.05) were found. Image quality scores from all observers at 10 kVp increments for all mAs values using non-AEC mode demonstrates a better score up to 90 kVp. Effective dose results show a statistical significant decrease (p=0.000) on the 75th quartile from 0.3 mSv at 60 kVp to 0.1 mSv at 120 kVp when applying the 10 kVp rule in non-AEC mode. Conclusion: No significant reduction in perceptual image quality is observed when increasing kVp whilst a marked and significant effective dose reduction is observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The change of paradigm imposed by the Bologna process, in which the student will be responsible for their own learning, and the presence of a new generation of students with higher technological skills, represent a huge challenge for higher education institutions. The use of new Web Social concepts in teaching process, supported by applications commonly called Web 2.0, with which these new students feel at ease, can bring benefits in terms of motivation and the frequency and quality of students' involvement in academic activities. An e-learning platform with web-based applications as a complement can significantly contribute to the development of different skills in higher education students, covering areas which are usually in deficit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Na União Europeia, a energia utilizada nos edifícios é responsável por uma grande parte do consumo total, cerca de 40%, de toda a energia produzida, contribuindo em grande escala para as emissões de gases de efeito de estufa, como o CO2. [ADENE, 2014]. A minimização deste consumo, durante o período de ciclo de vida de um edifício, é um grande desafio associado ao ambiente e à economia. Na atualidade assistimos, cada vez mais, ao emergir de novas tecnologias. Faz parte dessa realidade, o crescimento e o desenvolvimento das UTA’s, que surgem como resposta do ser humano pela busca de otimização da sua zona de conforto, da qualidade de ar interior e da eficiência energética. Assim, para que não se sacrifique o conforto térmico, há que conciliar a qualidade de ar interior com a energia dispensada para climatizar os espaços. Para ajudar à minimização de CO2 em conjunto com uma eficiência energética e conforto térmico, traduzindo-se numa melhor qualidade de ar no interior de espaços climatizados, surge o objetivo de implementar uma aplicação através do software LabVIEW para prever uma experiência real. Como solução, recorreu-se a modelos matemáticos que traduzissem os vários balanços térmicos, balanços de massa e de CO2. As principais conclusões deste trabalho foram: validação do comportamento do modelo matemático da temperatura; validação do comportamento do modelo matemático de CO2; humidade relativa com 25% de registos válidos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atualmente, as empresas distinguem-se das demais pelos produtos e serviços fornecidos com qualidade e dentro dos prazos estabelecidos. Uma empresa de desenvolvimento de software não foge a essa regra e para isso os processos envolvidos nas diversas fases de levantamento, desenvolvimento, implementação e suporte devem estar documentados, ser do conhecimento geral da Organização e colocados em prática diariamente nas diversas atividades dos colaboradores. Para isso deve contribuir uma melhoria contínua desses mesmos processos. O CMMI-DEV, Capability Maturity Model Integration for Development, possibilita a introdução de boas práticas nas diversas áreas do processo de desenvolvimento de software, mas também a avaliação dessas áreas e respetiva identificação de aspetos a necessitarem de ser melhorados ou até disseminados por toda a organização. Este trabalho envolveu efetuar uma análise teórica do CMMI-DEV e sua posterior utilização prática num ambiente de trabalho empresarial para avaliação dos processos dessa empresa. Para a concretização deste segundo aspeto, foi elaborado um questionário para avaliar os processos de uma organização, segundo o modelo CMMI-DEV 1.3, avaliada a facilidade de utilização do questionário de avaliação dos processos por parte dos inquiridos. Foi realizada ainda uma análise aos resultados obtidos nos referidos questionários.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computação gráfica um campo que tem vindo a crescer bastante nos últimos anos, desde áreas como cinematográficas, dos videojogos, da animação, o avanço tem sido tão grande que a semelhança com a realidade é cada vez maior. Praticamente hoje em dia todos os filmes têm efeitos gerados através de computação gráfica, até simples anúncios de televisão para não falar do realismo dos videojogos de hoje. Este estudo tem como objectivo mostrar duas alternativas no mundo da computação gráfica, como tal, vão ser usados dois programas, Blender e Unreal Engine. O cenário em questão será todo modelado de raiz e será o mesmo nos dois programas. Serão feitos vários renders ao cenário, em ambos os programas usando diferentes materiais, diferentes tipos de iluminação, em tempo real e não de forma a mostrar as várias alternativas possíveis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Corporate world is becoming more and more competitive. This leads organisations to adapt to this reality, by adopting more efficient processes, which result in a decrease in cost as well as an increase of product quality. One of these processes consists in making proposals to clients, which necessarily include a cost estimation of the project. This estimation is the main focus of this project. In particular, one of the goals is to evaluate which estimation models fit the Altran Portugal software factory the most, the organization where the fieldwork of this thesis will be carried out. There is no broad agreement about which is the type of estimation model more suitable to be used in software projects. Concerning contexts where there is plenty of objective information available to be used as input to an estimation model, model-based methods usually yield better results than the expert judgment. However, what happens more frequently is not having this volume and quality of information, which has a negative impact in the model-based methods performance, favouring the usage of expert judgement. In practice, most organisations use expert judgment, making themselves dependent on the expert. A common problem found is that the performance of the expert’s estimation depends on his previous experience with identical projects. This means that when new types of projects arrive, the estimation will have an unpredictable accuracy. Moreover, different experts will make different estimates, based on their individual experience. As a result, the company will not directly attain a continuous growing knowledge about how the estimate should be carried. Estimation models depend on the input information collected from previous projects, the size of the project database and the resources available. Altran currently does not store the input information from previous projects in a systematic way. It has a small project database and a team of experts. Our work is targeted to companies that operate in similar contexts. We start by gathering information from the organisation in order to identify which estimation approaches can be applied considering the organization’s context. A gap analysis is used to understand what type of information the company would have to collect so that other approaches would become available. Based on our assessment, in our opinion, expert judgment is the most adequate approach for Altran Portugal, in the current context. We analysed past development and evolution projects from Altran Portugal and assessed their estimates. This resulted in the identification of common estimation deviations, errors, and patterns, which lead to the proposal of metrics to help estimators produce estimates leveraging past projects quantitative and qualitative information in a convenient way. This dissertation aims to contribute to more realistic estimates, by identifying shortcomings in the current estimation process and supporting the self-improvement of the process, by gathering as much relevant information as possible from each finished project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many municipal activities require updated large-scale maps that include both topographic and thematic information. For this purpose, the efficient use of very high spatial resolution (VHR) satellite imagery suggests the development of approaches that enable a timely discrimination, counting and delineation of urban elements according to legal technical specifications and quality standards. Therefore, the nature of this data source and expanding range of applications calls for objective methods and quantitative metrics to assess the quality of the extracted information which go beyond traditional thematic accuracy alone. The present work concerns the development and testing of a new approach for using technical mapping standards in the quality assessment of buildings automatically extracted from VHR satellite imagery. Feature extraction software was employed to map buildings present in a pansharpened QuickBird image of Lisbon. Quality assessment was exhaustive and involved comparisons of extracted features against a reference data set, introducing cartographic constraints from scales 1:1000, 1:5000, and 1:10,000. The spatial data quality elements subject to evaluation were: thematic (attribute) accuracy, completeness, and geometric quality assessed based on planimetric deviation from the reference map. Tests were developed and metrics analyzed considering thresholds and standards for the large mapping scales most frequently used by municipalities. Results show that values for completeness varied with mapping scales and were only slightly superior for scale 1:10,000. Concerning the geometric quality, a large percentage of extracted features met the strict topographic standards of planimetric deviation for scale 1:10,000, while no buildings were compliant with the specification for scale 1:1000.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software product lines (SPL) are diverse systems that are developed using a dual engineering process: (a)family engineering defines the commonality and variability among all members of the SPL, and (b) application engineering derives specific products based on the common foundation combined with a variable selection of features. The number of derivable products in an SPL can thus be exponential in the number of features. This inherent complexity poses two main challenges when it comes to modelling: Firstly, the formalism used for modelling SPLs needs to be modular and scalable. Secondly, it should ensure that all products behave correctly by providing the ability to analyse and verify complex models efficiently. In this paper we propose to integrate an established modelling formalism (Petri nets) with the domain of software product line engineering. To this end we extend Petri nets to Feature Nets. While Petri nets provide a framework for formally modelling and verifying single software systems, Feature Nets offer the same sort of benefits for software product lines. We show how SPLs can be modelled in an incremental, modular fashion using Feature Nets, provide a Feature Nets variant that supports modelling dynamic SPLs, and propose an analysis method for SPL modelled as Feature Nets. By facilitating the construction of a single model that includes the various behaviours exhibited by the products in an SPL, we make a significant step towards efficient and practical quality assurance methods for software product lines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Civil

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En los últimos años el grupo de investigación ha caracterizado la calidad de los cuerpos de agua, detectando gradientes de calidad ambiental que se manifiestan por la aparición de tóxicos que generan cambios en la calidad del agua, sedimento y especialmente en la biota. El presente proyecto propone una evaluación integrada de la contaminación de los recursos hídricos abarcando el estudio de las causas de la contaminación y las respuestas biológicas que se producen ante dichas alteraciones. Por ello nuestro objetivo principal es evaluar la contaminación de los recursos acuáticos a través del desarrollo y aplicación de diversas herramientas. El enfoque multidisciplinario del mismo permitirá integrar los análisis de las diferentes áreas de estudio, con el fin de brindarán soluciones al problema generalizado de la contaminación acuática. El fin último es alcanzar una mejor valoración de los cambios temporales y espaciales en la calidad de las cuencas hídricas. Se propone analizar la presencia y concentración de tóxicos en agua, suelo, sedimento y biota conjuntamente con la evaluación de los efectos sobre los organismos a diferentes niveles de organización, lo que permitirá determinar y seleccionar los indicadores más eficientes de la contaminación ambiental. Se desarrollarán biomarcadores moleculares basados en expresión genética en la biota acuática y biomarcadores morfológicos, histológicos y bioquímicos. Además se evaluará el efecto del estrés tóxico sobre los hábitos natatorios de peces utilizando un software recientemente desarrollado por el grupo. También se intensificará la búsqueda de biomarcadores específicos de disrupción endocrina en peces tales como aromatasa, vitelogenina, parámetros estáticos y dinámicos de espermatozoides y comportamiento de cortejo y cópula. Así, el plan propuesto brindará un conjunto de herramientas, con diverso grado de complejidad, para ser usadas en la correcta evaluación del impacto ambiental de las actividades humanas. El grupo de trabajo pretende realizar una fuerte contribución a los conocimientos de base para crear conciencia sobre el problema de las diferentes cuencas en estudio, a fin de llevar a cabo un control sostenido de la calidad de los recursos acuáticos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC at the study file level, the meta-level across studies and the meta-analysis output level. Real-world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for the use of a powerful and flexible software package called EasyQC. Precise timings will be greatly influenced by consortium size. For consortia of comparable size to the GIANT Consortium, this protocol takes a minimum of about 10 months to complete.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Images obtained from high-throughput mass spectrometry (MS) contain information that remains hidden when looking at a single spectrum at a time. Image processing of liquid chromatography-MS datasets can be extremely useful for quality control, experimental monitoring and knowledge extraction. The importance of imaging in differential analysis of proteomic experiments has already been established through two-dimensional gels and can now be foreseen with MS images. We present MSight, a new software designed to construct and manipulate MS images, as well as to facilitate their analysis and comparison.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of copper (Cu) filtration on image quality and dose in different digital X-ray systems was investigated. Two computed radiography systems and one digital radiography detector were used. Three different polymethylmethacrylate blocks simulated the pediatric body. The effect of Cu filters of 0.1, 0.2, and 0.3 mm thickness on the entrance surface dose (ESD) and the corresponding effective doses (EDs) were measured at tube voltages of 60, 66, and 73 kV. Image quality was evaluated in a contrast-detail phantom with an automated analyzer software. Cu filters of 0.1, 0.2, and 0.3 mm thickness decreased the ESD by 25-32%, 32-39%, and 40-44%, respectively, the ranges depending on the respective tube voltages. There was no consistent decline in image quality due to increasing Cu filtration. The estimated ED of anterior-posterior (AP) chest projections was reduced by up to 23%. No relevant reduction in the ED was noted in AP radiographs of the abdomen and pelvis or in posterior-anterior radiographs of the chest. Cu filtration reduces the ESD, but generally does not reduce the effective dose. Cu filters can help protect radiosensitive superficial organs, such as the mammary glands in AP chest projections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Quality assurance (QA) in clinical trials is essential to ensure treatment is safely and effectively delivered. As QA requirements have increased in complexity in parallel with evolution of radiation therapy (RT) delivery, a need to facilitate digital data exchange emerged. Our objective is to present the platform developed for the integration and standardization of QART activities across all EORTC trials involving RT. METHODS: The following essential requirements were identified: secure and easy access without on-site software installation; integration within the existing EORTC clinical remote data capture system; and the ability to both customize the platform to specific studies and adapt to future needs. After retrospective testing within several clinical trials, the platform was introduced in phases to participating sites and QART study reviewers. RESULTS: The resulting QA platform, integrating RT analysis software installed at EORTC Headquarters, permits timely, secure, and fully digital central DICOM-RT based data review. Participating sites submit data through a standard secure upload webpage. Supplemental information is submitted in parallel through web-based forms. An internal quality check by the QART office verifies data consistency, formatting, and anonymization. QART reviewers have remote access through a terminal server. Reviewers evaluate submissions for protocol compliance through an online evaluation matrix. Comments are collected by the coordinating centre and institutions are informed of the results. CONCLUSIONS: This web-based central review platform facilitates rapid, extensive, and prospective QART review. This reduces the risk that trial outcomes are compromised through inadequate radiotherapy and facilitates correlation of results with clinical outcomes.