954 resultados para Computer software - Quality control
Resumo:
We present a method to enhance fault localization for software systems based on a frequent pattern mining algorithm. Our method is based on a large set of test cases for a given set of programs in which faults can be detected. The test executions are recorded as function call trees. Based on test oracles the tests can be classified into successful and failing tests. A frequent pattern mining algorithm is used to identify frequent subtrees in successful and failing test executions. This information is used to rank functions according to their likelihood of containing a fault. The ranking suggests an order in which to examine the functions during fault analysis. We validate our approach experimentally using a subset of Siemens benchmark programs.
Resumo:
This paper reviews the current state of development of both near-infrared (NIR) and mid-infrared (MIR) spectroscopic techniques for process monitoring, quality control, and authenticity determination in cheese processing. Infrared spectroscopy has been identified as an ideal process analytical technology tool, and recent publications have demonstrated the potential of both NIR and MIR spectroscopy, coupled with chemometric techniques, for monitoring coagulation, syneresis, and ripening as well as determination of authenticity, composition, sensory, and rheological parameters. Recent research is reviewed and compared on the basis of experimental design, spectroscopic and chemometric methods employed to assess the potential of infrared spectroscopy as a technology for improving process control and quality in cheese manufacture. Emerging research areas for these technologies, such as cheese authenticity and food chain traceability, are also discussed.
Resumo:
Climate data are used in a number of applications including climate risk management and adaptation to climate change. However, the availability of climate data, particularly throughout rural Africa, is very limited. Available weather stations are unevenly distributed and mainly located along main roads in cities and towns. This imposes severe limitations to the availability of climate information and services for the rural community where, arguably, these services are needed most. Weather station data also suffer from gaps in the time series. Satellite proxies, particularly satellite rainfall estimate, have been used as alternatives because of their availability even over remote parts of the world. However, satellite rainfall estimates also suffer from a number of critical shortcomings that include heterogeneous time series, short time period of observation, and poor accuracy particularly at higher temporal and spatial resolutions. An attempt is made here to alleviate these problems by combining station measurements with the complete spatial coverage of satellite rainfall estimates. Rain gauge observations are merged with a locally calibrated version of the TAMSAT satellite rainfall estimates to produce over 30-years (1983-todate) of rainfall estimates over Ethiopia at a spatial resolution of 10 km and a ten-daily time scale. This involves quality control of rain gauge data, generating locally calibrated version of the TAMSAT rainfall estimates, and combining these with rain gauge observations from national station network. The infrared-only satellite rainfall estimates produced using a relatively simple TAMSAT algorithm performed as good as or even better than other satellite rainfall products that use passive microwave inputs and more sophisticated algorithms. There is no substantial difference between the gridded-gauge and combined gauge-satellite products over the test area in Ethiopia having a dense station network; however, the combined product exhibits better quality over parts of the country where stations are sparsely distributed.
Resumo:
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.
Resumo:
When using the digital halftone proofing systems, a closer print match can be achieved compared to what earlier couldbe done with the analogue proofing systems. These proofing systems possibilities to produce accurate print match canas well lead to producing bad print matches as several print related parameters can be adjusted manually in the systemby the user. Therefore, more advanced knowledge in graphic arts technology is required by the user of the system.The prepress company Colorcraft AB wishes to control that their color proofs always have the right quality. This projectwas started with the purpose to find a quality control metod for Colorcraft´s digital halftone proofing system(Kodak Approval XP4).Using a software who supports spectral measuring combined with a spectrophotometer and a control bar, a qualitycontrol system was assembled. This system detects variations that lies out of the proofing system´s natural deviation.The prerequisite for this quality control system is that the tolerances are defined with consideration taken to the proofingsystems natural deviations. Othervise the quality control system will generate unnecessecary false alarms and thereforenot be reliable.
Resumo:
Single-page applications have historically been subject to strong market forces driving fast development and deployment in lieu of quality control and changeable code, which are important factors for maintainability. In this report we develop two functionally equivalent applications using AngularJS and React and compare their maintainability as defined by ISO/IEC 9126. AngularJS and React represent two distinct approaches to web development, with AngularJS being a general framework providing rich base functionality and React a small specialized library for efficient view rendering. The quality comparison was accomplished by calculating Maintainability Index for each application. Version control analysis was used to determine quality indicators during development and subsequent maintenance where new functionality was added in two steps. The results show no major differences in maintainability in the initial applications. As more functionality is added the Maintainability Index decreases faster in the AngularJS application, indicating a steeper increase in complexity compared to the React application. Source code analysis reveals that changes in data flow requires significantly larger modifications of the AngularJS application due to its inherent architecture for data flow. We conclude that frameworks are useful when they facilitate development of known requirements but less so when applications and systems grow in size.
Resumo:
For many years, drainage design was mainly about providing sufficient network capacity. This traditional approach had been successful with the aid of computer software and technical guidance. However, the drainage design criteria had been evolving due to rapid population growth, urbanisation, climate change and increasing sustainability awareness. Sustainable drainage systems that bring benefits in addition to water management have been recommended as better alternatives to conventional pipes and storages. Although the concepts and good practice guidance had already been communicated to decision makers and public for years, network capacity still remains a key design focus in many circumstances while the additional benefits are generally considered secondary only. Yet, the picture is changing. The industry begins to realise that delivering multiple benefits should be given the top priority while the drainage service can be considered a secondary benefit instead. The shift in focus means the industry has to adapt to new design challenges. New guidance and computer software are needed to assist decision makers. For this purpose, we developed a new decision support system. The system consists of two main components – a multi-criteria evaluation framework for drainage systems and a multi-objective optimisation tool. Users can systematically quantify the performance, life-cycle costs and benefits of different drainage systems using the evaluation framework. The optimisation tool can assist users to determine combinations of design parameters such as the sizes, order and type of drainage components that maximise multiple benefits. In this paper, we will focus on the optimisation component of the decision support framework. The optimisation problem formation, parameters and general configuration will be discussed. We will also look at the sensitivity of individual variables and the benchmark results obtained using common multi-objective optimisation algorithms. The work described here is the output of an EngD project funded by EPSRC and XP Solutions.
Resumo:
O presente trabalho objetivou estudar a uniformidade de distribuição da calda de pulverização contendo herbicidas, em culturas perenes arbustivas, utilizando combinações de pontas de pulverização em barra lateral protegida, conduzida a pequena distância do alvo, na linha de culturas perenes arbustivas. Para isso, foi desenvolvido um programa computacional que permite simular a sobreposição do leque de pulverização, da porção protegida da barra e do leque formado pela ponta de pulverização do bico mais extremo da barra, de modo diferente dos demais programas. Após a seleção das melhores combinações de pontas de pulverização por meio de simulação dos padrões de deposição da pulverização das pontas individuais e dos coeficientes de variação menores que 10%, algumas dessas combinações foram testadas em campo, aplicando-se um herbicida sistêmico (glyphosate) e outro com ação de contato (paraquat). Os resultados indicaram que o programa computacional desenvolvido pode constituir-se em um auxiliar valioso para a seleção das melhores combinações de pontas de pulverização. em aplicações tanto do herbicida glyphosate quanto do paraquat, com volumes de calda mais reduzidos,abaixo de 100 L ha-1, destacaram-se como arranjos mais eficientes: a) pontas TT110015 distanciadas de 52,5 cm entre si, combinadas com a ponta TK-0,5 na extremidade da barra a 50 cm do último bico, operando na velocidade de 5 km h-1 e pressão de 103 kPa (15 lbf pol-2), com distância de caminhamento do tronco da árvore de 20 cm ; b) pontas SMCE2 distanciadas de 15 cm entre si, combinadas com a ponta TK-0,5 na extremidade da barra de 20 cm do último bico, operando na velocidade de 4 km h-1 e pressão de 414 kPa (60 lbf pol-2), com distância de caminhamento do tronco da árvore de 30 cm ; e c) pontas TLX-2 distanciadas de 15 cm entre si, combinadas com a ponta TK-0,5 na extremidade da barra de 20 cm do último bico, operando à velocidade de 5 km h-1 e pressão de 414 kPa (60 lbf pol-2), com distância de caminhamento do tronco da árvore de 30 cm. A velocidade de deslocamento do pulverizador de 5 km h-1 proporcionou melhores condições para que os herbicidas estudados apresentassem melhor controle de plantas daninhas, quando comparada com a velocidade de deslocamento do pulverizador de 4 km h-1.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Nowadays, where the market competition requires products with better quality and a constant search for cost savings and a better use of raw materials, the research for more efficient control strategies becomes vital. In Natural Gas Processin Units (NGPUs), as in the most chemical processes, the quality control is accomplished through their products composition. However, the chemical composition analysis has a long measurement time, even when performed by instruments such as gas chromatographs. This fact hinders the development of control strategies to provide a better process yield. The natural gas processing is one of the most important activities in the petroleum industry. The main economic product of a NGPU is the liquefied petroleum gas (LPG). The LPG is ideally composed by propane and butane, however, in practice, its composition has some contaminants, such as ethane and pentane. In this work is proposed an inferential system using neural networks to estimate the ethane and pentane mole fractions in LPG and the propane mole fraction in residual gas. The goal is to provide the values of these estimated variables in every minute using a single multilayer neural network, making it possibly to apply inferential control techniques in order to monitor the LPG quality and to reduce the propane loss in the process. To develop this work a NGPU was simulated in HYSYS R software, composed by two distillation collumns: deethanizer and debutanizer. The inference is performed through the process variables of the PID controllers present in the instrumentation of these columns. To reduce the complexity of the inferential neural network is used the statistical technique of principal component analysis to decrease the number of network inputs, thus forming a hybrid inferential system. It is also proposed in this work a simple strategy to correct the inferential system in real-time, based on measurements of the chromatographs which may exist in process under study