921 resultados para Objective Image Quality
Resumo:
The objective of this paper was to show the potential additional insight that result from adding greenhouse gas (GHG) emissions to plant performance evaluation criteria, such as effluent quality (EQI) and operational cost (OCI) indices, when evaluating (plant-wide) control/operational strategies in wastewater treatment plants (WWTPs). The proposed GHG evaluation is based on a set of comprehensive dynamic models that estimate the most significant potential on-site and off-site sources of CO2, CH4 and N2O. The study calculates and discusses the changes in EQI, OCI and the emission of GHGs as a consequence of varying the following four process variables: (i) the set point of aeration control in the activated sludge section; (ii) the removal efficiency of total suspended solids (TSS) in the primary clarifier; (iii) the temperature in the anaerobic digester; and (iv) the control of the flow of anaerobic digester supernatants coming from sludge treatment. Based upon the assumptions built into the model structures, simulation results highlight the potential undesirable effects of increased GHG production when carrying out local energy optimization of the aeration system in the activated sludge section and energy recovery from the AD. Although off-site CO2 emissions may decrease, the effect is counterbalanced by increased N2O emissions, especially since N2O has a 300-fold stronger greenhouse effect than CO2. The reported results emphasize the importance and usefulness of using multiple evaluation criteria to compare and evaluate (plant-wide) control strategies in a WWTP for more informed operational decision making
Resumo:
The present paper constitutes a synthesis of the results gotten during the five campaigns of air quality measurement in the years of 2003 and 2004 carried out in the Portuguese city of Viana do Castelo to characterise the reference situation and to accompany the Polis Programme, an urban re-qualification and environmental valorisation plan. The main objective of the monitoring programme consisted of the evaluation of atmospheric pollutants whose levels were susceptible of enhancement in the course of the urbanistic public works. The presented results refer to measurements performed in two distinct places of this city, comprising various consecutive days of acquisition that include, at least, one day of weekend.
Resumo:
Internet s’ha alçat en poc temps com el mitjà més utilitzat pels turistes per a planificar, organitzar i comprar un viatge, és per això que es proposa donar les mateixes facilitats en el destí. La Publicitat Dinàmica o “Digital Signage” és un nou servei de comunicació que consisteix en un conjunt de tecnologies i aplicacions informàtiques que permeten emetre missatges multimèdia i comunicar-se així d’una manera innovadora amb el públic objectiu de cada empresa, si s’afegeix un sistema independent, multimèdia i interactiu que pot utilitzar-se per a proporcionar informació i/o permetre la realització de transaccions es potencia al màxim el servei. D’aquesta manera es proposa crear una Xarxa Digital Multimèdia de Kioscs Interactius recolzats amb una pantalla de plasma per a la tecnologia Digital Signage. La ubicació escollida estratègicament és en un dels punts de major afluència turística, tal com l’entrada dels hotels. Així es tracta de crear circuits tancats en àrees geogràfiques on es troben els principals nuclis turístics de Mallorca. La possibilitat d’accedir a segments de població altament interessants per al producte o servei es multiplica al ser una manera fàcil, eficaç i altament suggestiva de promocionar el què es pretén. Un avantatge és la simplicitat de la infraestructura tecnològica que es necessita, el dispositiu mitjançant el qual es visualitzaran els missatges serà una pantalla de plasma convencional, i un terminal de punt de venda instal.lat en un lloc de pas. Cada mòdul està connectat a la xarxa ADSL mitjançant un servidor local a Internet. La connexió a la xarxa és imprescindible per a que el manteniment i actualització dels continguts es puguin efectuar remotament. L’objectiu principal d’aquest treball és estudiar la viabilitat de la implantació de la xarxa, mitjançant la realització d’un estudi de mercat on s’analitzen els grups claus per a la implantació: els hotelers, la indústria turística i el Govern Balear. S’identifiquen els beneficis que aportarà al nou servei i les repercussions que tendrà la seva instal.lació. Entre els resultats més destacats d’aquest estudi cal remarcar l’acceptació que ha tengut la idea entre els hotelers entrevistats i la resposta positiva de la indústria turística. Es reconeix: una millora de la imatge del sector, l’ús com a eina de promoció turística pel Govern, i la contribució a la sostenibilitat econòmica pel fet que augmenta la competitivitat de les empreses i això millora la qualitat del servei.
Resumo:
Large enterprises have for many years employed eBusiness solutions in order to improve their efficiency. Smaller companies, however, have not been able to leverage these technologies due to the high level of know-how and resources required in implementing them. To solve this, novel software services are being developed to facilitate eBusiness adoption for the small enterprise with the aim of making B2Bi feasible not only between large organisations but also between trading partners of all sizes. The objective of this study was to find what standards and techniques on eBusiness and software testing and quality assurance fit best for building these new kinds of software considering the requirements their unique eBusiness approach poses. The research was conducted as a literature study with focus on standards on software testing and quality assurance together with standards on eBusiness. The study showed that the current software testing and quality assurance standards do not possess such characteristics as would make select standards evidently better fitted for building this type of software, which were established to be best developed as web services in order for them to meet their requirements. A selection of eBusiness standards and technologies was proposed to support this approach. The main finding in the study was, however, that these kinds of web services that have high interoperability requirements will have to be able to carry out automated interoperability and conformance testing as part of their operation; this objective dictates how the software are built and how testing during software development is to be done. The study showed that research on automated interoperability and conformance testing for web services is still limited and more research is needed to make the building of highly-interoperable web services more feasible.
Resumo:
As primary objective, this thesis examines Finnair Technical Procurement’s service quality with its underlying process. As an internal unit, Technical Procurement serves as a link between external suppliers and internal customers. It is argued that external service quality requires a certain quality level within an organization. At the same time, aircraft maintenance business is subject to economic restraints. Therefore, a methodology was developed with a modified House of Quality that assists management in analyzing and evaluating Technical Procurement’s service level and connected process steps. It could be shown that qualitative and quantitative objectives do not exclude each other per se.
Resumo:
The objective of the work has been to study why systems thinking should be used in combination with TQM, what are the main benefits of the integration and how it could best be done. The work analyzes the development of systems thinking and TQM with time and the main differences between them. The work defines prerequisites for adopting a systems approach and the organizational factors which embody the development of an efficient learning organization. The work proposes a model based on combination of an interactive management model and redesign to be used for application of systems approach with TQM in practice. The results of the work indicate that there are clear differences between systems thinking and TQM which justify their combination. Systems approach provides an additional complementary perspective to quality management. TQM is focused on optimizing operations at the operational level while interactive management and redesign of organization are focused on optimization operations at the conceptual level providing a holistic system for value generation. The empirical study demonstrates the applicability of the proposed model in one case study company but its application is tenable and possible also beyond this particular company. System dynamic modeling and other systems based techniques like cognitive mapping are useful methods for increasing understanding and learning about the behavior of systems. The empirical study emphasizes the importance of using a proper early warning system.
Resumo:
In this thesis, a Peer-to-Peer communication middleware for mobile environment is developed using the Qt framework and the Qt Mobility extension. The Peer-to-Peer middleware – called as PeerHood – is for service sharing in network neighborhood. In addition, the PeerHood enables service connectivity and device monitoring functionalities. The concept of the PeerHood is already available in native C++ implementation on Linux platform using services from the platform. In this work, the PeerHood concept is remade to be based on use of the Qt framework. The objective of the new solution is to increase PeerHood quality with using functionalities from the Qt framework and the Qt Mobility extension. Furthermore, by using the Qt framework, the PeerHood middleware can be implemented to be portable cross-platform middleware. The quality of the new PeerHood implementation is evaluated with defined quality factors and compared with the existing PeerHood. Reliability, CPU usage, memory usage and static code analysis metrics are used in evaluation. The new PeerHood is shown to be more reliable and flexible that the existing one.
Resumo:
With the increase of use of digital media the need for the methods of multimedia protection becomes extremely important. The number of the solutions to the problem from encryption to watermarking is large and is growing every year. In this work digital image watermarking is considered, specifically a novel method of digital watermarking of color and spectral images. An overview of existing methods watermarking of color and grayscale images is given in the paper. Methods using independent component analysis (ICA) for detection and the ones using discrete wavelet transform (DWT) and discrete cosine transform (DCT) are considered in more detail. A novel method of watermarking proposed in this paper allows embedding of a color or spectral watermark image into color or spectral image consequently and successful extraction of the watermark out of the resultant watermarked image. A number of experiments have been performed on the quality of extraction depending on the parameters of the embedding procedure. Another set of experiments included the test of the robustness of the algorithm proposed. Three techniques have been chosen for that purpose: median filter, low-pass filter (LPF) and discrete cosine transform (DCT), which are a part of a widely known StirMark - Image Watermarking Robustness Test. The study shows that the proposed watermarking technique is fragile, i.e. watermark is altered by simple image processing operations. Moreover, we have found that the contents of the image to be watermarked do not affect the quality of the extraction. Mixing coefficients, that determine the amount of the key and watermark image in the result, should not exceed 1% of the original. The algorithm proposed has proven to be successful in the task of watermark embedding and extraction.
Resumo:
ABSTRACT The essay objective was to correlate lignin content resulting from tigmomorphogenesis induced by stem swaying with survival and post-planting growth of P. taeda seedlings. Seedlings were subjected to daily frequencies (0, 5, 10, 20 and 40 movements) of stem swaying for 60 days. By the end of the treatments, we determined lignin content of below and aboveground seedling tissues. Four replicates per treatment were planted in a area cultivated with pines. Ninety days after planting, survival and increments of seedling height, stem diameter and stem volume were quantified. Application of 20 stem swayings increased lignin in both below and aboveground plant tissues. Outplanted seedling survival was reduced with 40 stem swayings while growth increments were increased with both 10 and 20 stem swayings. Lignin content from belowground plant tissues was positively correlated with outplanted seedling survival while lignin from aboveground tissues correlated with height and stem volume increments. P. taeda seedlings with higher lignin content have higher survival chances after planting.
Resumo:
Software systems are expanding and becoming increasingly present in everyday activities. The constantly evolving society demands that they deliver more functionality, are easy to use and work as expected. All these challenges increase the size and complexity of a system. People may not be aware of a presence of a software system, until it malfunctions or even fails to perform. The concept of being able to depend on the software is particularly significant when it comes to the critical systems. At this point quality of a system is regarded as an essential issue, since any deficiencies may lead to considerable money loss or life endangerment. Traditional development methods may not ensure a sufficiently high level of quality. Formal methods, on the other hand, allow us to achieve a high level of rigour and can be applied to develop a complete system or only a critical part of it. Such techniques, applied during system development starting at early design stages, increase the likelihood of obtaining a system that works as required. However, formal methods are sometimes considered difficult to utilise in traditional developments. Therefore, it is important to make them more accessible and reduce the gap between the formal and traditional development methods. This thesis explores the usability of rigorous approaches by giving an insight into formal designs with the use of graphical notation. The understandability of formal modelling is increased due to a compact representation of the development and related design decisions. The central objective of the thesis is to investigate the impact that rigorous approaches have on quality of developments. This means that it is necessary to establish certain techniques for evaluation of rigorous developments. Since we are studying various development settings and methods, specific measurement plans and a set of metrics need to be created for each setting. Our goal is to provide methods for collecting data and record evidence of the applicability of rigorous approaches. This would support the organisations in making decisions about integration of formal methods into their development processes. It is important to control the software development, especially in its initial stages. Therefore, we focus on the specification and modelling phases, as well as related artefacts, e.g. models. These have significant influence on the quality of a final system. Since application of formal methods may increase the complexity of a system, it may impact its maintainability, and thus quality. Our goal is to leverage quality of a system via metrics and measurements, as well as generic refinement patterns, which are applied to a model and a specification. We argue that they can facilitate the process of creating software systems, by e.g. controlling complexity and providing the modelling guidelines. Moreover, we find them as additional mechanisms for quality control and improvement, also for rigorous approaches. The main contribution of this thesis is to provide the metrics and measurements that help in assessing the impact of rigorous approaches on developments. We establish the techniques for the evaluation of certain aspects of quality, which are based on structural, syntactical and process related characteristics of an early-stage development artefacts, i.e. specifications and models. The presented approaches are applied to various case studies. The results of the investigation are juxtaposed with the perception of domain experts. It is our aspiration to promote measurements as an indispensable part of quality control process and a strategy towards the quality improvement.
Resumo:
Quality is not only free but it can be a profit maker. Every dollar that is not spent on doing things wrong becomes a dollar right on the bottom line. The main objective of this thesis is to give an answer on how cost of poor quality can be measured theoretically correctly. Different calculation methods for cost of poor quality are presented and discussed in order to give comprehensive picture about measurement process. The second objective is to utilize the knowledge from the literature review and to apply it when creating a method for measuring cost of poor quality in supplier performance rating. Literature review indicates that P-A-F model together with ABC methodology provides a mean for quality cost calculations. These models give an answer what should be measured and how this measurement should be carried out. However, when product or service quality costs are incurred when quality character derivates from target value, then QLF seems to be most appropriate methodology for quality cost calculation. These methodologies were applied when creating a quality cost calculation method for supplier performance ratings.
Resumo:
The objective of this study was to evaluate the effects of the application of different water depths and nitrogen and potassium doses in the quality of Tanzania grass, in the southern of the state of Tocantins. The experiment was conducted on strips of traditional sprinklers, and used, as treatments, a mixture of fertilizer combinations of N and K2O always in the ratio of 1 N:0.8 K2O. This study determined throughout the experiment: plant height (PH), the crude protein (CP) and neutral detergent fiber (NDF). The highest plant height obtained was 132.4 cm, with a fertilizer dose of 691.71 kg ha-1 in the proportion of N:0.8 K2O, in other words, 384.28 kg ha-1 of N and 307.43 kg ha-1 of K2O, and water depth of 80% of the ETc. The highest crude protein content was 12.2%, with the fertilizer dose application of 700 kg ha-1 yr-1 in the proportion of 1 N to 0.8 of K2O, in other words, 388.89 kg ha-1 of N and 311.11 kg ha-1 of K2O and absence of irrigation. The lowest level of neutral detergent fiber was 60.7% with the application of the smallest dose of fertilizer and highest water depth. It was concluded in this study that there was an increase in plant height by increasing the fertilizer dose and water depth. The crude protein content increased 5.4% in the dry season, by increasing the fertilizer dose and water depth. In the dry season, there was an increase of NDF content by 4.5% by increasing the application of fertilizer and water depth.
Resumo:
The objective of this study was to evaluate the performance of two genotypes of elephant grass, fertilized with and without N, for biomass production for energy use under the edaphoclimatic conditions of the Cerrado. The genotypes Roxo and Paraíso, grown in a field experiment in a Latosol in the Cerrado region were evaluated for biomass yield, nitrogen accumulation, C:N and stem:leaf ratios, fibre, ash and P and K contents and calorific value. The accumulated dry biomass ranged from 30 to 42 Mg ha-1 and showed no response to nitrogen fertilization with the lowest biomass obtained by the genotype Paraíso and the highest by Roxo. The total N accumulation followed the same pattern as for dry matter, ranging from 347 to 539 kg N ha-1. C:N and stem:leaf ratio of the biomass produced did not vary with treatments. The fibre contents were higher in genotype Paraíso and the highest levels of ash in the genotype Roxo. The K content in the biomass was higher in genotype Roxo and P did not vary between genotypes. The calorific value averaged 18 MJ kg-1 of dry matter and did not vary with the levels of N in leaves and stems of the plant. Both genotypes, independent of N fertilization, produced over 30 Mg ha-1 of biomass under Cerrado conditions.
Resumo:
The objective of this study consisted on mapping the use and soil occupation and evaluation of the quality of irrigation water used in Salto do Lontra, in the state of Paraná, Brazil. Images of the satellite SPOT-5 were used to perform the supervised classification of the Maximum Likelihood algorithm - MAXVER, and the water quality parameters analyzed were pH, EC, HCO3-, Cl-, PO4(3-), NO3-, turbidity, temperature and thermotolerant coliforms in two distinct rainfall periods. The water quality data were subjected to statistical analysis by the techniques of PCA and FA, to identify the most relevant variables in assessing the quality of irrigation water. The characterization of soil use and occupation by the classifier MAXVER allowed the identification of the following classes: crops, bare soil/stubble, forests and urban area. The PCA technique applied to irrigation water quality data explained 53.27% of the variation in water quality among the sampled points. Nitrate, thermotolerant coliforms, temperature, electrical conductivity and bicarbonate were the parameters that best explained the spatial variation of water quality.
Resumo:
ABSTRACT Precision agriculture (PA) allows farmers to identify and address variations in an agriculture field. Management zones (MZs) make PA more feasible and economical. The most important method for defining MZs is a fuzzy C-means algorithm, but selecting the variable for use as the input layer in the fuzzy process is problematic. BAZZI et al. (2013) used Moran’s bivariate spatial autocorrelation statistic to identify variables that are spatially correlated with yield while employing spatial autocorrelation. BAZZI et al. (2013) proposed that all redundant variables be eliminated and that the remaining variables would be considered appropriate on the MZ generation process. Thus, the objective of this work, a study case, was to test the hypothesis that redundant variables can harm the MZ delineation process. BAZZI This work was conducted in a 19.6-ha commercial field, and 15 MZ designs were generated by a fuzzy C-means algorithm and divided into two to five classes. Each design used a different composition of variables, including copper, silt, clay, and altitude. Some combinations of these variables produced superior MZs. None of the variable combinations produced statistically better performance that the MZ generated with no redundant variables. Thus, the other redundant variables can be discredited. The design with all variables did not provide a greater separation and organization of data among MZ classes and was not recommended.