23 resultados para quality requirements

em Aston University Research Archive


Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this work we present a quality driven approach to DASH (Dynamic Adaptive Streaming over HTTP) for segment selection in varying network conditions. Current adaption algorithms focus largely on regulating data rates using network layer parameters by selecting the level of quality on offer that can eliminate buffer underrun without considering picture fidelity. In reality, viewers may accept a level of buffer underrun in order to achieve an improved level of picture fidelity. In this case, the conventional DASH algorithms can cause extreme degradation of the picture fidelity when attempting to eliminate buffer underrun with scarce bandwidth availability. Our work is concerned with a quality-aware rate adaption scheme that maximizes the client's quality of experience in terms of both continuity and fidelity (picture quality). Results show that the scheme proposed can maintain a high level of quality for streaming services, especially at low packet loss rates. It is also shown that by eliminating buffer underrun completely, the PSNR that reflects the picture quality of the video is greatly reduced. Our scheme offers the offset between continuity-based quality and resolution-based quality, which can be used to set threshold values for the level of quality desired by clients with different quality requirements. © 2013 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The need for low bit-rate speech coding is the result of growing demand on the available radio bandwidth for mobile communications both for military purposes and for the public sector. To meet this growing demand it is required that the available bandwidth be utilized in the most economic way to accommodate more services. Two low bit-rate speech coders have been built and tested in this project. The two coders combine predictive coding with delta modulation, a property which enables them to achieve simultaneously the low bit-rate and good speech quality requirements. To enhance their efficiency, the predictor coefficients and the quantizer step size are updated periodically in each coder. This enables the coders to keep up with changes in the characteristics of the speech signal with time and with changes in the dynamic range of the speech waveform. However, the two coders differ in the method of updating their predictor coefficients. One updates the coefficients once every one hundred sampling periods and extracts the coefficients from input speech samples. This is known in this project as the Forward Adaptive Coder. Since the coefficients are extracted from input speech samples, these must be transmitted to the receiver to reconstruct the transmitted speech sample, thus adding to the transmission bit rate. The other updates its coefficients every sampling period, based on information of output data. This coder is known as the Backward Adaptive Coder. Results of subjective tests showed both coders to be reasonably robust to quantization noise. Both were graded quite good, with the Forward Adaptive performing slightly better, but with a slightly higher transmission bit rate for the same speech quality, than its Backward counterpart. The coders yielded acceptable speech quality of 9.6kbps for the Forward Adaptive and 8kbps for the Backward Adaptive.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A comprehensive examination is made of the characteristics and quality requirements of bio-oil from fast pyrolysis of biomass. This considers all aspects of the special characteristics of bio-oil – how they are created and the solutions available to help meet requirements for utilisation. Particular attention is paid to chemical and catalytic upgrading including synthesis gas and hydrogen production which has seen a wide range of new research activities and also more limited attention to chemicals recovery. An appreciation of the potential for bio-oil to meet a broad spectrum of applications in renewable energy has led to a significantly increased R&D activity that has focused on addressing liquid quality issues both for direct use for heat and power and indirect use for biofuels and green chemicals. This increased activity is evident in North America, Europe and Asia with many new entrants as well as expansion of existing activities. The only disappointment is the more limited industrial development and also deployment of fast pyrolysis processes that are necessary to provide the basic bio-oil raw material.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coke oven liquor is a toxic wastewater produced in large quantities by the Iron and Steel, and Coking Industries, and gives rise to major effluent treatment problems in those industries. Conscious of the potentially serious environmental impact of the discharge of such wastes, pollution control agencies in many countries have made progressively more stringent quality requirements for the discharge of the treated waste. The most common means of treating the waste is the activated sludge process. Problems with achieving consistently satisfactory treatment by this process have been experienced in the past. The need to improve the quality of the discharge of the treated waste prompted attempts by TOMLINS to model the process using Adenosine Triphosophnte (ATP) as a measure of biomass, but these were unsuccessful. This thesis describes work that was carried out to determine the significance of ATP in the activated sludge treatment of the waste. The use of ATP measurements in wastewater treatment were reviewed. Investigations were conducted into the ATP behaviour of the batch activated sludge treatment of two major components of the waste, phenol, and thiocyanate, and the continuous activated sludge treatment of the liquor itself, using laboratory scale apparatus. On the basis of these results equations were formulated to describe the significance of ATP as a measured activity and biomass in the treatment system. These were used as the basis for proposals to use ATP as a control parameter in the activated sludge treatment of coke oven liquor, and wastewaters in general. These had relevance both to the treatment of the waste in the reactor and to the settlement of the sludge produced in the secondary settlement stage of the treatment process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This review covers the production and utilisation of liquids from the thermal processing of biomass and related materials to substitute for synthetic phenol and formaldehyde in phenol formaldehyde resins. These resins are primarily employed in the manufacture of wood panels such as plywood, MDF, particle-board and OSB. The most important thermal conversion methods for this purpose are fast pyrolysis and vacuum pyrolysis, pressure liquefaction and phenolysis. Many feedstocks have been tested for their suitability as sources of phenolics including hard and softwoods, bark and residual lignins. Resins have been prepared utilising either the whole liquid product, or a phenolics enriched fraction obtained after fractional condensation or further processing, such as solvent extraction. None of the phenolics production and fractionation techniques covered in this review are believed to allow substitution of 100% of the phenol content of the resin without impacting its effectiveness compared to commercial formulations based on petroleum derived phenol. This survey shows that considerable progress has been made towards reaching the goal of a price competitive renewable resin, but that further research is required to meet the twin challenges of low renewable resin cost and satisfactory quality requirements. Particular areas of concern are wood panel press times, variability of renewable resin properties, odour, lack of reactive sites compared to phenol and potential for increased emissions of volatile organic compounds.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A comprehensive examination is made of the characteristics and quality requirements of bio-oil from fast pyrolysis of biomass. An appreciation of the potential for bio-oil to meet a broad spectrum of applications in renewable energy has led to a significantly increased R&D activity that has focused on addressing liquid quality issues both for direct use for heat and power and indirect use for biofuels and green chemicals. This increased activity is evident in North America, Europe, and Asia with many new entrants as well as expansion of existing activities. The only disappointment is the more limited industrial development and also deployment of fast pyrolysis processes that are necessary to provide the basic bio-oil raw material. © 2012 American Institute of Chemical Engineers (AIChE).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tool life is an important factor to be considered during the optimisation of a machining process since cutting parameters can be adjusted to optimise tool changing, reducing cost and time of production. Also the performance of a tool is directly linked to the generated surface roughness and this is important in cases where there are strict surface quality requirements. The prediction of tool life and the resulting surface roughness in milling operations has attracted considerable research efforts. The research reported herein is focused on defining the influence of milling cutting parameters such as cutting speed, feed rate and axial depth of cut, on three major tool performance parameters namely, tool life, material removal and surface roughness. The research is seeking to define methods that will allow the selection of optimal parameters for best tool performance when face milling 416 stainless steel bars. For this study the Taguchi method was applied in a special design of an orthogonal array that allows studying the entire parameter space with only a number of experiments representing savings in cost and time of experiments. The findings were that the cutting speed has the most influence on tool life and surface roughness and very limited influence on material removal. By last tool life can be judged either from tool life or volume of material removal.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A segment selection method controlled by Quality of Experience (QoE) factors for Dynamic Adaptive Streaming over HTTP (DASH) is presented in this paper. Current rate adaption algorithms aim to eliminate buffer underrun events by significantly reducing the code rate when experiencing pauses in replay. In reality, however, viewers may choose to accept a level of buffer underrun in order to achieve an improved level of picture fidelity or to accept the degradation in picture fidelity in order to maintain the service continuity. The proposed rate adaption scheme in our work can maximize the user QoE in terms of both continuity and fidelity (picture quality) in DASH applications. It is shown that using this scheme a high level of quality for streaming services, especially at low packet loss rates, can be achieved. Our scheme can also maintain a best trade-off between continuity-based quality and fidelity-based quality, by determining proper threshold values for the level of quality intended by clients with different quality requirements. In addition, the integration of the rate adaptation mechanism with the scheduling process is investigated in the context of a mobile communication network and related performances are analyzed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Some of the factors affecting colonisation of a colonisation sampler, the Standard Aufwuchs Unit (S. Auf. U.) were investigated, namely immersion period, whether anchored on the bottom or suspended, and the influence of riffles. It was concluded that a four-week immersion period was best. S. Auf. U. anchored on the bottom collected both more taxa and individuals than suspended ones. Fewer taxa but more individuals colonised S. Auf. U. in the potamon zone compared to the rhithron zone with a consequent reduction in the values of pollution indexes and diversity. It was concluded that a completely different scoring system was necessary for lowland rivers. Macroinvertebrates colonising S. Auf. U. in simulated streams, lowland rivers and the R. Churnet reflected water quality. A variety of pollution and diversity indexes were applied to results from lowland river sites. Instead of these, it was recommended that an abbreviated species - relative abundance list be used to summarise biological data for use in lowland river surveillance. An intensive study of gastropod populations was made in simulated streams. Lynnaea peregra increased in abundance whereas Potamopyrgas jenkinsi decreased with increasing sewage effluent concentration. No clear-cut differences in reproduction were observed. The presence/absence of eight gastropod taxa was compared with concentrations of various pollutants in lowland rivers. On the basis of all field work it appeared that ammonia, nitrite, copper and zinc were the toxicants most likely to be detrimental to gastropods and that P. jenkinsi and Theodoxus fluviatilis were the least tolerant taxa. 96h acute toxicity tests of P. jenkinsi using ammonia and copper were carried out in a flow-through system after a variety of static range finding tests. P. jenkinsi was intolerant to both toxicants compared to reports on other taxa and the results suggested that these toxicants would affect distribution of this species in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developing economies offer tremendous potential for future growth and organizations appreciating these consumers’ requirements stand to reap considerable returns. However, compared with more developed economies published consumer studies are few. In particular, there is a dearth of service quality research and hardly any from Africa. Furthermore, the little available research tends to apply Western methodologies, which may not be entirely appropriate. This research investigates East African consumer perceptions of retail banking using an approach that takes account of the research context. Qualitative research was undertaken to define the relevant service attributes. Performance along these was then investigated through a survey with over 2000 respondents. Principal component analysis identifies 13 core service dimensions and multinomial logistic regression reveals which are the key drivers of customer satisfaction. Comparison of the results with studies from other regions confirms that established standardized research instruments are likely to miss or under-represent service attributes important in developing countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research was conducted at the Space Research and Technology Centre o the European Space Agency at Noordvijk in the Netherlands. ESA is an international organisation that brings together a range of scientists, engineers and managers from 14 European member states. The motivation for the work was to enable decision-makers, in a culturally and technologically diverse organisation, to share information for the purpose of making decisions that are well informed about the risk-related aspects of the situations they seek to address. The research examined the use of decision support system DSS) technology to facilitate decision-making of this type. This involved identifying the technology available and its application to risk management. Decision-making is a complex activity that does not lend itself to exact measurement or precise understanding at a detailed level. In view of this, a prototype DSS was developed through which to understand the practical issues to be accommodated and to evaluate alternative approaches to supporting decision-making of this type. The problem of measuring the effect upon the quality of decisions has been approached through expert evaluation of the software developed. The practical orientation of this work was informed by a review of the relevant literature in decision-making, risk management, decision support and information technology. Communication and information technology unite the major the,es of this work. This allows correlation of the interests of the research with European public policy. The principles of communication were also considered in the topic of information visualisation - this emerging technology exploits flexible modes of human computer interaction (HCI) to improve the cognition of complex data. Risk management is itself an area characterised by complexity and risk visualisation is advocated for application in this field of endeavour. The thesis provides recommendations for future work in the fields of decision=making, DSS technology and risk management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives - Powdered and granulated particulate materials make up most of the ingredients of pharmaceuticals and are often at risk of undergoing unwanted agglomeration, or caking, during transport or storage. This is particularly acute when bulk powders are exposed to extreme swings in temperature and relative humidity, which is now common as drugs are produced and administered in increasingly hostile climates and are stored for longer periods of time prior to use. This study explores the possibility of using a uniaxial unconfined compression test to compare the strength of caked agglomerates exposed to different temperatures and relative humidities. This is part of a longer-term study to construct a protocol to predict the caking tendency of a new bulk material from individual particle properties. The main challenge is to develop techniques that provide repeatable results yet are presented simply enough to be useful to a wide range of industries. Methods - Powdered sucrose, a major pharmaceutical ingredient, was poured into a split die and exposed to high and low relative humidity cycles at room temperature. The typical ranges were 20–30% for the lower value and 70–80% for the higher value. The outer die casing was then removed and the resultant agglomerate was subjected to an unconfined compression test using a plunger fitted to a Zwick compression tester. The force against displacement was logged so that the dynamics of failure as well as the failure load of the sample could be recorded. The experimental matrix included varying the number of cycles, the amount between the maximum and minimum relative humidity, the height and diameters of the samples, the number of cycles and the particle size. Results - Trends showed that the tensile strength of the agglomerates increased with the number of cycles and also with the more extreme swings in relative humidity. This agrees with previous work on alternative methods of measuring the tensile strength of sugar agglomerates formed from humidity cycling (Leaper et al 2003). Conclusions - The results show that at the very least the uniaxial tester is a good comparative tester to examine the caking tendency of powdered materials, with a simple arrangement and operation that are compatible with the requirements of industry. However, further work is required to continue to optimize the height/ diameter ratio during tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indicators which summarise the characteristics of spatiotemporal data coverages significantly simplify quality evaluation, decision making and justification processes by providing a number of quality cues that are easy to manage and avoiding information overflow. Criteria which are commonly prioritised in evaluating spatial data quality and assessing a dataset’s fitness for use include lineage, completeness, logical consistency, positional accuracy, temporal and attribute accuracy. However, user requirements may go far beyond these broadlyaccepted spatial quality metrics, to incorporate specific and complex factors which are less easily measured. This paper discusses the results of a study of high level user requirements in geospatial data selection and data quality evaluation. It reports on the geospatial data quality indicators which were identified as user priorities, and which can potentially be standardised to enable intercomparison of datasets against user requirements. We briefly describe the implications for tools and standards to support the communication and intercomparison of data quality, and the ways in which these can contribute to the generation of a GEO label.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Requirements-aware systems address the need to reason about uncertainty at runtime to support adaptation decisions, by representing quality of services (QoS) requirements for service-based systems (SBS) with precise values in run-time queryable model specification. However, current approaches do not support updating of the specification to reflect changes in the service market, like newly available services or improved QoS of existing ones. Thus, even if the specification models reflect design-time acceptable requirements they may become obsolete and miss opportunities for system improvement by self-adaptation. This articles proposes to distinguish "abstract" and "concrete" specification models: the former consists of linguistic variables (e.g. "fast") agreed upon at design time, and the latter consists of precise numeric values (e.g. "2ms") that are dynamically calculated at run-time, thus incorporating up-to-date QoS information. If and when freshly calculated concrete specifications are not satisfied anymore by the current service configuration, an adaptation is triggered. The approach was validated using four simulated SBS that use services from a previously published, real-world dataset; in all cases, the system was able to detect unsatisfied requirements at run-time and trigger suitable adaptations. Ongoing work focuses on policies to determine recalculation of specifications. This approach will allow engineers to build SBS that can be protected against market-caused obsolescence of their requirements specifications. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quality, production and technological innovation management rank among the most important matters of concern to modern manufacturing organisations. They can provide companies with the decisive means of gaining a competitive advantage, especially within industries where there is an increasing similarity in product design and manufacturing processes. The papers in this special issue of International Journal of Technology Management have all been selected as examples of how aspects of quality, production and technological innovation can help to improve competitive performance. Most are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the theme was 'Getting Ahead Through Technology and People'. At the conference itself over 80 papers were presented by authors from 15 countries around the world. Among the many topics addressed within the conference theme, technological innovation, quality and production management emerged as attracting the greatest concern and interest of delegates, particularly those from industry. For any new initiative to be implemented successfully, it should be led from the top of the organization. Achieving the desired level of commitment from top management can, however, be a difficulty. In the first paper of this issue, Mackness investigates this question by explaining how systems thinking can help. In the systems approach, properties such as 'emergence', 'hierarchy', 'commnication' and 'control' are used to assist top managers in preparing for change. Mackness's paper is then complemented by Iijima and Hasegawa's contribution in which they investigate the development of Quality Information Management (QIM) in Japan. They present the idea of a Design Review and demonstrate how it can be used to trace and reduce quality-related losses. The next paper on the subject of quality is by Whittle and colleagues. It relates to total quality and the process of culture change within organisations. Using the findings of investigations carried out in a number of case study companies, they describe four generic models which have been identified as characterising methods of implementing total quality within existing organisation cultures. Boaden and Dale's paper also relates to the management of quality, but looks specifically at the construction industry where it has been found there is still some confusion over the role of Quality Assurance (QA) and Total Quality Management (TQM). They describe the results of a questionnaire survey of forty companies in the industry and compare them to similar work carried out in other industries. Szakonyi's contribution then completes this group of papers which all relate specifically to the question of quality. His concern is with the two ways in which R&D or engineering managers can work on improving quality. The first is by improving it in the laboratory, while the second is by working with other functions to improve quality in the company. The next group of papers in this issue all address aspects of production management. Umeda's paper proposes a new manufacturing-oriented simulation package for production management which provides important information for both design and operation of manufacturing systems. A simulation for production strategy in a Computer Integrated Manufacturing (CIM) environment is also discussed. This paper is then followed by a contribution by Tanaka and colleagues in which they consider loading schedules for manufacturing orders in a Material Requirements Planning (MRP) environment. They compare mathematical programming with a knowledge-based approach, and comment on their relative effectiveness for different practical situations. Engstrom and Medbo's paper then looks at a particular aspect of production system design, namely the question of devising group working arrangements for assembly with new product structures. Using the case of a Swedish vehicle assembly plant where long cycle assembly work has been adopted, they advocate the use of a generally applicable product structure which can be adapted to suit individual local conditions. In the last paper of this particular group, Tay considers how automation has affected the production efficiency in Singapore. Using data from ten major industries he identifies several factors which are positively correlated with efficiency, with capital intensity being of greatest interest to policy makers. The two following papers examine the case of electronic data interchange (EDI) as a means of improving the efficiency and quality of trading relationships. Banerjee and Banerjee consider a particular approach to material provisioning for production systems using orderless inventory replenishment. Using the example of a single supplier and multiple buyers they develop an analytical model which is applicable for the exchange of information between trading partners using EDI. They conclude that EDI-based inventory control can be attractive from economic as well as other standpoints and that the approach is consistent with and can be instrumental in moving towards just-in-time (JIT) inventory management. Slacker's complementary viewpoint on EDI is from the perspective of the quality relation-ship between the customer and supplier. Based on the experience of Lucas, a supplier within the automotive industry, he concludes that both banks and trading companies must take responsibility for the development of payment mechanisms which satisfy the requirements of quality trading. The three final papers of this issue relate to technological innovation and are all country based. Berman and Khalil report on a survey of US technological effectiveness in the global economy. The importance of education is supported in their conclusions, although it remains unclear to what extent the US government can play a wider role in promoting technological innovation and new industries. The role of technology in national development is taken up by Martinsons and Valdemars who examine the case of the former Soviet Union. The failure to successfully infuse technology into Soviet enterprises is seen as a factor in that country's demise, and it is anticipated that the newly liberalised economies will be able to encourage greater technological creativity. This point is then taken up in Perminov's concluding paper which looks in detail at Russia. Here a similar analysis is made of the concluding paper which looks in detail at Russia. Here a similar analysis is made of the Soviet Union's technological decline, but a development strategy is also presented within the context of the change from a centralised to a free market economy. The papers included in this special issue of the International Journal of Technology Management each represent a unique and particular contribution to their own specific area of concern. Together, however, they also argue or demonstrate the general improvements in competitive performance that can be achieved through the application of modern principles and practice to the management of quality, production and technological innovation.