995 resultados para attractive quality
Resumo:
In order to grow and survive, a firm must create value with consumers in ways that both fit in with consumer demands and stand out from competitors. Focusing on and understanding consumer and firm assessments of value and creation of value has become a central scope in the contemporary strategic management and marketing literature for understanding and explaining firm survival and success. Consequently, the overall aim of this thesis is to provide a conceptually and empirically grounded understanding of consumers’ and managers’ value assessments and behavior in value creation. This thesis draws on a consumer experience perspective and theories on social construction, organizational identity, self-congruence, and the theory of attractive quality, and combines multiple qualitative and quantitative studies. The findings in this thesis shed light on the interplay between consumers, firms, and contextual structures in value creation. Contextual structural, cultural, and political forces are shown to affect and be affected by the shared and individual cognitions of value creation that firms and consumers use in their assessment and creation value. The results of the study enhance the understanding of how firms can adopt various strategic schemas or organizing logics to optimize different types of use value creation when choosing between opposing and contradictive demands in their value creation. Furthermore, the thesis provides a deeper understanding of the hierarchical nature of consumer judgments of value that can be used to enhance the effectiveness of firm prioritizations and as a foundation for future value-creating strategies.
Resumo:
In tephritid fruit flies of the genus Bactrocera Macquart, a group of plant derived compounds (sensu amplo ‘male lures') enhance the mating success of males that have consumed them. For flies responding to the male lure methyl eugenol, this is due to the accumulation of chemicals derived from the male lure in the male rectal gland (site of pheromone synthesis) and the subsequent release of an attractive pheromone. Cuelure, raspberry ketone and zingerone are a second, related group of male lures to which many Bactrocera species respond. Raspberry ketone and cuelure are both known to accumulate in the rectal gland of males as raspberry ketone, but it is not known if the emitted male pheromone is subsequently altered in complexity or is more attractive to females. Using Bactrocera tryoni as our test insect, and cuelure and zingerone as our test chemicals, we assess: (i) lure accumulation in the rectal gland; (ii) if the lures are released exclusively in association with the male pheromone; and (iii) if the pheromone of lure-fed males is more attractive to females than the pheromone of lure-unfed males. As previously documented, we found cuelure was stored in its hydroxyl form of raspberry ketone, while zingerone was stored largely in an unaltered state. Small but consistent amounts of raspberry ketone and β-(4-hydroxy-3-methoxyphenyl)-propionic acid were also detected in zingerone-fed flies. Males released the ingested lures or their analogues, along with endogenous pheromone chemicals, only during the dusk courtship period. More females responded to squashed rectal glands extracted from flies fed on cuelure than to glands from control flies, while more females responded to the pheromone of calling cuelure-fed males than to control males. The response to zingerone treatments in both cases was not different from the control. The results show that male B. tryoni release ingested lures as part of their pheromone blend and, at least for cuelure, this attracts more females.
Resumo:
Ginger autotetraploids were produced by immersing shoot tips in a 0.5% w/v colchicine, 2% v/v dimethyl sulfoxide solution for 2 h. Stomatal measurements were used as an early indicator of ploidy differences in culture with mean stomata length of tetraploids (49.2 μm) being significantly larger than the diploid (38.8 µm). Of the 500 shoot tips treated, 2% were characterised as stable autotetraploid lines following field evaluation over several seasons. Results were confirmed with flow cytometry and, of the 7 lines evaluated for distinctness and uniformity, 6 were solid tetraploid mutants and 1 was a periclinal chimera. Significant differences were noted between individual tetraploid lines in terms of shoot length, leaf length, leaf width, size of rhizome sections (knob weight) and fibre content. The solid autotetraploid lines had significantly wider, greener leaves than the diploids, they had significantly fewer but thicker shoots and, although ‘Queensland’ (the diploid parent from which the tetraploids were derived) had a greater total rhizome mass at harvest, its knob size was significantly smaller. From the autotetraploid lines, one line was selected for commercial release as ‘Buderim Gold’. It compared the most favourably with ‘Queensland’ in terms of the aroma/flavour profile and fibre content at early harvest, and had consistently good rhizome yield. More importantly it produced large rhizome sections, resulting in a higher recovery of premium grade confectionery ginger and a more attractive fresh market product.
Resumo:
Sown pasture rundown and declining soil fertility for forage crops are too serious to ignore with losses in beef production of up to 50% across Queensland. The feasibility of using strategic applications of nitrogen (N) fertiliser to address these losses was assessed by analysing a series of scenarios using data drawn from published studies, local fertiliser trials and expert opinion. While N fertilser can dramatically increase productivity (growth, feed quality and beef production gains of over 200% in some scenarios), the estimated economic benefits, derived from paddock level enterprise budgets for a fattening operation, were much more modest. In the best-performing sown grass scenarios, average gross margins were doubled or tripled at the assumed fertiliser response rates, and internal rates of return of up to 11% were achieved. Using fertiliser on forage sorghum or oats was a much less attractive option and, under the paddock level analysis and assumptions used, forages struggled to be profitable even on fertile sites with no fertiliser input. The economics of nitrogen fertilising on grass pasture were sensitive to the assumed response rates in both pasture growth and liveweight gain. Consequently, targeted research is proposed to re-assess the responses used in this analysis, which are largely based on research 25-40 years ago when soils were generally more fertile and pastures less rundown.
Resumo:
We consider the problem of wireless channel allocation to multiple users. A slot is given to a user with a highest metric (e.g., channel gain) in that slot. The scheduler may not know the channel states of all the users at the beginning of each slot. In this scenario opportunistic splitting is an attractive solution. However this algorithm requires that the metrics of different users form independent, identically distributed (iid) sequences with same distribution and that their distribution and number be known to the scheduler. This limits the usefulness of opportunistic splitting. In this paper we develop a parametric version of this algorithm. The optimal parameters of the algorithm are learnt online through a stochastic approximation scheme. Our algorithm does not require the metrics of different users to have the same distribution. The statistics of these metrics and the number of users can be unknown and also vary with time. Each metric sequence can be Markov. We prove the convergence of the algorithm and show its utility by scheduling the channel to maximize its throughput while satisfying some fairness and/or quality of service constraints.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The growing need to assess the environmental status of the Mediterranean coastal marine habitats and the large availability of data collected by Reef Check Italia onlus (RCI) volunteers suggest the possibility to develop innovative and reliable indices that may support decision makers in applying conservation strategies. The aims of this study were to check the reliability of data collected by RCI volunteers, analyse the spatial and temporal distribution of RCI available data, resume the knowledge on the biology and ecology of the monitored species, and develop innovative indices to asses the ecological quality of Mediterranean subtidal rocky shores and coralligenous habitats. Subtidal rocky shores and coralligenous were chosen because these are the habitats more attractive for divers; therefore mlst data are referring to them, moreover subtidal rocky bottom are strongly affected by coastal urbanisation, land use, fishing and tourist activities, that increase pollution, turbidity and sedimentation. Non-indigenous species (NIS) have been recognized as a major threat to the integrity of Mediterranean native communities because of their proliferation, spread and impact on resident communities. Monitoring of NIS’ spreading dynamics at the basin spatial scale is difficult but urgent. According to a field test, the training provided by RCI appears adequate to obtain reliable data by volunteers. Based on data collected by RCI volunteers, three main categories of indices were developed: indices based on species diversity, indices on the occurrence non-indigenous species, and indices on species sensitive toward physical, chemical and biological disturbances. As case studies, indices were applied to stretches of coastline defined according to management criteria (province territories and marine protected areas). The assessments of ecological quality in the Tavolara Marine Protected Area using the species sensitivities index were consisten with those previously obtained with traditional methods.
Resumo:
BACKGROUND: Thoracic endovascular aortic repair (TEVAR) represents an attractive alternative to open aortic repair (OAR). The aim of this study was to assess outcome and quality of life in patients treated either by TEVAR or OAR for diseased descending thoracic aorta. METHODS: A post hoc analysis of a prospectively collected consecutive series of 136 patients presenting with surgical diseases of the descending aorta between January 2001 and December 2005 was conducted. Fourteen patients were excluded because of involvement of the ascending aorta. Assessed treatment cohorts were TEVAR (n = 52) and OAR (n = 70). Mean follow-up was 34 +/- 18 months. End points were perioperative and late mortality rates and long-term quality of life as assessed by the Short Form Health Survey (SF-36) and Hospital Anxiety and Depression Score questionnaires. RESULTS: Mean age was significantly higher in TEVAR patients (69 +/- 10 years versus 62 +/- 15 years; p = 0.002). Perioperative mortality rates were 9% (OAR) and 8% (TEVAR), respectively (p = 0.254). Accordingly, cumulative long-term mortality rates were similar in both cohorts. Overall quality-of-life scores were 93 (63-110, OAR) and 83 (60-112, TEVAR), respectively. Normal quality-of-life scores range from 85 to 115. Anxiety and depression scores were not increased after open surgery. CONCLUSIONS: Thoracic endovascular aortic repair and OAR both provide excellent long-term results in treatment of thoracic aortic disease. Long-term quality of life, however, is reduced after thoracic aortic repair. Interestingly, TEVAR patients did not score higher in overall quality of life despite all advantages of minimized access trauma. Similarly, anxiety and depression scores are not reduced by TEVAR, possibly reflecting a certain caution against the new technology.
Resumo:
ATM, SDH or satellite have been used in the last century as the contribution network of Broadcasters. However the attractive price of IP networks is changing the infrastructure of these networks in the last decade. Nowadays, IP networks are widely used, but their characteristics do not offer the level of performance required to carry high quality video under certain circumstances. Data transmission is always subject to errors on line. In the case of streaming, correction is attempted at destination, while on transfer of files, retransmissions of information are conducted and a reliable copy of the file is obtained. In the latter case, reception time is penalized because of the low priority this type of traffic on the networks usually has. While in streaming, image quality is adapted to line speed, and line errors result in a decrease of quality at destination, in the file copy the difference between coding speed vs line speed and errors in transmission are reflected in an increase of transmission time. The way news or audiovisual programs are transferred from a remote office to the production centre depends on the time window and the type of line available; in many cases, it must be done in real time (streaming), with the resulting image degradation. The main purpose of this work is the workflow optimization and the image quality maximization, for that reason a transmission model for multimedia files adapted to JPEG2000, is described based on the combination of advantages of file transmission and those of streaming transmission, putting aside the disadvantages that these models have. The method is based on two patents and consists of the safe transfer of the headers and data considered to be vital for reproduction. Aside, the rest of the data is sent by streaming, being able to carry out recuperation operations and error concealment. Using this model, image quality is maximized according to the time window. In this paper, we will first give a briefest overview of the broadcasters requirements and the solutions with IP networks. We will then focus on a different solution for video file transfer. We will take the example of a broadcast center with mobile units (unidirectional video link) and regional headends (bidirectional link), and we will also present a video file transfer file method that satisfies the broadcaster requirements.
Resumo:
En las ciudades europeas, los patrones de movilidad son cada vez más complejos debido fundamentalmente a un crecimiento sostenido de la población así como a la tendencia de dispersión de los núcleos urbanos. En consecuencia, muchos de los usuarios del transporte público se ven obligados a combinar varios modos o servicios de transporte para completar sus viajes diarios. Por tanto, el mayor reto de las ciudades es conseguir una mejora e incremento en la movilidad mientras que al mismo tiempo se reducen problemas como la congestión, los accidentes y la contaminación (COM, 2006). Un principio básico para lograr una movilidad sostenible es reducir los inconvenientes y molestias derivados de la transferencia o ruptura del viaje. En este sentido, los intercambiadores de transporte público juegan un papel fundamental como nodos de la red urbana de transporte y la calidad del servicio prestado en ellos tiene una influencia directa sobre la experiencia diaria de los viajeros. Como señaló Terzis and Last (2002), un intercambiador de transportes urbano eficiente debe ser competitivo y al mismo tiempo, debe ser atractivo para los usuarios dado que sus experiencias físicas y sus reacciones psicológicas se ven influenciadas de manera significativa por el diseño y operación del intercambiador. Sin embargo, todavía no existen standards o normativas a nivel europeo que especifiquen como deberían ser estos intercambiadores. Esta tesis doctoral proporciona conocimientos y herramientas de análisis dirigidas a planificadores y gestores de los propios intercambiadores con el fin de entender mejor el funcionamiento de los intercambiadores y gestionar así los recursos disponibles. Así mismo, esta tesis identifica los factores clave en el diseño y operación de intercambiadores urbanos de transporte y proporciona algunas guías generales de planificación en base a ellos. Dado que las percepciones de los usuarios son particularmente importantes para definir políticas adecuadas para intercambiadores, se diseñó y se llevó a cabo en 2013 una encuesta de satisfacción al viajero en tres intercambiadores de transporte urbano europeos: Moncloa (Madrid, España), Kamppi (Helsinki, Finlandia) e Ilford Railway Station ( Londres, Reino Unido). En resumen, esta tesis pone de relieve la naturaleza ambivalente de los intercambiadores urbanos de transporte, es decir, como nodos de la red de transporte y como lugares en sí mismos donde los usuarios pasan tiempo dentro de ellos y propone algunas recomendaciones para hacer más atractivos los intercambiadores a los usuarios. Travel patterns in European urban areas are becoming increasingly complex due to a sustained increase in the urban population and the trend towards urban sprawl. Consequently, many public transport users need to combine several modes or transport services to complete their daily trips. Therefore, the challenge facing all major cities is how to increase mobility while at the same time reducing congestion, accididents and pollution (COM, 2006). Reducing the inconvenience inherent in transferring between modes is a basic principle for achieving sustainable mobility. In this regard, transport interchanges play a key role as urban transport network nodes, and the quality of the service provided in them has a direct influence on travellers' daily experience. As noted by Terzis and Last (2000), an efficient urban transport interchange must be competitive and, at the same time, be attractive for users given that their physical experiences and psychological reactions are significantly influenced by the design and operation of the interchange. However, yet there are no standards or regulations specifying the form these interchanges should take in Europe. This doctoral thesis provides knowledge and analysis tools addressed to developers and managers in order to understand better the performance of an urban transport interchange and manage the available resources properly. Likewise, key factors of the design and operation of urban transport interchanges are identified and some 'Planning guidelines' are proposed on the basis on them. Since the users' perceptions of their experience are particularly important for achieving the most appropriate policy measures for interchanges, an ad‐hoc travellers' satisfaction survey was designed and carried out in 2013 at three European transport interchanges: Moncloa (Madrid, Spain), Kamppi (Helsinki, Finland) and Ilford Railway Station (London, United Kingdom) In summary, this thesis highlights the ambivalent nature of the urban transport interchanges, i.e. as nodes within the transport network and as places where users spending time and proposes some policy recommendations in order to make urban transport interchanges attractive for users.
Resumo:
La medida de calidad de vídeo sigue siendo necesaria para definir los criterios que caracterizan una señal que cumpla los requisitos de visionado impuestos por el usuario. Las nuevas tecnologías, como el vídeo 3D estereoscópico o formatos más allá de la alta definición, imponen nuevos criterios que deben ser analizadas para obtener la mayor satisfacción posible del usuario. Entre los problemas detectados durante el desarrollo de esta tesis doctoral se han determinado fenómenos que afectan a distintas fases de la cadena de producción audiovisual y tipo de contenido variado. En primer lugar, el proceso de generación de contenidos debe encontrarse controlado mediante parámetros que eviten que se produzca el disconfort visual y, consecuentemente, fatiga visual, especialmente en lo relativo a contenidos de 3D estereoscópico, tanto de animación como de acción real. Por otro lado, la medida de calidad relativa a la fase de compresión de vídeo emplea métricas que en ocasiones no se encuentran adaptadas a la percepción del usuario. El empleo de modelos psicovisuales y diagramas de atención visual permitirían ponderar las áreas de la imagen de manera que se preste mayor importancia a los píxeles que el usuario enfocará con mayor probabilidad. Estos dos bloques se relacionan a través de la definición del término saliencia. Saliencia es la capacidad del sistema visual para caracterizar una imagen visualizada ponderando las áreas que más atractivas resultan al ojo humano. La saliencia en generación de contenidos estereoscópicos se refiere principalmente a la profundidad simulada mediante la ilusión óptica, medida en términos de distancia del objeto virtual al ojo humano. Sin embargo, en vídeo bidimensional, la saliencia no se basa en la profundidad, sino en otros elementos adicionales, como el movimiento, el nivel de detalle, la posición de los píxeles o la aparición de caras, que serán los factores básicos que compondrán el modelo de atención visual desarrollado. Con el objetivo de detectar las características de una secuencia de vídeo estereoscópico que, con mayor probabilidad, pueden generar disconfort visual, se consultó la extensa literatura relativa a este tema y se realizaron unas pruebas subjetivas preliminares con usuarios. De esta forma, se llegó a la conclusión de que se producía disconfort en los casos en que se producía un cambio abrupto en la distribución de profundidades simuladas de la imagen, aparte de otras degradaciones como la denominada “violación de ventana”. A través de nuevas pruebas subjetivas centradas en analizar estos efectos con diferentes distribuciones de profundidades, se trataron de concretar los parámetros que definían esta imagen. Los resultados de las pruebas demuestran que los cambios abruptos en imágenes se producen en entornos con movimientos y disparidades negativas elevadas que producen interferencias en los procesos de acomodación y vergencia del ojo humano, así como una necesidad en el aumento de los tiempos de enfoque del cristalino. En la mejora de las métricas de calidad a través de modelos que se adaptan al sistema visual humano, se realizaron también pruebas subjetivas que ayudaron a determinar la importancia de cada uno de los factores a la hora de enmascarar una determinada degradación. Los resultados demuestran una ligera mejora en los resultados obtenidos al aplicar máscaras de ponderación y atención visual, los cuales aproximan los parámetros de calidad objetiva a la respuesta del ojo humano. ABSTRACT Video quality assessment is still a necessary tool for defining the criteria to characterize a signal with the viewing requirements imposed by the final user. New technologies, such as 3D stereoscopic video and formats of HD and beyond HD oblige to develop new analysis of video features for obtaining the highest user’s satisfaction. Among the problems detected during the process of this doctoral thesis, it has been determined that some phenomena affect to different phases in the audiovisual production chain, apart from the type of content. On first instance, the generation of contents process should be enough controlled through parameters that avoid the occurrence of visual discomfort in observer’s eye, and consequently, visual fatigue. It is especially necessary controlling sequences of stereoscopic 3D, with both animation and live-action contents. On the other hand, video quality assessment, related to compression processes, should be improved because some objective metrics are adapted to user’s perception. The use of psychovisual models and visual attention diagrams allow the weighting of image regions of interest, giving more importance to the areas which the user will focus most probably. These two work fields are related together through the definition of the term saliency. Saliency is the capacity of human visual system for characterizing an image, highlighting the areas which result more attractive to the human eye. Saliency in generation of 3DTV contents refers mainly to the simulated depth of the optic illusion, i.e. the distance from the virtual object to the human eye. On the other hand, saliency is not based on virtual depth, but on other features, such as motion, level of detail, position of pixels in the frame or face detection, which are the basic features that are part of the developed visual attention model, as demonstrated with tests. Extensive literature involving visual comfort assessment was looked up, and the development of new preliminary subjective assessment with users was performed, in order to detect the features that increase the probability of discomfort to occur. With this methodology, the conclusions drawn confirmed that one common source of visual discomfort was when an abrupt change of disparity happened in video transitions, apart from other degradations, such as window violation. New quality assessment was performed to quantify the distribution of disparities over different sequences. The results confirmed that abrupt changes in negative parallax environment produce accommodation-vergence mismatches derived from the increasing time for human crystalline to focus the virtual objects. On the other side, for developing metrics that adapt to human visual system, additional subjective tests were developed to determine the importance of each factor, which masks a concrete distortion. Results demonstrated slight improvement after applying visual attention to objective metrics. This process of weighing pixels approximates the quality results to human eye’s response.
Resumo:
We present a new approach accounting for the nonadditivity of attractive parts of solid-fluid and fluidfluid potentials to improve the quality of the description of nitrogen and argon adsorption isotherms on graphitized carbon black in the framework of non-local density functional theory. We show that the strong solid-fluid interaction in the first monolayer decreases the fluid-fluid interaction, which prevents the twodimensional phase transition to occur. This results in smoother isotherm, which agrees much better with experimental data. In the region of multi-layer coverage the conventional non-local density functional theory and grand canonical Monte Carlo simulations are known to over-predict the amount adsorbed against experimental isotherms. Accounting for the non-additivity factor decreases the solid-fluid interaction with the increase of intermolecular interactions in the dense adsorbed fluid, preventing the over-prediction of loading in the region of multi-layer adsorption. Such an improvement of the non-local density functional theory allows us to describe experimental nitrogen and argon isotherms on carbon black quite accurately with mean error of 2.5 to 5.8% instead of 17 to 26% in the conventional technique. With this approach, the local isotherms of model pores can be derived, and consequently a more reliab * le pore size distribution can be obtained. We illustrate this by applying our theory against nitrogen and argon isotherms on a number of activated carbons. The fitting between our model and the data is much better than the conventional NLDFT, suggesting the more reliable PSD obtained with our approach.
Resumo:
The primary objective of this work is to relate the biomass fuel quality to fast pyrolysis-oil quality in order to identify key biomass traits which affect pyrolysis-oil stability. During storage the pyrolysis-oil becomes more viscous due to chemical and physical changes, as reactions and volatile losses occur due to aging. The reason for oil instability begins within the pyrolysis reactor during pyrolysis in which the biomass is rapidly heated in the absence of oxygen, producing free radical volatiles which are then quickly condensed to form the oil. The products formed do not reach thermodynamic equilibrium and in tum the products react with each other to try to achieve product stability. The first aim of this research was to develop and validate a rapid screening method for determining biomass lignin content in comparison to traditional, time consuming and hence costly wet chemical methods such as Klason. Lolium and Festuca grasses were selected to validate the screening method, as these grass genotypes exhibit a low range of Klason /Acid Digestible Fibre lignin contents. The screening methodology was based on the relationship between the lignin derived products from pyrolysis and the lignin content as determined by wet chemistry. The second aim of the research was to determine whether metals have an affect on fast pyrolysis products, and if any clear relationships can be deduced to aid research in feedstock selection for fast pyrolysis processing. It was found that alkali metals, particularly Na and K influence the rate and yield of degradation as well the char content. Pre-washing biomass with water can remove 70% of the total metals, and improve the pyrolysis product characteristics by increasing the organic yield, the temperature in which maximum liquid yield occurs and the proportion of higher molecular weight compounds within the pyrolysis-oil. The third aim identified these feedstock traits and relates them to the pyrolysis-oil quality and stability. It was found that the mineral matter was a key determinant on pyrolysis-oil yield compared to the proportion of lignin. However the higher molecular weight compounds present in the pyrolysis-oil are due to the lignin, and can cause instability within the pyrolysis-oil. The final aim was to investigate if energy crops can be enhanced by agronomical practices to produce a biomass quality which is attractive to the biomass conversion community, as well as giving a good yield to the farmers. It was found that the nitrogen/potassium chloride fertiliser treatments enhances Miscanthus qualities, by producing low ash, high volatiles yields with acceptable yields for farmers. The progress of senescence was measured in terms of biomass characteristics and fast pyrolysis product characteristics. The results obtained from this research are in strong agreement with published literature, and provides new information on quality traits for biomass which affects pyrolysis and pyrolysis-oils.
Resumo:
Quality, production and technological innovation management rank among the most important matters of concern to modern manufacturing organisations. They can provide companies with the decisive means of gaining a competitive advantage, especially within industries where there is an increasing similarity in product design and manufacturing processes. The papers in this special issue of International Journal of Technology Management have all been selected as examples of how aspects of quality, production and technological innovation can help to improve competitive performance. Most are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the theme was 'Getting Ahead Through Technology and People'. At the conference itself over 80 papers were presented by authors from 15 countries around the world. Among the many topics addressed within the conference theme, technological innovation, quality and production management emerged as attracting the greatest concern and interest of delegates, particularly those from industry. For any new initiative to be implemented successfully, it should be led from the top of the organization. Achieving the desired level of commitment from top management can, however, be a difficulty. In the first paper of this issue, Mackness investigates this question by explaining how systems thinking can help. In the systems approach, properties such as 'emergence', 'hierarchy', 'commnication' and 'control' are used to assist top managers in preparing for change. Mackness's paper is then complemented by Iijima and Hasegawa's contribution in which they investigate the development of Quality Information Management (QIM) in Japan. They present the idea of a Design Review and demonstrate how it can be used to trace and reduce quality-related losses. The next paper on the subject of quality is by Whittle and colleagues. It relates to total quality and the process of culture change within organisations. Using the findings of investigations carried out in a number of case study companies, they describe four generic models which have been identified as characterising methods of implementing total quality within existing organisation cultures. Boaden and Dale's paper also relates to the management of quality, but looks specifically at the construction industry where it has been found there is still some confusion over the role of Quality Assurance (QA) and Total Quality Management (TQM). They describe the results of a questionnaire survey of forty companies in the industry and compare them to similar work carried out in other industries. Szakonyi's contribution then completes this group of papers which all relate specifically to the question of quality. His concern is with the two ways in which R&D or engineering managers can work on improving quality. The first is by improving it in the laboratory, while the second is by working with other functions to improve quality in the company. The next group of papers in this issue all address aspects of production management. Umeda's paper proposes a new manufacturing-oriented simulation package for production management which provides important information for both design and operation of manufacturing systems. A simulation for production strategy in a Computer Integrated Manufacturing (CIM) environment is also discussed. This paper is then followed by a contribution by Tanaka and colleagues in which they consider loading schedules for manufacturing orders in a Material Requirements Planning (MRP) environment. They compare mathematical programming with a knowledge-based approach, and comment on their relative effectiveness for different practical situations. Engstrom and Medbo's paper then looks at a particular aspect of production system design, namely the question of devising group working arrangements for assembly with new product structures. Using the case of a Swedish vehicle assembly plant where long cycle assembly work has been adopted, they advocate the use of a generally applicable product structure which can be adapted to suit individual local conditions. In the last paper of this particular group, Tay considers how automation has affected the production efficiency in Singapore. Using data from ten major industries he identifies several factors which are positively correlated with efficiency, with capital intensity being of greatest interest to policy makers. The two following papers examine the case of electronic data interchange (EDI) as a means of improving the efficiency and quality of trading relationships. Banerjee and Banerjee consider a particular approach to material provisioning for production systems using orderless inventory replenishment. Using the example of a single supplier and multiple buyers they develop an analytical model which is applicable for the exchange of information between trading partners using EDI. They conclude that EDI-based inventory control can be attractive from economic as well as other standpoints and that the approach is consistent with and can be instrumental in moving towards just-in-time (JIT) inventory management. Slacker's complementary viewpoint on EDI is from the perspective of the quality relation-ship between the customer and supplier. Based on the experience of Lucas, a supplier within the automotive industry, he concludes that both banks and trading companies must take responsibility for the development of payment mechanisms which satisfy the requirements of quality trading. The three final papers of this issue relate to technological innovation and are all country based. Berman and Khalil report on a survey of US technological effectiveness in the global economy. The importance of education is supported in their conclusions, although it remains unclear to what extent the US government can play a wider role in promoting technological innovation and new industries. The role of technology in national development is taken up by Martinsons and Valdemars who examine the case of the former Soviet Union. The failure to successfully infuse technology into Soviet enterprises is seen as a factor in that country's demise, and it is anticipated that the newly liberalised economies will be able to encourage greater technological creativity. This point is then taken up in Perminov's concluding paper which looks in detail at Russia. Here a similar analysis is made of the concluding paper which looks in detail at Russia. Here a similar analysis is made of the Soviet Union's technological decline, but a development strategy is also presented within the context of the change from a centralised to a free market economy. The papers included in this special issue of the International Journal of Technology Management each represent a unique and particular contribution to their own specific area of concern. Together, however, they also argue or demonstrate the general improvements in competitive performance that can be achieved through the application of modern principles and practice to the management of quality, production and technological innovation.